This is a wiki, you click the edit button at bottom of this post to add your files
The time has come for the community to be able to show off the files they have uploaded to this genesis network.
Looking for videos, pictures, text, and so on.
Will ask that people do not include test files that are not of interest to anyone. The first entry is only there to give an example is just that and will be deleting it later when others have added their files.
NOTE: posts to the topic will be deleted (regularly)
This is a wiki and ask for one bullet point entry per upload (file). Lets keep it short and people can explore the file for themselves. Perhaps the best layout will be bullet point list which will keep the description with the datamap address even on phones and other narrow displays. This can be achieved by using the following format - <datamap address> Description
1b5a77a34e172ea3c6596b17eb33bc4e64aa8d2ed378d8be9a52f4a37083eb1f - A song @NAFO_Radio wrote about being a little kid.
Wow this is interesting, if we can have some distro’s uploaded and work as alternative download links that are usually slow! Can we get Ubuntu 25.04 on there?
I am planing to save that one for an official release I don’t want to run out of ISO’s to quickley
arch linux downloading in under 4 min
🔗 Connected to the Network
Fetching file: "archlinux-2025.04.01-x86_64.iso"...
Successfully downloaded data at: 9875177d76c9768edbabe048ad2b2846b8a9de0286bd5e1097813cc0dc75128f
real 3m31.901s
user 0m56.296s
sys 0m13.078s
Would be nice to see Autonomi download links next to other ‘mirrors’ on the download sites for many distros and other big file download sites (game mods etc) in the not too distant future… would be great advertising while providing a valuable service
Downloaded perfectly via AntTP too! Got about 5 MB/s (about 40 Mbit/s) over my home connection too (while running a bunch of nodes etc). Md5 checked out too.
Also, I tried another 2.6 GB ISO download earlier with 32 threads and my connection couldn’t keep up. I dropped AntTP threads too 16 and hit retry and it resumed from 2 GB and completed just fine!
I hadn’t thought about it, but the RANGE headers used for video streaming / forwarding etc are also used for file downloads. So, we get file resume for free!
toivo@toivo-HP-ProBook-450-G5:~$ time ant file download 142925b90fc5f6e38807713303fbdc91bfcd893a3c0f597e53764087fd70b8ed . #md5sum 094aefdb1dbbdad8aa99600f8413789b ubuntu-24.04.2-desktop-amd64.iso 5.8Gb
Logging to directory:
🔗 Connected to the Network Fetching file: "ubuntu-24.04.2-desktop-amd64.iso"...
⠙ [00:13:35] [----------------------------------------] 0/1
Killed
real 14m32.948s
user 6m3.249s
sys 3m26.582s
export CHUNK_DOWNLOAD_BATCH_SIZE=16
toivo@toivo-HP-ProBook-450-G5:~$ time ant file download 142925b90fc5f6e38807713303fbdc91bfcd893a3c0f597e53764087fd70b8ed . #md5sum 094aefdb1dbbdad8aa99600f8413789b ubuntu-24.04.2-desktop-amd64.iso 5.8Gb
Logging to directory:
🔗 Connected to the Network Fetching file: "ubuntu-24.04.2-desktop-amd64.iso"...
⠙ [00:13:35] [----------------------------------------] 0/1
Killed
real 14m32.948s
user 6m3.249s
sys 3m26.582s
export CHUNK_DOWNLOAD_BATCH_SIZE=8
time ant file download 142925b90fc5f6e38807713303fbdc91bfcd893a3c0f597e53764087fd70b8ed . #md5sum 094aefdb1dbbdad8aa99600f8413789b ubuntu-24.04.2-desktop-amd64.iso 5.8Gb
🔗 Connected to the Network Fetching file: "ubuntu-24.04.2-desktop-amd64.iso"...
⠤ [00:25:02] [----------------------------------------] 0/1
Killed
real 25m25.601s
user 8m35.064s
sys 4m32.363s
export CHUNK_DOWNLOAD_BATCH_SIZE=4
time ant file download 142925b90fc5f6e38807713303fbdc91bfcd893a3c0f597e53764087fd70b8ed . #md5sum 094aefdb1dbbdad8aa99600f8413789b ubuntu-24.04.2-desktop-amd64.iso 5.8Gb
Logging to directory:
🔗 Connected to the Network Fetching file: "ubuntu-24.04.2-desktop-amd64.iso"...
⠉ [00:47:11] [----------------------------------------] 0/1
Killed
real 47m41.690s
user 9m34.740s
sys 5m37.576s
Now I have it running with batch size 2, but I might go to sleep before it finishes, so I decided to report the results so far.
It seems that it just gets killed at certain level of completion. Interesting to see what batches 2 and 1 does, eventually.
Also it has been interesting, that even the larger batch sizes haven’t interfered with my other internet use with this same laptop. This forum and other sites have been responsive all the time.
it seems like the file might be too large for your machine to hold all chunks in memory to compose the final file ?
the streaming feature is not available yet, hence all fetched chunks are kept in memory, and only all chunks got fetched, then the final file will be composed and flushed to the disk.
Withing the client logs, you can check the number of chunks fetched at the time of kill across runs, to see if they all failed roughly at the same threshold.
I’m considering storing the chunks to file instead of memory too, to reduce footprint further. For now, the memory usage is modest though, especially if chunks are flowing smoothly.
It uses channels and the receiver implements the futures::stream trait. It plays nicely with Actix then.
Believe it or not, I started to wonder about that by myself too. And it seems to be the case. I checked 3 log files, and the largest chunk number fetched is always 117.
That’s something fix of course, but the bigger news for me is that it seems I can download with batch 32 without any trouble to our home network. Good job!