When a user posted on discord that Autonomi would be great as storage for a Youtube competitor it got me thinking, early on with no caching nodes then maybe video streaming to 100’s of simulataneous clients might not be optimal.
So what might be the best apps and use cases early on, to maximise Autonomis potential and to create popular use cases that makes more and more people getting drawn in and start using Autonomi?
My thoughts so far:
A dropbox killer app with a superior UI, something like an advanced file explorer but with simple interface to setup backup settings.
A browser add-on to make transaction between users or share files, in my mind something similar to metamask but also different, maybe simpler but more powerful.
A messaging app.
News from the most popular news sites summerized and translated into languages were news from other countries are not allowed and blocked.
What are your thoughs on how to best optimize the early Autonomi network to create most traction and gain popularity, what apps and features?
Many thanks, that is so cool, did not think about it. Remember sitting in my college apartment 2014 looking for new tech that would make a huge impact, after I understood the Solow-Swan model and due to health, needed a life changing investment, high risk/high reward. Fell in love with the idea and the tech, it felt like the right path for the future. Those who wait for something good never waits too long, they say.
It’s amazing to see so many who have stuck around for a long time and that we are getting so close now. Cheers to all, never stop fighting and believing.
I think interoperability with other stuff would be nice. Like if Autonomi could function as a torrent seed and maybe scrape things from the BitTorrent DHT. That way, you’d have the benefit of multiple seeders increasing download speed, but you wouldn’t run into the problem of something having no seeds.
Also backing up various archives like the one below is the first thing I’d use Autonomi for.
Even with anybody watching the same it will be smooth and a perfect fit… Nodes all have a tiny fraction of the whole file and have plenty upload bandwidth
Exactly same millisecond?.. Let’s say second… 0.5 MB * 5k =500mbyte/s =1600mbit/s… So with your 4mbit line just half a second full load… With a limited upload rate to e.g. 200mbit that’s 8 seconds without any caching mechanisms… Nothing super brutal…
I mentioned that in discord when it was brought up and seems noone else really got to see it. I suspect some just don’t accept the analysis we have made.
A video with 20,000 chunks and 100 people watching will have to win the lottery for even 5 people to be grabbing the same chunks (or 5 chunks with buffering in player). Its more likely that it’ll be many seconds between any one particular chunk being got.
But if you increase that 100 to 10,000 then the chance of 2 people grabbing the same set of chunks is very high. And 5 people is perhaps 50-50 and even then you have 5 nodes supplying that chunk to the 5 people.
That is hardly a load at all. Even 100,000 people watching the same 20,000 chunk long video will be possible and likely fine. Better than youtube in most cases of 100K people watching a decent length video. Small videos have a small chance of 100K people wanting to watch it doing so at the precise same minute or two.
Now I proposed the problem will be live streaming of say a soccer event where 100K or more people want it live. Then each chunk is trying to be read within seconds of it being written. Caching will definitely be needed for that to work smoothly. Otherwise a different live streaming method will be needed and perhaps revert to old internet style specifically for live streaming. But thats fine as the old internet had to invent add ons to do stuff like that anyhow.
Thats to be implemented yet.
Although after discussing active records with Qi_ma I am pretty sure that there is many many nodes holding a particular chunk and this means a form of caching is probably already there. Just not the kind originally proposed that relied on the hopping mechanism originally proposed to be used for chunk store/retrieval.
Thinking about that, with 2 KBucket bins of nodes potentially holding a chunk then there is more than the 5 to serve up most chunks so even 100K people watching a 20,000 chunk video will not have any issues. Live streaming yes there will be issues, but not a random 100K randomly watching a very popular video.
Sounds positive but even if they don’t try and access the same chunk at the same time, 1080p streaming on youtube is 5mbit/s, how are nodes, without caching nodes, going to be able to handle that for 100, 1000, 10 000 and so on? Will all chunks be spread that even over the whole network that all nodes will offload the bandwidth requirement?
But I’m not sure it’s obvious as something for soon after launch until a trading platform for Autonomi based assets has been demonstrated, and a mechanism for wrapping / bridging various crypto assets onto Autonomi-based tradable tokens has been demonstrated and thoroughly tested.
A good bridging mechanism will also enable Autonomi to be used as a kind of ‘L2’ for any bridged assets, so it will be a big deal if it’s shown to be possible.