Hello guys, I’ve been following the poject for some time now and I was wondering - have you considered hosting the Wikipedia, are there ongoing developments?
In my opinion this could be a win-situation for all involved - the network would gain a considerable amount of supporters (farmers) and knowledge of its existence would spread quicklier (I’m thinking about something like a permanent main page entry calling for support for the Wikipedia by joining the SAFE network).
The wiki foundation on the other hand wouldn’t need to beg for ever increasing amounts of money each fall since their main expenditures are the servers hosting the site. And given the projected amounts of available space and bandwidth, Wikipedia authors could implement more videos and high resolution images than they do now. Also, the developers were able to collect data on how the SAFE network performs on a large scale / live environment.
And not least: people taking part in ‘hosting the Wikipedia’ can feel good about their constant support instead of donating money from time to time.
I like the idea, but the Wikimedia Foundation isn’t going to remove Wikipedia from the regular internet until SAFE utterly dominates the world wide web, so at best this would be a mirror/backup.
I think it is a nice idea and like all these things we need to be able to demonstrate the network running as we make the approach, otherwise we spend quite a lot of resource discussing theory, rather than finishing the network and giving them installers. I think the latter is far more powerful. We have quite a long list of people to approach as we launch and Wikimedia will be on that list for sure.
@Seneca: I agree, but at least, the amount of data provided by SAFE wouldn’t be billed to them / would lessen the need for donations.
I don’t think the goal should be for wikipedia (and similar services) to unplug itself from WWW, at least not for a while. IMO it should be a lot more gradual and first step would be to provide a bridging app that acts just like a regular web server to current clients/browsers but in reality it keeps all its data on SAFE. You don’t even need to implement a webserver but rather have this app as plugins for existing web servers (e.g apache). I’m pretty new to SAFE so maybe there already is such an app (in the making) and I don’t know of that?
I think the higher latency of SAFE, plus the remaining bandwidth costs won’t make such a solution an attractive option.
There will probably be some HTTP gateways, but the way I imagine this stuff is going to be implemented is that a lot of sites are going to have a legacy web server and a SAFE presence. If the person coming to the site comes from the old internet, they’re displayed a little bar at the top of the screen saying “Hi, you’re visiting a legacy support site, the support of which will be ending in the future, here are some instructions on how to download and install the SAFE plugin for [insert user’s browser here] and continue using our site in the future!”. Of course worded more nicely with marketing flair.
Anyway, yeah, for a good while both will be supported side by side.
Hmm… doesn’t latency very much depend on popularity of the content and can be improved a lot with services adding their own nodes? I don’t think viewers would mind non-popular articles to be loaded a few seconds slower than popular ones.
Somewhat at least, yes, but we don’t know for sure how much yet.
No not really, since they can’t choose to host their own data when they farm.
On SAFE you at least get privacy and downloading in parallel back, but if there’s a classic server in-between you lose those benefits and are only left with the higher latency.
You don’t even need to implement a webserver but rather have this app as plugins for existing web servers (e.g apache). I’m pretty new to SAFE so maybe there already is such an app (in the making) and I don’t know of that?
Take a look at http://safepress.io which is a project that will enable websites to be hosted on SAFE while accessed through a standard browser with the planned MaidSafe plugin.
On latency, I’m not sure @Seneca is right here. For typical pages, which these days load tens of files from different URLs, I wonder if SAFE will actually be quicker due to parallelism, superior routing, and caching.
Wikipedia appear to encourage mirroring: Wikipedia:Mirrors and forks - Wikipedia
It looks like the site can be statically generated from xml and db dumps.
It just needs someone to maintain the mirror on safe net, really. Obviously, the PUTs would still need to be paid for, but technically, it would all be possible.
Wikipedia for all it’s benefits is still highly centralized and censored. I think it does need more mirroring, forking and competition.