Hybrid, for transition purposes

I was wondering if we can imagine having hybrrid websites, or a hybrid ‘web’ once SAFE goes life. I can imagine that not everyone will jump to the new network, especially when there is nothing much still in the beginning other than (simple) copies of what we like on the current web. It will take a while people abandoning Google and the likes simply because it works for my/their purposes. People don’t feel the need to switch because it is not a real problem safe solves to them.

I know that one of the promises is ‘cheap and fast’ hosting, but can we run that data on the current web? This could be a good start - if the website makers can use it on the background (I can imagine dropbox using it (i think that is a very solid use-case)). Can we think of some sort of a ‘framework’ that loads SAFE content into a current web, and also mirrored in the SAFE world?


Nothing to stop you from running a backend on the SAFE network. :slight_smile:


In that case you’d be using SAFE for storage. The problem with this though, from a cost saving perspective, is that bandwidth will usually be a much larger cost than storage, but if you run a hybrid model you still have to provide and pay for the bandwidth. If you were running a very storage heavy website you might see some savings though.


Will there be a bridge between the current internet and Safe network?
Obviously, it’s going to take a long time for large scale adoption of the Safe network as a replacement for the current internet. So for people just who are dipping their toes in, storing data on the safe network could be preferable to saving it to cloud services, and it might even be cheaper than using paid encrypted storage services.

So something like https://www.safestorage.biz (that’s the most credible domain, i find :wink: ), which would appear like a site like spideroak. You pay the site to store your files, the site converts your payment to Safecoin and allows you to create and account and Put files up to the value you have paid .
You would be sending your credentials through the site so that’s a clearly weak point, but it could be a stepping stone for moving people on to the network?

Edit: Actually, just thinking… when the safe network goes live, I guess anyone would be able to set up encrypted storage services on the current internet, and essentially create an account/accounts on the Safe network to store the files (obviously with some magicical programming in-between to make it look seamless), without even informing users… Would this be a problem, would it be considered abuse, or would it just mean more safecoin for farmers?

If I understand this correctly, wouldn’t it be the case that large static sites would probably be saving money, while securing their assets, but for sites where content frequently changes (a lot of new puts without increasing the overall size of the site), the cost would be much higher with SAFE?

I think safe to say, there would be an ongoing cost with SAFE, how much we have yet to see. It may be higher or less than a server, depending on how its implemented. So for instance you design a site that updates, but use users data (videos, blogs etc.), the user pays a little to post then. That’s nice as users then can control their data and posts, but may not suit all providers. It should open many more business models though.


We’ll see what costs end up being. An interesting part though is that a site owner can choose to let users pay all these update costs if they want to (assuming the updates are made by users posting stuff etc), and users could pay that cost by sharing their unused hard drive space.


I don’t see how the “update data / static website” has much influence on the cost, since the requests comes from visiters-traffic and not some FTP-ish activity right?

Basically, from a simple webmaster’s perspective: I want to use the safenetwork as much as possible, as early as possible, and preferably have no increase in costs, and my users will experience no difference, maybe only faster load times :smiley:

The ‘data security’ is not really an argument here, since I - as an admin - still have access to it, and it could still be breached in 1000 ways minus a direct server attack.

Would it perhaps be possible to run SAFE client in something like Google native client - so that a user can go to an unSAFE website, and then be ‘teleported’ into a site hosted on the SAFE network?

Google native client or WebAssembly has the same restrictions as JavaScript.

How could you expect SAFE to be faster than something like Azure?

Azure, in clear, can load a page of 100+ resources sub 1 second.

SAFE’s relying on a distributed group of geodisbursed machines to try and match that.

The browser can’t parallelise 100 transfers. You can prefetch and lazy load, and if you’ve engineered a page to use these types of techniques to the max, you could possibly be relatively quick.

However, unless all the files are on low latency ssd dedi’s/vps’s near the requestor, I’d never bet SAFE to be a speed match for something like Azure. Given the fact the business model is based on spare home user storage and the fact that farmers will most be attracted who are at the lower income levels (largely), which means distant and/or cheaper internet connections and hardware, I just can’t imagine your hosting being quicker on SAFE – unless your current hosting is on shared hosting in timbuktu.

Lots of assumptions there @bullrun. I’m not convinced by your assessment but I don’t think there’s much point arguing over this because there’s not enough data to know what the reality will be like. Prepare to be pleasantly surprised is my suggestion :slight_smile:

Yeah, I’m open to being pleasantly surprised.

Then again, if some of the predominant hosting companies on the planet can’t outperform Azure, I’m not holding my breath waiting for a coalition of personal computers spare storage pulling that one off.

Maybe I should spend more time looking for the performance optimisation discussions, such as how we can prefetch, force caching, parallelise transfers, and so on. Any pointers to resources or discussions would be appreciated.


I think it is too early for that to, for me anyway. There are many ways speed and latency can be improved. Some will be in the application layer, and others may be addressed in the network. But until we have a better idea of the parameters and how easy it is to get along with them, or not, I think it is too speculative to be best use of our (OK, my) time.

1 Like

It is true that SAFE under the current conditions and foreseeable conditions will not match the performance of a fast simple (dynamic) website for individual experience.

What is not clear is speed for multiple hoards of people. For SAFE each person will experience the approximate same time of response no matter is 1 or hoards of people are accesing the safesite. The point where things start to slow down is when the number accessing the safesite actually maxes out SAFE itself.

So if SAFE is real small of say 10,000 nodes then a few thousand people accessing the safesite will be slowing down SAFE itself and thus that safesite. But if SAFE is say 10 million nodes and 100,000 people access that safesite then its quite possible then each person (on average) will have the same experience as if 1 person was accessing that safesite. In fact caching could even speed up the experience a little because all the CSS, javascript, images are being cached closer to the people accessing the safesite.