So let’s say company XYZ has a current or future requirement for secure cloud computing and let’s assume obe objection might be well I don’t know how I feel about storing this data on random peoples computers but I also don’t like the feeling of not knowing I don’t have some of my own data secured by me for no other reason than psychological.
So using that objection can we tie this back to XOR distancing meaning if I put my own Data onto the network and I run my own farm from the same premise shouldnt the network theoretically store that or some of it on my own farm? And I fully understand there is no way to actually know if, which and what amount of data is stored on my own farm but does this at least pass the theoretical test?
No, location shouldn’t matter because it doesn’t affect the xor address, which is all the network knows about.
I should add that my calculation is not inaccurate because I didn’t factor in duplication. So the probability is higher by the number of copies (on average) of each chunk kept by the network. It will still be a tiny probability though unless you store a significant proportion of the network.
Imagine a football stadium full of buckets (the whole network) and you have ten of them, out of say 100,000. (bucket = vault.)
Then tennis balls (the data chunks) are thrown at random and every ball lands in a random bucket. All the balls are blue except for yours which are red.
If you have ten buckets, that’s 0.01% of the total storage, so there’s a one in ten thousand chance that each of your balls will land in one of your buckets.
For five duplicate chunks that becomes five times more likely (for each chunk), so a one in 2,000 chance that each time one of your red balls is thrown, that it will land in one of your ten buckets. And since you have ten balls, that’s a one in 200 chance that one red ball would land in one of your ten buckets.
Put another way, to have a significant chance of storing some of your own data you’d need to store a significant proportion of the total network.
Hope that helps rather than confuses (and that I got my maths right - haven’t had a cup of tea yet ).
Think multi-scale SAFE. Build a home/business SAFE network of 2+ computers that acts as your local cache of all your files. Then have this same local SAFE NFS data backed up on the global SAFE network at specific intervals (maybe every X seconds or minutes if you are paranoid). The same computer network you setup to run vaults for your local SAFE could also run other vaults for a regional or planetary SAFE network to serve GETs, thus having multiple functions. If you need your files, they will always be served most efficiently by your local network. Only in the case of some local disaster, like a fire or flood, would you need to pull down your backups from the global SAFE Network to your newly rebuilt local SAFE network.
And of course that could be a usb drive instead of a local home safe network.
Personally I would be having a mixture of local only files (temp), local+SAFE files (for speed,sharing), SAFE only files. Then for those important files I will have local and SAFE copies,