TLDR: StoreCost is a rate limit, not a fee for service. Rewards ensure permanence, not security. Membership rules ensure security.
Regarding storecost is a rate limit, not a fee for service:
In the past I’ve said things like storecost infuses chunks with their future costs which implies some sort of idea of unpopular-data-pays-for-the-needs-of-popular-data. Since then I’ve changed my mind.
Storecost is not a fee for service because
- We don’t know the cost of services into the future.
- Costs for different operators will vary by location and labor and experience and equipment etc.
- Costs for each chunk of data will vary by popularity and geolocation and age etc.
So storecost can and should only be a form of rate limiting or spam protection.
To illustrate why this is, imagine the hypothetical scenario of no storecost, upload is free and only constrained by the ability of vaults to process it.
In this case there would be a queue of data. Maybe it’s a first-in-first-stored situation. Maybe it’s randomly selected. Maybe all new data is accepted and the most unpopular old data is dropped as the new data comes in. Whatever it is, the missing element of all these is a consistent way for vaults to decide whether or not to store new incoming data.
Storecost is hoping to solve consistency with decision making, ie allow the client to attach a priority or value (via a fee amount) to new data (or to change perspective on the same idea, allow the network to specify the threshold for spam / not spam).
So I guess what I’m saying is storecost should be as low as possible so that upload is economically constrained (ie by value) rather than technologically constrained (ie by bandwidth or disk space constraints). We don’t want to allow absolutely everything, eg random data, but we do want to allow as much valuable data as we can. Storecost allows setting the incoming data flow as high as possible but not so high it damages the network.
A related tangent: the initial constraint on upload will probably be the difficulty/ease of obtaining safecoin. The additional cost/difficulty/friction of obtaining safecoin will prevent a lot of data from being uploaded compared to, say, a free-but-rate-limited situation. But as safecoin becomes more commonly available and the acquisition and spending of it simplified then the storecost will become the only limit on uploads, rather than the other safecoin-related frictions. I think the current UX ideas for payment and onboarding are fantastic which helps reduce the friction when a user takes the journey from initial discovery of SAFE to uploading data.
The current design of storecost does indeed aim at this since it’s based on spare space which is probably the main factor when deciding how much to rate limit.
Regarding rewards ensure permanence:
Since storecost is not actually funding the ongoing security and delivery of SAFE network services, this must be done by the reward.
It’s very important that the reward should not bias in favour of popular data. It should reward the retention of all data equally. Data fungibility.
Pay-for-GET (in the simplest form) would mainly benefit popular data. Farmers would be doing a cost benefit of keeping unpopular data. At some point that unpopular data would be worth dropping. The reward mechanism should ensure that it never becomes viable to drop unpopular data (and this is why it’s important for the storecost to ensure uploaded data has some initial value hurdle, so that the network is justified in keeping it).
burstcoin uses storage in the form of computed farming-data where the key idea is farming-data is relatively costly to obtain but relatively cheap to utilise for farming so farmers store it rather than continuously re-obtain it, hence proof of storage. Data chunks in SAFE are similar in being relatively costly to acquire but relatively cheap to utilise for farming. It’s important that a) all data is utilised approximately equally for farming and b) only legitimate data is possible to use, ie you can’t use cheap home-made data for farming, only costly client-uploaded data.
This insight into the reward mechanisms and the relation to storecost is important because it shows what not to do. We can’t treat storecost as a benefit to farmers. We can’t treat rewards as a type of fee-for-direct-service but as a fee-for-aggregate-and-future-potential-service.
Some ideas to illustrate this concept
- rather than reward a GET request for a specific chunk, reward when that (probably popular) GET request is used in combination with some existing (probably unpopular) data. This should make unpopular data equally accountable and useful and valuable as popular data for the purpose of rewards.
- the frequency and size of rewards should be matched to the rate of change of the network, which is subject to unpredictable rates of change in participation and technology. Upload is constrained by total available bandwidth, relocation and joining new nodes takes time, and rewards should reflect these temporal constraints. GET and PUT volume will vary through the days and the years. Availability of new storage will go through famines and gluts. The reward should be responsive to these things while primarily ensuring the availability of all previously uploaded data. Note that ‘availability’ is mainly ‘can it be downloaded’.
Questions to ponder:
Are the rewards and storecost also related to security? I think a little, but mainly the membership rules (join / disallow / relocate / age / punish / kill) are the biggest factor here, maybe as a gut feeling about 90% of security comes from the membership rules and 10% from the reward/storecost mechanisms.
Where should the storecost go? To the initial vault handling the upload? To the final ones storing it? To all vaults on the upload route? To the network as a whole? I think the existing recycling solution (ie storecost is given to the network as a whole to give as future rewards) is a pretty good approach.
How should the reward mechanism be designed to cover all data rather than only popular data?
Should it be possible for storecost to get to zero? If not, how close to zero could it get?
What tools can we give farmers to best understand how to run their operations? For example, if they have lots of spare disk space but their bandwidth is maxed out, how can this best be explained to them?
Can spare space and spare bandwidth and spare compute be measured in a useful way? Or is it only useful to measure stress / failure results and aim toward a certain degree of stress?
If storecost is a rate limit, how does it work for vaults with varying abilities? Not all vaults would want to rate limit the same way, but storecost is a kind of universal rate limit.
As per the topic title, these points aim to clarify potential misconceptions with the safecoin economy, but may themselves be misconceptions. What do you think?