Perpetual Auction Currency

How do you know P for a movie ticket or a beer? You remember how much you usually paid recently. If you’re short on money, you may offer a bit less at first, if you absolutely need to have it done and right away, you may offer a bit more to make sure. Again, it’s like a free market.

All of this needs to be handled by the client software of course. Users could set things like “I want my money last for at least 200 GB” or similar and the client software would set P accordingly (though preferably as low as possible). Warning should appear if the request is unrealistic or if rising storage costs make it become so. Something like the mobile data indicator on phones would be useful and already familiar.

If we wanted to take the market idea real far, we could implement a limit order book with users on the bid side and vaults on the ask side, chunks waiting to be accepted for storage. Just musing, not suggesting.

Exposed by whom? I imagined the client would attach it with the request.

Though not absolutely necessary, there could be an informational API call for getting the recent average price, the price at which 99% of the requests were accepted, the last accepted price, or something else that’s both useful and easy to keep track of. As it’s just informational, there’s no need trying to get it absolutely correct about it.

If we want to stick to the “saved once, stored for ever” paradigm, the result must be a flat out rejection. I’m not personally convinced that’s a useful paradigm in the general case but redundancy can only be tuned in very coarse steps when using simple copies as we do so I’m not sure your suggestion is viable here.

TL;DR The request should be refused.

I’m not sure if that’s what you also meant, but having a free market would remove almost all metrics from the network since we no longer need to invent the perfect government. Individual agents would keep their various metrics but, since all that is outside the core, it’s a very different situation.

But they are. It’s just, the close group (currently with size N) must be a bit larger (M) to accommodate a market process where only N<M nodes will get to store any given chunk.

How do we know the address map to an actually existing chunk though? That is, how do we know if we need to punish a vault for not being able to return data for a given request? So, the problem of storing some information about chunks already exists, at which point it takes little more than an additional bitmap in that entry to also note which N of the M nodes should keep a copy of it.

In a distributed network, the best consistency we can hope for is “eventual”. I don’t believe any sufficiently flexible method could result in a more globally consistent value than a free market. Or, that a free market could give that with better consistency than any hand-coded method, for that matter. However, it has desirable properties that rigid hand-coded methods lack.

Within any time period, each client will interact with many random vaults and each vault will interact with many random clients. If that won’t result in some global consistency, I don’t know what would.

The same. People don’t care about chunks. They store files that may span multiple chunks (thus, sections) so a single upload would already need to deal with multiple payments of potentially different amounts.

Many great questions here. I vote for assigning orphaned chunks to the cheapest vault in their new close group and paying the difference from network balance.

What should happen if the given section runs out of money? It isn’t all that different from running out of space as a result of a vault leaving the section though so there may already be some ideas floating around.

2 Likes

There’s a sign saying the cost. I was more interested in that ‘signage’, as in, how does a client see what storecost nodes are currently accepting? The mechanism seems interesting since clients presumably can’t directly query destination nodes, so is there some easy way to get a list of storecosts so P can be set? Not trying to be negative about it, just trying to work out some logistics (which apply to many economic systems as well as the one you’re describing, so I’m approaching this as a generally useful idea to explore).

I see now (from the way you describe it) that storecost would be more like clients ‘try it and see’, rather than ‘know and pay’. I originally approached it as ‘nodes dictate prices to clients via signage’, but it seems you’re aiming for the emphasis to be more like ‘clients specify their desire’ and nodes accept or reject it, so clients are the primary origin of storecost values rather than nodes, and nodes are the authority of storecost values. Is this on the right track?

Yes sorry I wasn’t clear, we’re in agreement on this. It’s a really nice feature of the market, simplifies core an enormous amount.

True, but people do care about cost, and specifically planning their costs, and not being charged more than they ‘should’. I think maybe this is easily dealt with by saying ‘the network is not for those people’, but I think it’s reasonable to have some expectations around costs and predictability. Perhaps the software will be able to handle it automatically so it’s good enough for almost everyone.

For example, I expect bitcoin transactions to be pretty cheap, but I’m not going to take a punt on my fee. I’m going to look at the current fee pricing and balance my fee based on that before I do my transaction. So I think many people will want to know ‘success or fail of upload for x fee’ before they try to spend their coins. It’s a user experience thing, right? Maybe also a habit thing?!


For me the main change to my perspective between the free market concept compared to PAC is the idea of ‘storecost signage’, where PAC has a really clear advertised storecost, vs free market which is less explicit and more emergent storecost. It’s pretty cool, but I’m not sure how I’ll go with the ‘anxiety’ of an unadvertised storecost value. Maybe it would be a nonissue, I’m not sure until I try.

Thanks for the clarifications.

2 Likes

I already addressed that a few paragraphs above that:

Here, however, I responded to your question about this:

People care about storing their stuff, and whether the price is different vault-to-vault (as I proposed) or section-to-section (an inherent feature of a distributed system), it makes no difference to them: the client software needs to handle different prices within a single document since it can span multiple chunks, thus sections, thus prices.

1 Like

@JoeSmithJr, How do you think the market idea would hold up to node relocations? I think it should hold up fine but wanted to explore it a bit:

Seems to me that users and nodes set storecost based on an agreement to store data, but how well does the storecost value account for future costs which won’t be incurred by the initial storage nodes?

Presumably the initial storecost set by the node must incorporate all [unknown future quantity of] requests for this chunk plus any external future costs (eg the cost to relocate and take on different chunks). So the uploader could be seen to be paying for the future GETs of some other chunks… Does that sound right? Not saying this is bad, just trying to grasp the mechanics and incentives behind it.

To rephrase from a different perspective, it would suck to pay for uploading a chunk and find that relocated nodes thought it was unfair to accept that chunk in the future because they weren’t directly paid to store it (which seems like faulty logic but the idea may seem valid to some). Does a market ensure perpetual data, or does it risk dropping ‘unpaid’ chunks at relocation?

Perhaps I’m questioning the perceptions rather than the mechanism itself…?

1 Like

This isn’t anything new though. Maybe putting it into market terms made it easier to recognize an already existing problem? We will start losing data when the incentives fall short of the costs. The rules may punish certain actions but the maximum punishment is exile. Coincidentally, nodes would “go into exile” voluntarily when they aren’t paid fairly, so no rules can fix the problem of running out of space (including already used space, that is, old data) if participation isn’t profitable.

None of that depends on the details of how payment is arranged, so we’re not dealing with a problem about the mechanisms of the economics but about the basics of supply, demand, and cost.


Let me add as a comment, though this forum is full of appeals to altruism about people donating their “unused space and bandwidth” because “it doesn’t cost them anything beyond what they’ve already paid for”, that is wishful thinking at best. One does not build the New Internet™ on hopes and dreams.

2 Likes

I don’t recognise this on the forum. I have though argued that the already sunk costs to which you refer mean that people will be willing to farm for lower rewards than purely commercial operators - who will direct capital where they expect most return. The effect can be framed as altruism, but I don’t agree - I think it is a psychological finger on the scales that will help keep the cost of storage lower than in a purely commercial environment. To what degree I don’t know, it may be insignificant, but the direction is downward pressure. That’s one reason for us to try and facilitate use of spare resources, others are democratisation and decentralisation. Both key aims of the project.

Altruism may also play a part, there will be some who will run a vault because of this I’m sure, but that doesn’t mean that the running of home vaults is always altruistic - people will earn, and they will value that. At the same time many will not regard their cost of farming as significant - they’ll make a judgement based on seeing their wallet earnings increase enough or not to bother. If you call that altruism, maybe we should also regard people sharing their data with facebook as altruistic! :stuck_out_tongue_winking_eye:

4 Likes

That is the way i frame it too. Some people at home will be happy to come out each month with some coin. And thinking about it they may even be happy if its isn’t quite paying the electricity. How many people mine coins expecting the price of the coin to be many times the current price at some stage in the future.

This is not altruism but people wishing to make something by running a farm and expect what they make to cover the incremental cost of what they already have setup at home. Or even a small SBC that cost a hundred dollars or less and recover the cost over 24 months.

They would expect to recover the cost at the current price of the coin or expected future price.

Then of course some here have said they would run a vault anyhow since the cost is very minimal anyhow. But that is a special case of users.

3 Likes

I get what you’re saying. I exaggerated on purpose but I did mean those cases as well. It takes either a nerd (“new toy!” – until he needs the space for more hentai), someone extremely stringy (running a home computer does not incur substantial cost), or a speculator (“future value!”) for any of that to matter, none of which comprises either a reliable or a sufficient user base for hosting the future internet.

Hosting vaults must become continuously and sufficiently profitable for the network to start growing and then stay alive.

3 Likes

I am not sure what you are considering the incremental cost to a home user will be, but it is basically the extra electricity for the activity of the drive and CPU to run the APP. For some it will be unmeasurable

1 Like

Possibly, but nobody is saying they won’t or shouldn’t be part of the solution. You seem to be arguing against a straw man - no offence meant - but rather than ‘exagerrating on purpose’ the discussion would be more useful with precision IMO. Why try to polarise it?

3 Likes

You misunderstood. I was talking about the other side, of the benefit. Unless it’s substantial, only weirdos will care. Not enough to run the FI™.

It was a comment that managed to side track the topic. My fault. Move it to another thread maybe?

But, if we’re at it now, complaining about lack of precision won’t help the discussion either. This is a serious issue, one that will decide the future of the Safe network.

Any move from the status quo requires not something good but something that is perceived as significantly better. Privacy, security, and other such things that we notice only in the odd case when we’re affected by its abuse don’t qualify. I’m purposely ignoring the minority who cares about these exactly because they are a minority. Ask a few friends if they have checked themselves on haveibeenpwned, if they use different passwords for their banks, email, WoW account, and you’ll see. Those are the currently existing free and low-tech solutions for what the Safe network is trying to achieve. Nobody who doesn’t use them will install a vault either.

It’s a lost game if all we have are hopes that privacy, security, or marginal, potential, or imaginary economic benefits would pull in enough people to host vaults. I’ve lived for long enough to know that will not happen, it never did, and time isn’t different. It needs to be profitable enough that people should be able to scale it up to provide substantial passive income, or it will not happen.

2 Likes

Again I think you are arguing with a ghost.

2 Likes

I agree to a point … but the code isn’t going to disappear and the network may not either, hence there should be opportunity down the track to address problems and inadequacies one way or another - even if worse comes to worse and the network requires a hard reboot.

At this point, I personally just want to see the baby get out the door. We’ve waited a loooong time for it … and I’m sure I’m not the only one tiring of the pregnancy.

1 Like

@JoeSmithJr, does the market idea only cover storecost or do you have ideas how it may work for the reward / coin creation side of the economy too?

The only thing I found talking about the reward is this, so wondered if you’ve had any progress on the reward side of things?


I still haven’t got much further with ideas (just letting them percolate for a while) but wanted to float a concept for the fans of algorithmic control of the network:

Storage, bandwidth, compute power etc all have an unknown future, maybe seeing plateaus in growth, maybe seeing breakthrough spikes in performance or efficiency, but not on any clear timeline. The unpredictability of the change (which compounds over time) makes it challenging to build an algorithm around those all-important health metrics.

However, latency is one metric we can be sure won’t improve by much more than double because we’re already getting close to hard limits of physics (mainly speed of light). Building a future-proof metric around latency (which can only ever improve by a maximum about double) would probably be a lot safer than building it around any of the others (which may increase massively or not by much and over an unknown time scale).

Any thoughts on this as a possible mechanism? I think latency can be used for an economic incentive structure, and it takes care of the rest automatically (big claim for sure). I think it’s also the only metric that when pushed to the extreme simultaneously increases security and performance. Most other metrics increase one at the cost of the other.

5 Likes

As someone who uses AWS heavily, one piece on the itemized bill is bandwidth usage (it’s never just about the storage costs but also the data transfer costs especially between regions). I do think that this needs to be seriously taken into account for the safe network. As much as the focus has been on hard drives, connectivity and its price need to be considered because 1) it’s a resource that most of us have much less power over 2) cloud providers charge for it separately for a reason. Right now, connectivity is being bundled into PUT cost. But it might be better off being granular as another lever for the network to leverage for seamless function.

Potential concerns with bandwidth based mechanism alone:
+ How do you ensure farmers store unpopular, stale data that’s rarely requested?
+ How would the network measure bandwidth then price it?
+ Don’t you run the risk of incentivizing a network of local ISPs rather than that of storage?

A much simpler solution would be to let vault operators set the price of their service to the network because they then can include all of their costs (bandwidth, storage, electricity, etc.) which are local to them. The market/bidding proposal for farming looks like it would best stand the test of time.

5 Likes

Yes it has been considered and is part and parcel of the rewards the vault gets.

Obviously if the all-in-one reward method is not good enough then yes other methods would need to be considered.

But it is reasonable to consider the bandwidth usage is somewhat related to the GETs done on the vault and as such rewarding for GETs is capable of also rewarding for the computer usage, the electricity, the storage and the bandwidth required for a vault to run.


On another note the home vault is using spare resources, need I say more about the economics of that?

The user always has the option to agree to run a vault or not, and since it is a not a requirement to use the network then the network can set the price it is willing to pay a vault operator.

By allowing the vault owner to set the price then we have another set of cloud providers who set the price as high as possible and no longer do we have one of the original goals of the network, it is no longer the SAFE network, but one of Kims mega networks.

4 Likes

I got into thinking maybe another solution that is in the middle!

lets say there is a voting system additionaly! so:

  1. the network decided alone the rewards and PUT price

  2. the vaults have a voting system where they “bid” or request more money

so if the rewards are x and all vaults (or most vaults) vote for more rewards then the system is adjusting a bit to make all vault operators happy.

if the network sees less people PUT into the network then it can lower the rewards and on and on!

edit: I imagine it as a voting system that the vaults can say “I am about to stop being a vault please do something about it!”

1 Like

It already is a voting system. The vaults vote “price is good, here are your chunks and send me some more” by staying connected to the network. The vaults vote “you’re crazy, the price is too low to be worth my effort” when they leave. Based on section population Dynamics, the network is free to adjust the price to maintain growth/health or other metrics.

6 Likes

The biggest voting system is people removing their vaults if dissatisfied. In the original system (and I gather in the upcoming) this would result in rewards increasing till enough are satisified. This guarantees the lowest price for the world’s users and creates a system that allows the poorer to participate storing their data on the network.

Remember one of the goals was to include as much of the world as possible which includes countries where people earn a lot less than most in this forum

EDIT Drats beaten by @jlpell

5 Likes

You guys treat vaults like things when in reality they are just extensions of people and one thing about those is they are fickle. In other words, someone who pulls the plug on the latest Great New Thing™ is unlikely to give it another chance a month later. Consequently, a scaling strategy summed up in “just leave if you don’t like the price” is terribly sub-optimal for adoption.

1 Like