Economic reasoning for Storage Emissions in White Paper

Economic reasoning for Storage Emissions – do the reasons given make sense?

For a TL:DR, see the summary at the bottom of this post!

Firstly, I want to say that I’m not against the 70% of token supply entering circulation over time. But I do think there needs to be a good reason given for it, and I don’t think the reasoning given in the White Paper makes it clear that it’s required, or that it will be beneficial.

I’ve sometimes said 49% relating to Storage Emissions, but I think the same reasoning applies to the full 70% if similar methods are expected for future son-storage resource provision.

Here are two quotes from the White Paper that provide the reasoning for storage emissions:

Physical infrastructure-based decentralized platforms require upfront infrastructure availability before usage can scale. To drive node network growth, new token emission incentivises node operators independent of current network usage levels.

The above implies that node supply and user demand isn’t expected to scale in a coordinated manner without intervention. I think this underestimates the coordinating ability of the market if price signals can clearly signal demand levels to node operators, and supply levels to uploaders.

It also suggests it’s a good idea to incentivise node operators to supply more resources that what is demanded by current usage levels of the network, rather than letting the network’s market balance supply and demand.

The foundational premise of Autonomi asserts that larger network sizes correlate with enhanced performance and heightened security. However, the dynamic pricing mechanism, while effective in providing a supply-and-demand-based incentive structure that augments rewards as demand rises, operates retrospectively. In essence, rewards only increase once new demand has emerged. This retrospective nature can potentially delay network growth due to market inefficiencies. The time lag between increased prices and the subsequent generation of additional resource supply could impede network expansion and adversely affect user experience.

This quote builds on the thinking that the network’s market (through Dynamic Pricing alone) may not be sufficient to coordinate supply & demand as the network grows, due to the retrospective response of supply to increases in demand.

The market should be good

I expect that when demand pushes storage cost a bit above the emerging ‘norm’, new nodes will quickly be spun up by opportunistic individuals / organisations. There will be a race to supply new nodes to catch the rewards when they’re juicy, so the network will keep up with demand. Some will likely automate a process to spin up new VPS instances at certain price levels. If demand continually outpaces supply, then perhaps a higher price is exactly what’s needed to keep things in balance, rather than being something to try to fight.

When demand is slow, new nodes won’t be incentivised to join, but why should they need to be if resource supply & demand are in balance, and operators are responsive when demand is strong?

Even if it were the case that in the early days of the network some kind of incentive were needed to help supply keep up with demand, I can’t imagine an optimal solution would be a multi-decade intervention that emits 49% of the token supply in a way that isn’t linked to demand levels, making this reasoning for the Storage Emissions questionable in my view.

Market inefficiencies

It is stated in the second quote above that a lag between demand pushing pricing upwards and nodes responding to this could be considered a market inefficiency. To me that sounds like a very efficient market if the pricing signal is clear. It’s normal to have some lag in supply responding to demand, even if in this case it’ll likely be very small.

Potential issue 1: Reduced network responsiveness to demand increases

An effective way to cause market inefficiencies is to reduce the potency of pricing signals between market participants, which is one potentially significant effect of the proposed Storage Emission plan.

For example, imagine a scenario where increased demand under Dynamic Pricing alone might push up store cost & node earnings by 10%. If Storage Emissions represented 50% of a node’s income at that time, the increased demand would only lead to a 5% increase in node income, which may not inspire the increase in node provision that is warranted by the increase in demand, due to the dilution of the price signal.

There are many variables, but the principle is that this intervention, which is intended to solve potential market inefficiency, is highly likely to cause market inefficiency.

Potential issue 2: Distorted market

One thing I love about Autonomi’s market is that it can work purely on supply and demand, with no ‘block subsidy’ needed to secure the network as with Bitcoin, or ‘Staking Rewards’ as with ETH… a pure market for real resources that is scalable and should work over long periods of time seems simpler and better to me.

Bitcoin’s change in economics when the blocksize limit was reached was dramatic in how it affected usage of the network due to rapidly rising fees. No more paying for Beer at the ‘Royal Dick’ (a pub in Edinburgh that accepted Bitcoin… does anyone here remember that?), and many emerging Bitcoin utilising business models became unsustainable as fees rocketed due to changing network economics.

The Storage Emissions as proposed would likely have the intended effect of pushing supply of resources ahead of demand. This may not be a good thing. Due to Dynamic Pricing, it would push down the cost of storing data on the network below the level at which nodes will be willing to provide similar resources on a long-term basis, effectively subsidising the cost of storing data below the market level.

This could be problematic as people start using the network in a way that seems great, but isn’t sustainable as the Storage Emissions reduce and the true market price becomes apparent.

Do we want people / companies in the early days figuring out how to use Autonomi in their daily life / product offerings, only to face steadily rising costs of using the network over time due to reducing Storage Emissions, and making previously enjoyed use cases too expensive? Would it be better if they could see a more accurate long-term price emerging for services on the network?

Of course it won’t be as dramatic as what happened with Bitcoin. It’s hard to say how significant this distortion will be, and the rate of change would be very low due to the slow rate of change in emissions, but it’s very likely that it will exist and may be unhelpful rather than beneficial.

As a side thought, it may be that some node operators would be better incentivised to provide extra resources in the early days if they expected that earned tokens would have a higher future value than if diluted by the release of the Storage Emissions. If this were a common position, the Storage Emissions may not stimulate the additional resource provision hoped for, and no emissions might be more effective in stimulating early participation than big emissions. (I expect the emissions would on balance stimulate more additional supply than zero emissions would… I doubt most operators will be thinking that long term)

Proposed tweak to Storage Emissions

Here’s a suggestion of a tweak to the Storage Emissions plan that may be preferable in some ways.

The tweak here allows the market to operate naturally and as efficiently as possible up to a certain point.

Above this point, the cost-curve to uploaders is made more shallow, while payments to node operators are kept in line with the original Dynamic Pricing curve. The difference would be funded with Storage Emissions.

Compared with the White Paper’s proposal, this tweak should;

  • Magnify, rather than reduce price signals to increase storage in response to demand increases above a certain point
  • Reduce market distortions because additional incentives to nodes are proportional to demand
  • Reduce market distortions because the price to uploaders will not be subsidised below a certain point, so there is less potential for data pricing ending up way below a sustainable level

This would still lead to a market distortion above a certain point, but as it’s occurring where the pricing curve is steep, it should quickly lead to more nodes responding to higher prices to ensure more than sufficient resource provision.

One thing this wouldn’t do is ensure the full 70% are emitted in a predictable timeframe. Release would be dependent on demand level, so it may either never fully happen, or may happen quickly if the market settles deep in the yellow zone.

Summary

In summary, I expect that the market mechanism with Dynamic Pricing alone should be sufficient to balance supply and demand on the network & the Storage Emissions as described are unnecessary and possibly harmful.

To me, the economic rationale given in the Whitepaper felt more like an attempted rationale to explain the 70% emissions, rather than the 70% emissions being a solution to clear problems.

Issuance of 70% of the eventual token supply is a huge event, and I feel the genuine and clear rationale should be given to justify such a significant action. What is given seems highly unsatisfactory (though of course, I could just be wrong!)

One rationale for the 70% emissions that makes sense to me is the justifiable ethical / ideological thinking that early investors should control 100% of the initial supply & not face dilution, but that most of the token supply should emerge through rewards to those who contribute by running nodes.

If something along these lines is the true rationale, I’d like to see it laid out clearly, rather than what is, in my view weak economic reasoning for a huge supply dilution.

Anyway, those are my long-winded thoughts on the Storage Emissions described in the White Paper.

I hope it makes sense, and provokes some good discussion even if my thinking turns out to be all wrong :smiley:

Edit: further along in this discussion, my thinking has moved on from; ‘how to optimise emissions distributions to nodes to avoid pitfalls’, to; ‘Allocating emissions to nodes is unnecessary, where could the huge value of emissions be allocated to far greater effect?’

8 Likes

I’m not following the chart image you have there. Would you be able to elaborate on that more?

Without the 70% (in any way/shape/form) then the rate paid goes up as fullness increases. Your chart shows this, but I don’t understand your proposal to add in some amount of the 70% here.

Off the top of my head, I would think such addition would be bad - as it seems it would cause a drop in the price of the token and complicate making a calculation of the cost-benefit of adding new nodes. E.g. I get more tokens, but the tokens are worth less … so negating the benefit of adding new nodes to some extent.

But again, I don’t understand your graph, so maybe I’m missing something.

My personal gut feeling here is that if the 70% is a done deal (and I assume it is), then it would be best to emit them over time at a steady and EXPECTED rate. This gives the market the ability to price them in and steadies the market.

Adding them in unpredictably may lead to unexpected volatility which could be detrimental to the entire network.

Anyway, thanks for taking the time to do a deep dive on this. :beers:

2 Likes

It isn’t a huge event but a process that spans decades. From just reading your summary (thanks for that!) I don’t share your concerns at this point. I found the rationale explained to be a relief - a way that I expect will handle issues that could otherwise threaten the network.

I won’t be able to see if that makes sense without understanding more about the emissions mechanism, but I think the idea is sound, though I haven’t time to analyse in detail (or read your full post for now).

3 Likes

Sure. It’s a modification of the chart from the White Paper, which shows the Dynamic Pricing concept.

Original chart:

Tweaked chart:

The change from the White Paper image is highlighted with the yellow area. On this version I’ve also changed one line to green to highlight that in the region where Storage Emissions would be active;

  • the price paid by uploaders would be the price at the green line on the lower side of the yellow area
  • the rewards received by nodes would be on the blue line on the upper side of the yellow area

The difference between the price paid, and reward received would be paid from the Storage Emissions pool.

So, it’s linking the Storage Emissions with the Dynamic Pricing to reduce the inefficiencies and distortions caused by a blanket Storage Emission as described in the White Paper.

Hope that clarifies what’s going on in the chart?

My big question here is, what is the rationale for the 70%. What is the reasoning for it, as the economic rationale provided seems incorrect, or at least insufficient to me.

If the emissions are supposed to reduce market inefficiencies, why does the proposal increase them, and is it really likely that the market can’t balance supply and demand without these emissions?

Having a set emissions schedule would certainly be more predictable than having a demand-led emissions, but would lead to the issues I’ve mentioned;

  • Reduction of supply sensitivity to demand increases
  • Unsustainably low store-costs (worse at the beginning of emissions, reducing over time)

@TylerAbeoJordan, what do you think of the reasons given for the Storage Emissions in the White Paper? Do you think they make sense, and that the emissions will improve the functioning of the market for Autonomi network resources?

2 Likes

When you’re talking about a currency, or asset, or future value of current earnings, the fact that over 10 years an action will be taken that basically doubles the available supply is a huge event, even if not a fast one.

Good to hear. What issues do you think may threaten the network if not for the distribution of Storage Emissions?

Yes, I look forward to hearing how the actual distribution mechanism will function… though I expect you’ll understand the technical ins-and-outs of that far better than I will!

3 Likes

Yep. I get what you are thinking here now. Again though I think it would distort the market in a negative manner and potentially cause volitility. For example the higher prices in the original model supress uploading while encouraging more space. You proposal would even out prices, but that negatively affects the networks ability to alleviate the supply of storage space. Effectively negating the mediation of the network.

In the OG white paper I suspect (don’t know) that it was to spread out the token supply over time - emulating bitcoin’s method of not handing it all out at once to a lucky group of early adopters. I don’t believe there is any pure economic rational to the 70% – as you point out.

That’s hard to quantitate. A possibility it could be unsustainable, but not necessarily so. If the network grows enough in the beginning, then there may be no problems here.

I have expressed concerns though about the sustainablity of the permanent data model - mainly that if the network fails to grow it could trigger a collapse of the network as prices for storage may increase causing demand to continue to shrink - spiraling until nodes can no longer earn and walk away. In the case of there being extra emissions from the 70% pool, these may alleviate that possibility by giving nodes something - even if it declines in value (via inflation of the currency).

I’ll have to go back and reread it, I’m not clear on what reasons are given in the new white paper. My views are in regard to what I believe were the old reasons - that it prevents the OG’s from owning the entire supply. So I can speak to that reason - I think it makes sense from a marketing perspective and facilitates a growing network – as people new to the project will see it as being more financially decentralized. From a pure economic perspective though I don’t think it’s helpful on it’s face.

Economic calculations are impossible to model though – which is why we need markets.

2 Likes

Here’s a quick version of the response I gave to a Q on it on the Q&A the other night of the approach to it, to give you a taste:

So the genesis event, the inception of the network, will create 30% of the overall supply at T-0.

You’ll be able to use the DAG, should you want to, to trace back through parent transactions back to that event.

But when it comes to emissions, the Network is creating these new tokens, based on certain conditions over 30+ years. It’s like a series on mini-genesis events all across that time.

Because it’s unacceptable to have these tokens with held by the foundation, or just laying around in the network some how. They need to come in to being over time… so there is no honeypot.

The way it works in bitcoin, as you’ll know, is that money appears in each blocks as block rewards.

We will have a similar have chunk reward system, where certain chosen addresses are stored by nodes.

Data comes in all the time, addresses depend on the Hash of this data, it’s easy to verify and hard to predict/mine. Chunk rewards are triggered based on the storage of Chunks at selected reward addresses… and this is when the new token is created, and given as a supplement to the node, on top of the data payment for that data.

During bitcoin verification money is traced all the way back to the verifiable block where it appeared, in our system, our DAG could have multiple sources starting off verifiable chunk rewards addresses.

So we have a pre-determined set of reward events/addresses, a fixed number of them with fixed values that summed up equate the supply we want to distribute. Similarly to bitcoin, as time passes the distribution will slow down as the probability to encounter those reward addresses diminishes.

At the moment we are saying that 70% of that 70% will be distributed over the decades like that, but… and that could continue to the full supply.

But we also know that there will be evolution of the Network, and that we may want to supplement other types of work that nodes may do in the future, such as compute.

So we refer to that potential in the paper… but that supply isn’t held anywhere, and it will be up to network participants—in other words node operators—to opt in to that through a future network upgrade (like a form of fork) to make that happen… or they just continue as data emissions.

There will be a full write up on this in due course, but I hope that gives more colour to it for now!

8 Likes

Absolutely - brilliant to hear a bit about the proposed mechanism for distribution. Sounds very interesting.

Do you think supplementing rewards is necessary for the efficient operation of the network’s markets for resources?

2 Likes

Yes, my proposal would even out prices (prevent them rising so much as the network fills), but it would also magnify the increasing of rewards to nodes, which should make the network alleviate a lack of supply of storage more quickly than without the intervention.

Saying that, I agree it would be a distortion that would still likely cause more harm than no emissions at all. But it should be an improvement vs the White Paper proposal.

This is my impression as well.

As you say modelling this would be impossible given all of the dynamics and moving parts.

But, it is safe to say that given the Storage Emissions rate is planned to be highest in the first year, which is the same time circulating supply is the lowest, this emission will form a significant part of nodes income, and therefore encourage supply at prices below what will be sustainable as emissions reduce.

I don’t think this effect will be reduced if the network grows rapidly, because rapid growth will likely cause token price to rise, therefore increasing the value of the Storage Emissions to nodes & maintaining the effective subsidy on storage.

Absolutely.

As I said, I’m not against the 70% emissions in principle, but I feel that having a well functioning market is highly important for the network, so want to question things that may reduce the efficacy of the network’s market for resources.

2 Likes

I like your input and understand your points.

Different supply and demand shocks could create short term ineffieciencies which could inpact the network short term.

I support that supply should only be inflated if there is a reason and that the best thing is a market of pure supply/demand.

With a demand shock were storage cost becomes steaper, that I don’t see become a large problem as that would probably be filled fast with new nodes wanting to grab some of that extra profit. So i don’t see why a price rise would need to keep the price down for storage as I believe that it will be solved quite fast.

Also that increased demand gives higher prices which also, while new nodes comes in, cools the demand for storage.

More problematic are example negative price shocks which gives a negative supply shock and the network could suffer a significant shrink in size when % of total nodes goes offline until price goes up and it becomes more profitable again. There is where I would, especially in a early stage, maybe support with incentives if necessary.

It would maybe be good writing supply/demand chart and thinking about what the consequences for different supply/demand shocks and what that could lead to.

Just some late night thinking out loud, logical mistakes might be there.

One of the worries is if something would effect the whole crypto market and price crashes like -95% or similar, how to make sure that storage cost covers expenses so not a huge % of the network goes down. But from the curve shown from the white paper gives hope that it can handle dramtic price swings and balance the network.

3 Likes

A couple of questions please.

  1. How are these addresses generated
  1. If an absolute address
  • How can you be sure that that address will ever have a chunk stored at it?
  • 256 bit address range is HUGE and
  • take billion years or more to even suggest coming close to having a chunk stored at even a tiny portion of it
  1. Or how will the address be considered stored at.
  • some sort of only using the upper 64 bits???
  1. What is to stop a gamer from generating billions of chunks (a few bytes long) a day till they get chunks that hash to addresses where rewards are at and using ones that correspond to their nodes being close.

Being open source code these addresses will be known

3 Likes

Yeah, it won’t be a lottery over the full 256 bit address space, it’ll be chucks being stored at certain prefixes that will trigger a pay out. Nodes will get rewards from being the first to store a chunk at an address that matches that reward condition.

So once a close group gets such a chunk the node with the closest node address in that group based on closeness to the chunk address can claim the supplement, at it’s effectively created in a mini-genesis event. An emission.

I’ll let Anselme to the full write up when he has the time though, it’s his baby.

7 Likes

Yes, this has always been part of the consideration, and one that I’ve argued strongly in favour of. It would most definitely be the simplest solution to just forget about the 70% of supply… but not necessarily the right one.

It’s also been part of the prospectus since the 2014 white paper. That’s not a reason to do it in itself, but an observation that it’s not a last minute bolt-on: it’s been laid out since before the ICO.

Your arguments are not unreasonable… and certainly not if we are looking purely at supply and demand, and a nice and efficient market for storage.

But what if there are bumps in the road to getting there? Such as unforeseen lean spells, or external factors that aren’t the sole preserve of the that market, like speculation on the token?

We want a resilient and robust network, one that can weather some storms, and have nodes stick around through them.

Distributing the remaining supply steadily, and over an extended period, to those actively supporting and sticking with the Network come rain or shine, seems like a reasonable way of doing it.

And it does so in a decentralised way too.

7 Likes

I think there is a case to be made, that if the 70% is going to happen, then release would be more manageable if done in a “bell curve” release over time instead of a large emission in the beginning that tapers over time.

  1. This would give devs time to adjust the network to quirks as more is released and would fade out in the later years so as not to disrupt the market.
  2. Most users of the network aren’t going to be users in the early days, so an early distribution won’t be as decentralized as possible, it would favor early adopters and while that’s nice for them it’s the same sort of early adopter concentration issue that we are trying to eliminate with the 70% in the first place.

What is the reasoning for an early large release and tapering? What considerations were made?

3 Likes

We are postulating that the Network will be most fragile in its infancy, and where we can likely expect the most volatility, and also relatively lower demand for storage until it is established and proven.

This distribution method will always be as distributed as it can be for any given network size.

Whereas a more centralising effect would be a smaller network requiring more support from a small number of entities that can afford to weather a lean patch. Incentivising a greater number of node operators at smaller network sizes with lower demand is more decentralising is it not?

Also… define “large emission”? It’s still pretty gradual, and over years…

It’s nothing by comparison to an ICO event that happens over a matter of hours of course!

4 Likes

I confess I don’t know what the planned distribution is - just the overall release curve. If it’s not a huge difference over time then good. I was imagining it was more like bitcoin’s release curve which was pretty severe.

This may well be true, but it doesn’t follow that releasing more early on will help with that. More tokens early on will enable more centralization of the tokens as only those who have a clue about the future of the network will buy and hold the most.

Furthermore, bigger players that want to disrupt the market (pump and dump; rinse,wash,repeat) will find it easier to do so if there are more tokens. So contrary to controlling volatility it may well increase it.

Sure distribution to users, but in the marketplace it won’t be - and that was my point.

2 Likes

Isn’t the opposite true though? Subsidising in a smaller network leads to lower barriers to entry due reduced risk, and more incentive to enter earlier, therefore higher decentralisation due to more diversity of supply?

2 Likes

Really, if Autonomi wants operate as a true supply/demand free market, Autonomi will need to have transparent price discovery (maybe the cousin of Autonomi’s own forthcoming distributed DNS we need a distributed DEX where some nodes chose to optionally run either or both and the network re-imburses them at some flat rate for doing so ) where all the nodes report their current price to the DEX network overlay and,

there should also be automated node-manager tools which allow the number of nodes to be increased or reduced in an automated fashion in reaction to different price movements (and even price classes, not all storage IOPS is created equal, which is why I like full Proof of Resource testing),

The same automated response to transparent price signalling as to the cost of uploads, maybe ‘group’ listed within ‘Close Autonomous Systems’ areas , containing many ‘close and relatively close’ private and public ‘close groups’, where the AS might have some use reporting actual latencies from close group a to not so close group z (with-in the AS).

the network would also need to facilitate client ‘market price sensitive’ easy to use ‘upload automation’ to control the rate of uploading one or more files in a row for the safe client, in response to changes in price discovery as described above.

The above imo, might likely be the way to best embed the behaviours of free market supply and demand, placed into the hands of the buyer and the seller found at the edge of the Autonomi network,

where both the buyer(uploader) and the seller(node operator) of storage space are both armed equally with automation to respond to market changes in upload prices in near Real Time given a Distributed DEX signal is available to each.

I do think PoR Proof of Resource/Capability is important to get this type of free market model to work, where such PoR/Capability (ingress/egress bandwidth through the edge router, as well as Core/thread count, clock speed, available RAM to operate Nodes, local read/Write IOPs speed should be published to the DEX together with node’s current storage price, essentially where price info and POR/Capability info is paired to effectively create a ‘storage class’ offer to the buyer/uploader for the renting nodes, so their automation code can make an adjustment as to the pay load size and flow rate of files to the network within their AS and even their close private group.

that said,

this type of RT free market idea/concept is likely way outside the scope of what the Autonomi network is able to deliver at the moment. :wink:

Anyway my 2 cents…

3 Likes

Thanks for that.

@Anselme ?
But can the issue of people knowing the address prefixes and then generating chunks whose hash matches the prefixes, then using their nodes choose ones that that would result in a reward, be addressed. A modern GPU should be able to hash chunks of a few bytes at massive amounts per day.

Remember that chunks do not need to be encrypted before uploading to the network and can be as small as 1 bytes, need 3 if doing self encryption because the file is split into at 3 chunks.

Hashing 1-3200 byte data sets is fast real fast.

So I could set my cpu and gpu to

  • hash small data sets at very high rates
  • keep the data sets that result in a matching hash to one of the emission addresses
  • keep doing that day after day after day
  • run up nodes on another machine
  • choose data sets that would result in one of my nodes getting the emission and upload it
  • reap the rewards
  • repeat run up nodes and repeating reward (if possible)

May not find them all, but more than enough to cover costs of a mining rig repurposed to hashing small data sets and cover the electricity and make a large pot of tokens

Remember the addresses cannot be secret since the code is open source.

1 Like

Really good to see this stated explicitly… and I agree it should be done, even if I see potential problems with the proposed method of delivering it.

I feel it would be great if this motivation for the dilution could be included in the whitepaper as well as the hoped for economic function of the emissions that are already written.

I agree resilience is important, and I agree with the 70% emissions should happen in principle. But what if those emissions were to work against resilience rather than for it?

I find it hard to imagine an Internet-like network experiencing significant lean times, assuming good apps and a variety of use cases (video, communications, backups, websites etc etc) and global audience, plus ability to schedule lower priority uploads to trigger if store cost were to dip.

But you did say unforseen lean times, so assuming one did happen, demand for tokens would be low at that time & price falling. If supply were also increasing alongside this, more rapidly dumping prices could be a bigger turn-off for node operators than receiving a higher number of tokens.

Regarding token speculation, I don’t think emissions would help nodes to stick around for similar reasons - if the token looked weak due to speculation, increasing token supply is the last thing that would be needed to inspire confidence to keep providing resources.

I wonder if it’s more likely that emissions as planned would reduce resilience rather than increase it, due to reducing the clarity of pricing signals, lowering expected future value of current earnings for node operators, and worsening lean-times by increasing token supply while demand is low?

Again, there are many variables, but I don’t think it’s clear that these emissions would boost resilience & they could do the opposite.

But if emissions must happen due to the good principles mentioned, and longstanding plans, are there other ways of doing it that avoid these market related issues?

For example, just chuck the whole 70% into the Royalties Pool with some vesting schedule, so the market for resources can work without interference, and all of the dilution only comes into being by adding serious value to the network ecosystem. A vastly better funded ecosystem might be far better for node operators than a boosted number of tokens earned!

2 Likes