Exploration of a live network economy

Exploration of a live network economy

(Edits made to update for an error in the data)

Next iteration

Each iteration of models will be given a sequential number, and we will simply call them Version 1 .. n. The first proposal was Version 1.

Version 1

Results from v1 simulations are available online in this excel.

After days of exploring the area around where the needle was dropped on the map, I have gathered some insights about that specific solution, which makes me believe that there is some fundamental problem with it. So by that, it is time move on to some other location on the map.

Insights

It was interesting to try out what effect on the system these interrelations would give. Digging down into this model, and working with implementing the simulation, gave a better understanding of the various parts. The untried combinations of variables, and tuning of values are still vast, however, as initially suspected, one of the problems was that there were so many variables included, which led to difficulties in tuning and - as it seems - difficulties in reaching stability of the system.
For that reason, moving on to another version, we want to try something more light weight.
I do not rule out the usefulness of some iteration of Version 1, however I feel I want to continue the exploration, and perhaps move back in this direction at some other point in time.

Version 2

(Results from v2 simulations can be found online in this excel.)

Browsing earlier suggestions, I decided to implement a suggestion for StoreCost from way back in 2015.
For FarmingReward, I did a modification of RFC0012. The reason I did not take RFC0012 directly, is that I have not been able to reproduce a random success rate of modulo of a random hash and FarmingDivisor, that both correspond to the RFC and gives a result that does not blow up. I can discuss this topic further if someone is interested. But basically, since coins are now divisible, all I needed was the sought success rate, by which I could just multiply with that number, as to get the equivalent of the probabilistic reward.

So, let’s look at the modified R.

First, a recap of variables:
s = Sections count
f = Filled storage percent (formerly d)
u = Unfarmed coins percent
R = Farming reward
C = Store cost
b = nanosafes per coin

FarmingReward R

The reward is composed as follows:

R = (u * f * b) / s

And the code:

public override Coins FarmingReward(Network network)
{
    return network.UnfarmedCoins * network.PercentFilled * (1 / (decimal)network.Sections.Count) * Coins.One;
}

Farming reward in RFC0012 includes f in calculation of FarmingRate, and then does a probabilistic execution based on a random hash and the inverted FarmingRate (i.e. FarmingDivisor), and then another probabilistic execution by checking if the coin exists or not, so basically based on u.

The difference now, is that we have replaced the first probabilistic part, with a success rate that is proportional to how much storage is used. They are not equivalent, but RFC0012 does base this part on the same variable. However, due to not being able to properly estimate the statistical outcome of that solution, this change was introduced instead.
The second probabilistic part is equivalent, and it is giving a success rate based on percent of unfarmed coins, which over time should yield about the same results as trying to create a coin at an adress not already existing.
In addition to these, the element of network size has been included, as to give gradual decrease of R as network grows. Simply, the reward is divided by number of sections. The motivation for this is the same as in Version 1.

StoreCost

This is an exact implementation of a suggestion made by @Seneca in 2015: Early demand, and effect on storage costs, after launch - #24 by Seneca

The following is an excerpt:

  1. Close groups must track the total amount of safecoin issued as rewards, and must track the total amount of safecoins absorbed from PUTs
  2. Based on the amount of issued safecoinsand the total amount of safecoinsin circulation, a target figure of total amount of safecoinsabsorbed can be computed
  3. On every PUT, if the actual amount of safecoinsabsorbed is lower than the target figure, the PUT cost is increased, else if the actual amount of safecoinsabsorbed is higher than the target figure, PUT price is decreased.

Let’s put it together. Step 1 and 2:

I = total SafeCoins issued
A = total SafeCoins absorbed
TA = target total SafeCoins absorbed
S = supply of SafeCoin (0.0-1.0)

TA = I * S

S makes sure that the rate of increase in SafeCoin supply (i.e. inflation) tapers off as we approach the cap. At the cap, S == 1.0 , so then the target total SafeCoins absorbed is exactly equal to total SafeCoins issued . Since there may be times when the farming rate algorithm suddenly has to increase rewards, we probably want to keep a buffer of reserve SafeCoins for such times. If we want to keep 10% of SafeCoins in reserve, the formula becomes:

TA = I * (S + 0.1)

Step 3:

MB/SC = Megabytes bought per SafeCoin

if (TA > A) {            //fewer SafeCoins have been absorbed than we want
    MB/SC--;             //So increase the PUT price to start absorbing more of them
} else if (TA < A) {     //More SafeCoins have been absorbed than we want
    MB/SC++;             //So decrease the PUT price to start absorbing less of them
}

And the code implementation for this is:

public override Coins StoreCost(Network network)
{
        var targetTotalSafecoinRecycled = (network.CoinsSupply + 0.1m) * network.TotalPaid;
        if (targetTotalSafecoinRecycled > network.TotalPaid)
                --_chunksPerSafecoin;
        else if (network.TotalPaid > targetTotalSafecoinRecycled)
                ++_chunksPerSafecoin;
        return new Coins(Coins.One.Value / _chunksPerSafecoin);
}

where _chunksPerSafecoin is initialised to 11 134 nanosafes per MB (based on preliminary voting results in Polls: How much will you spend? How much storage do you need? etc)

You might spot the constant 0.1 on the first line in the code block, this is telling that we are aiming for 10 % buffer of unfarmed coins. It is from the proposal, but we will also try this with the previously stated goal of 50 % unfarmed kept as buffer.

Here is rest of the post:

A great benefit of this approach is that we actually have control over inflation now. Unlike in BTC where the inflation rate is a function of time (block count), with this algorithm the inflation rate is a function of usage of network resources. More usage (growth of the network) increases the inflation rate, less usage decreases the inflation rate.

Since we start with 30%(?) of SafeCoins already in existence, I should be initialized at 0.3 * 2^32, and A should probably be initialized so that TA == A where S = 0.3 .

MB/SC can be initialized at a guesstimate number, the algorithm would quickly correct it to the right value.

Method

Initial values

InitialUsers: 5000
TotalSupply: 4294967296:0
InitialSupply: 644245094:367787776
InitialUserCoins: 128849:18873557
Unfarmed: 3650722201:632215000
InitialUserChunks: 100000
InitialUsersPerVault: 3
ReadWriteRatio: ReadWriteRatioNo3
UsersPerVaultRatio: UsersPerVaultRatioNo3
ActionsPerUserDay: 100
GrowthRate: DemandBasedNo6
FarmingAlgo: RFC12Seneca
CachedFarmingAlgo: True
VaultSize: 500000
DaysSimulated: 3650

Simulation code

    public void Start()
    {
        var days = Parameters.DaysSimulated;
        var actionsPerUserDay = Parameters.ActionsPerUserDay;
        var growthRate = Parameters.GrowthRate;

        var sw = new Stopwatch();

        Report(-1, 0);

        for (int i = 0; i < days; i++)
        {
            sw.Restart();

            var nodeCount = _network.TotalNodeCount;
            var newVaults = (int)(nodeCount * growthRate.GetRateFor(i, _network));
            for (int j = 0; j < newVaults; j++)
                _network.AddVault();

            var totalVaults = nodeCount + newVaults;
            var usersPerVault = Parameters.UsersPerVaultRatio.GetRatioFor(i, _network);
            var totalUsers = (int)(usersPerVault * totalVaults);

            Parallel.For(0, totalUsers, s => Action(i, actionsPerUserDay));

            sw.Stop();
            
            Report(i, sw.ElapsedMilliseconds);

            TryResetCache();
        }

        Output();
    }

and

    void Action(int day, long actionsPerUserDay)
    {
        var ratio = Parameters.ReadWriteRatio.GetRatioFor(day, _network);
        if (ratio > StaticRandom.NextDouble())
            _network.Get(actionsPerUserDay);
        else _network.Put(actionsPerUserDay);
    }

Market model

Growth rate

public class DemandBasedNo6 : GrowthRate
{
    readonly UsersPerVaultRatioNo3 _usersPerVaultRatio = new UsersPerVaultRatioNo3(3);
    const int year = 365;

    public override double GetRateFor(int day, Network network)
    {
        var d = (double)network.PercentFilled;
        var f = Math.Pow(d + 1, 2) - 1;
        var disallowMultiplier = d * f;
        return disallowMultiplier * GrowthRateMultiplier(day, network) * DailyRate(day);
    }

    double GrowthRateMultiplier(int day, Network network)
    {
        var c = network.StoreCost();
        var r = network.FarmingReward();
        var t = (double)(c / r);
        var m = 1 - t;
        var u = _usersPerVaultRatio.GetRatioFor(day, network);
        return day >= 365 ? m * u : m * (365 - day);
    }

    double DailyRate(int day) => YearlyRate(day) / year;
    double YearlyRate(int day)
    {
        if (365 > day) return 0.20;
        else if (730 > day) return 0.16;
        else return 0.12;
    }
}

Users per vault

class UsersPerVaultRatioNo3 : UsersPerVaultRatio
{
    readonly InitialRatioTimeChangeNo1 _ratioTimeChange;
    readonly ReadWriteRatioNo3 _demand = new ReadWriteRatioNo3();

    public UsersPerVaultRatioNo3(double initialRatio)
        => _ratioTimeChange = new InitialRatioTimeChangeNo1(initialRatio);

    public override double GetRatioFor(int day, Network network)
    {
        var c = network.StoreCost();
        var r = network.FarmingReward();
        var demandWeight = (double)(r / c);
        var g = GrowthRateMultiplier(day, network);
        return demandWeight * g * _ratioTimeChange.GetRatio(day);
    }

    double GrowthRateMultiplier(int day, Network network)
        => 1 - _demand.GetRatioFor(day, network);
}

class InitialRatioTimeChangeNo1
{
    readonly double _initialRatio;

    public InitialRatioTimeChangeNo1(double initialRatio)
        => _initialRatio = initialRatio;

    public double GetRatio(int day)
    {
        if (180 >= day) return _initialRatio;
        else if (day > 180 && 365 > day) return 2.2 * _initialRatio;
        else return 3.3 * _initialRatio;
    }
}

Read-write ratio

class ReadWriteRatioNo3 : ReadWriteRatio
{
    public override double GetRatioFor(int day, Network network)
    {
        var c = network.StoreCost();
        var r = network.FarmingReward();
        var w = (double)(c / r);
        var t = w * Math.Pow(1 + w, 2);
        return Sigmoid(t);
    }

    double Sigmoid(double x) => 1 / (1 + Math.Pow(Math.E, -x));
}

Results

Data points can be found online in this excel.

Discussion

Storage

Filled storage percent f rose up to near 45 % quite sharply and then stayed roughly there all through the 10 years simulated. A slight decline was seen from that point, which is not in accordance with the target of balancing around 50 %, so this would indicate that the models need further work.

Market

Even though models were used for GrowthRate, UsersPerVault and ReadWriteRatio (that had been fine tuned in Version 1 as to model something resembling a realistic market, responding to changes in price of storage as well as reward) it was surprising to see that the system was very stable already in the very first simulation, with no tweaking done. It seems likely this can be attributed to the previous work with fine tuning models.

Growth rate

A growth rate of 12 % per year, after passing the very early stages of the network (2 years), seems like a reasonable rate. This would be roughly the growth in internet users seen between 2005 and 2007.
Later stages of the network (>10 years) would probably much like internet adoption, see additional decline in yearly growth rate.

Clients

The number of clients so sharply rising, and then quite steadily falling, is a result of initially very cheap store cost - as determined by the ratio of C to R (this is a model of how cheap C is) and a gradual evolution of the network with lots of uploads, in combination with a modeled decline in users per vault ratio.
One could argue that the decline in users per vault is not realistic. The idea has been that it is due to an increased adoption of the practice to run a vault. It would seem that there would instead be a transition of that initial mass of clients, into vaults, which is not what we see. Instead these clients are in the later stages of the simulation no longer users of the network. A maybe far fetched after-construction, could be that these are users that took advantage of the very cheap storage costs to upload a lot of data, but that later are not using the services of the network to any larger extent. Rather, they just keep their backups, for some undefined point in the future.

Store cost

This number is quite steady, and all the time significantly below the farming reward. If we assume the farming reward to be baseline indicator of safecoin fiat value (meaning that as it decreases, it indicates that the fiat value increases) we can conclude that Version 2 - as intended - gives an initially very cheap store cost, and sees a gradual increase towards the real market value.

Farming reward

An initial spike up to 6 million nanosafes per GET (0.006 safecoin) at day 7 after launch, is followed by a sharp drop to 218.000 nanosafes after about 323 days. A 96 % drop in less than a year. We then see a surge, which is the result of the market model, after which a slower and steady decline is seen till the end of the simulation.

Unfarmed coins

Also in this model, it seems it will take many many years before we get close to 50 %. This simulation used 10 % as a buffer target, which would take even longer to reach.

Vaults

It would seem like 810.000 vaults in 10 years is a bit of a pessimistic estimation of growth.
Previous simulations have gone up to a maximum of 24 million vaults (and 1.2 billion clients) in only 2 years. Simulations take much longer with that size, and reaching 10 years with such a large population would probably take weeks, and maybe even months or more with a continued growth. It is hard to say what is a realistic adoption rate. But it seems fair to believe the number is somewhere between these two values.

Further work

Improvements on the model of user behavior and the market, are definitely needed. These are very primitive models. Preferably models based on various observations, data sources and perhaps existing work in a similar domain. It would be desirable to try various levels of irrationality and dysfunction of the market, as to determine the resilience of the economy model.

9 Likes