“Depending on the time period under study, bandwidth increases at between 20% and 100% per year. … the large bulk of improvement over time is due to breakthroughs in manufacturing, materials science, semiconductor optics, and signal processing.”
The source is Figure 3 in this 2017 ieee paper (the whole paper is really fascinating, especially when taken from a SAFE network perspective) IEEE Xplore Full-Text PDF:
Quoting from the paper itself
“it is not unreasonable to assume that the DSP underlying a 10-Tb/s interface will be able to fit within a single (or at most within a small number of) CMOS ASICs by 2024.”
The paper talks a lot about historical and expected increase in demand as well as how it affects supply of bandwidth. “network traffic has often been driven by totally unanticipated disruptions”
Seems like the gains from ASIC are not as significant as with bitcoin, but there are several avenues to explore. The paper talks about “Optical Transport Systems in 2024” in Section III, the list of potential areas to explore are
Scaling System Capacity Through Improved Fiber
Scaling Interfaces and Capacity by Advanced Modulation
Scaling System Capacity Through Wavelength Parallelism
Scaling System Capacity in Both Wavelength and Space
I wonder if the gains will be small enough that by ‘fuzzing’ the fastest response calculation, slower vaults can get kept in the picture without significantly affecting performance.
Still cannot see why rewarding all the vaults that have successfully kept the chunk and can verify it in reasonable time cannot be rewarded.
It solves
the need for “fuzzyness” in timing.
the incentive for gaming the speed is reduced a lot.
people with reasonable home computers will get rewarded which reduces the desire to pimp out their PC just to compete
the cost benefit for pimping out their system and connection is a lot less since the returns for the fastest system/connection over a decent home system is not as great. It would be expected that pimping out will actually mean they earn less profit (and ROI) than the home computer using spare resources.
The reasons to reward all who successfully store and retrieve (sending signature) in reasonable time is
The network needs multiple copies
The network does not care if the one machine is always the fastest for a particular chunk
The network needs the other copies for when the fastest fails or is offline (IE data security)
world wide latency actually helps the home computer since its unlikely that pimped out machines would be holding every copy of the chunk, especially since being pimped out is not as great a benefit.
The most important point is that the chunk needs to be stored on multiple machines so it is only fair that all machines that does that successfully be rewarded since all were needed to do that.
My 2 cents again and in my opinion solves a number of the issues with changing technologies (bandwidth + PC/storage) and reduces the benefits of pimping out machines which is limited to the richer group of society (rich become richer).
The distribution of the rewards is the important thing to get right. For instance
equal reward almost ruins any incentive for people to improve their machines/connection.
rewarding only the faster means that there will be a move towards only fast machines on fast connections (centralisation in areas of high speeds) as those who spend the money increasingly are the only ones being rewarded.
the right mix means that
home users will get rewarded
the rich will find it more difficult to break even if using pimped out machines/connections. They were always fighting an uphill battle to compete with home users. And home users getting rewards for doing the right thing means the vaults will be mostly in the hands of the home user.
That makes sense to me. You want competitiveness between vaults to let the network have top performance, but you don’t want centralisation created by ‘super vaults’ (in size or performance). Can a Vault-time-out be an option, where a vault that has successfully delivered x chunks within y seconds gets a time-out ‘penalty’ or pause for a few seconds? That would also affect network speed a bit, but make the Vault system fair for all users.
Your home PC vault has been chuntering away for months 24/7.You have served at least n chunks per day for m months. Your hard-earned vacation is coming up soon. You earn a 2-week break with no loss of status. Presumably with some extra security checking on re-start…
Its all in the ratio of the reward for fastest compared to the rest. This provides incentive for people to improve their home computers and if not too high then discourage the race to the most pimped out machines/connections (thus centralisation)
Just a quick note here that if we were to store SafeCoin amounts as arbitrary precision rational number (store numerator and denominator) then we could operate with exact amounts when dividing up farming rewards (eg 1/3 instead of 0.3333333333333), and would not bump into division limitations of fixed decimal point if paying out small reward to eg billions of contributors. So there is no “tiny remainder” like in Superman III or Office Space. Each contributor would receive the theoretical exact amount.
There are at least 2 rust libs for mathematical operations with rational numbers, one of them a wrapper around GMP. I’m talking about internal representation only, used for math operations… user facing display amounts could still be decimal.
Math ops with rational numbers are slower compared to integer ops, but maybe worth it in the long run…?
I haven’t heard of any cryptocurrency doing this yet, btw. anyone know of one?
There is nothing wrong with doing financial stuff with fixed point integers. It is a well established part of computer science and financial institutions.
We discussed the issues with rounding errors and is one reason the industry uses fixed point integers
I think perhaps you are confusing rational numbers with floating point numbers. The former represent fractions exactly by storing both the numerator and the denominator and performs division as fractions, giving exact answers. The latter has rounding errors and approximations.
One would use rational number data type exactly because one wants to avoid rounding errors when dealing with ratios/division. (storing amounts as int does not fix the division problem).
We don’t need the former and the latter is enough. It is exactly 333333333 nanos in the fixed point integer representation implemented for safecoins. As said by @neo, there is no risk of rounding errors with this representation.
Yes, this isn’t 1/3 but we don’t need this value. And then why not also square_root(2), PI, e, ln(2), … which are not even rational numbers. All these are values that can be be expressed in some math packages like Maple but they are not needed in the financial domain.
Slower yes, worth no.
Not me, not even in the more general financial domain.
@tfa yes I understand perfectly well the integer representation, where decimals are calced only for display. I have built financial apps. I understand that is what cryptocurrency developers are comfortable with. I am simply thinking outside the box for a moment. And expressing the idea so at least it is “out there”.
I brought up the rational number data type because it is seemingly the best fit with a perfectly ideal reward distribution to all that contributed, no matter how small the contribution – as I described above. I am not saying we should use them, only that it is interesting to think about in that context.
A rational number reward is provably more equitable, even for a small number of participants. Let’s say we have a reward of 100 SafeCoin to distribute amongst 3 farmers. Using a rational number, each receives 33 1/3, and all is well. Using the integer representation, rounding must occur. I will use 2 decimal places for brevity. So the first two receive 33.33. The last one may receive either 33.33 or 33.34. If the former, that is “equitable” but each party has been robbed of a teeny amount they earned and a unit of currency has been lost/dropped. How do we account for it? If the latter, it is simply not equitable, as one party received more than the two for the same amount of contribution.
This may seem an inconsequential amount, particularly with 64 bit (or larger) ints, but to me it is still interesting to think what if we can solve it exactly instead of constantly fudging the numbers at various levels of code.
Slower yes, worth no.
that seems a value judgement / bias. To me, having exact numbers just feels “right”. The only reason NOT to would be if we simply cannot due to a technical constraint. The most obvious might be a performance issue. But it is also quite possible that the overhead of other operations (such as signature checking, hashing, etc) will be comparitively so huge, that int vs rational makes approx zero difference in practice for transactions. We wouldn’t really know until we benchmark both options, other things being equal.
Not as a direct response to this quote, more to indicate the scale of how I’m thinking - Spread Networks is many orders of magnitude more significant than a ‘pimped out machine’, having built a $300M fibre cable directly between Chicago and New York for high frequency trading, for a ‘tiny’ 1.5 ms advantage (taking a 14.5 ms trip to 13 ms).
This is the sort of stuff that will arrive at the SAFE network (ie large scale industry level developments). I’m not saying it’s a problem, but that it’s worth considering and designing the economy to work with these people so that their development ends up being to the benefit of everyone not just for the operators. I’m not suggesting it’s easy, but a really interesting thing to think about.
I’m sure there are many other examples, especially in microwave technology which doesn’t suffer from latency losses the way fibre does.
Interesting idea to think about, it’s a nice perspective to look at the problem from.
Not sure how this would work in the user interface.
Also numerators and denominators have some limit, eg u64, so when someone tries to have a wallet with very large unequal denominators it requires rounding, eg 1 / MAX added to 1 / (MAX-1) cannot be added because the denominator cannot become large enough.
I would imagine that amounts would be shown in decimal in UI, possibly with a toggle or tooltip to show “real” fractional amount. They key thing is that all math operations would take place using the rational number data type.
Also numerators and denominators have some limit, eg u64, so when someone tries to have a wallet with very large unequal denominators it requires rounding, eg 1 / MAX added to 1 / (MAX-1) cannot be added because the denominator cannot become large enough.
That’s why I specified arbitrary precision rational number. The idea is to use arbitrary precision numbers for both numerator and denominator, which is what GMP does, for example. No rounding required.
Yes, it is an insignificant amount. In your example 2 people receive 33.333333333 and the third one receives 33.333333334. How much 0.000000001 will ever be worth? Some people have big dreams, but I am a realist, and this will always be dust.
And the contract agreed by the participants doesn’t need to be 1/3 for each but can approximate this. There can be a clause that stipulates that rounding dust is given to a participant chosen randomly or given to the network or to a mutual fund, ….
I have no specific examples of this, but I think that iterative operations can reach the physical limits of the numerator and denominator (for example daily compound interests over several years).
Same here: we don’t need them because the contract can stipulate explicitly that rounding dust is to be credited to the bank (for example) and the next step starts from the rounded value.
@tfa I find it interesting to consider/discuss theoretically exact calculations without need for handling special cases every time there is a division operation. If you don’t, that’s fine. The integer approach and its tradeoffs is well understood.
You need to examine 70 years of financial programming and if you do then you will see all these arguments and always find that in the end fixed point has a lot of advantages. Not bias, but history experience.
As @tfa says the real need for 1/3 situation does not happen all that often. Also we have lived successfully and happy without needing 1/3 precise to the infinite place. 1/3 of a $1000 deal in which no one is concerned with the 1 cent left over, or the micro dollar in a financial account.
Also people are taught the decimal currency from before school, and that is what they understand, The populace is not taught rational numbers for currency or similar, and thus rounding can occur in the human interaction. Also when thinking money, currency etc and I say to 7 people I am going to split 1/11th of the proceeds with you seven people then rational numbers for the general person becomes meaningless. The proceeds is say $77 so just say i am going to give you all $1.00 and its easy.
Also as @tfa says the ninth place will always be dust and if EVER it is not then we need to move from 64 bit integer to 96 or 128 integer long before that happens.
Also speed is always preferred over perfectionist perception
Never slow things things down so that you get infinite precision when perfectly useful practical precision is more than needed
And as a side note, the time used to implement rational numbers is taking away from implementing the essentials of the network, when it is of questionable benefit/detraction.
@neo thx, you’ve summarized the status quo “think” quite nicely.
Though if you actually have a link to meaningful discussion or research around using a modern rational number data type library for a financial application, I’d be interested to read that.
the real need for 1/3 situation does not happen all that often
if you say so, but going by my own experience, the need for division has occurred in several financial apps I’ve worked on, and the rounding solution is always gross to me, no matter how many extra units of precision used.
the time used to implement rational numbers is taking away from implementing the essentials of the network
Don’t worry… no one is implementing rational number support in SAFE now, and doubtful ever will be. It’s just a little thought experiment I put out into the universe in the context of a perfectly fair reward to all contributors, which is itself just a thought experiment in this thread, and something I’ve contemplated since early days of Bitcoin.
If someone were to implement it, it’s a simple matter to use rug::Rational for amount in a Transaction. Though I’m not sure if that would play well with AT2, so there’s that.
Actually because this is a well established science (computer science) then I feel the need is for someone to show the factual need for rational numbers in financial accounts apart from infinite precision or cool idea.
In my post I pointed to the reasons, specifically the way financial systems work and how people actually interact. And specifically the way humans are educated from before school to work with fixed point finances. Both of this is evidence for there being no practical need for rational numbers in finances in our world system.
Remember we need to program financial account keeping (coin balances etc) for the masses out there and not a few special rather rare situations. When we try to be novel with financial we often create problems for the users of that system, this is from experience. It needs to feel familiar and easy for the lay man and the accountant more than for us engineers/scientific people. Its finances.
Now for other Scientific, Engineering and Mathematical reasons then rational number would be great.
Having done years (> 2 decades) in scientific & technology & systems programming and over a decade or two doing financial programming I see a huge differences between the computer science of the two.