Continuing the discussion from Early demand, and effect on storage costs, after launch:
@dirvine, I love(d) the show Futurama. They did one called the Farnsworth Parabox - Farnsworth being the resident scientist - in which they illustrated the Parallel Universe Theory - a sub-theory of the multiverse theory.
What you’re talking about is
that which it created - it becomes itself apart of.
Which is close to the concept of some forms or panentheism:
…the cosmos [Network code] exists within God [the Network itself], who [which] in turn “transcends”, “pervades” or is “in” the cosmos.
– Wikipedia - Panentheism
[…SPOILER ALERT…]
Now, in the episode - to exemplify the point - there existed two boxs. The box in Universe-A contained Universe-B, and contrarily, the one containing Universe-A was located in Universe-B.
Hilarity ensues, and we end up with the two groups of people in each universe reaching inside of their respective boxes, and trying to pull them inside-out from the other side.
Back at Planet Express, both Universe-1 and A pull their respective boxes from the inside-out. Essentially, Universe-A has a box containing their own universe. Bender shakes the box a little, causing an earthquake. As Fry sits on the box to watch television, the universe momentarily folds in on itself. [Think fullscreen → widescreen]
– Futurama Wiki - The Farnsworth Parabox
In the Network’s case, what you have is the code that governs the network completely contained within the Network itself. It’s like the Neo of apps, except that he would now be the network, and not merely of the network. The Neotwork.
Now my thought is that, typically, in these situations, it’s up to the internals of the universe to protect that which is the universe itself - the Neotwork (ha!) - and distributed decentralization does just that. It entrusts it completely to the entire populous.
As an aside, since it can by any of the populous at any given time, it can be said to be all of them with an equal amount of probability no matter the knowledge of any given party.
So in the Neotwork’s case, the inherent design of it is protected within the implementation of itself. At this point we reach homeostasis. Or more accurately, equilibrium. But perhaps that’s not even accurate enough.
It’s more of a mix of the two. As for homeostasis argument, well, it is not able to be destroyed. It’s public and distributed throughout the network like any other file. So once it exists inside of the network - given that the network has a critical mass of users - it can’t be destroyed. So it - despite entropy/time - statically exists inside of the network. Therefore it achieves homeostasis from within itself.
But it is also able to be manipulated within itself. It can be improved upon and improve it’s own environment. Only once we have a working upgrade merge process, can the network be improved upon from inside of its own environment - collectively. Therefore it achieves equilibrium inside of itself. However that equilibrium is instantaneous and changes are unrecognizable.
To do this the upgrade merge process has to be tested against every aspect of the network. It can then upgrade flawlessly and not break usability. And it’s instantaneous as well. There’s approximately zero recognition that the upgrade was applied. So one may say that it’s static equilibrium.
At this point the Network is an App of itself (as explained above) it is modified, maintained, and upgraded. Core devs will just be serving the same purpose as any other App. It gets users using it’s product, and they get rewarded for the users using it (assuming an approaching perfect App-rewarding scheme). But it’s the users use thereof per improvement/new feature.
Using this thinking, we can see how App rewards can approach zero given sufficient entropy.
Now this benefit of the network is not present at this point. There are a couple of items required:
- Path to Merge Upgrade to App Codebase
- App Reward Mechanism
Once these methods sufficiently approach perfection, the core dev payments will look exactly like app devs’ and be a % based on new features added that approaches zero over time/entropy for each new feature.