Well I’m happy that royalties are gone. Added complexity and not just of code but also for people trying to determine market valuation. KISS principle is best. I’ve been against such things from early days.
Now if I could just get the team to wake up to the reality that forever data is risking the entire network to a node implosion collapse.
We need to be rid of the idea of permanent data. Nothing in nature is permanent - not even atoms … the higher the complexity of the beast, the less capable it is of maintaining fixed data as a part of it’s system - let’s call that Tyler’s postulate. The complexity of is such that permanent data will end up being a cancer that kills the network in a few years time.
and if you’re right with your postulate, that permanent data storage is an issue then a countdown that can be increased by paying for it can be added to every chunk later on as well
until this moment in time Moores Law can be considered working out pretty well which would mean forever storage shouldn’t be an issue … unless storage growth stops at some point in the future…
It’s not storage capacity that is the issue though. The implosion happens because of costs, demand, and the price of the token.
If demand drops off significantly at any point in time, then token price drops but at the same time the network increases the cost to upload data (as data doesn’t expire on the network) … so costs maintain, but less and less data is uploaded and token price goes down … meaning nodes will start shutting off - which then again puts up the price to store data further lowering demand.
It’s a doom loop death spiral network implosion.
Data needs to expire in order to keep the price for data from going exponential and destroying demand.
even if you’re right - it would then be slow enough to come up with a fix in time … no need to add the complexity now and give up before trying …
…with arweave we can see it works at least a couple of years so there for sure is no rush and they don’t look like dying/regretting the pricing model …
“not gonna happen” – That’s not an actual argument, so not sure why you bothered to write it.
“it would then be slow enough to come up with a fix” – this may be true, emphasis on “may”, but what is certainly true is that data upload isn’t consistent and when we have a significant drop in demand - or even token price drop, then we are at a significant risk of a network implosion and losing everyone’s data. Which means that all important data needs to be backed up somewhere else, which means the network has less value, and so less demand, from the git-go.
as if the network would be considered a perfectly safe data store from launch if we had data rot (and everyone would suddenly not keep other backups) …
the proposal of forever storage might add some additional demand too - don’t you think?
…if i had to pay for every byte I upload to the network continuous fees I would at first only upload what I really need/use … if i have the proposal of forever storage I will upload what I think I might want on the network at some point … delaying uploads doesn’t make sense - just makes the data less available …
So much educational sites would rot away. That is one of the problems the SAFE network was being designed to solve.
Also data unused takes so little space in future compared to incoming data. As others have said when they upgrade their computers they can keep all their old data in just a small area on their disk. The thing many miss is that old data not being used does not cost since the network is designed for spare resources. If it was designed along the lines of a data centre storage then there is a definite possibility that old data may have ongoing costs.
I just don’t see the problem that is being solved here and risk is being heaped on everyone’s plate.
If we have temp data, then I can pay for a period and even set it up to auto renew the storage later.
This would open up the doors to the main way data is stored now, gives more surety and confidence in the network and data can remain on the network for as long as one wants it to stay.
As storage prices go down so will the renewal fees. Temporary data isn’t a compromise, it’s a realistic and proven working model for data storage.
That seems like marketing jargon … please explain that - how is my sovereignty threatened by having only temp data?
What makes the network special is full decentralization, not permanent data. It’s the inability by a malicious actor to take down or remove data that gives sovereignty. Temp data can be renewed by me when I need and importantly IF I WANT, which is actually greater sovereignty than permanent data – which itself is a misnomer as it won’t really be permanent anyway. The more correct term would be “indefinite data” … but that’s not going to market the product very well.
Aside from renegotiating there no “there” there. The majority of that video is just marketing. Also renegotiation can be done automatically or by agents in the near future.
The data is also only perpetual so long as the network doesn’t’ implode … and my arguments for that appear solid. None have countered them logically yet - meaning it’s a real threat until proven otherwise. Just because Arweave has gotten away with it so far does not prove permanent data is a valid proposition.
I have a feeling some are looking to the network as a replacement for their own drives, memory sticks and so on.
If data is temporary then like what Unix, Linux, Windows, Mac OS etc do is have a temp directory/folder for that. Temporary files are not what you want to store with your permanent files. Why pay to store temporary files.
If its temporary in the sense of long time then if it was paid for then it’ll become that overhead that includes bandwidth, etc. Just a cost of providing persistent data so web sites do not rot away, so educational material doesn’t rot away because the uploaders are no longer with us, or files not on the list to be “refreshed” but is still very important to those using it.
The network was designed for persistent data so the world’s valuable data doesn’t just rot away.
I would suggest that if a network is desired where files can be deleted, potentially deleting records from other files or dedup meant others who uploaded the file didn’t need to but then its deleted from under them, then another network needs to be run up for that.
If the bitcoin blockchain wasn’t immutable append only, if transactions and balances could drop off the end an be lost, would your financial soverignty be threatened?
Are you really sure you want that to be an argument here? As it sounds like you are saying that if we lose data on a permanent data network, it’s all good.
Edit: BTW, I don’t like bitcoin because of that nature - It’s why I like the native token concept where there is no history nor need of it globally.
I think it’s a bit late to start debating one of the fundamentals that been key since the start, don’t you?
Counter arguments could be made, but just as you can’t prove that the impossible network is indeed impossible others can’t prove that it is, indeed possible.
The whole point of this project is to try things nobody believes can be done and solve some important problems by doing so.
Even if it fails, it’s a successful experiment in many ways.
It’s always been a ridiculously ambitious and risky, vision-driven project, and that’s what all of us long termers bought into.
I’m afraid it’s pointless trying to change minds about that now, at least I hope so.
To me, decentralised permanent storage is the only thing we can reasonably expect at this point.
That being unique to this project is what keeps me here, though what I’m doing has changed because that is all that’s left (until it isn’t and the native token becomes reality).
Well obviously I don’t think it’s too late. Sooner or later this will need to be fixed.
I disagree. The point is to produce something in the end that works. Dream of the castle in the sky, then build the foundation under it and if not possible, then lower your expectations. The team is already lowering expectations for some things, and I believe that reality must kick in at some point and they will change this flawed premise.
I should probably clarify a bit more. Bitcoin mining difficulty remains in proportion to the demand for miners. But what we have with and permanent data is reversed! If the token value drops (demand drops) then the amount of tokens people will demand for payment will increase, leading to the implosion event.
This is a serious issue and people want to wave it away, but sooner or later it needs to be dealt with by incorporating a temp data protocol and allowing old data to expire, thus allowing for lower future demand (as the market determines). Trying to force the market to accept data that people don’t care about anymore is putting good (still in demand) data at risk as well.