Update 19th September, 2024

I think the very tip of the iceberg we are at makes this even more powerful.

With open ended (continuous learning) AI basically compressing all the world’s data then where are we?

This is definitely something that plays on my mind a lot. I love the fact I can take a current 1B param model (3 Gb or so) and with that I could restart humanity, i.e. make fire, kill animals, build shelter, design chips and computers and so on, is mind blowing. So compressing human knowledge like that is madly underestimated now.

So where are we going? I suspect old data may be comprised into models and we don’t care who wrote stuff, she discovered stuff and so on, all we care about is the knowledge. So with that knowledge in a super compressed form is pretty obvious to me.

From there we have current epheriral data, the “thinking” type temp data and managing that. Is that a Safe network? possibly.

So the route humanity is on is incredible. But the requirement for secured data storage (Compressed or not) is still very important, critical even.

Interesting times…

16 Likes

And just on communication, most of us will have had 5-10k good conversations across a life time. This is having millions at a time and will have had billions very soon and in all the languages. A savant human might be able to absorb 28k books over a life time. These have distilled with good or plausibly perfect recall millions of books and that’s just scratching the surface. Right now they are already better at taking the other’s perspective than any human at tests that try to measure that. And of course we are glacial in our comparative speed of thought. By the time a lot of us get into conversations with it limited number of months hence forth, it will know us better than we know ourselves and better than any human has ever known another human. We will be accessing its clarity to interpret not just other languages but interpret ourselves and our friends and family. Human experts will be so forgotten we won’t be able to believe we ever accepted such a concept.

The stabilizing and sedative effect on our warring ways will be like an injection of the most powerful sedative. And its hitting the most aware, engaged and connected people first. Its hitting the pot stirrers. Our warring, war against ourselves ways will have suddenly lost their mindless unconsciousness as we drift off. We will float across the river on this raft even if we sleep the whole way.

2 Likes

Thanks old friend, for not implying I’m a long term sleeper saboteur.

Anyway, I think you outline one major technical compromise. Relying on oracles creates vulnerabilities to surveillance and access for a start.

I have little faith that the solution is a fork. What made MaidSafe trustable was their obsession with decentralising everything and David not being motivated by money.

His character and the lengths he went to for his vision are what made this project so attractive and remain so, as the cesspool of also rans emerged from Blockchain chancers and worse.

So I do not expect we will see anyone with the resources to make a successful fork do so with a tenth of David’s integrity and ability to turn down riches and personal aggrandisement to stick to his vision.

But the die is cast so I am no longer going to spend time arguing my case or thinking about these or other issues around the September plan.

I’m flabbergasted that they are still calling the October event ‘Launch’. That’s what I mean by sophistry.

I’m also surprised everyone else has let that pass without comment, although I know that in private not everyone is as ok as it appears in public.

If I’m wrong and in 5 years we have S.A.F.E. and those fundamentals are truly pretty much met I will be as pleased as anyone here. I don’t like to be proved wrong on anything usually, but in this case I’ll be delighted.

Take care and don’t get any new tattoos just yet :smile:

11 Likes

I guess it might make more sense to call January ‘launch’, as there will be a TGE and permanent data.

Maybe October should be re-branded as ‘Network token test’ or something?

I’d be interested to hear why October is being called ‘launch’… maybe launching public engagement, e.g. marketing. Or maybe it’s kicking off the ‘launch’ process that ends in the network running in January if all goes to plan?

Still, they’ve been clear recently on defining what they mean by ‘launch’ and what the process will be like, so that’s ok for me.

2 Likes

Launch of this project has always meant the working network marked by token distribution to holders.

Reaching that by October was what the roadmap promised and the growing dissonance between that possibility and what we were being told - that launch is still going ahead on time - is what was misleading in the run up to the about face and pivot of the September plan.

That wasn’t necessary - there are ways to communicate with the community and others without damaging their credibility or the project. In the past we have seen that done well quite a few times.

I think it’s deceitful to publish a roadmap which says Launch in October and when we are eight weeks from the date change the meaning of launch to make it appear as if the roadmap’s culmination has been successfully reached.

As you point out, they could have been clearer and honest by renaming the October event.

I’m amazed that they chose not to do that and it reflects badly on them, and undermines our ability to trust them. As noted, that’s not just a one-off but an underlining theme of this year. It’s just the most blatant and to me inexplicable footshot, because it was so unnecessary.

4 Likes

The ability to adapt to changing circumstances is a strength, not a fault.

2 Likes

A network launch is a launch with no persistent data and no persistent token :wink:


Check out the Dev Forum

4 Likes

It can be done by David himself…, just not a fork but a parallel network, I’m surprised no one has thought about this solution (although I am not aware of possible technical obstacles), and I’m sure it’s the only way to avoid regulatory problems and eventually get the native network up and running as originally intended.

You’re not!, damn there goes my conspiracy theory out the window. :rofl:

More like a beta launch.

Glad my name isn’t “everyone else” I’ve said it before that we are really only in alpha phase of testing at the moment.

yea without the native token, it’ll be real hard to come close

You split the data or duplicate it. Costing even more to upload “forever” since to ensure it survives you have to upload to both. Which ever gets favoured the other will die off due to people choosing the more popular one to save cost on uploading and the more popular one has the all/most of the data, thus increases popularity even more ensuring the other is not used.

5 Likes

It begins to look more like the launch of a ship where a bottle of Champagne is smashed on the hull, there’s a nice speech, a band plays and it’s slid into the water (the boat, not the band!) but then there’s a few things still left to fit and tune before it goes on its merry way.

3 Likes

Hmmm, that’s not how things work. If you look at the crypto world you will see countless copies of the same code being used.

Maybe if you had added the clarification that you are talking about 100 years in the future, there is a chance that things will turn out like this.

People use different things for different reasons, and price is just one of the reasons.

And what will survive is only speculation, there are things that are unknowable and unpredictable - the behavior of billions of people is one of them.


Check out the Dev Forum

3 Likes

And as the team, & David say this is not a crypto project

Data is different to a financial style of thing. That is why we really only have one internet and not thousands of networks where data is separate to each other.

2 Likes

The data part is not blockchain, but the token is blockchain, so don’t be surprised if some year or so crypto behaves like the internet thanks to blockchain abstraction:

https://www.theblock.co/learn/312777/what-is-blockchain-abstraction?modal=newsletter


Check out the Dev Forum

To fork the network is to fork the data, but the fork will have a clean slate to start with. The fork only gives you a new network to store data.

If the network was only the token then you are correct 100%

I agree the ERC20 side is forkable. But you cannot separate the two and my consideration is what would happen to the 2 networks as a whole and not separating the data from the token

2 Likes