Launch Planning: Community Update šŸš€

Of course I think they will be enough, but everything in life takes time. Do you think a baby can take care of itself from day 1?

It’s crazy to think that an unproven network will be used enough from day 1 to pay the node operators and grow.

Over the years I’ve seen enough unproven networks to be initially subsidized with over-the-air tokens. The previous plan for subsidies continuing decades was good.


Check out the Dev Forum

3 Likes

6 posts were split to a new topic: Communism

It’s not crazy when the network in question is based on a market for resources rather than based only on expected future value / network effects. It can function economically at small scale or global.

Dilution of supply isn’t that effective as a subsidy. If there’s a given demand for network tokens, increasing supply ahead of demand only reduces the token price, which also will reduce the value of any prior token earnings by nodes.

I think nodes will be more excited to earn tokens that won’t be significantly diluted vs tokens that will face the additional sell-pressure and token supply growth that a dilution based subsidy would entail.

They’ll also be more confident to hold them long term with less dilution.

3 Likes

Absolutely not. People who run Bitcoin nodes and Ethereum nodes and everything else with a subsidy are actually buying tokens from the network with their resources speculating that they will get more tokens than if they bought them from the market.

I know because I’ve also mined Ether and Sia and all sorts of other things. If you rely only on the market from day 1, you have no predictability as a miner, how do I calculate whether it is more profitable for me to buy from the market or starting nodes and buy from the network?


Check out the Dev Forum

4 Likes

That is true. A key difference with Autonomi is that nodes are selling resources to users, rather than buying tokens.

Perhaps, though there’s also more certainty in some ways, as you don’t have the uncertainty of a surprise step-change in technology (EG when ASICs appeared on the BTC mining scene) killing your expected ROI when you bought what you thought was state-of-the-are mining equipment.

Bitcoin / Ethereum mining needs to be profitable for it to work.

Running an Autonomi nodes only needs to pay enough to ensure nodes add capacity to the network at a rate that matches demand, so it’s a very different beast, where no ā€˜block subsidy’ equivalent is required.

2 Likes

Yes, without a subsidy, it will be exactly the same with Bitcoin in 100 years. Then miners will only sell services to network users.

A subsidy means a predictable flow of tokens. Since the idea of ​​Autonomi is to use only available resources in theory, this means that everything you get above 0 is free and the network can grow very quickly.

In practice, what we saw in the beta is that people don’t use free resources, they buy them.

This means that there is a real danger that these resources will be turn off if enough people don’t buy network storage, which is very likely to happen because who will use something unproven in scale?


Check out the Dev Forum

3 Likes

Subsidised storage is a double edged sword.

You can get faster short term growth but the risk is also that people get used to not paying the real price.

For a short amount of the time maybe a year or so, it might give benefits. But think it would be important to make it known that it is for a limited amount of time.

Companies who subsidise risks getting stuck in a endless loop of negative cash flow.

I am with you on that it might be a necessary evil to subsidise storage to get some short term traction.

1 Like

Interesting , I like your thought on this, re:

Scypto and RADIX, the founder

Dan Hughes @ RADIX should probably take a look at this Phala model, distributed elastic AI Compute Workload governed by SCs…

imo there is certainly merit in the opportunity you point out.

On that note

I took a closer look at RADIX and Scrypto this am, certainly the way Hughes has organized the SC to be a ā€˜component’ with self contained buckets and multiple permanent vaults is well organized and intuitive, simplifying making writing an SC very compact,

the caveat being of course the scrypto ā€˜SC’ developer must use the RADIX engine and related DLT … and know Rust+ Scrypto , which does have a steeper (what the heck is this stuff) learning curve :wink:

so really RADIX+scrypto is a complete ā€˜alternate system architecture’ (and universe) to what is Solidity and ETH,

and given programmers are generally lazy, well you get the picture

So that is a big uphill battle… for Hughes scrypto and RADIX

(the best tech does not always win, distribution always wins ie MS)

It might be interesting for Hughes to look at Spiderchain, as they are EVM agnostic , he might be able to port/transform part of his RADIX engine to become a SpiderChain RADIX_VM running Scrypto lang, without a lot of work to become mainstream on BTC

There is alot of dPoS SC VM resurgence going on in the BTC community at the moment, it might be the right time for Hughes to pivot RADIX and scrypto ā€˜SC’ lang to take advantage of what Spiderchain has done there in BTC land to get BTC taproot actually working with their innovative dist. peg in/peg out + multi-sig scaling of VM overlay network with truly dist. PoS, scaling ERC20 SCs with EVM and Spiderchain in a huge scale-out way… given Spiderchain is agnostic in the VM sense by design.

That said, keeping Autonomi Network in context, it’s all ā€˜just’ software

so in theory, Hughes transforming the principles of his scrypto lang for SC and the RADIX engine to interact with a DAG (Autonomi)

so as to work as a composable distributed set of SCs that can be elastically scaled up/down to service a ā€˜SC Workload’

like Phala (for AI workloads like LLM Training or Tuning) on top of Autonomi , that settles initially using ERC20

is possible…, and not as crazy as it sounds :wink:

maybe means in the case of somehow transforming this all to work as a layer two on Autonomi Network ,

is worth a second glance by @dirvine , at a minimum to inspire the Maidsafe team in their own journey to do the same.

Hughes is in the UK, so same time zone…

Given any such aspirations for any developer in theory to tackle the above challenge of scrypto SCs on Spiderchain protected DLT VM network operating on an L1 DAG (sounds like a lot of work, but it really isn’t if the software architecture is designed right to a clear set of requirements before writing a stitch of code)

1 Like

If people stop buying network storage, the network is dead anyway. Who’s going to want to run nodes to earn a token that is used to pay for resources that nobody wants? Printing more of that token won’t work magic.

As has been said many times before, people always create new data that they want to store, so as long as the network does a good job in hosting & serving that data for people, there will always be new videos / music / documents / forum posts / emails etc etc to upload.

The ongoing demand for people to store data on Autonomi should lead to the ā€˜predictable flow of tokens’ that BTC mining offered in the early days.

There is a huge difference between Autonomi and BTC.

If BTC didn’t give out a block subsidy, fees would be the only source of revenue for miners, which in a small network wouldn’t have worked.

With Autonomi, even in a small network, 1gb of storage space is a valuable resource that people will pay for.

2 Likes

In practice, what we saw in the beta is that some people are perfectly happy to spend THEIR OWN MONEY on ensuring that a critical mass of nodes is available FOR US ALL and for the general good of the project.

We also saw through a particularly ill-thought out, rushed and then botched reward scheme that there were some others thought they could make a quick buck by running or controlling many many nodes. The motivations of these two groups (with some admitted overlap) were quite different.

Neither is a reliable indicator of how this network will develop. Especially at this scale and eventually at much larger scale.

So beware making sweeping judgements at this time on what is frankly a minuscule amount of data and in skewed circumstances ie the reward scheme and largely non-functional nodes from home. When of course (all other things being equal) it is nodes from home ( and other under-utilised resources) that will either make this project fly or fail.

4 Likes

None of the above is critical of the pragmatic requirement of having bulk amounts of data from a few sources early on to validate the basic functionality of the network and to get up to some kind of critical mass.
So please dont take this as a rant against @Bux 's ā€œdata partnersā€. They are very necessary at this time but in saying that the tail must not be allowed to wag the dog.
And in balancing all that, its why @JimCollinson and @Bux get the big bucks - and why we probably have too high expectations of them.

It wont stop us holding you to these unrealistic expectations though…

5 Likes

Congratulations on your upcoming launch :rocket:

So ELI5 is Autonomi (Maidsafe) is finally Beta launching on October 29th using an Ethereum based ā€˜Beta Token’ & the full Mainnet launch & TGE is January 2025 when Data Storage will also become permanent.

Most OG’s are aware of Maidsafe so there is huge brand recognition among early adopters but it has fallen off our radar over the last few years.

So if this is well communicated, I think there is deep well of users and investors who will support the network.

Well done on never giving up.

28 Likes

Welcome to the forum!

Yes, that’s right. The TGE token will also be ERC20, but hopefully it’ll be possible to swap this with the Native token when it’s ready, which is expected to be some time after launch (no specific timeline given… I hope sooner rather than later!).

I hope many other crypto OGs who will be familiar with Maidsafe from back in the day will get involved when Autonomi launches.

13 Likes

I pivoted my interest back into this project because the ā€˜remnant’ team persevered and, also the direction of the team’s effort pivoted back to to be aligned more with original design intent.

Really, dPoS and staking in its current form favours those who laid out the cash to build big nodes and acquired the most token value in staking terms.

It will be interesting to see how the maidsafe team addresses the massive tectonic dual plate shift in the cryptoworld of DEFI on BTC and ā€˜DEPIN for AI’ when adding that layer on top of Autonomi Network to keep that layer of the network widely distributed, to avoid the concentration of wealth directed by those with god complexes,

it’s a tough ā€˜dual headed beast’ problem,

which imo needs to be tackled,

which in part means re-arming small biz and the consumer with the tools they need to be successful as part of a bigger mesh of interests.

white list curation of genAI Agent use of one or more LLM’s ML Training sources to provide responses, imo is one key place to start adding capabilities to the Autonomi Client in order to get a human leash on AI ā€˜re-organized’ and curated (often cleansed/censored) data quickly…

4 Likes

What does this refer to? I understand watch the code base. But 15% royalties but paid to who and why and for how long and how is it paid and by who?

1 Like
4 Likes

Judging by the 24 view count I suspect that I am not the only one who just read all of that again thinking you posted it to show the updated changes :rofl:

5 Likes

Ha! Sorry! No that’s pre-changes (read: scrapping)

4 Likes

Maybe I’m remembering incorrectly, but wasn’t the royalty 10% to the foundation on every upload originally ?

At least using ERC-20 and smart contracts will prevent a hard fork removing the royalty from the native token client, given the DAG wasn’t able to keep up for nodes to validate the upload has paid the royalty. It would of been nice to have seen that acknowledged as a commercial concern on the token economics - I can’t be the only person to notice this :thinking:

Are we able to get any more clarity on which chain is being used ? with some of the hint’s dropped over Paymasters, and RPC endpoints I’ve got the assumption that’s it’s going to be ERA based, which isn’t a bad thing apart from the 24Hour L2 - L1 lock-in frustration, maybe the team could bring that up with matterlabs.

I’ve been doing some calculations, and ERA’s block sizes fixed at around 5-6k transactions, back to L1 are going to be a pinch point and a GAS guzzler - are there any thought’s that can be shared on what is planned for the chunk size, and smart contract implementation ?

My assumption is that it’s going to have to go much bigger ? but that’s going to hit on the De-Dupe efficiency ? Some initial workings would be really beneficial in helping understanding on how this is going to not overload any current L2 scaling solutions, and smart contract memory restraint, without compromising the chunk distribution / and network efficiency by nature of the distribution of chunks / validation of chunks in closed groups, and on download speed from nodes.

#Edit : Also are we able to have some visibility on proposed trading pairs ? assumption it will be ANT/USDC - ANT/ETH (Where ANT is the yet to be named token) and some clarification on how long the implementation of native paymaster on L2 will be after token launch ?

Also, Where is the liquidity pool coming from ? Is the foundation initially going to be funding a DEX bot on L2, or will there be another funding round ?

4 Likes

Just lost 1266 nanos transferring between wallets

]
Failed to send NanoTokens(1266) to 86544878e86ceb216232eaacf3e442473300b720bf3612db28a50bc8034335c55c3db67e205468d7add17a768c148b82 due to Wallet(PubKeyMismatch("/home/safe/.local/share/safe/client/wallet")).
Completed with Err(
   0: Wallet Error Main pub key doesn't match the key found when loading wallet from path: "/home/safe/.local/share/safe/client/wallet".
   1: Main pub key doesn't match the key found when loading wallet from path: "/home/safe/.local/share/safe/client/wallet"

Location:
   sn_cli/src/bin/subcommands/wallet/hot_wallet.rs:357

Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.) of execute "Wallet(Send { amount: \"0.000001266\", to: \"86544878e86ceb216232eaacf3e442473300b720bf3612db28a50bc8034335c55c3db67e205468d7add17a768c148b82\" })"

The same command completed successfully some time later with a much smaller amount.
NOTHING ELSE CHANGED

When the simple basics are still not working reliably there really is not much point in planning for a ā€œlaunchā€.

If things dont improve rapidly on many fronts, I am likely to cash in my MAID for whatever I can get and resume my interest in flight simming,
This project has been subverted.

And Im certainly not lashing out £70/month any more to rent a big server for a project that seems as if it will only ever be of value to large corporations. So you can kiss a few hundred nodes goodbye as well.

8 Likes