Community Token Proposal :IF:

I am not saying you want a closed network, i am saying our network will not be optimized for fast transactions

You write about a test network in the lab, while Autonomi is a working market network that has proven to scale - currently around 4.5 million nodes.

Furthermore, Sonic’s Lachesis Consensus uses aBFT (Asynchronous Byzantine Fault Tolerance) consensus and a DAG structure, which theoretically allow parallel transaction processing - so you contradict yourself a bit when writing about the use of DAG in Autonomi.

The high TPS declared by Sonic (e.g. 10,000) is only possible under ideal network loads, while real usage (e.g. interactions with DeFi, NFT) introduces delays.

Read about the challenges of the Sonic network:

However, you might come to the conclusion that blockchains are years behind the times…? :wink:

2 Likes

Well, if you’re saying you need a closed network for fast transactions, then I’d reject that option on principle.

Autonomi should be quick enough to allow data to be uploaded for forum/blog posts, etc. I suspect that will be 100s of milliseconds once the network has been refined, but a few seconds is probably fine too.

When I tap my card in a shop, it takes a few seconds. That is perfectly fine.

What we also want is massive throughput of such transactions. So millions of people around the world can make those transactions and still only wait a similar amount of time.

Having close groups managing each transaction in isolation will scale very well. Extremely well. It’s exactly why the network will be able to support millions of simultaneous blog or forum posts too… at the same time, on the same network.

5 Likes

Why spend all this energy on a L2 token of an already layered token (ERC20) of crypto token (ETH) instead of just investing your energy into creating a native token? Blockchain sucks. I’m sorry, I don’t understand the point of this other than to prove just how absolutely awful the current blockchain solution is, and as with the lightning network, it only makes using crypto more complicated to understand and even more likely to turn people away.

3 Likes

That does not equate to transaction rate.

I can have a transaction take 5 seconds to process, yet the system achieve 100,000 transactions per second. Basic transaction processing that has been well documented for decades.

With the client doing the bulk of the work then for one person they might see a transaction take from 1 to 10 seconds due to internet lag. But in each small segment of the network there might be 5000 clients all doing their work and have 1000/second and with network scale have 100’s (1000’s later on) of these areas doing that. That is where decentralisation of digital currency has it over any blockchain

6 Likes

In our consensus algorithm, an event containing transactions is represented by a vertex in a DAG, and edges represent the relationships between the events. (docs.soniclabs.com/technology/consensus)

Sui utilizes a DAG in it’s network architecture offering greater efficiency and performance for applications downstream. (blog.sui.io/all-about-directed-acyclic-graphs)

In our design though, since all validation/consensus happens client-side, I would not expect stellar transaction speeds. Although Native Token could be validated by nodes as @dirvine once suggested, so faster and more secure.

4 Likes

This project is about utilizing technology meant for native token, but for a whole family of tokens.

One of them could become the native token, but for that it would need modification of node software for nodes to accept it as a payment for data storage, and perhaps validate it server(node)-side which would bring better speed and security.

Beyond basics, instead of going native, we choose to explore bridging and exchanging, because modification of node software would require full support and close collaboration with MaidSafe, and they are focused on other areas now. An alternative would be doing that without them, but this would mean reinventing a wheel and loosing much of the know-how, also putting a network to danger of forking.

10 Likes

How does that scale though? This is the problem. Bitcoin was fast and cheap when it was small. :ant: scales, so not as fast as a highly optimized hardware software solution at low volume, but as we are highly decentralized, we can scale much better – meaning less impact on speed and cost of transactions as our network grows.

6 Likes

Why is forking dangerous?

1 Like

Its dangerous, right now, as it could split the team and community before we reach a (as -yet uncalculated) critical mass.
A few years down the road, I am with @Dimitar in that forking is inevitable. Right now, my take is it would be a very Bad Thing.

7 Likes

Why do you think (@Dimitar too) that a network fork is inevitable within a few years? What is it supposed to be due to?

1 Like

Human nature. There will be people who want to and believe they can do better if they change a few things. I would be very surprised if there weren’t hundreds of networks after the technology is proven to work…


Check out the Dev Forum

3 Likes

What @Dimitar said above.

dunno about hundreds though - a good few however - say 20+ after 3-4 years

1 Like

I also predict 95% of them will be expensive failures, but with the benefit of 3-4 years hindsight, I think some very specialised off-shoots may prosper in their own niche.

Unless of course Maidsafe/Autonomi make a roaring c*nt of it all. But I have few worries on that score.

Whatever - in 5 years time there will be several implementations of the original vision from Troon 2008.

2 Likes

If you mean attempts to create private (commercial) networks, then there will probably be some who will try to use Autonomi’s technology for their own purposes…

But I think the problem will not be the technology but the legal and regulatory aspects, anyone who thinks things through will recognise that it is better to create solutions on a core network that systemically solves all the problems than to adapt technology that does not offer regulatory compliance for certain areas of activity.

I used to think it would be possible to implement SN technology in my own independent network, but after much analysis and delving into the details of all the issues, the thought effectively blew out of my head, sure, with the right resources everything can be done, but the question is - to what end?

1 Like

I am still wanting to explore private Autnonomi-like solutions for individual or groups of datacentres.
How much physical disk space is “wasted” on hardware RAID? I have not had the need to look for a few years but are good hardware RAID boards still pretty expensive?

@rreive ?

1 Like

Indeed. Similarly I would like to see multiple voluntary public networks that do not use any tokens/payments, uploads via a group key (moderator nodes see it first, permits or denies), small file limits.

2 Likes

Week 1 summary

This week has been mostly infrastructure and ideation week. The project took off, was registered and qualified for Impossible Futures program, and got its own place on GitHub.

(more info in daily updates)

Support from community is overwhelming, lots of discussions here on forum, as well as in private, which I am grateful for, because it always brings new insight.

Next week I plan to focus more on code, to prepare a simple test case of some example token manipulation.

11 Likes

I’ve been brainstorming about a problem, that cheater could create such a long chain of valid transactions after an invalid one, so that payee would not have time to check them all. I came up with additional process of validation, caried out by all clients “in the meantime” to mark risky transactions ahead of time. I’m curious what do you think about it.

There was also another idea how to cope with the problem, that I threw away.

7 Likes