I was quite curious about how many people would be able to use the Ethereum blockchain at any moment. Bitcoin is quite clear, they’re stuck at 7 transaction per second, but Ethereum will do a lot more than just transactions, it will use their blockchain for calculation etc. So what if people start to do a dice game every block? Where Dapps need to interact a lot of times with the blockchain? And what if people use software like Augur, where great amounts of data are stored in blocks? Here’s the link to the topic.
Here’s my question:
Is there already any clue on how many active nodes the network can handle? Some say Bitcoin will break at 7 transactions a second. How many full nodes can Ethereum handle? Where’s the limit? 100K, 1 million, 10 million?
Vitalik was so kind to reply with some calculations:
Ethereum has no fixed-size block limit; indeed, we’ve had a dynamically adjusted limit based on usage all the way since January 2014. Theoretically, it can thus process 1 billion tx/second. However, in that scenario, the only parties able to process nodes would be extremely large data centers run by companies like Google and Amazon (realistically, clever software implementations would be required to parallelize transaction processing, and a lot of txs would be thrown out leading to DDoS vectors, but it could be engineered around to a moderate extent); hence, the issue is not scalability by itself, but rather the scalability/centralization tradeoff frontier.
Assuming all nodes are on normal laptops, we have the following limits:
At the minimum gas limit of 3141592, we can process a maximum of 149 transactions per block, ie. 12 tps, assuming each transaction takes 21000 gas (for a brief period of time, you can use SSTORE clear refunds to have transactions that cost ~13000 gas, but we’ll ignore this for now since it’s irrelevant to normal usage). If we assume an average transaction consumes 50k gas, this goes down to 62 tx per block, ie. 5 tps. However, if usage actually gets this high the limit will adjust upward, so it’s not a real limit.
The python client on a desktop PC can currently process ~25 elliptic curve signature recoveries per second and ~1-2m gas per second if VM execution (average usage, which includes storage operations, arithmetic, etc). Hence, at 50k gas, 1 tx will take ~70ms conservatively, so python will start to croak at around 14 tps. A recent update included a faster elliptic curve verification library so this may be somewhat increased, however.
The go client’s elliptic curve signature verification takes about 2-3ms, and its VM is also similarly faster, so go will likely croak at around 200 tx/sec.
From a data standpoint, the 3141592 gas limit has been calculated to force a maximum state growth of ~110 MB/day, though actual usage is much less (ie. anything close to 110 MB/day would only be achieved if a block contained a single transaction containing a single for loop to do SSTORE operations).
Right now, the size of the python database is ~2.9 GB from testnet usage, so ~40 MB/day. However, this is highly suboptimal because python currently never deletes anything, and anything which ever becomes part of a merkle tree goes into the database.
100 tx/sec would mean a block size of about 225 kb. Under our current p2p infrastructure, block propagation time starts to become a serious problem at around 20-40 kb last time I checked, so 10-20 tx/sec is a reasonable safe maximum initially. However, we can expect the network to improve in quality over time, and even if it does not the network would still work - it would just have more uncles (another solution is for miners to create their own sub-net where you need proof that you mined a block in order to join [that’s the best way to do this without making it a cartel], and route blocks between each other internally there). Note that this problem is NOT because we have a fast block time; if blocks were 4x further apart, each one would be 4x bigger. However, fast block times do make IBLTs harder to implement.
So, 10-100 tx/sec is likely the point where things start to get problematic, at least from an initial analysis.
As far as users go, it depends entirely on the level of usage. At 5 tx/day per user (reasonable for financial apps), 100 tx/sec would give 100 * 86400 / 5 = 1732000 users. At 500 tx/day per user (reasonable for a “ethereum replaces the internet” scenario), we get 17320 users. Hence why we need ethereum 2.0
Now we have to see how popular Ethereum will get. I think they can hit the limit within 1 or 2 years.