I've not too long ago spent altogether an excessive amount of time placing collectively an evaluation of the boundaries on block measurement and transactions/second on the idea of varied technical bottlenecks. The methodology I exploit is to decide on particular working targets after which calculate estimates of throughput and most block measurement for every of varied totally different working necessities for Bitcoin nodes and for the Bitcoin community as a complete. The smallest bottlenecks represents the precise throughput restrict for the chosen targets, and due to this fact fixing that bottleneck needs to be the best precedence.
The targets I selected are supported by some analysis into accessible machine sources on this planet, and to my information that is the primary paper that means any particular working targets for Bitcoin. Nonetheless, the targets I selected are very tough and really a lot up for debate. I strongly advocate that the Bitcoin neighborhood come to some consensus on what the targets needs to be and the way they need to evolve over time, as a result of selecting these targets makes it potential to do unambiguous quantitative evaluation that can make the blocksize debate rather more clear reduce and make coming to choices about that debate a lot less complicated. Particularly, it'll make it clear whether or not persons are disagreeing in regards to the targets themselves or disagreeing in regards to the options to enhance how we obtain these targets.
There are lots of simplifications I made in my estimations, and I absolutely anticipate to have made loads of errors. I might admire it if folks might evaluate the paper and level out any errors, insufficiently supported logic, or lacking info so these points might be addressed and corrected. Any suggestions would assist (particularly evaluate of my math)!
Here is the paper: https://github.com/fresheneesz/bitcoinThroughputAnalysis
Oh, I also needs to point out that there is a spreadsheet you may obtain and use to mess around with the targets your self and look nearer at how the numbers have been calculated. Additionally, there was a discussion on r/BitcoinDiscussion some time again.
Bitcoin's present most constraining bottleneck in keeping with the targets I selected are storage of the blockchain and UTXO set, and reminiscence use of storing the UTXO database. These two issues virtually undoubtedly do not meet the chosen targets.
The very best precedence enchancment ought to most likely be fraud proofs, since they drastically enhance the community safety in an setting the place most customers use SPV nodes.
The second-highest precedence enhancements needs to be Assume-UTXO accomplished in a approach the place historic information might be ignored by most nodes (fraud proofs are required for this).
The simplest future enchancment will most likely be some type of accumulator (like Utreexo), since that can eradicate the scale of the UTXO set as a bottleneck solely.
In 10 years, if all these present concepts are applied, we will most likely safely get to 100 transactions per second on-chain.