"Beautiful" Pattern: Bitcoin Will See a Colossal Rise ...

It is time to usher in a new phase of Bitcoin development - based not on crypto & hashing & networking (that stuff's already done), but based on clever refactorings of datastructures in pursuit of massive and perhaps unlimited new forms of scaling

Debates among devs are normal and important.
Debates between programmers are the epitome of decentralized development and as such they are arguably the most important mechanism that will ensure the ongoing success of the Bitcoin (or cryptocurrencies) project.
Therefore, we would be wise to encourage such debates, rather than trying to make them go away by calling them "personal attacks".
In the real world, there aren't a whole lot of different ways to hammer a nail into a board or pour cement into a hole - but in the abstract world of mathematics and programming, there are many, many different ways to represent and manipulate a data structure, limited only by our imaginations, so it is actually appropriate to expect and even demand lots of jostling and critiquing from our programmers as they "try to invent a better mousetrap."
In fact, this is the kind of informal jockeying and shop talk that always has gone on and always will go on among mathematicians and programmers - and quite rightly so, because it is precisely the mechanism whereby they maintain order among their ranks, by making subtle and cogent observations about who knows what.
A famous example of this typical sort of jockeying and shop talk can be seen elsewhere in the ongoing debates between programmers of the "procedural" / "object-oriented" school (C/C++, Java) versus the "functional" school (Haskell, ML). It's always quite an eye-opener for a procedural programmer who's been using "loops" all their life, when they finally discover how to use an "iterator" in functional programming. They both "accomplish" the same thing of course - but in radically and subtly different ways, since an iterator in a functional language is a "first-class citizen" which can be passed around as an argument parameterizing a function, etc. - allowing much more compact and expressive (and sometimes even more efficient) code.
Different Bitcoin dev skill sets are required for different stages of Bitcoin's life cycle
An example of the debate between various devs can be seen here:
It is "clear that Greg Maxwell actually has a fairly superficial understanding of large swaths of computer science, information theory, physics and mathematics."- Dr. Peter Rizun (managing editor of the journal Ledger)
What Peter R is saying here is simply that a different skill set is needed to usefully contribute to Bitcoin development now that it has moved well beyond its "proof-of-concept and initial rollout" stages (hey, this thing actually works) and is now trying to move into its "massive scaling" stages (let's try to roll this thing out to millions or billions of people).
Bitcoin's "proof-of-concept and initial rollout" stages
Initially, during the "proof-of-concept and initial rollout" stages, the skill set that was required to be a "Bitcoin dev" merely involved knowing enough cryptography, hashing, networking, "game theory", rudimentary economics, and C/C++ programming in order to be able to understand Satoshi's original vision and implementation, doing some simple and obvious refactorings, cleanups and optimizations while respecting the overall design decisions captured in the original C/C++ code, and maintaining the brilliant "game theory" incentives baked therein - the most notable of all being of course that thing which some mathematicians have taken to calling "Nakamoto Consensus" (which could be seen as a useful emerging mathematical-historical term along the lines of Nash Equilibrium, etc.) - ie, Satoshi's brilliant cobbling-together of several existing concepts from crypto and hashing and game theory and rudimentary economics in order to provide a good-enough solution to the long-standing Byzantine Generals Problem which mathematicians and programmers had heretofore (for decades) considered to be unsolvable.
In particular, during the "proof-of-concept and initial rollout" stages, the crypto and hashing stuff is all pretty much done: the elliptic-curve cryptography has been decided upon (and by the way Satoshi very carefully managed to pick one of the few elliptic curves that is NSA-proof) and the various hashing algorithms (SHA, RIPE) are actually quite old from previous work, and the recipe for combining them all together has been battle-tested and it should work fine for the next few decades or so (assuming that practical quantum computing is probably not going come along on that time scale).
Similar, during the "proof-of-concept and initial rollout" stages, the networking and incentives and game theory are all pretty much done: the way the mempool gets relayed, the way miners race to solve blocks while trying to minimize orphaning, and the incentives provided currently mainly by the coinbase subsidy and to be provided much later (after more halvings and/or more increases in volume and price) mainly by transaction fees - this stuff has also been decided upon, and is working well enough (within the parameters of our existing imperfect regulatory and economic landscape and networking topology, where things such as ASIC chips, cheap electricity and cooling in China, and the Great Firewall of China have come to the fore as major factors driving decisions about who mines where).
Bitcoin's "massive scaling" stages
Now, as we attempt to enter the "massive scaling" stage, a different skill set is required. As I've outlined above, the crypto and the hashing and the incentives are all pretty much done now - and mining has become concentrated where it's most profitable, and we are actually starting to hit the "capacity ceiling" a few times (up till now just some spam attacks and stress tests - but soon, more worryingly, possibly even with the next few months, really hitting the capacity ceiling with "real" transactions).
Early scaling debates centered around blocksize
And so, for the past year, we've gone through the never-ending debates on scaling - most of them focusing up till now (perhaps rather naïvely, some have argued) on the notion of "maximum blocksize", which was set at 1 MB by Satoshi as a temporary anti-spam kludge.
The smallblock proponents have been claiming that pretty much all "scaling solutions" based on simply increasing the maximum blocksize could have bad effects such as decreasing the number of nodes (decreasing this important type of decentralization) or increasing the number of orphans (decreasing profits for certain miners) - so they have been quite adamant in resisting any such proposals.
Meanwhile the bigblock proponents have been claiming that increased adoption (higher price and volume) should be more than enough to eventually offset / counteract any supposed decrease in node count and miner profits that might happen immediately after bigblocks would be rolled out.
For the most part, both sides appear to be arguing in good faith (with the possible exception of private companies hoping to be able to peddle future, for-profit "solutions" to the "problem" of artificially scarce level-one on-chain block space - eg, Blockstream's Lightning Network) - so the battles have raged on, the community has become divided, and investors are becoming hesitant.
New approaches transcending the blocksize debates
In this mathematical-historical context, it is important to understand the fundamental difference in approach taken by Peter__R. He is neither arguing for smallblocks nor for bigblocks nor for a level-2 solution. He is instead (with his recently released groundbreaking paper on Subchains - not to be confused with sidechains or treechains =) sidestepping and transcending those approaches to focus on an entirely different, heretofore largely unexplored approach to the problem - the novel concept of "nested subchains":
By nesting subchains, weak block confirmation times approaching the theoretical limits imposed by speed-of-light constraints would become possible with future technology improvements.
Now, this is a new paper, and it will still undergo a lot of peer review before we can be sure that it can deliver on what it promises. But at first glance, it is very promising - not least of all because it is attacking the whole problem of "scaling" from a new and possibly highly productive angle: not involving bigblocks or smallblocks or bolt-ons (LN) but instead examining the novel possibility of decomposing the monolithic "blocks" being appended to the "chain" into some sort of "substructures" ("subchains"), in the hopes that this may permit some sort of efficiencies and economies at the network relay level.
"Substructural refactoring"-based approaches
So what we are seeing here is essentially a different mathematical technique being applied, for the first time, to a different part of the problem in an attempt to provide a "massive scaling" solution for Bitcoin. (I'm not sure what to call this technique - but the name "substructural refactoring" is the first thing that comes to mind.)
While there had indeed been some sporadic discussions among existing devs along the lines of "weak blocks" and "subchains", this paper from Peter R is apparently the first time that anyone has made a comprehensive attempt to tie all the ideas together in a serious presentation including, in particular, detailed analysis of how subchains would dovetail with infrastructure (bandwidth and processing) constraints and miner incentives in order for this to actually work in practice.
Graphs reminiscent of elasticity and equilibrium graphs from economics
For example, if you skim through the PDF you'll see the kinds of graphs you often see in economics papers involving concepts such as elasticity and equilibrium and optimization (eg, a graph where there's a "gap" between two curves which we're hoping will decrease in size, or another graph where there's a descending curve and an ascending curve which intersect at some presumably optimum point).
Now, you can see from the vagueness of some my arguments and illustrations above that I am by no means an expert in the mathematics and economics involved here, but am instead merely a curious bystander with only a hobbyist's understanding of these complex subjects (although a rather mature one at that, having worked most of my long and chequered career in math and programming and finance).
But I am fairly confident that what we are seeing here is the emergence of a new sort of "skill set" which will be needed from the kind of Bitcoin developers who can lead us to a successful future where millions or billions of people (and perhaps also machines) are able to transact routinely and directly on the blockchain.
And if a developer like Peter R wants to direct some criticism at another developer who has failed to have these insights, I think that is a natural manifestation of human ego and competitiveness which is healthy to keep these guys on their toes.
A new era of Bitcoin development
The time for tweaking the crypto and hashing is long past - which means that the skills of guys like nullc and petertodd may no longer as important as they were in the past. (In fact, there are entirely other objections can be raised against Peter Todd, given his proclivity for proving that he can, at the mathematical level, break systems which actually do work "good enough" by relying on constraints imposed at the "social level" - a level which PTodd evidently does not much believe in. For the most egregious example of this, see his decision to force his Opt-In (soon to become On-By-Default) Full RBF - which breaks existing "good-enough" risk mitigation practices many business had up till now relied on to profitably use zero-conf for retail.)
Likewise the skills of adam3us may also not be as important as they were in the past: he is, after all, the guy who invented ecash, so he is clearly a brilliant cryptographer and pioneer cypherpunk who laid the groundwork for what Bitcoin has become today, but it is unclear whether he now has (or ever had) the vision to appreciate how big (and fast) Bitcoin can become (at "level 1" - ie, directly on the blockchain itself).
In this regard, it is important to point out the serious lack of vision and optimism on the part of nullc and petertodd and adam3us:
TL;DR: Times are a-changin'. The old dev skill sets for Bitcoin's early years (crypto, hashing, networking) are becoming less important, while new dev skill sets are becoming more important (such as something one might call "substructural refactoring"). We should encourage competition as new devs emerge who have these new skill sets, because they may be the way out of the "dead end" of the blocksize-based approaches to scaling, opening up massive and perhaps unlimited new forms of "fractal-like" scaling instead.
submitted by ydtm to btc [link] [comments]

Why This Bitcoin Halving will be MASSIVE! Enjin ENJ, Energi NRG, Sharering, Kyber Network, Uptrennd Bitcoin Halving erklärt ✅ The Bitcoin halving: A look at past market cycles Bitcoin Halving 2020 Explained!! (WHAT YOU NEED TO KNOW) $BTC What is BITCOIN HALVING? coming in 2020 - $100,000???

The Next Bitcoin Halving Will Coincide With The Greatest Financial Collapse Of All Time, Says The CEO Of Bull Bitcoin Mohammad Shazil Follow on Twitter December 20, 2018 1 minute read Why could Ethereum be the next Bitcoin? It has a superb team of not only veterans of Bitcoin but programming and development. They have evolved the technology, functionality and needs provided by Bitcoin. Their network, protocol and smart contract platform has been such a leader that they have become a go to source for the majority of ICOs which have equally impressive teams and use cases ... With the next bitcoin halving expected to happen in May 2020, the time has come for investors to start paying attention to this pattern. Historically, the halving starts getting priced in approximately one year before it happens, which would result in bitcoin bottoming out in early 2019 followed by a rally starting in May 2019. But what if this time is different? It won’t be, let’s explore ... During the halving, block rewards are cut in half, ... What To Expect With The Next Bitcoin Halving. Using CoinDesk data, analyzing bitcoin prices through April of 2019 found that large volatility events seem to occur around 12-18 months after each halving. The first time, BTC went from around $11 to around $1,100 and back down to $220. The second time, bitcoin went from around $230 to around ... Ahead of Bitcoin’s block reward reduction or “halving,” BTC has performed extremely well.As this outlet reported previously, the cryptocurrency just printed its seventh consecutive week of gains. This is a technical feat last seen in April 2019, prior to the 300% bull market of that year.Calls have been mounting for Bitcoin to pullback, with Bloomberg…

[index] [5130] [26246] [19601] [33707] [48341] [50366] [8738] [29030] [47426] [21446]

Why This Bitcoin Halving will be MASSIVE! Enjin ENJ, Energi NRG, Sharering, Kyber Network, Uptrennd

Bitcoin rewards halve every 4 years. What are the implications for the next one. Find out from one of the smartest minds in crypto and blockchain in this Q&A session. In this video, we talk about the Bitcoin halving and historical price behavior following its occurrence. For those unfamiliar, the Bitcoin halving is when the block rewards to miners are cut in half. Bitcoin halving 2020 explained: if you want a simple explanation of what the bitcoin halving is and what it means for bitcoin price, bitcoin miners, and for you, then this is a can't-miss video! 👇 Die wichtigsten Kryptoseiten in der Beschreibung 👇 Heute seht ihr alles zum Thema Block Halving beim Bitcoin. Mein Video aus 2016 dazu: https://youtu.be/XB... In dem Video sprechen wir darüber was Bitcoin Halving ist und wann es stattfindet. Bitcoins Kaufen: http://partners.etoro.com/B10616_A71660_TClick.aspx Artik...