Saturday, December 16, 2017

How much energy does bitcoin mining really use? It's complicated







This is more a measure of just how big this has all gotten and it is certainly not over. My own back of the envelope calculation tells be that the actual potential market size and we are certainly working at getting there is around a price point of $1,000,000 per coin.

 At that point it matches gold and to meet real human needs it will still go much higher than this.So this story is not over. Energy consumption can actually become a major force but rising efficiency will also counter that. Bitcoin is quite capable of simply replacing the US dollar as a global reserve currency.


 What it certainly does is put all national currencies on notice that they are immanently replaceable. That is really a good thing because it finally strips the King of his power of the purse and credit.Add in the need to apply fractional baking to hte natural community and we have a completely fresh economic dispensation.







How much energy does bitcoin mining really use? It's complicated



Bitcoin hype has reached an all-time high. But if running the bitcoin network uses up as much yearly electricity as a medium-sized country, is it worth it?



By NICOLE KOBIE



Saturday 2 December 2017




http://www.wired.co.uk/article/how-much-energy-does-bitcoin-mining-really-use




Bitcoin chews through masses of energy, but exactly how much is up for debate. Regardless of the actual number, it's climbing — so is the environmental cost of the digital currency becoming too high? In short, it’s complicated. So let’s look at the numbers…





This being bitcoin, the numbers are confusing and largely made up. Power consumption is one of the major costs of bitcoin mining, as dedicated machines crunch the algorithms that build a record of every single bitcoin transaction and are rewarded with tiny fractions of a bitcoin for their efforts. As mining gets more difficult, it requires increasingly powerful hardware to be competitive. As the value of the digital currency goes up — and it's skyrocketed this year — miners are more likely to invest in ever more sophisticated hardware. Back in 2009, you could mine competitively with your desktop computer, but now you'll need specialist hardware, such as the Antminer S9 – a dedicated mining rig that weighs six kilos and costs well over £1000 before you even start thinking about the electricity bill



That evolution, as well as the global spread of miners, makes it difficult to assess exactly how much energy is spent on the digital checks that underpin bitcoin, but there are plenty of people trying to get a handle on just how much power it's chewing through.



Why does energy consumption matter? Regardless of whether bitcoin is a bubble or not, we're investing heavily in infrastructure and burning through huge amounts of energy. If this is going to be a viable alternative financial system, it needs to be financially and environmentally sustainable. And if it never has a chance of being truly useful, and is just a get rich quick scheme, are we destroying the climate for something totally trivial?



Of course, if you own bitcoin, which has leapt in value from $1,000 earlier this year to above $10,000, even a fraction of a bitcoin is no longer a trivial amount of money. No matter how lucrative, is a currency experiment worth churning through oodles of energy for?



What are the estimates?



It's nigh on impossible to know exactly how much energy is being used, but cryptocurrency tracking site Digiconomist is the source of one oft-cited estimate. According to its Bitcoin Energy Consumption Index, the network of computers that verify bitcoin transactions draw 3.4 Gigawatts (GW) — a single watt is a joule per second, and your laptop probably probably uses about 60W. That 3.4GW adds up to 30.1 terrawatt hours (TWh) per year of energy — that doesn't mean that much energy is used per hour, every hour, but is instead a measurement that equates to the amount of work those 30 terrawatts would do over an hour. In this case, that 30.1TWh is equivalent to the energy used by the entire nation of Morocco annually. Some dispute this figure. Fervently. Oscar Lafarga, co-founder from cryptocurrency consultant and developer SetOcean, reckons the real answer is likely half as much. In Bitcoin Magazine, Marc Bevand suggests it's likely lower still at between 470MW and 540MW.



There are other figures, if those don't appeal. In 2014 a pair of Irish researchers published one of the first papers on this topic. Karl O'Dwyer and David Malone estimated the total power use of bitcoin would be somewhere between 100MW and 10GW, but decided it was somewhere in the middle, choosing 3GW – comparable to their home country's consumption. Malone now pegs it at around 0.5GW, but also agrees with Digiconomist's overall estimate, because it's also within the realm of possibility. Others have picked different figures: in 2015, researcher Hass McCook pinned it at 120MW, while in 2016, a paper in the International Symposium on Computer Architecture said the power used by ASIC clouds, purpose-built datacentres of specialised mining equipment, alone was between 300MW and 500MW.



That's a lot of numbers (sorry, but it gets worse). There are plenty of other estimates, but the key point is they're all very different. The real range is probably somewhere between 100MW to 3.4GW. That's like guessing someone's age as between 15 and 65, while admitting there's a margin of error of ten years.


Let's start with timing. When you make your guess skews the figures, because the bitcoin network changes so quickly — there's always more activity and more processing power, but it's somewhat balanced by more efficient hardware. Harald Vranken, associate professor at Netherlands' Open University, studied the energy draw of bitcoin earlier this year, positing that it was in the 100MW to 500MW range, versus Digiconomist's 3.4GW. "At first glance, it appears that these are quite different figures, however this is not the case," Vranken says, because when it comes to bitcoin, numbers that are an order of magnitude apart are actually kind of the same.





As he explained to WIRED, his numbers are for January of this year and since then the network hash rate — a measure of the bitcoin network's processing power, looking at how quickly it solves the equations that run the network — has leapt by a factor of 4.2. The revenue from mining in January was $716 million, while now it's $8 billion — a factor of 11.4. Feed those factors into Vranken’s equation and bitcoin’s energy draw is between 5GW and 7GW. That's more than Digiconomist's figure, but that methodology has other inputs. "Digiconomist furthermore considers that miners nowadays spend 60 per cent of their revenues on operational costs, which would mean that my figures now would be 3GW to 4.3 GW," he says, adding that means the Digiconomist figure "is in line with my figures."



So while those two figures look different, they're roughly the same. What a difference a year makes.



How to calculate power use



Another factor influencing these figures is methodology. There are a few pieces of information we know: how hard it is to solve the proof of work, how much energy various hardware uses, how much revenue miners stand to make, and how much energy is used by the entire world as a useful top-line figure. Using those pieces of the puzzle, we can attempt to fill in the rest.






For example, Vranken notes in his paper that in January power consumption could vary from 45MW when using ASIC hardware versus 450TW when using standard CPUs — but we know the latter isn't likely. "Since the worldwide annual electricity consumption is about 2.3TW, it is clear that 450TW is completely unrealistic," he notes. Bitcoin is popular, but it hasn't actually taken over the world, yet.



That first Irish paper used a similar methodology that examined the types of hardware used, explains David Malone, one of the authors from Maynooth University. "In our paper, we estimated a range, with the top end based on everyone using either old inefficient hardware and the bottom end based on everyone using new efficient hardware," Malone explains. "This gave us a range with Ireland's energy consumption somewhere in the middle. You can also try to get estimates by balancing the cost of electricity for mining against the value of mining, but the idea is very similar."



Malone has actually reduced his estimate, saying that while it's hard to know exactly what hardware is being used, it's likely all professional grade at this point, which is much more efficient. "The difficulty [of mining] has also increased, but I reckon a significant portion of the increase in difficulty may have been counterbalanced by the increase in efficiency."



Digiconomist, meanwhile, works on the premise that miners spend a certain amount on operational costs, improving their hardware when prices go up, shifting from standard desktop PCs to GPUs then to specially designed ASIC machines. And that evolution in hardware can have a huge impact on the amount of power used.



"The index is based on the idea that more hashpower will be added as long as it's profitable to produce more," says Digiconomist founder Alex de Vries. "The costs are mainly electricity and capital equipment costs. It can be calculated that the lifetime electricity costs are then about 60 per cent of the total, based on past performance. This doesn’t mean costs are always 60 per cent, since that wouldn’t factor in production limits. It takes a few months for machines to be produced and installed. Costs are estimated at less than 20 per cent now by the index."






Tweak, correct or otherwise fiddle with any of the factors in the various equations, and the result changes — that's just how maths works, apparently, but it means it's no wonder we have such a wide estimate.



What's the cost of cash?



Bitcoin may well have merit above and beyond making miners rich, but compared to traditional payment systems — gold, cash, credit cards — is it an energy hog? The consumption range leaves bitcoin either much more expensive in terms of energy than existing transactional systems or much cheaper. Once again, it's how you pick your data.



To put these figures in some context, Digiconomist suggests Visa'spayment systems uses the energy equivalent of 50,000 US households to run 350 million transactions, while bitcoin uses the energy equivalent of 2.8 million US households to run 350,000 transactions on a good day — in short, Visa does more with less. As the site’s rationale explains, bitcoin is increasingly becoming a tool for the rich but we’re all paying the price for a system that uses 20,000 times (give or take) more energy than traditional systems per transaction.



But in his paper, Vranken counters that in the 100MW to 500MW range, bitcoin mining requires between 0.8KWh to 4.4KWh per year, but the energy required for mining and recycling gold – which backs US currency – is 138KWh a year, while printing paper notes and minting coins is 11KWh. He pins the banking system, including not only its data centres but also its branches and ATMs, at 650KWh. In other words, there's more to our traditional financial system than one brand of payment card. That said, he notes bitcoin is a much, much smaller system than cash and traditional banking, but as bitcoin scales up, so does the energy required for mining.



Using a Visa card may well be less of an energy suck than bitcoin, but in a way that point is moot — we still have both, and will for the foreseeable future, no matter how successful bitcoin is going mainstream. You're likely using them in tandem, such as selling off bitcoin to earn the dollars to pay off your Visa bill.



Is bitcoin worth it?



Whether researchers choose the high end or low end of the energy consumption range largely seems to depend on what they think of the currency itself. Digiconomist founder de Vries has a long list of criticisms regarding sustainability, so his number trends a bit higher. His critics are bitcoin fans, so they push the consumption guess down to suggest it's not a wasteful activity – Bevand notes that at his figures, mining eats up around 4TWh annually, less than the energy used by Christmas lights in the US by a third.



Regardless of how much energy bitcoin chews through now, those figures are helpful as a baseline, as its consumption is going to increase. Bitcoin's proof of works gets harder to solve as time goes on and returns fewer coins – go back to Vranken's maths at the beginning, and that's the increase in power consumption over less than a year, despite massively more efficient hardware. The system works by rewarding miners for computation, so they keep on computing.



Is there another way? Aside from pushing for more efficient hardware, there are other "proof" techniques that are less demanding, though may introduce security concerns. Proof of stake is the frequently mooted solution which uses a less demanding system to prove ownership of coins and dole them out via a raffle-like scheme, Vranken says. There's also proof of space, which he explains sees the miner use a specified amount of memory to compute the proof. There's also proof-of-space-time, which adds in a temporal element, but at this point that sounds a bit like he's trolling us all.



Bitcoin-style currencies might get more efficient, but don't expect them to get any easier to understand.

No comments: