There have been many claims in recent years that bitcoin and the miners securing the network via SHA-256 proof of work use an unconscionable amount of energy. But what data are these claims based on, are the calculations that lead to these claims based on flawed or sound approaches and assumptions, how much electrical power does the network draw, and how much electrical energy has the Bitcoin network used historically?
Methodologies & Misconceptions
Due to the vast, globally distributed topology of the Bitcoin network, the amount of electrical power and energy that miners consume isn’t exactly verifiable, instead it must be estimated. Among the energy consumption hysteria [(1) (2) (3)] over the previous few years, a surprisingly large number of reputable sources have weighed in and attempted to estimate Bitcoin’s network energy consumption in more level-headed and data-derived ways:
- University of Cambridge, Judge Business School (JBS)
- The International Energy Agency (IEA)
- Electric Power Research Institute (EPRI)
- Coin Center
- Marc Bevand
- Hass McCook
- Alex de Vries
Estimation methodologies seem to fall into two major categories: economics-based approaches rooted in financial assumptions, as well as physics-based procedures planted in engineering principles. These two estimation approaches were thoroughly compared and contrasted at BTC2019. Some estimations also utilize a type of hybrid of both techniques.
It’s important to understand when digesting all of these estimations that electrical consumption is typically measured in two ways: instantaneously (power, watts, kilowatts, etc.) and that same instantaneous power measurement integrated over a certain period of time (energy, joules, kilowatt-hours, etc.)
Economics-based approaches that estimate Bitcoin network energy consumption generally assume perfectly rational market behavior, and can easily be manipulated with a few input variable misassumptions.
In theory, the Bitcoin mining industry is rational, profit maximizing and perfectly competitive: mining marginal revenue should tend to equal marginal cost (MR = MC). Meaning, on long enough time horizons, the market should find an equilibrium, where the cost of energy consumed in a unit of bitcoin’s production would be roughly equivalent to the unit’s market value at the time of minting. This calculation process can be distilled as, “how much can Bitcoin network miners afford to spend on electricity?”
Typically, these types of estimations are too dependent on a single volatile variable: the market exchange price of bitcoin. Below is a quick simplified example of this type of estimation:
- [MR] = [MC]
- [(blocks/day)*(reward/block)*(BTC Price) ]=[(kWh/ day)*($/kWh)]
- [(blocks/day)*(BTC/block)*($/BTC)]/($/kWh) = (kWh/ day)
Let’s try this estimation. Bitcoin blocks are generated roughly every 10 minutes — a rate of 6 per hour, or 144 every day. Currently, a single bitcoin block contains 6.25 BTC of coinbase block subsidy; that’s 37.5 BTC per hour, or 900 newly minted-bitcoin rewarded to miners daily. With bitcoin’s current market exchange price of about $10,750, that is roughly $9,675,000 earned per day that bitcoin miners have available to spend on electricity.
- [(144)*(6.25/block) * $10,750] / ($0.10/kWh) = (96.75 GWh/ day)
This amount of daily energy equates to roughly 35.3 TWh of yearly usage that the bitcoin miners could afford to consume for an entire year, if we take a snap shot today and assume constant bitcoin price for a year and U.S. average electrical costs.
While this method is over reliant on bitcoin price, it is also heavily dependent on the assumed electrical energy cost for miners. The calculations and conclusions of this kind of estimate can be drastically different or even manipulated depending on the assumptions used as inputs: energy costs ($/kWh) and the price of bitcoin ($/BTC).
Here we used the average U.S. electrical cost of $0.10/kWh. However, in the US, electrical costs actually vary seasonally, from state to state, city to city, and in some cases neighborhood to neighborhood. Global electrical costs have the same incongruency. This isn’t even including wide ranging industrial, commercial or residential electrical energy rates, adding even more sources of error to these economics-based estimation techniques. And, in fact, this calculation’s heavy energy price dependence has yet another flaw: some miners high in ingenuity have near-zero fuel cost as they harvest excess, otherwise wasted, inaccessible or curtailed energy sources.
This quick exercise highlights, why this type of economics based estimation approach is a gross over-simplification fraught with the following issues:
- Bitcoin mining, network hash rate and therefore network energy consumption isn’t as responsive to sudden price movements as these economics based estimation methods are.
- The economics-based model claims energy usage is cut in half along with network miner rewards after each bitcoin block reward halving cycle, which is every 210,000 blocks or about 4 years, while difficulty and proof of work based data disproves this.
- This type of model assumes a single average global energy cost ($/kWh); electrical energy costs vary widely by region, seasonally, and even by energy source.
- Likely to be an upper-bound estimation
Physics-based network energy estimation approaches, on the other hand, tend to be a very rigorous type of “running the numbers” the bitcoin community is accustomed to.
These methods use independently verifiable on-chain difficulty, proof-of-work data and original equipment manufacturer (OEM) -published heat rate specifications to more accurately estimate historical energy inputs into the bitcoin mining system. The physics estimation attempt may best be described as a “bitcoin stoichiometric ratio unit analysis calculation:”
- Bitcoin Difficulty (Unitless)→ Bitcoin Hash Rate (Daily Average TH/s)
- Daily Average Hash Rate (TH/s) → Yearly Hashes (TH/Year)
- Yearly Hashes (TH/Year)* Yearly Hash Heat Rate (Joules/TH) = (J /Year)
- Energy/Year (J/Yr) → (kWh/Yr) → (TWh/Yr) → (ktoe/Yr)
So, let’s try out this style of estimation using bitcoin proof of work difficulty data and OEM-published data. Bitcoin network difficulty self-adjusts once every 2,016 blocks, or roughly once every two-week period. This difficulty adjustment is to compensate for block production speed discrepancies and, thus, network hash rate fluctuations.
This difficulty and proof-of-work relationship allows us to derive an estimate for network hash rate based on the block production rate and the associated difficulty level. From the amount of work done at the various difficulty levels over the previous decade, we can roughly estimate the amount of SHA-256 hashes computed per year on the Bitcoin network, shown below in terahashes per year (Th/Year) or a trillion hashes per year. We can do this same exercise on a daily granularity as well (spoiler: keep reading).
Bitcoin is on pace to have roughly 3,934 yotahash computed on the network during the year 2020 or about 3,934 septillion hashes.
“yota” and “septillion” are the largest of the SI prefixes to date, (1⁰²⁴).
Now that we have an estimation for the amount of hashes per year, next we must compile mining rig efficiency data over the past 11 years to understand how much energy would have been required to produce that amount of work.
Here it is important to understand the different types of mining equipment that have provided work toward the Bitcoin blockchain over the years. Each era and year have had distinctly different proof-of-work efficiency characteristics, which change the network’s energy consumption values over time.
From the humble beginnings of the Bitcoin genesis block being built by work derived from CPUs (central processing units), to blocks eventually being constructed with GPUs (graphics processing units), then on to FPGAs (field programmable gate arrays), and finally ASICs (application specific integrated circuits) the bitcoin network has evolved at a stunning pace.
Important note: efficiency is defined as useful work performed over energy expended to complete that work (terahash/joules — Th/j). However, ASIC original equipment manufacturers typically cite a type of heat rate specification, or the inverse of efficiency, showing energy expended over useful work (Joules per terahash — J/Th).
As you can see in the log scale chart below, over the past eight years, bitcoin mining ASIC heat rates have been steadily marching lower every year, meaning network efficiency has increased.
Translating this data into an average yearly heat rate shows a similar steep decline during the entire history of bitcoin mining. CPU, GPU, & FPGA benchmarks along with published OEM power usage data was used to estimate 2009–2012 network average heat rate. ASIC miners announced in 2020 were visualized above and below to show the the continued decrease in heat rate, but they were discarded from these energy estimations as they are not yet publicly available.
So now that we have compiled all of the necessary data (yearly hashes and yearly hash heat rate), let’s combine them via an engineer’s attempt at bitcoin mining energy stoichiometry:
- Yearly Hashes (TH/Year)* Yearly Heat Rate (Joules/TH) = (J /Yr)
- Energy per Year (J/Yr) → (kWh/Yr) → (TWh/Yr)
Simply multiplying the yearly work completed (Terahash/Year) by the yearly estimated heat rate in (Joules/Terahash) for miners on the system and you arrive at a Joules/Year estimation. We will convert from Joules/Year to kWh/Year (a kWh is equal to 3.6 Megajoules) and below those yearly energy estimates are charted.
However, this physics-based estimation method also has some issues:
- The quantity of active miners by level of efficiency isn’t known, and this physics-based model assumes equal participation from all miner models available on the market by year released.
- This physics-based model used stair stepped yearly heat rate data as in input. That yearly data abruptly changes at the 1st of each year, a gradual heat rate decline is more realistic as older miners steadily retire and new ones fire up.
- It assumes old miners retire after a year of usage, which is also unlikely as equipment life cycles are now ranging 2 or more years.
- Likely to be a lower-bound type of estimation.
Where do these yearly energy consumption estimations fall among the previously-cited calculation attempts? Interestingly enough, both of our calculation attempts, even with all of the short comings discussed above — the economic-based estimation (35.3 TWh) and the physics-based estimation (40.17 TWh) — are very similar in value. They also fall within the range of a variety of other popular estimations from noteworthy individuals, entities and institutions shown in the chart below. That all of these estimations are fairly similar in magnitude lends credibility to the various different estimators as well as the wide variety of methodologies and different assumptions used.
Noteworthy below: it appears that bitcoin hash rate (EH/s) is beginning to decouple from the general yearly energy (TWh/Yr) estimation trend. This may be due to the decreasing heat rate of SHA-256 ASIC mining equipment if the estimate is physics based, or due to the halving and price stagnation if the estimate is economics based.
This chart above shows yearly energy estimation snapshots along a timeline of date published in TWh/Year, but a few of these sources [University of Cambridge (C-BECI) & Alex de Vries (D-BECI)] actually publish these yearly estimates on a daily graph going back a few years. This gets back to the previous energy vs. power discussion: logic should prevent plotting yearly energy estimations on a daily axis. Regardless, I thought it would be worth comparing these published interval estimates with our above methodologies using more continuous time series data going back to late 2017, near the date of the previous market exchange price maximum. Our economic and physics calculations, the Cambridge estimates, and Digiconomist‘s results are all fairly similar in magnitudes over time, again, adding some peer review and validity to these different estimation techniques.
Our above estimation methodologies appear to align nicely with the other various daily interval yearly energy estimates, so all of these time series estimates were averaged together to create a sort of Composite Bitcoin Energy Index — CBEI as shown below in TWh/Year. Each of these estimations have different assumptions, varying levels and sources of inaccuracy, and thus their composite may have a smothering effect and could be more accurate. This estimation combination (CBEI) has just recently retested the 60 TWh/year threshold for bitcoin network energy consumption.
How does this composite energy estimation index compare to Bitcoin network hash rate over time? The CBEI shows a decoupling of hash rate and energy around early 2019 with hash rate continuing to rise and energy consumption staying relatively steady as ASIC heat rates and bitcoin mining incentives have shrank.
Interestingly, many of these bitcoin estimations are commonly extrapolated for an entire year, as an energy consumed value in TWh/Year without supporting time data or evidence. Daily network power estimations would be much preferred to all of these yearly energy consumption estimations plotted on a daily chart. So, I took the liberty to convert these daily interval yearly energy estimates into a daily power usage estimation to correct for all our chart crimes and electrical power & energy misconceptions.
I present the Composite Bitcoin Power Index — CBPI, built by compiling and converting estimates from the Digiconomist (D-BECI and D-BECI-Minimum), Cambridge (C-BECI, C-BECI-Maximum, and C-BECI-Minimum), as well as our above economics & physics based estimates. This CBPI composite estimates the bitcoin network instantaneous electrical draw as expressed in watts, the System International Unit of electrical power. This CBPI crested in mid 2019 at nearly 7.58 GW, or about 6 DeLorean time machines running at a full 1.21 Gigawatts. (jigawatts?)
Energy values this large are difficult to digest, especially in a yearly context, so let’s put the Composite Bitcoin Energy Index — CBEI estimation of 60TWh/year in perspective by comparing it with some thorough estimations previously completed by Hass McCook.
- 650 TWh/Year on the banking system
- 200 TWh/Year on gold mining
- 75 TWh/Year on PC & console gaming
- 60 TWh/Year on bitcoin mining (CBEI)
- 11 TWh/Year on paper currency and coin minting
- 7 TWh/Year on Christmas lights in the US
Based on our estimations above, the Bitcoin network consumes roughly 40–60TWh/Year — just about 0.15 % of global yearly electricity generation (26,700 TWh) and only about 0.024 % of global total energy production (14,421,151 ktoe).
A ktoe is also a unit of energy: a kilo tonne of oil equivalent, 11.36 MWh.
So, bitcoin energy consumption today is only a very tiny portion of what many consider a significant civilization level problem: ever increasing human energy consumption. Check out interesting solutions to this problem outlined a century ago by Nikola Tesla. As recently as last month (September 2020) a study claimed nearly 76% of the bitcoin network is powered by clean energy sources. Also, remember that once Einstein discovered mass-energy equivalence and humanity harnessed the energy embedded in the atom, energy for the advancement of mankind has became materially abundant.