Bitcoin Uses a Lot of Energy, But Gold Mining Uses More
Much has been written about the large amount of electricity required to mine Bitcoin. There is no doubt that the amount of power consumed by Bitcoin mining is large in absolute terms, but it’s worth putting into perspective. Bitcoin isn’t a country, so its energy usage really shouldn’t be compared to the power usage of a country or countries. Since BTC and physical gold are both stores of value, it makes more sense to compare the energy usage and costs of digital gold mining to physical gold mining.
Estimating Bitcoin’s energy usage
Bitcoin is a decentralized network with millions of mining units and no structure for reporting power consumption. As a result, it is not possible to be sure precisely how much power it consumes. There are, however, two rough methods for estimating power consumption: the "top-down" approach and the "bottom-up" approach.
The top-down approach estimates power consumption based on block rewards. Every time a new block is created, 12.5 Bitcoins are given to miners. Assuming 10-minute time intervals between blocks, that comes to about $562,500 every hour (12.5 x $7,500 x 6) earned by miners. If miners were using 30% of that $562,500 per hour to buy electricity, how much would they buy? The answer provides an estimate of power consumption. Electricity prices vary greatly, but we can assume about $100 per MWh for the cost of power to the ultimate end-user. That comes to about 1,688 MWh to support the Bitcoin blockchain.
Using the bottom-up approach, we look at the hash rate which is known by observing the difficulty of each block. If we assume that all miners are using the most efficient hardware, we can derive a lower bound on electricity consumption. The latest Antminer S9 is rated at about 14.5 terahashes per second. The total Bitcoin network hash rate is about 50 exahashes per second (exa is short for one billion trillion, or 1018).
That means that the equivalent of 3.45 million Antminer S9 units are required to support the Bitcoin blockchain. The S9 is rated to consume 1,650 watts. This implies that 3.45 million S9s will consume 5,693 MWh. Let’s start with 5,693 and assume that older mining hardware is about 25% less efficient than the latest S9s, implying that about 7,000 MWh are required to currently support the blockchain.
If we average 1,688 MWh using the top-down approach, and 7,000 MWh using the bottom-up approach, we estimate Bitcoin has a total energy cost of about 4,344 MWh.
So how much is 4,344 MWh? Ravenswood, the largest power plant in New York City, is rated at 2,480 MWh of baseload power, so about 1.75 Ravenswoods. Widely used GE wind turbines are rated at a minimum of 1.7 MWh each, so about 2,555 of those. Using the estimate of 4,344 MWh lets us estimate that the total annual cost of electricity is about $4.3 billion.
How does that compare to gold?
Physical gold mining is far more resource-intensive than Bitcoin mining. While there are ample statistics on the global gold mining industry, such “top-down” data does not include information on resource utilization. It is time-consuming to estimate global gold mining resource utilization using a “bottom-up” methodology. This would require analyzing financial statements of all significant publicly reporting gold mining companies. As a rough approximation, I have projected data taken from Barrick Gold Corporation (the world’s largest gold mining company) onto the global gold mining industry.
In 2017, Barrick produced 5.3 million oz. of gold. Page 32 of Barrick’s 2017 annual report discloses gold production cost of $794/oz., or $4.2 billion. Each year approximately 88 million oz. of gold are mined worldwide. Assuming Barrick’s $794 cost/oz., it cost $70 billion per year to mine the world’s gold production. While Barrick does not directly disclose the use of energy in gold mining, there are clues in its annual report. Most of Barrick’s direct energy costs are diesel fuel. Page 40 of its 2017 annual report shows $346 million of 2018 future crude oil exposure at $78/bbl. If we use current forward prices of $65/bbl, we come up with $288 million per year of energy costs, or $54/oz. in oil costs, for Barrick.
If we assume the world gold industry has a similar cost structure, we can estimate $4.8 billion in energy costs for the gold mining industry. Barrick has a world-class portfolio of low-cost mines. Let’s assume that the global cost average is 25% higher than Barrick. That would imply $6.0 billion of energy cost, and $87.3 billion of total annual costs.
To summarize, gold mining’s $6.0 billion of direct energy usage is about 92 million barrels annually ($6.0 billion/$65 bbl). The world consumes about 34 billion barrels annually. That means gold mining direct energy costs (diesel fuel) are about 0.27% of worldwide oil consumption. Bitcoin, in comparison uses about 0.07% of worldwide electricity capacity.
One last thing. Ecuador, the country that Bitcoin’s energy usage is most often compared with, has a total installed capacity of 8,070 MWh. The total installed electricity-generating capacity worldwide is approximately 6,142,000 MWh, so Ecuador has about 0.13% of worldwide capacity. The Bitcoin network’s power usage is about 70% of Ecuador’s usage, so that comparison isn’t unreasonable, but Bitcoin mining is still far more efficient than gold mining. I think the world would be a better place if we mined less gold, and increased our adoption of Bitcoin as a store of value.
Vladimir Jelisavcic is an investor in Panda Analytics Inc. which has created a Proof-of-Work Index Fund. The calculations in this article are based on the latest available figures, but there is inherent uncertainty in trying to estimate the energy usage of Bitcoin and gold mining, as not all necessary data are available.