How Does a “Hash Rate” Differ from “Network Difficulty”?

Hash rate is the total combined computational power being used by miners to secure a Proof-of-Work network, measured in hashes per second. Network difficulty is a measure of how hard it is to find a valid block hash, and it is a value that is adjusted by the protocol.

The difficulty adjusts to ensure the average time between blocks remains constant (e.g. 10 minutes for Bitcoin), regardless of fluctuations in the total hash rate.

High hash rate leads to a higher difficulty setting.

What Is the Estimated Computational Power Needed to Reverse a 256-Bit ECDSA Key?
What Metric Is Used to Measure the Total Computing Power of the Network?
What Is a Nonce in the Context of PoW?
Why Is the 2^128 Security Level of SHA-256 Considered Adequate against Current Computing Power?
What Is a “Hash Rate” and How Does It Affect the Difficulty Target?
What Is the ‘Hash Rate’ and How Does It Measure Network Security?
What Is “Hash Rate” and How Does It Affect a Miner’s Chance of a Reward?
How Is the ‘Difficulty’ Adjusted in the Bitcoin Mining Process?

Glossar