Zero-knowledge proofs often seem mysterious because the math behind them is very advanced. Many people jokingly call this “moon math,” since it feels like something magical or from another world.

We want to remove this confusion and help everyone understand what zero-knowledge proofs actually do. Even though they still feel magical, we believe the community should understand the technical ideas behind our work.

In this post, we introduce an important building block used in many zero-knowledge proof systems: polynomial commitment schemes. After that, we explain KZG, one of the most popular and practical types of polynomial commitment schemes.

We then describe how KZG is used inside zk-rollups, and how Ethereum also uses KZG in Proto-Danksharding. Finally, we show how zk-rollups and Proto-Danksharding can work together smoothly and efficiently — something that is possible specifically because both systems use polynomial commitment schemes.

Why are we talking about polynomials?

Polynomials are powerful mathematical tools because they allow us to represent large or complex objects efficiently. One common example is representing an n-dimensional vector of field elements v using a single polynomial. We do this by constructing a polynomial φ(x) that passes through the points (i, v_i) for every index i = 1, 2, …, n. For example, the 3-dimensional vector v = [2, 0, 6] can be encoded by the polynomial φ(x) = 4x² − 14x + 12 because plugging in the values gives φ(1) = 2, φ(2) = 0, and φ(3) = 6. In general, given any n points, there always exists a unique polynomial of degree at most n − 1 that passes through all of them. This is why we say that n points fully define a polynomial of degree up to n − 1.

The process of constructing this polynomial is called polynomial interpolation, and one of the most widely used techniques is Lagrange interpolation, which provides a direct formula for building the polynomial from the given points. Using this method, we now know how to build a polynomial of degree at most n − 1 from exactly n constraints.

In the previous section, we learned that if you know n points, you can always determine one unique polynomial whose degree is at most n − 1. Now, we want to go a step further and understand how to actually find that polynomial from the coordinates of those n points.

One common and simple method for doing this is called Lagrange interpolation. Even though the official formulas may look complicated, the basic idea is very straightforward. Lagrange interpolation gives us a direct way to build the polynomial that passes through all the given points.

For example, suppose we want to find a polynomial P of degree at most 2 (that is, a quadratic polynomial). To do this, we need exactly 3 points, and the polynomial must satisfy all three of those constraints. Using the coordinates of these 3 points, Lagrange interpolation will construct the exact polynomial that fits them perfectly.

To do that, we'll build 3 sub-polynomials (one per constraint) P₁,P₂ and P₃ of degree at most 2.

These 3 sub-polynomials must follow these sub-constraints:

x

P₁(x)

P₂(x)

P₃(x)

1

2

0

0

2

0

0

0

3

0

0

6

Each sub-polynomial evaluates to 0 in all constraint points but one.

Finally, we set:

P(x) = P₁(x) + P₂(x) + P₃(x)

Let's quickly check, referring to the previous tab, that P respects all its constraints:

P(1) = P₁(1) + P₂(1) + P₃(1) = 2 + 0 + 0 = 2
P(2) = P₁(2) + P₂(2) + P₃(2) = 0 + 0 + 0 = 0
P(3) = P₁(3) + P₂(3) + P₃(3) = 0 + 0 + 6 = 6

We just verified that P respect its 3 constraints. Now, let's build P₁, P₂ and P₃.

P₁ already satisfies constraints:

P₁(2) = P₁(3) = 0

Now let's find A such as:

P₁(1) = 2

We have the following equation:

P₁(1) = A(1 - 2)(1 - 3) = 2

Which gives us:

A = 2

And finally:

P₁(x) = 2(x − 2)(x − 3)

P₁ now satisfies its 3 sub-constraints.

Using the factorised form, we can define P2 as:

P₂(x) = B(x - 1)(x − 3)

P2 already satisfies constraints:

P₂(1) = P₂(3) = 0

Now let's find B such as:

P₂(2) = 0

We have the following equation:

P₂(2) = B(2 − 1)(2 − 3) = 0

Which gives us:

B = 0

And finally:

P₁(x) = 0(x − 1)(x − 3) = 0

P₂ now satisfies its 3 sub-constraints.

Using the factorised form, we can define P3 as:

P₃(x) = C(x − 1)(x − 2)

P₃ already satisfies constraints:

P₃(1) = P₃(2) = 0

Now let's find C such as:

P₃(3) = 6

We have the following equation:

P₃(3) = C(3 - 1)(3 - 2) = 6

Which gives us:

C = 3

And finally:

P₃(x) = 3(x - 1)(x - 2)

P₃ now satisfies its 3 sub-constraints.

Checking P

Let's quickly check that P does respect the 3 constraints:

P(1) = 4(1²) - 14(1) + 12 = 4 - 14 + 12 = 2
P(2) = 4(2²) - 14(2) + 12 = 16 - 28 + 12 = 0
P(3) = 4(3²) - 14(3) + 12 = 36 - 48 + 12 = 6

What are polynomial commitment schemes, and why are they useful?

Polynomial commitment schemes are special tools that let someone commit to an entire polynomial without showing what the polynomial actually is. They work similarly to normal commitment schemes, where a person commits to a message and later reveals it. A good commitment scheme must be binding, meaning you cannot change the message after committing, and hiding, meaning no one can learn the message from the commitment alone. Polynomial commitments follow the same rules, but instead of committing to one message, you commit to a whole polynomial with many coefficients.

The powerful part of polynomial commitments is that you can later prove the value of the polynomial at specific points without revealing the entire polynomial. For example, if someone wants to prove that their secret polynomial ϕ(x) has the value 66 at x = 4, they can do so without exposing the rest of the polynomial. They simply give a short proof that shows “ϕ(4) = 66” is true, and the verifier can check it using the earlier commitment. The verifier learns nothing else about the polynomial itself. This feature is incredibly useful in zero-knowledge proofs, where the goal is to prove something is true without revealing extra information.

Another reason polynomial commitments are useful is that the commitment is much smaller than the polynomial. A polynomial may have hundreds or thousands of values, but the commitment can be just a single small group element, like 48 bytes. This is extremely important for blockchains, where storing or posting large data is expensive. By compressing a large polynomial into one tiny commitment, systems like zk-rollups and Proto-Danksharding can save a lot of space and reduce costs.

To make this easier to understand, imagine Alice has a secret polynomial, such as ϕ(x) = 3x² + 5x + 2. She does not want to reveal the polynomial, but Bob wants proof that ϕ(4) = 66. Alice commits to the polynomial using a polynomial commitment scheme and sends Bob the commitment. Later, she reveals only the value 66 and provides a short proof showing that this value is correct for x = 4. Bob checks the proof against the commitment and becomes convinced, without ever learning anything else about the polynomial. This is why polynomial commitments are powerful tools in modern cryptography and essential for scalable blockchain systems.

KZG Polynomial Commitment

1. Commitment

In a polynomial commitment, the prover first turns their data into a polynomial. Then they create a special cryptographic “commitment” to this polynomial. You can think of this commitment like sealing the polynomial inside a locked box: the prover cannot change it later, and the verifier cannot see what is inside. This step guarantees two things — the polynomial cannot be altered (binding) and its contents stay private (hiding).

2. Evaluation

Next, the prover wants to prove something about the polynomial without revealing it. So they pick a point x, plug it into the polynomial, and calculate the answer y=P(x). They send only this value y, plus a small proof that shows the value came from the committed polynomial. The actual polynomial stays hidden the whole time. This allows the prover to show the polynomial behaves correctly at one point without exposing the entire polynomial.

3. Verification

Finally, the verifier checks if the value y really matches the committed polynomial at point x. Using the commitment and the proof, the verifier performs a cryptographic check. If everything is valid, the verifier knows P(x)=y must be true — even though they never saw the polynomial itself. If the prover tries to lie or cheat, the verification step will catch it. This makes polynomial commitments secure and very useful in technologies like zero-knowledge proofs and Ethereum scaling.

KZG Polynomial Commitment Scheme

KZG has four main steps.

Step 1 - Trusted Setup

A one-time setup done before the system runs.

  1. Choose a generator g of an elliptic-curve group G (supports pairings).
  2. Choose the maximum degree l of the polynomial.
  3. Pick a secret random value:
τ ∈ Fp 
  1. Compute and publish:

    (g, g^τ, g^(τ^2), ...., g^(τ^l))
    

    Only these powers of gᵗ are public.

    The value τ must remain secret forever.

    If someone knows τ, they can forge proofs.

    Step 2 - Commit to a Polynomial

    Suppose we have the polynomial:

    ϕ(x) = ∑ᵢ₌₀ˡ ϕᵢ xⁱ
    

    We want to compute the commitment:

    C = g^{ϕ(τ)}
    

    Although the committer cannot compute g^{ϕ(τ)} directly since he doesn’t know τ, he can compute it using the output of the setup

Step 3 - Create a Proof for Evaluation ϕ(a)=b

To prove that:

ϕ(a)=b

Compute the quotient polynomial:

q(x) = ϕ(x) - b / x - a

This is only valid if the evaluation is correct

Then compute the proof:

π = g^{q(τ)}

This is the KZG evaluation proof.

Step 4 - Verification

Given:

The verifier checks:

e(c/g^b,g) = e(π,g^τ/g^a)

Here 𝑒 ( ⋅ , ⋅ ) is a bilinear pairing.

This equation is equivalent to verifying:

q(τ) = ϕ(τ) - b /τ - a

If the pairing check holds, the evaluation is accepted as correct.

Short explanation of the pairing check

LHS: e(C/g^b, g) = e(g,g)ϕ(τ)−b (C = g^{ϕ(τ)})

RHS: e(π, g^τ/g^a) = e(g,g)q(τ)⋅(τ−a) (π = g^{q(τ)})

Equality implies ϕ(τ)−b = q(τ)(τ−a), the quotient identity evaluated at τ, which enforces ϕ(a)=b if q(x) was correctly formed. (q(x) = ϕ(x) - b / x - a)

Use Cases:

  1. zk-rollups

In zk-rollups, we must prove that the work done on Layer 2 (L2) is correct. To do this, all the steps of the computation are converted into a big table (a 2D matrix). This happens during a process called witness generation. Each column of this table represents one part of the computation, and each column can be turned into a polynomial. So instead of handling a huge matrix directly, we work with a list of polynomials. The correctness of the computation can then be described using mathematical rules between these polynomials. For example, imagine three columns of the table represent three polynomials:

a(x)⋅b(x)−c(x)=0

This means “the first polynomial multiplied by the second must equal the third.” It’s like saying: column 1 × column 2 must produce column 3.

Instead of checking this rule for every possible value of x (which would be slow), zk-rollups check it only at a few random points. If the rule is correct at those random points, then with very high probability, the whole computation is correct.
Example: If I want to check whether two long lists match a rule, I don’t need to compare every item—checking a few random items usually tells me if the entire relationship is valid.

Polynomial commitment schemes—such as KZG—are perfect for this. The rollup commits to all the polynomials that describe the L2 computation (something like locking them inside a cryptographic box). Later, the verifier can ask for the values of these polynomials at specific random points. With those values and the commitments, the verifier checks whether the correctness rules hold. If they do, the verifier knows the entire computation is valid.

  1. Ethereum’s Proto-Danksharding

Proto-Danksharding (EIP-4844) is an upgrade designed to make it much cheaper for rollups to publish their data on Ethereum’s Layer 1. It introduces a new kind of transaction called a blob-carrying transaction. This type of transaction includes a large piece of data called a blob (around 128 kB). However, this blob is not accessible to smart contracts or the execution layer. Smart contracts can only see a commitment to the blob, not the blob itself.

Now the question is: How should Ethereum create this commitment to the blob?
One option is to take the blob and justhash it. But hashing is limited: if we only store a hash, then later we cannot prove anything about the data inside the blob without revealing the entire blob. This is too restrictive for Ethereum’s future scaling plans.

Instead, we can treat the blob as a polynomial. (Earlier, we explained how vectors or data can be represented as polynomials.) Using a polynomial commitment scheme such as KZG, Ethereum can commit to the blob in a way that not only hides the data, but also allows verification of certain properties without downloading the entire blob.

This capability is essential for something called Data Availability Sampling (DAS). DAS allows validators to check whether the blob is available and correct without downloading all 128 kB. Instead, validators download only tiny random pieces. Thanks to the math behind polynomial commitments, if enough random samples are correct, validators can be highly confident that the whole blob is available. (Although DAS is not included in the very first version of Proto-Danksharding, it will be added soon as Ethereum continues toward “full Danksharding.”)

Ethereum has chosen KZG as the polynomial commitment scheme for Proto-Danksharding and future sharding upgrades. Researchers compared several schemes, and concluded that KZG provides the best balance of efficiency, proof size, and simplicity for Ethereum’s roadmap in the short and medium term.

How zk-rollups and Ethereum’s Proto-Danksharding interact

zk-rollups and Ethereum’s Proto-Danksharding may seem like separate systems, but they both use KZG commitments in ways that allow them to work together smoothly. Scroll uses KZG to commit to the computations performed on Layer 2, while Ethereum uses KZG to commit to large data blobs posted on Layer 1. Surprisingly, these two uses can interact in a very elegant way.

When Scroll finishes processing a batch of L2 transactions and computes a new state root, it must publish three things on Ethereum’s L1:

  1. T — the list of L2 transactions,
  2. Sᵢ — the new state root after those transactions are applied,
  3. π — a proof that the new state root is correct.

Ethereum must verify two things:

Ethereum stores T as a data blob, which means the verifier only has access to a KZG commitment to that blob - let’s call this commitment Cₜ. Meanwhile, the proof π also contains KZG commitments to various polynomials used during the computation, including the polynomial that represents the transaction list. That polynomial has its own commitment inside the proof—call it Cₚ.

Now we have two different KZG commitments (Cₜ from the blob and Cₚ from the proof) that should represent the same polynomial ϕₜ (the polynomial representation of the transaction list). We need to check whether Cₜ and Cₚ really refer to the same data.

To do this, we use a technique called a proof of equivalence. The idea is simple:

  1. Pick a random-like value
    z = hash(Cₜ ∥ Cₚ) This makes z unpredictable and unique to these two commitments.
  2. Both commitments are then “opened” at the point z, each producing a value a.
    That is, prove that:
    • ϕₜ(z) = a under commitment Cₜ, and
    • ϕₜ(z) = a under commitment Cₚ.

If both commitments give the same value at the same random point, then with extremely high probability, they represent the same polynomial.

Example: Imagine two people, each claiming they have the same secret polynomial. Instead of revealing the whole polynomial, each evaluates it at a random point-say x = 103. If both evaluations match, the chance of them having two different polynomials that coincidentally agree at that exact random point is astronomically small.

This same logic lets Ethereum verify that the transaction list used in the proof π is exactly the same as the transaction list published in the blob.

A nice bonus is that this equivalence check works even if the two commitments use different polynomial commitment schemes. For example, one could be KZG and the other FRI. As long as both commitments support opening at a point, the verifier can compare them, making this approach highly flexible.

Erasure Coding With Polynomials

Another powerful concept used both in KZG and PeerDAS is erasure coding. This technique allows data to be recovered even if parts of it are missing. Using polynomials, we can take a small set of original values and extend them into a longer set by evaluating the polynomial at additional points. The key property is that if the degree stays the same, the original data can be recovered from any subset of enough points. This is exactly how Ethereum’s data-availability sampling (DAS) works: nodes don’t need every piece of data, only a random sample that lets them reconstruct the original information with high probability.

Why Polynomial Commitments Are Needed?

Using polynomials and erasure codes solves many problems, but creates a new one:
How do we prove that a piece of data truly belongs to the committed polynomial?

This is where KZG polynomial commitments come in. A KZG commitment allows:

This property is essential for Ethereum’s scaling roadmap because validators must verify data availability efficiently without reading megabytes of blob data.

PeerDAS and Proto-Danksharding rely on polynomials to split data into columns, cells, and blobs. Each blob is treated as a polynomial whose evaluations fill the data. When data is extended and arranged into columns, validators only receive small pieces, yet the polynomial structure ensures that all pieces are consistent. To verify this consistency without downloading entire blobs, Ethereum uses KZG proofs. Each proof confirms that a specific cell corresponds to the polynomial committed in the block header.

Thanks to polynomial commitments, Ethereum can safely scale data availability while drastically reducing the load on individual nodes. This is one of the main reasons KZG was chosen for EIP-4844 and PeerDAS.

Next, we will take a closer look at how PeerDAS works, how KZG commitments play a key role in it, and how erasure coding allows the network to recover missing data.