How Powdr translates PIL to Plonky3 and what about to stwo —2
following the first post got two tasks: generate trace from pil to stwo implement FrameworkEval trait, which is used by stwo to define constraint, to PowdrCircuit let’s talk about generate trace first. in stwo, the trace is in this format as showed in last post ColumnVec<CircleEvaluation<SimdBackend, BaseField, BitReversedOrder>> how powdr in plonky3 does it? the “into_p3_field()” transfer the witness to plonky3 field. for fibonacci example, the input witness looks like this Map { iter: Chain { a: Some(Iter([( “Fibonacci::ISLAST”, [GoldilocksField(0), GoldilocksField(0), GoldilocksField(0), GoldilocksField(1)])])), b: Some(Iter([(“Fibonacci::y”, [GoldilocksField(0), GoldilocksField(0), GoldilocksField(0), GoldilocksField(1)])])) } } the output RowMajorMatrix it becomes: the plonky3 rowmajormatrix values are [0, 0, 0, 0, 0, 0, 1, 1] it re-organized based on rows. concatenate rows one by one. The trace is re-organized like this probably is because how the plonky3 do the constraints evaluation. namely, how in eval function in the air related traits evaluation the trace. so I …
Powdr command list
stwo: cargo run -r –features stwo pil test_data/pil/fibonacci.pil -o output -f –field m31 –prove-with stwo prove with plonky3 cargo run pil test_data/pil/fibonacci.pil -o output -f –field bb –prove-with plonky3 hello world example cargo run pil test_data/asm/book/hello_world.asm –inputs 0 -o output –field m31 –prove-with plonky3 -f …
How Powdr translates PIL to Plonky3 STARK—1
in Powdr, the proving function of Plonky3 is in function prove_with_key, in stark.rs you can see the inputs for prove_with_key function has “air”, this is what used to define constraints, powdr gives “&circuit” to it, and it is a powdrcircuit, probably powdrcircuit applies the trait of AIR so at this point I think I need to understand how Plonly3 defines AIR, using this example to learn. In this example, AIR is defined like this: // Define your AIR constraints inputs via declaring a Struct with relevant inputs inside it. pub struct FibonacciAir { pub num_steps: usize, // numbers of steps to run the fibonacci iterations pub final_value: u32, // precomputed final result after the numbers of steps. } // Define your Execution Trace Row size, impl<F: Field> BaseAir<F> for FibonacciAir { fn width(&self) -> usize { 2 // Row width for fibonacci program is 2 } } // Define your …
stwo fibonacci
it is about this example in stwo two parameters: FIB_SEQUENCE_LENGTH specifies the rounds of fibo sequence LOG_N_INSTANCES as stwo use SIMD, data parallel data struct, this number specify how many instances of fibo sequences are proved at the same time. The data are packed in LOG_N_LANES chunks, which is 16 each chunk, so LOG_N_INSTANCES is at least 5 which is at least 32 instances (I don’t know why it is not 16) with FIB_SEQUENCE_LENGTH=4 LOG_N_INSTANCES =5 run this test function print the input here I can get [FibInput { a: PackedM31([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]), b: PackedM31([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]) }, FibInput { a: PackedM31([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]), b: PackedM31([16, 17, 18, 19, 20, 21, …
Stwo backend note2
following note1 packed31 is a vector of length 16 try to understand how the trace is generated the bottom lines actually transfer the M31 values into circleDomain from start: N is log N instances this code is to initialise the witness columns, later use to store the witness. this code generate N numbers of BaseColumns, each based columns has length 2^log_size, but it is presented by chunks of 16 length each, as packedM31. Constraints in the code, the constraints is generated by component wideFibonacciComponent is a FrameworkComponent that has general type C which implement FrameworkEval, FrameworkEval requires to defined a function evaluate which has general type E that implement EvalAtRow WideFibonacciComponent is defined like this frameworkComponent is defined like this when perceive eval, remember it is a type that implement FrameworkEval, which doesn’t mean it is a struct called FrameworkEval FrameworkEval looks like this, it is a trait!!! EvalAtRow looks …
stwo backend note1
Powdr side backend trait in powdr verify prove export_verification_key how plonky3 implemented? this is in mod.rs in mod.rs the backend trait is implemented for plonky3prover, in the prove function, it calls prove in stark.rs, where plonky3prover is implemented so this prove function in stark.rs should be simliarly implemented in prover.rs in stwo recall stark: get a trace with witness, and interpolate the witness to get a polynomial: the witness are the evaluations on the domain, in cicle stark, it is the circle domain committing to the polynomial and do the low degree extension, this step include fft and dfft define the constraints: boundry constraint and transition constraint: for instance, first element should be 1, that is the committed polynomial has \(f(w^0)=1\), which means \( f(x)-1 \) has root at \(x=w^0\), so the poly should be dividable by \(x-w^0\) STWO side constraint related there is a FrameworkComponent in stwo seems to …