arkworks integration (#13031)

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix compression

* fix compression

* fix compression

* fix compression

* fix compression

* fix std leak

* fix std leak

* fix std leak

* merge master

* merge master

* cargo update

* cargo update

* cargo update

* cargo update

* cargo update

* use serialize_result

* cargo update

* cargo update

* cargo update

* cargo update

* reduce boilerplate code

* remove host function muls

* reduce boilerplate code

* remove patches

* uuse correct ark-substrate branch

* reduce boilerplate code

* cleanup

* cleanup

* proper error handling

* derive serialize for error

* proper error handling

* proper error handling

* proper error handling

* derive Debug for PairingError

* sp-arkworks path

* cargo update

* adopt tests to error handling

* fix tests

* cargo update

* remove results

* deserialize as G2Affine

* cargo update

* add codex index to PairingError

* replace Vec<Vec<u8>>

* replace Vec<Vec<u8>>

* use into_iter for chunks

* use chunks for scalars

* fix ersialized_size

* use into

* collect as vec

* collect as vec

* no collect Vec

* use into_iter

* import AffineRepr

* fix typo

* cargo update

* new serialization

* fix typo

* unwrap results

* unwrap results

* use correct deserialization

* fix bugs, cleanup

* correct len

* vec without capacity

* Revert "vec without capacity"

This reverts commit 2b1cd004f9f3f7cb1b0513c794f9ea781bb75ef1.

* Revert "correct len"

This reverts commit b85de8606364260c310f3c306b0a920e184e7e53.

* Revert "fix bugs, cleanup"

This reverts commit eef4c77ac99c0ed2e4b4857702e6ab5f1d2ce36c.

* Revert "use correct deserialization"

This reverts commit 9eacba93150bd41614e198cc6f2838d57d14f8db.

* Revert "unwrap results"

This reverts commit b0df1e1bdbd2518baa23040e0c6663ca69d2ba25.

* Revert "unwrap results"

This reverts commit de3cfbd04964dd66faeae5616b5763b1d30520e2.

* Revert "fix typo"

This reverts commit c12045d78f2468800be30ee1b31b12768aa7a786.

* Revert "new serialization"

This reverts commit e56a088be7612e4511382817afaf61f65b0c3aca.

* Revert "cargo update"

This reverts commit 15898da94677a5f19290a7f15fb15cb4cbd8f431.

* Revert "fix typo"

This reverts commit c89e96331f1d07e3b9b6a00ea9c89896553d67c6.

* Revert "import AffineRepr"

This reverts commit 5a103ac1b3506736181ddda040d896930bd8f83a.

* Revert "use into_iter"

This reverts commit 2e31d912bd4103529b40b250410f9f5b1a980ce4.

* Revert "no collect Vec"

This reverts commit db18dcac34fc3c3ddc20c3b42331f8d5fa7014b5.

* Revert "collect as vec"

This reverts commit dd3f809e965cec361a0feaab9abfae7115756e2c.

* Revert "collect as vec"

This reverts commit 9167d5984d8ecc3903d24f96d8c9fcac45c87bf7.

* Revert "use into"

This reverts commit 344cfffbd38fde130225df35f36259872754bd3a.

* Revert "fix ersialized_size"

This reverts commit c6a760986551cbbcaa3748564dd5e3c7630209c6.

* Revert "use chunks for scalars"

This reverts commit 67987ae0bbba7e3963ccba0dd9f1fbaa4c922d4f.

* Revert "use into_iter for chunks"

This reverts commit 1ddd6b89c2f8fb4e6dd26768be0edaca2d1be3f9.

* Revert "replace Vec<Vec<u8>>"

This reverts commit 4d3b13c02a9db0ea6bd130bda38c851f2371ec6e.

* cargo update

* cargo update

* Revert "replace Vec<Vec<u8>>"

This reverts commit 4389714068d939abc97288c5b06ee23d399a19ad.

* cargo update

* add error

* add error

* add error

* fix typo

* fix imports

* import coded

* import codec

* import PairingError

* fix patches

* sp-arkworks

* sp-arkworks

* use random values for multiplications

* cargo update

* fix imports

* fix imports

* add host functions

* re-add mul impls

* cargo update

* cargo update

* cargo update

* cargo update

* cargo update

* cargo update

* cargo update

* PairingError -> ()

* remove PairingError

* cargo update

* cargo update

* cargo update

* reduce boilerplate code

* cargo update

* update comments

* cargo update

* optimize code quality

* use ark_scale (#13954)

* use ark_scale

* fix tests

* fix tests

* cleanup & comments

* use correct PR branch

* hazmat

* ed curves, use ArkScaleProjective

* Achimcc/arkworks integration remove affine hostcalls (#13971)

* remove affine host-calls

* remove affine host-call impls, also in tests

* cargo update

* ark-substrate: use main branch

* cargo update

* Achimcc/arkworks integration bandersnatch (#13977)

* use bandersnatch

* bandersnatch

* add abndersnatch sw msm

* use correct PR branch

* cargo update

* cargo update

* fix tests

* cleanup

* cleanup

* fix tests

* refactor tests

* cargo update

* cargo update

* cargo update

* refactor tests

* cleanup & update tests

* upgrade arkworks/algebra

* cargo update

* adopt tests

* versioning ark-substrate

* cargo update

* remove patched deps

* bump ark-scale

* use crates-io deps

* fix doc comments

* Cargo.toml, linebreaks at end

* reorgainze tests

* sp-arkworks -> sp-crypto-ec-utils

* move host functions to crypto-ec-utils

* fmt

* remove sp-ec-crypto-utils from io

* remove unwrap from te msm

* remove elliptic_curves references in test

* elliptic_curves references in test

* update doc comments

* remove warn missing docs

* fmt

* cargo update

* update doc comments

* cargo update

* cargo update, bump arkworks, codec versions

* bump runtime version in sp-crypto-ec-utils

* remove feature flag ec-utils-experimental

* crypto-ec-utils -> crypto/ec-utils

* tests/ -> test-data/

* update doc comments for signatures

* update comments

* update doc comments for signatures

* fix doc comments

* fix doc comments

* fix doc comments

* fix doc comments

* fix doc comments

* cleanup

* fix doc comments

* cargo update

* fix doc comments

* cargo update
This commit is contained in:
Achim Schneider
2023-06-06 12:23:07 +02:00
committed by GitHub
parent 35cc4162df
commit 03490d5dd0
14 changed files with 1530 additions and 285 deletions
@@ -0,0 +1,130 @@
// This file is part of Substrate.
// Copyright (C) Parity Technologies (UK) Ltd.
// SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//! The generic executions of the operations on arkworks elliptic curves
//! which get instantiatied by the corresponding curves.
use ark_ec::{
pairing::{MillerLoopOutput, Pairing, PairingOutput},
short_weierstrass,
short_weierstrass::SWCurveConfig,
twisted_edwards,
twisted_edwards::TECurveConfig,
CurveConfig, VariableBaseMSM,
};
use ark_scale::hazmat::ArkScaleProjective;
use ark_std::vec::Vec;
use codec::{Decode, Encode};
const HOST_CALL: ark_scale::Usage = ark_scale::HOST_CALL;
type ArkScale<T> = ark_scale::ArkScale<T, HOST_CALL>;
pub(crate) fn multi_miller_loop_generic<Curve: Pairing>(
g1: Vec<u8>,
g2: Vec<u8>,
) -> Result<Vec<u8>, ()> {
let g1 = <ArkScale<Vec<<Curve as Pairing>::G1Affine>> as Decode>::decode(&mut g1.as_slice())
.map_err(|_| ())?;
let g2 = <ArkScale<Vec<<Curve as Pairing>::G2Affine>> as Decode>::decode(&mut g2.as_slice())
.map_err(|_| ())?;
let result = Curve::multi_miller_loop(g1.0, g2.0).0;
let result: ArkScale<<Curve as Pairing>::TargetField> = result.into();
Ok(result.encode())
}
pub(crate) fn final_exponentiation_generic<Curve: Pairing>(target: Vec<u8>) -> Result<Vec<u8>, ()> {
let target =
<ArkScale<<Curve as Pairing>::TargetField> as Decode>::decode(&mut target.as_slice())
.map_err(|_| ())?;
let result = Curve::final_exponentiation(MillerLoopOutput(target.0)).ok_or(())?;
let result: ArkScale<PairingOutput<Curve>> = result.into();
Ok(result.encode())
}
pub(crate) fn msm_sw_generic<Curve: SWCurveConfig>(
bases: Vec<u8>,
scalars: Vec<u8>,
) -> Result<Vec<u8>, ()> {
let bases =
<ArkScale<Vec<short_weierstrass::Affine<Curve>>> as Decode>::decode(&mut bases.as_slice())
.map_err(|_| ())?;
let scalars = <ArkScale<Vec<<Curve as CurveConfig>::ScalarField>> as Decode>::decode(
&mut scalars.as_slice(),
)
.map_err(|_| ())?;
let result =
<short_weierstrass::Projective<Curve> as VariableBaseMSM>::msm(&bases.0, &scalars.0)
.map_err(|_| ())?;
let result: ArkScaleProjective<short_weierstrass::Projective<Curve>> = result.into();
Ok(result.encode())
}
pub(crate) fn msm_te_generic<Curve: TECurveConfig>(
bases: Vec<u8>,
scalars: Vec<u8>,
) -> Result<Vec<u8>, ()> {
let bases =
<ArkScale<Vec<twisted_edwards::Affine<Curve>>> as Decode>::decode(&mut bases.as_slice())
.map_err(|_| ())?;
let scalars = <ArkScale<Vec<<Curve as CurveConfig>::ScalarField>> as Decode>::decode(
&mut scalars.as_slice(),
)
.map_err(|_| ())?;
let result = <twisted_edwards::Projective<Curve> as VariableBaseMSM>::msm(&bases.0, &scalars.0)
.map_err(|_| ())?;
let result: ArkScaleProjective<twisted_edwards::Projective<Curve>> = result.into();
Ok(result.encode())
}
pub(crate) fn mul_projective_generic<Group: SWCurveConfig>(
base: Vec<u8>,
scalar: Vec<u8>,
) -> Result<Vec<u8>, ()> {
let base = <ArkScaleProjective<short_weierstrass::Projective<Group>> as Decode>::decode(
&mut base.as_slice(),
)
.map_err(|_| ())?;
let scalar = <ArkScale<Vec<u64>> as Decode>::decode(&mut scalar.as_slice()).map_err(|_| ())?;
let result = <Group as SWCurveConfig>::mul_projective(&base.0, &scalar.0);
let result: ArkScaleProjective<short_weierstrass::Projective<Group>> = result.into();
Ok(result.encode())
}
pub(crate) fn mul_projective_te_generic<Group: TECurveConfig>(
base: Vec<u8>,
scalar: Vec<u8>,
) -> Result<Vec<u8>, ()> {
let base = <ArkScaleProjective<twisted_edwards::Projective<Group>> as Decode>::decode(
&mut base.as_slice(),
)
.map_err(|_| ())?;
let scalar = <ArkScale<Vec<u64>> as Decode>::decode(&mut scalar.as_slice()).map_err(|_| ())?;
let result = <Group as TECurveConfig>::mul_projective(&base.0, &scalar.0);
let result: ArkScaleProjective<twisted_edwards::Projective<Group>> = result.into();
Ok(result.encode())
}