Inner hashing of value in state trie (runtime versioning). (#9732)

* starting

* Updated from other branch.

* setting flag

* flag in storage struct

* fix flagging to access and insert.

* added todo to fix

* also missing serialize meta to storage proof

* extract meta.

* Isolate old trie layout.

* failing test that requires storing in meta when old hash scheme is used.

* old hash compatibility

* Db migrate.

* runing tests with both states when interesting.

* fix chain spec test with serde default.

* export state (missing trie function).

* Pending using new branch, lacking genericity on layout resolution.

* extract and set global meta

* Update to branch 4

* fix iterator with root flag (no longer insert node).

* fix trie root hashing of root

* complete basic backend.

* Remove old_hash meta from proof that do not use inner_hashing.

* fix trie test for empty (force layout on empty deltas).

* Root update fix.

* debug on meta

* Use trie key iteration that do not include value in proofs.

* switch default test ext to use inner hash.

* small integration test, and fix tx cache mgmt in ext.
test  failing

* Proof scenario at state-machine level.

* trace for db upgrade

* try different param

* act more like iter_from.

* Bigger batches.

* Update trie dependency.

* drafting codec changes and refact

* before removing unused branch no value alt hashing.
more work todo rename all flag var to alt_hash, and remove extrinsic
replace by storage query at every storage_root call.

* alt hashing only for branch with value.

* fix trie tests

* Hash of value include the encoded size.

* removing fields(broken)

* fix trie_stream to also include value length in inner hash.

* triedbmut only using alt type if inner hashing.

* trie_stream to also only use alt hashing type when actually alt hashing.

* Refactor meta state, logic should work with change of trie treshold.

* Remove NoMeta variant.

* Remove state_hashed trigger specific functions.

* pending switching to using threshold, new storage root api does not
make much sense.

* refactoring to use state from backend (not possible payload changes).

* Applying from previous state

* Remove default from storage, genesis need a special build.

* rem empty space

* Catch problem: when using triedb with default: we should not revert
nodes: otherwhise thing as trie codec cannot decode-encode without
changing state.

* fix compilation

* Right logic to avoid switch on reencode when default layout.

* Clean up some todos

* remove trie meta from root upstream

* update upstream and fix benches.

* split some long lines.

* UPdate trie crate to work with new design.

* Finish update to refactored upstream.

* update to latest triedb changes.

* Clean up.

* fix executor test.

* rust fmt from master.

* rust format.

* rustfmt

* fix

* start host function driven versioning

* update state-machine part

* still need access to state version from runtime

* state hash in mem: wrong

* direction likely correct, but passing call to code exec for genesis
init seem awkward.

* state version serialize in runtime, wrong approach, just initialize it
with no threshold for core api < 4 seems more proper.

* stateversion from runtime version (core api >= 4).

* update trie, fix tests

* unused import

* clean some TODOs

* Require RuntimeVersionOf for executor

* use RuntimeVersionOf to resolve genesis state version.

* update runtime version test

* fix state-machine tests

* TODO

* Use runtime version from storage wasm with fast sync.

* rustfmt

* fmt

* fix test

* revert useless changes.

* clean some unused changes

* fmt

* removing useless trait function.

* remove remaining reference to state_hash

* fix some imports

* Follow chain state version management.

* trie update, fix and constant threshold for trie layouts.

* update deps

* Update to latest trie pr changes.

* fix benches

* Verify proof requires right layout.

* update trie_root

* Update trie deps to  latest

* Update to latest trie versioning

* Removing patch

* update lock

* extrinsic for sc-service-test using layout v0.

* Adding RuntimeVersionOf to CallExecutor works.

* fmt

* error when resolving version and no wasm in storage.

* use existing utils to instantiate runtime code.

* Patch to delay runtime switch.

* Revert "Patch to delay runtime switch."

This reverts commit 67e55fee468f1a0cda853f5362b22e0d775786da.

* useless closure

* remove remaining state_hash variables.

* Remove outdated comment

* useless inner hash

* fmt

* fmt and opt-in feature to apply state change.

* feature gate core version, use new test feature for node and test node

* Use a 'State' api version instead of Core one.

* fix merge of test function

* use blake macro.

* Fix state api (require declaring the api in runtime).

* Opt out feature, fix macro for io to select a given version
instead of latest.

* run test nodes on new state.

* fix

* Apply review change (docs and error).

* fmt

* use explicit runtime_interface in doc test

* fix ui test

* fix doc test

* fmt

* use default for path and specname when resolving version.

* small review related changes.

* doc value size requirement.

* rename old_state feature

* Remove macro changes

* feature rename

* state version as host function parameter

* remove flag for client api

* fix tests

* switch storage chain proof to V1

* host functions, pass by state version enum

* use WrappedRuntimeCode

* start

* state_version in runtime version

* rust fmt

* Update storage proof of max size.

* fix runtime version rpc test

* right intent of convert from compat

* fix doc test

* fix doc test

* split proof

* decode without replay, and remove some reexports.

* Decode with compatibility by default.

* switch state_version to u8. And remove RuntimeVersionBasis.

* test

* use api when reading embedded version

* fix decode with apis

* extract core version instead

* test fix

* unused import

* review changes.

Co-authored-by: kianenigma <kian@parity.io>
This commit is contained in:
cheme
2021-12-24 09:54:07 +01:00
committed by GitHub
parent ea30c739ea
commit 4c651637f2
78 changed files with 1814 additions and 782 deletions
+87 -27
View File
@@ -24,7 +24,7 @@ use hash_db::Hasher;
use sp_std::{borrow::Borrow, marker::PhantomData, ops::Range, vec::Vec};
use trie_db::{
self, nibble_ops,
node::{NibbleSlicePlan, NodeHandlePlan, NodePlan},
node::{NibbleSlicePlan, NodeHandlePlan, NodePlan, Value, ValuePlan},
ChildReference, NodeCodec as NodeCodecT, Partial,
};
@@ -80,7 +80,11 @@ impl<'a> Input for ByteSliceInput<'a> {
#[derive(Default, Clone)]
pub struct NodeCodec<H>(PhantomData<H>);
impl<H: Hasher> NodeCodecT for NodeCodec<H> {
impl<H> NodeCodecT for NodeCodec<H>
where
H: Hasher,
{
const ESCAPE_HEADER: Option<u8> = Some(trie_constants::ESCAPE_COMPACT_HEADER);
type Error = Error;
type HashOut = H::Out;
@@ -88,11 +92,22 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
H::hash(<Self as NodeCodecT>::empty_node())
}
fn decode_plan(data: &[u8]) -> sp_std::result::Result<NodePlan, Self::Error> {
fn decode_plan(data: &[u8]) -> Result<NodePlan, Self::Error> {
let mut input = ByteSliceInput::new(data);
match NodeHeader::decode(&mut input)? {
let header = NodeHeader::decode(&mut input)?;
let contains_hash = header.contains_hash_of_value();
let branch_has_value = if let NodeHeader::Branch(has_value, _) = &header {
*has_value
} else {
// hashed_value_branch
true
};
match header {
NodeHeader::Null => Ok(NodePlan::Empty),
NodeHeader::Branch(has_value, nibble_count) => {
NodeHeader::HashedValueBranch(nibble_count) | NodeHeader::Branch(_, nibble_count) => {
let padding = nibble_count % nibble_ops::NIBBLE_PER_BYTE != 0;
// check that the padding is valid (if any)
if padding && nibble_ops::pad_left(data[input.offset]) != 0 {
@@ -105,9 +120,13 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
let partial_padding = nibble_ops::number_padding(nibble_count);
let bitmap_range = input.take(BITMAP_LENGTH)?;
let bitmap = Bitmap::decode(&data[bitmap_range])?;
let value = if has_value {
let count = <Compact<u32>>::decode(&mut input)?.0 as usize;
Some(input.take(count)?)
let value = if branch_has_value {
Some(if contains_hash {
ValuePlan::Node(input.take(H::LENGTH)?)
} else {
let count = <Compact<u32>>::decode(&mut input)?.0 as usize;
ValuePlan::Inline(input.take(count)?)
})
} else {
None
};
@@ -132,7 +151,7 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
children,
})
},
NodeHeader::Leaf(nibble_count) => {
NodeHeader::HashedValueLeaf(nibble_count) | NodeHeader::Leaf(nibble_count) => {
let padding = nibble_count % nibble_ops::NIBBLE_PER_BYTE != 0;
// check that the padding is valid (if any)
if padding && nibble_ops::pad_left(data[input.offset]) != 0 {
@@ -143,10 +162,16 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
nibble_ops::NIBBLE_PER_BYTE,
)?;
let partial_padding = nibble_ops::number_padding(nibble_count);
let count = <Compact<u32>>::decode(&mut input)?.0 as usize;
let value = if contains_hash {
ValuePlan::Node(input.take(H::LENGTH)?)
} else {
let count = <Compact<u32>>::decode(&mut input)?.0 as usize;
ValuePlan::Inline(input.take(count)?)
};
Ok(NodePlan::Leaf {
partial: NibbleSlicePlan::new(partial, partial_padding),
value: input.take(count)?,
value,
})
},
}
@@ -160,9 +185,23 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
&[trie_constants::EMPTY_TRIE]
}
fn leaf_node(partial: Partial, value: &[u8]) -> Vec<u8> {
let mut output = partial_encode(partial, NodeKind::Leaf);
value.encode_to(&mut output);
fn leaf_node(partial: Partial, value: Value) -> Vec<u8> {
let contains_hash = matches!(&value, Value::Node(..));
let mut output = if contains_hash {
partial_encode(partial, NodeKind::HashedValueLeaf)
} else {
partial_encode(partial, NodeKind::Leaf)
};
match value {
Value::Inline(value) => {
Compact(value.len() as u32).encode_to(&mut output);
output.extend_from_slice(value);
},
Value::Node(hash, _) => {
debug_assert!(hash.len() == H::LENGTH);
output.extend_from_slice(hash);
},
}
output
}
@@ -171,33 +210,46 @@ impl<H: Hasher> NodeCodecT for NodeCodec<H> {
_nbnibble: usize,
_child: ChildReference<<H as Hasher>::Out>,
) -> Vec<u8> {
unreachable!()
unreachable!("No extension codec.")
}
fn branch_node(
_children: impl Iterator<Item = impl Borrow<Option<ChildReference<<H as Hasher>::Out>>>>,
_maybe_value: Option<&[u8]>,
_maybe_value: Option<Value>,
) -> Vec<u8> {
unreachable!()
unreachable!("No extension codec.")
}
fn branch_node_nibbled(
partial: impl Iterator<Item = u8>,
number_nibble: usize,
children: impl Iterator<Item = impl Borrow<Option<ChildReference<<H as Hasher>::Out>>>>,
maybe_value: Option<&[u8]>,
value: Option<Value>,
) -> Vec<u8> {
let mut output = if maybe_value.is_some() {
partial_from_iterator_encode(partial, number_nibble, NodeKind::BranchWithValue)
} else {
partial_from_iterator_encode(partial, number_nibble, NodeKind::BranchNoValue)
let contains_hash = matches!(&value, Some(Value::Node(..)));
let mut output = match (&value, contains_hash) {
(&None, _) =>
partial_from_iterator_encode(partial, number_nibble, NodeKind::BranchNoValue),
(_, false) =>
partial_from_iterator_encode(partial, number_nibble, NodeKind::BranchWithValue),
(_, true) =>
partial_from_iterator_encode(partial, number_nibble, NodeKind::HashedValueBranch),
};
let bitmap_index = output.len();
let mut bitmap: [u8; BITMAP_LENGTH] = [0; BITMAP_LENGTH];
(0..BITMAP_LENGTH).for_each(|_| output.push(0));
if let Some(value) = maybe_value {
value.encode_to(&mut output);
};
match value {
Some(Value::Inline(value)) => {
Compact(value.len() as u32).encode_to(&mut output);
output.extend_from_slice(value);
},
Some(Value::Node(hash, _)) => {
debug_assert!(hash.len() == H::LENGTH);
output.extend_from_slice(hash);
},
None => (),
}
Bitmap::encode(
children.map(|maybe_child| match maybe_child.borrow() {
Some(ChildReference::Hash(h)) => {
@@ -229,11 +281,15 @@ fn partial_from_iterator_encode<I: Iterator<Item = u8>>(
) -> Vec<u8> {
let nibble_count = sp_std::cmp::min(trie_constants::NIBBLE_SIZE_BOUND, nibble_count);
let mut output = Vec::with_capacity(3 + (nibble_count / nibble_ops::NIBBLE_PER_BYTE));
let mut output = Vec::with_capacity(4 + (nibble_count / nibble_ops::NIBBLE_PER_BYTE));
match node_kind {
NodeKind::Leaf => NodeHeader::Leaf(nibble_count).encode_to(&mut output),
NodeKind::BranchWithValue => NodeHeader::Branch(true, nibble_count).encode_to(&mut output),
NodeKind::BranchNoValue => NodeHeader::Branch(false, nibble_count).encode_to(&mut output),
NodeKind::HashedValueLeaf =>
NodeHeader::HashedValueLeaf(nibble_count).encode_to(&mut output),
NodeKind::HashedValueBranch =>
NodeHeader::HashedValueBranch(nibble_count).encode_to(&mut output),
};
output.extend(partial);
output
@@ -247,11 +303,15 @@ fn partial_encode(partial: Partial, node_kind: NodeKind) -> Vec<u8> {
let nibble_count = sp_std::cmp::min(trie_constants::NIBBLE_SIZE_BOUND, nibble_count);
let mut output = Vec::with_capacity(3 + partial.1.len());
let mut output = Vec::with_capacity(4 + partial.1.len());
match node_kind {
NodeKind::Leaf => NodeHeader::Leaf(nibble_count).encode_to(&mut output),
NodeKind::BranchWithValue => NodeHeader::Branch(true, nibble_count).encode_to(&mut output),
NodeKind::BranchNoValue => NodeHeader::Branch(false, nibble_count).encode_to(&mut output),
NodeKind::HashedValueLeaf =>
NodeHeader::HashedValueLeaf(nibble_count).encode_to(&mut output),
NodeKind::HashedValueBranch =>
NodeHeader::HashedValueBranch(nibble_count).encode_to(&mut output),
};
if number_nibble_encoded > 0 {
output.push(nibble_ops::pad_right((partial.0).1));