mirror of
https://github.com/pezkuwichain/pezkuwi-subxt.git
synced 2026-05-06 18:28:03 +00:00
Introduce trie level cache and remove state cache (#11407)
* trie state cache
* Also cache missing access on read.
* fix comp
* bis
* fix
* use has_lru
* remove local storage cache on size 0.
* No cache.
* local cache only
* trie cache and local cache
* storage cache (with local)
* trie cache no local cache
* Add state access benchmark
* Remove warnings etc
* Add trie cache benchmark
* No extra "clone" required
* Change benchmark to use multiple blocks
* Use patches
* Integrate shitty implementation
* More stuff
* Revert "Merge branch 'master' into trie_state_cache"
This reverts commit 947cd8e6d43fced10e21b76d5b92ffa57b57c318, reversing
changes made to 29ff036463.
* Improve benchmark
* Adapt to latest changes
* Adapt to changes in trie
* Add a test that uses iterator
* Start fixing it
* Remove obsolete file
* Make it compile
* Start rewriting the trie node cache
* More work on the cache
* More docs and code etc
* Make data cache an optional
* Tests
* Remove debug stuff
* Recorder
* Some docs and a simple test for the recorder
* Compile fixes
* Make it compile
* More fixes
* More fixes
* Fix fix fix
* Make sure cache and recorder work together for basic stuff
* Test that data caching and recording works
* Test `TrieDBMut` with caching
* Try something
* Fixes, fixes, fixes
* Forward the recorder
* Make it compile
* Use recorder in more places
* Switch to new `with_optional_recorder` fn
* Refactor and cleanups
* Move `ProvingBackend` tests
* Simplify
* Move over all functionality to the essence
* Fix compilation
* Implement estimate encoded size for StorageProof
* Start using the `cache` everywhere
* Use the cache everywhere
* Fix compilation
* Fix tests
* Adds `TrieBackendBuilder` and enhances the tests
* Ensure that recorder drain checks that values are found as expected
* Switch over to `TrieBackendBuilder`
* Start fixing the problem with child tries and recording
* Fix recording of child tries
* Make it compile
* Overwrite `storage_hash` in `TrieBackend`
* Add `storage_cache` to the benchmarks
* Fix `no_std` build
* Speed up cache lookup
* Extend the state access benchmark to also hash a runtime
* Fix build
* Fix compilation
* Rewrite value cache
* Add lru cache
* Ensure that the cache lru works
* Value cache should not be optional
* Add support for keeping the shared node cache in its bounds
* Make the cache configurable
* Check that the cache respects the bounds
* Adds a new test
* Fixes
* Docs and some renamings
* More docs
* Start using the new recorder
* Fix more code
* Take `self` argument
* Remove warnings
* Fix benchmark
* Fix accounting
* Rip off the state cache
* Start fixing fallout after removing the state cache
* Make it compile after trie changes
* Fix test
* Add some logging
* Some docs
* Some fixups and clean ups
* Fix benchmark
* Remove unneeded file
* Use git for patching
* Make CI happy
* Update primitives/trie/Cargo.toml
Co-authored-by: Koute <koute@users.noreply.github.com>
* Update primitives/state-machine/src/trie_backend.rs
Co-authored-by: cheme <emericchevalier.pro@gmail.com>
* Introduce new `AsTrieBackend` trait
* Make the LocalTrieCache not clonable
* Make it work in no_std and add docs
* Remove duplicate dependency
* Switch to ahash for better performance
* Speedup value cache merge
* Output errors on underflow
* Ensure the internal LRU map doesn't grow too much
* Use const fn to calculate the value cache element size
* Remove cache configuration
* Fix
* Clear the cache in between for more testing
* Try to come up with a failing test case
* Make the test fail
* Fix the child trie recording
* Make everything compile after the changes to trie
* Adapt to latest trie-db changes
* Fix on stable
* Update primitives/trie/src/cache.rs
Co-authored-by: cheme <emericchevalier.pro@gmail.com>
* Fix wrong merge
* Docs
* Fix warnings
* Cargo.lock
* Bump pin-project
* Fix warnings
* Switch to released crate version
* More fixes
* Make clippy and rustdocs happy
* More clippy
* Print error when using deprecated `--state-cache-size`
* 🤦
* Fixes
* Fix storage_hash linkings
* Update client/rpc/src/dev/mod.rs
Co-authored-by: Arkadiy Paronyan <arkady.paronyan@gmail.com>
* Review feedback
* encode bound
* Rework the shared value cache
Instead of using a `u64` to represent the key we now use an `Arc<[u8]>`. This arc is also stored in
some extra `HashSet`. We store the key are in an extra `HashSet` to de-duplicate the keys accross
different storage roots. When the latest key usage is dropped in the lru, we also remove the key
from the `HashSet`.
* Improve of the cache by merging the old and new solution
* FMT
* Please stop coming back all the time :crying:
* Update primitives/trie/src/cache/shared_cache.rs
Co-authored-by: Arkadiy Paronyan <arkady.paronyan@gmail.com>
* Fixes
* Make clippy happy
* Ensure we don't deadlock
* Only use one lock to simplify the code
* Do not depend on `Hasher`
* Fix tests
* FMT
* Clippy 🤦
Co-authored-by: cheme <emericchevalier.pro@gmail.com>
Co-authored-by: Koute <koute@users.noreply.github.com>
Co-authored-by: Arkadiy Paronyan <arkady.paronyan@gmail.com>
This commit is contained in:
@@ -25,7 +25,7 @@ use sp_std::{borrow::Borrow, marker::PhantomData, ops::Range, vec::Vec};
|
||||
use trie_db::{
|
||||
nibble_ops,
|
||||
node::{NibbleSlicePlan, NodeHandlePlan, NodePlan, Value, ValuePlan},
|
||||
ChildReference, NodeCodec as NodeCodecT, Partial,
|
||||
ChildReference, NodeCodec as NodeCodecT,
|
||||
};
|
||||
|
||||
/// Helper struct for trie node decoder. This implements `codec::Input` on a byte slice, while
|
||||
@@ -85,7 +85,7 @@ where
|
||||
H: Hasher,
|
||||
{
|
||||
const ESCAPE_HEADER: Option<u8> = Some(trie_constants::ESCAPE_COMPACT_HEADER);
|
||||
type Error = Error;
|
||||
type Error = Error<H::Out>;
|
||||
type HashOut = H::Out;
|
||||
|
||||
fn hashed_null_node() -> <H as Hasher>::Out {
|
||||
@@ -185,19 +185,19 @@ where
|
||||
&[trie_constants::EMPTY_TRIE]
|
||||
}
|
||||
|
||||
fn leaf_node(partial: Partial, value: Value) -> Vec<u8> {
|
||||
fn leaf_node(partial: impl Iterator<Item = u8>, number_nibble: usize, value: Value) -> Vec<u8> {
|
||||
let contains_hash = matches!(&value, Value::Node(..));
|
||||
let mut output = if contains_hash {
|
||||
partial_encode(partial, NodeKind::HashedValueLeaf)
|
||||
partial_from_iterator_encode(partial, number_nibble, NodeKind::HashedValueLeaf)
|
||||
} else {
|
||||
partial_encode(partial, NodeKind::Leaf)
|
||||
partial_from_iterator_encode(partial, number_nibble, NodeKind::Leaf)
|
||||
};
|
||||
match value {
|
||||
Value::Inline(value) => {
|
||||
Compact(value.len() as u32).encode_to(&mut output);
|
||||
output.extend_from_slice(value);
|
||||
},
|
||||
Value::Node(hash, _) => {
|
||||
Value::Node(hash) => {
|
||||
debug_assert!(hash.len() == H::LENGTH);
|
||||
output.extend_from_slice(hash);
|
||||
},
|
||||
@@ -244,7 +244,7 @@ where
|
||||
Compact(value.len() as u32).encode_to(&mut output);
|
||||
output.extend_from_slice(value);
|
||||
},
|
||||
Some(Value::Node(hash, _)) => {
|
||||
Some(Value::Node(hash)) => {
|
||||
debug_assert!(hash.len() == H::LENGTH);
|
||||
output.extend_from_slice(hash);
|
||||
},
|
||||
@@ -295,31 +295,6 @@ fn partial_from_iterator_encode<I: Iterator<Item = u8>>(
|
||||
output
|
||||
}
|
||||
|
||||
/// Encode and allocate node type header (type and size), and partial value.
|
||||
/// Same as `partial_from_iterator_encode` but uses non encoded `Partial` as input.
|
||||
fn partial_encode(partial: Partial, node_kind: NodeKind) -> Vec<u8> {
|
||||
let number_nibble_encoded = (partial.0).0 as usize;
|
||||
let nibble_count = partial.1.len() * nibble_ops::NIBBLE_PER_BYTE + number_nibble_encoded;
|
||||
|
||||
let nibble_count = sp_std::cmp::min(trie_constants::NIBBLE_SIZE_BOUND, nibble_count);
|
||||
|
||||
let mut output = Vec::with_capacity(4 + partial.1.len());
|
||||
match node_kind {
|
||||
NodeKind::Leaf => NodeHeader::Leaf(nibble_count).encode_to(&mut output),
|
||||
NodeKind::BranchWithValue => NodeHeader::Branch(true, nibble_count).encode_to(&mut output),
|
||||
NodeKind::BranchNoValue => NodeHeader::Branch(false, nibble_count).encode_to(&mut output),
|
||||
NodeKind::HashedValueLeaf =>
|
||||
NodeHeader::HashedValueLeaf(nibble_count).encode_to(&mut output),
|
||||
NodeKind::HashedValueBranch =>
|
||||
NodeHeader::HashedValueBranch(nibble_count).encode_to(&mut output),
|
||||
};
|
||||
if number_nibble_encoded > 0 {
|
||||
output.push(nibble_ops::pad_right((partial.0).1));
|
||||
}
|
||||
output.extend_from_slice(partial.1);
|
||||
output
|
||||
}
|
||||
|
||||
const BITMAP_LENGTH: usize = 2;
|
||||
|
||||
/// Radix 16 trie, bitmap encoding implementation,
|
||||
@@ -329,7 +304,7 @@ const BITMAP_LENGTH: usize = 2;
|
||||
pub(crate) struct Bitmap(u16);
|
||||
|
||||
impl Bitmap {
|
||||
pub fn decode(mut data: &[u8]) -> Result<Self, Error> {
|
||||
pub fn decode(mut data: &[u8]) -> Result<Self, codec::Error> {
|
||||
Ok(Bitmap(u16::decode(&mut data)?))
|
||||
}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user