mirror of
https://github.com/pezkuwichain/serde.git
synced 2026-04-25 17:27:55 +00:00
Compare commits
206 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| ae59c6b6d2 | |||
| 4973d7a62d | |||
| ed6a1de311 | |||
| ab234be025 | |||
| ee75e6c0e9 | |||
| c2b390fe63 | |||
| 56d5d7f761 | |||
| 0b89bc920e | |||
| 0dac13e4db | |||
| 0c2e91f28a | |||
| 13e7bee0e6 | |||
| 65104aca9c | |||
| 9360094ba7 | |||
| 3700779bfa | |||
| d9e894911f | |||
| 85e3ddc2b8 | |||
| ccae35d92a | |||
| 61ca928325 | |||
| a93f2ebff0 | |||
| a45f1ae915 | |||
| 9641978481 | |||
| ffd2017c6f | |||
| b7eb42aa6b | |||
| 750f8ba299 | |||
| 49cdef074d | |||
| aa86b04714 | |||
| c887a0b472 | |||
| 34936be574 | |||
| e354dd0c7f | |||
| bc221abb04 | |||
| ab5e8780ab | |||
| 0c34e06e51 | |||
| 4a0c4e0c25 | |||
| 8c34e0940f | |||
| eb6bf16a51 | |||
| 797d049db5 | |||
| d61a373f12 | |||
| e0eea551b4 | |||
| c650a92bf7 | |||
| f218f4d7bf | |||
| 8c0a2015be | |||
| 4773863e3a | |||
| 80cd9c7617 | |||
| 436cafb0a3 | |||
| 98bb02e9b4 | |||
| 142439088c | |||
| ce81288235 | |||
| 88d5fe6bfd | |||
| 9a2c352025 | |||
| 61c90cb8cb | |||
| 66e8b0a0cd | |||
| 9e7a3437d9 | |||
| 7ac8d4f9ae | |||
| 501bae42f5 | |||
| 7a0397451e | |||
| 16787318d1 | |||
| f4ae0888c8 | |||
| 213071fe5c | |||
| cfd26c6fda | |||
| 23fa83941e | |||
| 88f5b9511d | |||
| d537f1e1f0 | |||
| f6ac232580 | |||
| aad7a7987f | |||
| b24ad76880 | |||
| 5796f1a0f5 | |||
| 6437167930 | |||
| f98daaa250 | |||
| b8a40551a2 | |||
| 40db31691a | |||
| ab68132b1f | |||
| e70bbd9dde | |||
| d5e5c520ac | |||
| 1b9a096fa7 | |||
| 39e05ffad2 | |||
| 78fab25c5c | |||
| 2a557a1e36 | |||
| ab0848f780 | |||
| 2b1303f59c | |||
| 7f9ba155cb | |||
| a4e0c2f055 | |||
| 3bbf70575b | |||
| ad680cbd44 | |||
| ff0cfb1f1f | |||
| 9b08915a18 | |||
| 501aa3ee1d | |||
| eebf0f8db8 | |||
| a7e4911ddb | |||
| eb08f037f5 | |||
| aa03fd5d1a | |||
| e198afb0c1 | |||
| bc8de251cf | |||
| 99e8686189 | |||
| 826f656e28 | |||
| ab7c003b64 | |||
| 422191fcb0 | |||
| 4ba748c902 | |||
| 14ed6f2dab | |||
| 30606a43aa | |||
| 9be3d32016 | |||
| 5daf1b89a1 | |||
| f8f5d0ca2f | |||
| 57873cce28 | |||
| 4ed0362c8e | |||
| 4cecaf8d02 | |||
| 50c696aabe | |||
| 2f58a20bc6 | |||
| 030459a040 | |||
| e9b530a000 | |||
| ea1a729088 | |||
| 857dcea774 | |||
| b98a9a8f9b | |||
| 3b135431fd | |||
| 945d12c0b4 | |||
| e36915300f | |||
| 85c05d301a | |||
| c2474bf6ee | |||
| a52f436788 | |||
| ad3335e5d6 | |||
| 4b00de0e22 | |||
| 8403fa018e | |||
| 0e9f1b42de | |||
| 0085d05e55 | |||
| 2eed855bff | |||
| c3eced410f | |||
| 8a630fea7c | |||
| 2e597ed3f0 | |||
| 0963121beb | |||
| 15b2714058 | |||
| 9ce107de25 | |||
| e47284c0e0 | |||
| 800620e2aa | |||
| 40c670e625 | |||
| ba260b0e5f | |||
| 8452e313cc | |||
| 0dccbb1f11 | |||
| deca49315a | |||
| 95407a4ca5 | |||
| 2fe9a860cd | |||
| e67d941b78 | |||
| d4042872f5 | |||
| 64af86b830 | |||
| 370c8a91cb | |||
| 972da59ebc | |||
| a42008f695 | |||
| e4ea2a56e9 | |||
| 7650a48fdd | |||
| d665a2f2b2 | |||
| 44e23254c9 | |||
| 552971196d | |||
| 0681cd5003 | |||
| d965367238 | |||
| a6df35b3d2 | |||
| 9fc180e62f | |||
| 5b815b7001 | |||
| 4831482695 | |||
| d3e5dd9cd7 | |||
| 26098ed877 | |||
| 42ed62cf24 | |||
| 9f0973aff7 | |||
| ccec002bf3 | |||
| f36a1e0895 | |||
| e6487cf6fa | |||
| 4f2e8d5dbb | |||
| 1c2a4bff1c | |||
| 85bccf42b6 | |||
| 959fee024f | |||
| 8ede8c8e2a | |||
| 83537c95e1 | |||
| fa9057fa31 | |||
| 0084d82a50 | |||
| b504b08782 | |||
| 775e8154e7 | |||
| 9c679d9082 | |||
| b0f9d2a0ba | |||
| f39b1db96a | |||
| 9ecb0839de | |||
| 8a4c116812 | |||
| 1d3e921ba6 | |||
| 8e8694261b | |||
| 4fdba725fe | |||
| 75eed8cdde | |||
| 6801a13650 | |||
| 25ab84d4b9 | |||
| e43d3f3e4f | |||
| b37d47c987 | |||
| eec7101894 | |||
| 5dd327fb02 | |||
| fd3d1396d3 | |||
| c47b4c8e0b | |||
| 2d793b82f6 | |||
| 237be46e29 | |||
| 3d7aad1e7b | |||
| e792874369 | |||
| 1669c69714 | |||
| 4d5e450054 | |||
| 26b22e647d | |||
| cda1fc46b0 | |||
| c68b959696 | |||
| eab80172e4 | |||
| c1259fbc87 | |||
| 58e30eaee4 | |||
| bafa941004 | |||
| f347b2d363 | |||
| 3f9fc49cca | |||
| c913527944 |
+52
-31
@@ -1,45 +1,66 @@
|
|||||||
# Contributing to Serde
|
# Contributing to Serde
|
||||||
|
|
||||||
Serde welcomes contribution from everyone. Here are the guidelines if you are
|
Serde welcomes contribution from everyone in the form of suggestions, bug
|
||||||
thinking of helping us:
|
reports, pull requests, and feedback. This document gives some guidance if you
|
||||||
|
are thinking of helping us.
|
||||||
|
|
||||||
## Contributions
|
Please reach out here in a GitHub issue or in the #serde IRC channel on
|
||||||
|
[`irc.mozilla.org`] if we can do anything to help you contribute.
|
||||||
|
|
||||||
Contributions to Serde or its dependencies should be made in the form of GitHub
|
[`irc.mozilla.org`]: https://wiki.mozilla.org/IRC
|
||||||
pull requests. Each pull request will be reviewed by a core contributor
|
|
||||||
(someone with permission to land patches) and either landed in the main tree or
|
|
||||||
given feedback for changes that would be required. All contributions should
|
|
||||||
follow this format, even those from core contributors.
|
|
||||||
|
|
||||||
Should you wish to work on an issue, please claim it first by commenting on
|
## Submitting bug reports and feature requests
|
||||||
the GitHub issue that you want to work on it. This is to prevent duplicated
|
|
||||||
efforts from contributors on the same issue.
|
|
||||||
|
|
||||||
## Pull Request Checklist
|
Serde development is spread across lots of repositories, but this serde-rs/serde
|
||||||
|
repository is always a safe choice for opening any issues related to Serde.
|
||||||
|
|
||||||
- Branch from the master branch and, if needed, rebase to the current master
|
When reporting a bug or asking for help, please include enough details so that
|
||||||
branch before submitting your pull request. If it doesn't merge cleanly with
|
the people helping you can reproduce the behavior you are seeing. For some tips
|
||||||
master you may be asked to rebase your changes.
|
on how to approach this, read about how to produce a [Minimal, Complete, and
|
||||||
|
Verifiable example].
|
||||||
|
|
||||||
- Commits should be as small as possible, while ensuring that each commit is
|
[Minimal, Complete, and Verifiable example]: https://stackoverflow.com/help/mcve
|
||||||
correct independently (i.e., each commit should compile and pass tests).
|
|
||||||
|
|
||||||
- If your patch is not getting reviewed or you need a specific person to review
|
When making a feature request, please make it clear what problem you intend to
|
||||||
it, you can @-reply a reviewer asking for a review in the pull request or a
|
solve with the feature, any ideas for how Serde could support solving that
|
||||||
comment, or you can ask for a review in `#serde` on `irc.mozilla.org`.
|
problem, any possible alternatives, and any disadvantages.
|
||||||
|
|
||||||
- Add tests relevant to the fixed bug or new feature.
|
## Running the test suite
|
||||||
|
|
||||||
|
We encourage you to check that the test suite passes locally before submitting a
|
||||||
|
pull request with your changes. If anything does not pass, typically it will be
|
||||||
|
easier to iterate and fix it locally than waiting for the CI servers to run
|
||||||
|
tests for you.
|
||||||
|
|
||||||
|
##### In the [`serde`] directory
|
||||||
|
|
||||||
|
```sh
|
||||||
|
# Test all the example code in Serde documentation
|
||||||
|
cargo test
|
||||||
|
```
|
||||||
|
|
||||||
|
##### In the [`test_suite/deps`] directory
|
||||||
|
|
||||||
|
```sh
|
||||||
|
# This is a prerequisite for running the full test suite
|
||||||
|
cargo clean && cargo update && cargo build
|
||||||
|
```
|
||||||
|
|
||||||
|
##### In the [`test_suite`] directory
|
||||||
|
|
||||||
|
```sh
|
||||||
|
# Run the full test suite, including tests of unstable functionality
|
||||||
|
cargo test --features unstable
|
||||||
|
```
|
||||||
|
|
||||||
|
[`serde`]: https://github.com/serde-rs/serde/tree/master/serde
|
||||||
|
[`test_suite/deps`]: https://github.com/serde-rs/serde/tree/master/test_suite/deps
|
||||||
|
[`test_suite`]: https://github.com/serde-rs/serde/tree/master/test_suite
|
||||||
|
|
||||||
## Conduct
|
## Conduct
|
||||||
|
|
||||||
In all Serde-related forums, we follow the [Rust Code of
|
In all Serde-related forums, we follow the [Rust Code of Conduct]. For
|
||||||
Conduct](https://www.rust-lang.org/conduct.html). For escalation or moderation
|
escalation or moderation issues please contact Erick (erick.tryzelaar@gmail.com)
|
||||||
issues, please contact Erick (erick.tryzelaar@gmail.com) instead of the Rust
|
instead of the Rust moderation team.
|
||||||
moderation team.
|
|
||||||
|
|
||||||
## Communication
|
[Rust Code of Conduct]: https://www.rust-lang.org/conduct.html
|
||||||
|
|
||||||
Beyond opening tickets on the
|
|
||||||
[serde-rs/serde](https://github.com/serde-rs/serde) project, Serde contributors
|
|
||||||
frequent the `#serde` channel on
|
|
||||||
[`irc.mozilla.org`](https://wiki.mozilla.org/IRC).
|
|
||||||
|
|||||||
@@ -5,5 +5,4 @@ members = [
|
|||||||
"serde_derive_internals",
|
"serde_derive_internals",
|
||||||
"serde_test",
|
"serde_test",
|
||||||
"test_suite",
|
"test_suite",
|
||||||
"test_suite/no_std",
|
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -20,9 +20,30 @@ You may be looking for:
|
|||||||
|
|
||||||
## Serde in action
|
## Serde in action
|
||||||
|
|
||||||
<a href="http://play.integer32.com/?gist=9003c5b88c1f4989941925d7190c6eec" target="_blank">
|
<details>
|
||||||
<img align="right" width="50" src="https://raw.githubusercontent.com/serde-rs/serde-rs.github.io/master/img/run.png">
|
<summary>
|
||||||
</a>
|
Click to show Cargo.toml.
|
||||||
|
<a href="http://play.integer32.com/?gist=9003c5b88c1f4989941925d7190c6eec" target="_blank">Run this code in the playground.</a>
|
||||||
|
</summary>
|
||||||
|
|
||||||
|
```toml
|
||||||
|
[dependencies]
|
||||||
|
|
||||||
|
# The core APIs, including the Serialize and Deserialize traits. Always
|
||||||
|
# required when using Serde.
|
||||||
|
serde = "1.0"
|
||||||
|
|
||||||
|
# Support for #[derive(Serialize, Deserialize)]. Required if you want Serde
|
||||||
|
# to work for structs and enums defined in your crate.
|
||||||
|
serde_derive = "1.0"
|
||||||
|
|
||||||
|
# Each data format lives in its own crate; the sample code below uses JSON
|
||||||
|
# but you may be using a different one.
|
||||||
|
serde_json = "1.0"
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
<p></p>
|
||||||
|
|
||||||
```rust
|
```rust
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
|
|||||||
+2
-5
@@ -1,5 +1,2 @@
|
|||||||
fn_args_layout = "Block"
|
error_on_line_overflow = false
|
||||||
array_layout = "Block"
|
same_line_attributes = false
|
||||||
where_style = "Rfc"
|
|
||||||
generics_indent = "Block"
|
|
||||||
fn_call_style = "Block"
|
|
||||||
|
|||||||
+4
-11
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "serde"
|
name = "serde"
|
||||||
version = "1.0.2" # remember to update html_root_url
|
version = "1.0.25" # remember to update html_root_url
|
||||||
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
||||||
license = "MIT/Apache-2.0"
|
license = "MIT/Apache-2.0"
|
||||||
description = "A generic serialization/deserialization framework"
|
description = "A generic serialization/deserialization framework"
|
||||||
@@ -48,22 +48,15 @@ std = []
|
|||||||
# https://github.com/serde-rs/serde/issues/812
|
# https://github.com/serde-rs/serde/issues/812
|
||||||
unstable = []
|
unstable = []
|
||||||
|
|
||||||
# Provide impls for types that require memory allocation like Box<T> and Rc<T>.
|
# Provide impls for types in the Rust core allocation and collections library
|
||||||
# This is a subset of std but may be enabled without depending on all of std.
|
# including String, Box<T>, Vec<T>, and Cow<T>. This is a subset of std but may
|
||||||
|
# be enabled without depending on all of std.
|
||||||
#
|
#
|
||||||
# Requires a dependency on the unstable core allocation library:
|
# Requires a dependency on the unstable core allocation library:
|
||||||
#
|
#
|
||||||
# https://doc.rust-lang.org/alloc/
|
# https://doc.rust-lang.org/alloc/
|
||||||
alloc = ["unstable"]
|
alloc = ["unstable"]
|
||||||
|
|
||||||
# Provide impls for collection types like String and Cow<T>. This is a subset of
|
|
||||||
# std but may be enabled without depending on all of std.
|
|
||||||
#
|
|
||||||
# Requires a dependency on the unstable collections library:
|
|
||||||
#
|
|
||||||
# https://doc.rust-lang.org/collections/
|
|
||||||
collections = ["alloc"]
|
|
||||||
|
|
||||||
# Opt into impls for Rc<T> and Arc<T>. Serializing and deserializing these types
|
# Opt into impls for Rc<T> and Arc<T>. Serializing and deserializing these types
|
||||||
# does not preserve identity and may result in multiple copies of the same data.
|
# does not preserve identity and may result in multiple copies of the same data.
|
||||||
# Be sure that this is what you want before enabling this feature.
|
# Be sure that this is what you want before enabling this feature.
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
|
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
use de::{Deserialize, Deserializer, Visitor, SeqAccess, MapAccess, Error};
|
use de::{Deserialize, Deserializer, Error, MapAccess, SeqAccess, Visitor};
|
||||||
|
|
||||||
/// An efficient way of discarding data from a deserializer.
|
/// An efficient way of discarding data from a deserializer.
|
||||||
///
|
///
|
||||||
|
|||||||
+724
-268
File diff suppressed because it is too large
Load Diff
+104
-4
@@ -94,6 +94,7 @@
|
|||||||
//! - OsString
|
//! - OsString
|
||||||
//! - **Miscellaneous standard library types**:
|
//! - **Miscellaneous standard library types**:
|
||||||
//! - Duration
|
//! - Duration
|
||||||
|
//! - SystemTime
|
||||||
//! - Path
|
//! - Path
|
||||||
//! - PathBuf
|
//! - PathBuf
|
||||||
//! - Range\<T\>
|
//! - Range\<T\>
|
||||||
@@ -503,6 +504,35 @@ pub trait Deserialize<'de>: Sized {
|
|||||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||||
where
|
where
|
||||||
D: Deserializer<'de>;
|
D: Deserializer<'de>;
|
||||||
|
|
||||||
|
/// Deserializes a value into `self` from the given Deserializer.
|
||||||
|
///
|
||||||
|
/// The purpose of this method is to allow the deserializer to reuse
|
||||||
|
/// resources and avoid copies. As such, if this method returns an error,
|
||||||
|
/// `self` will be in an indeterminate state where some parts of the struct
|
||||||
|
/// have been overwritten. Although whatever state that is will be
|
||||||
|
/// memory-safe.
|
||||||
|
///
|
||||||
|
/// This is generally useful when repeateadly deserializing values that
|
||||||
|
/// are processed one at a time, where the value of `self` doesn't matter
|
||||||
|
/// when the next deserialization occurs.
|
||||||
|
///
|
||||||
|
/// If you manually implement this, your recursive deserializations should
|
||||||
|
/// use `deserialize_in_place`.
|
||||||
|
///
|
||||||
|
/// This method is stable and an official public API, but hidden from the
|
||||||
|
/// documentation because it is almost never what newbies are looking for.
|
||||||
|
/// Showing it in rustdoc would cause it to be featured more prominently
|
||||||
|
/// than it deserves.
|
||||||
|
#[doc(hidden)]
|
||||||
|
fn deserialize_in_place<D>(deserializer: D, place: &mut Self) -> Result<(), D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
// Default implementation just delegates to `deserialize` impl.
|
||||||
|
*place = Deserialize::deserialize(deserializer)?;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// A data structure that can be deserialized without borrowing any data from
|
/// A data structure that can be deserialized without borrowing any data from
|
||||||
@@ -1010,6 +1040,76 @@ pub trait Deserializer<'de>: Sized {
|
|||||||
fn deserialize_ignored_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
|
fn deserialize_ignored_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
|
||||||
where
|
where
|
||||||
V: Visitor<'de>;
|
V: Visitor<'de>;
|
||||||
|
|
||||||
|
/// Determine whether `Deserialize` implementations should expect to
|
||||||
|
/// deserialize their human-readable form.
|
||||||
|
///
|
||||||
|
/// Some types have a human-readable form that may be somewhat expensive to
|
||||||
|
/// construct, as well as a binary form that is compact and efficient.
|
||||||
|
/// Generally text-based formats like JSON and YAML will prefer to use the
|
||||||
|
/// human-readable one and binary formats like Bincode will prefer the
|
||||||
|
/// compact one.
|
||||||
|
///
|
||||||
|
/// ```
|
||||||
|
/// # use std::ops::Add;
|
||||||
|
/// # use std::str::FromStr;
|
||||||
|
/// #
|
||||||
|
/// # struct Timestamp;
|
||||||
|
/// #
|
||||||
|
/// # impl Timestamp {
|
||||||
|
/// # const EPOCH: Timestamp = Timestamp;
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// # impl FromStr for Timestamp {
|
||||||
|
/// # type Err = String;
|
||||||
|
/// # fn from_str(_: &str) -> Result<Self, Self::Err> {
|
||||||
|
/// # unimplemented!()
|
||||||
|
/// # }
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// # struct Duration;
|
||||||
|
/// #
|
||||||
|
/// # impl Duration {
|
||||||
|
/// # fn seconds(_: u64) -> Self { unimplemented!() }
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// # impl Add<Duration> for Timestamp {
|
||||||
|
/// # type Output = Timestamp;
|
||||||
|
/// # fn add(self, _: Duration) -> Self::Output {
|
||||||
|
/// # unimplemented!()
|
||||||
|
/// # }
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// use serde::de::{self, Deserialize, Deserializer};
|
||||||
|
///
|
||||||
|
/// impl<'de> Deserialize<'de> for Timestamp {
|
||||||
|
/// fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||||
|
/// where D: Deserializer<'de>
|
||||||
|
/// {
|
||||||
|
/// if deserializer.is_human_readable() {
|
||||||
|
/// // Deserialize from a human-readable string like "2015-05-15T17:01:00Z".
|
||||||
|
/// let s = String::deserialize(deserializer)?;
|
||||||
|
/// Timestamp::from_str(&s).map_err(de::Error::custom)
|
||||||
|
/// } else {
|
||||||
|
/// // Deserialize from a compact binary representation, seconds since
|
||||||
|
/// // the Unix epoch.
|
||||||
|
/// let n = u64::deserialize(deserializer)?;
|
||||||
|
/// Ok(Timestamp::EPOCH + Duration::seconds(n))
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// ```
|
||||||
|
///
|
||||||
|
/// The default implementation of this method returns `true`. Data formats
|
||||||
|
/// may override this to `false` to request a compact form for types that
|
||||||
|
/// support one. Note that modifying this method to change a format from
|
||||||
|
/// human-readable to compact or vice versa should be regarded as a breaking
|
||||||
|
/// change, as a value serialized in human-readable mode is not required to
|
||||||
|
/// deserialize from the same data in compact mode.
|
||||||
|
#[inline]
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
@@ -1119,7 +1219,7 @@ pub trait Visitor<'de>: Sized {
|
|||||||
self.visit_i64(v as i64)
|
self.visit_i64(v as i64)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The input contains an `i32`.
|
/// The input contains an `i64`.
|
||||||
///
|
///
|
||||||
/// The default implementation fails with a type error.
|
/// The default implementation fails with a type error.
|
||||||
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
|
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
|
||||||
@@ -1262,7 +1362,7 @@ pub trait Visitor<'de>: Sized {
|
|||||||
/// The default implementation forwards to `visit_str` and then drops the
|
/// The default implementation forwards to `visit_str` and then drops the
|
||||||
/// `String`.
|
/// `String`.
|
||||||
#[inline]
|
#[inline]
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn visit_string<E>(self, v: String) -> Result<Self::Value, E>
|
fn visit_string<E>(self, v: String) -> Result<Self::Value, E>
|
||||||
where
|
where
|
||||||
E: Error,
|
E: Error,
|
||||||
@@ -1321,7 +1421,7 @@ pub trait Visitor<'de>: Sized {
|
|||||||
///
|
///
|
||||||
/// The default implementation forwards to `visit_bytes` and then drops the
|
/// The default implementation forwards to `visit_bytes` and then drops the
|
||||||
/// `Vec<u8>`.
|
/// `Vec<u8>`.
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn visit_byte_buf<E>(self, v: Vec<u8>) -> Result<Self::Value, E>
|
fn visit_byte_buf<E>(self, v: Vec<u8>) -> Result<Self::Value, E>
|
||||||
where
|
where
|
||||||
E: Error,
|
E: Error,
|
||||||
@@ -1423,7 +1523,7 @@ pub trait SeqAccess<'de> {
|
|||||||
/// `Ok(None)` if there are no more remaining items.
|
/// `Ok(None)` if there are no more remaining items.
|
||||||
///
|
///
|
||||||
/// `Deserialize` implementations should typically use
|
/// `Deserialize` implementations should typically use
|
||||||
/// `SeqAcccess::next_element` instead.
|
/// `SeqAccess::next_element` instead.
|
||||||
fn next_element_seed<T>(&mut self, seed: T) -> Result<Option<T::Value>, Self::Error>
|
fn next_element_seed<T>(&mut self, seed: T) -> Result<Option<T::Value>, Self::Error>
|
||||||
where
|
where
|
||||||
T: DeserializeSeed<'de>;
|
T: DeserializeSeed<'de>;
|
||||||
|
|||||||
+169
-38
@@ -37,7 +37,7 @@
|
|||||||
|
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
use de::{self, IntoDeserializer, Expected, SeqAccess};
|
use de::{self, Expected, IntoDeserializer, SeqAccess};
|
||||||
use private::de::size_hint;
|
use private::de::size_hint;
|
||||||
use ser;
|
use ser;
|
||||||
use self::private::{First, Second};
|
use self::private::{First, Second};
|
||||||
@@ -51,21 +51,23 @@ pub struct Error {
|
|||||||
err: ErrorImpl,
|
err: ErrorImpl,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
type ErrorImpl = Box<str>;
|
type ErrorImpl = Box<str>;
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
type ErrorImpl = ();
|
type ErrorImpl = ();
|
||||||
|
|
||||||
impl de::Error for Error {
|
impl de::Error for Error {
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn custom<T>(msg: T) -> Self
|
fn custom<T>(msg: T) -> Self
|
||||||
where
|
where
|
||||||
T: Display,
|
T: Display,
|
||||||
{
|
{
|
||||||
Error { err: msg.to_string().into_boxed_str() }
|
Error {
|
||||||
|
err: msg.to_string().into_boxed_str(),
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn custom<T>(msg: T) -> Self
|
fn custom<T>(msg: T) -> Self
|
||||||
where
|
where
|
||||||
T: Display,
|
T: Display,
|
||||||
@@ -85,12 +87,12 @@ impl ser::Error for Error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl Display for Error {
|
impl Display for Error {
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn fmt(&self, formatter: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
fn fmt(&self, formatter: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
||||||
formatter.write_str(&self.err)
|
formatter.write_str(&self.err)
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn fmt(&self, formatter: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
fn fmt(&self, formatter: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
||||||
formatter.write_str("Serde deserialization error")
|
formatter.write_str("Serde deserialization error")
|
||||||
}
|
}
|
||||||
@@ -112,7 +114,9 @@ where
|
|||||||
type Deserializer = UnitDeserializer<E>;
|
type Deserializer = UnitDeserializer<E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> UnitDeserializer<E> {
|
fn into_deserializer(self) -> UnitDeserializer<E> {
|
||||||
UnitDeserializer { marker: PhantomData }
|
UnitDeserializer {
|
||||||
|
marker: PhantomData,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -355,15 +359,84 @@ where
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
/// A deserializer holding a `&str` with a lifetime tied to another
|
||||||
|
/// deserializer.
|
||||||
|
#[derive(Clone, Debug)]
|
||||||
|
pub struct BorrowedStrDeserializer<'de, E> {
|
||||||
|
value: &'de str,
|
||||||
|
marker: PhantomData<E>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, E> BorrowedStrDeserializer<'de, E> {
|
||||||
|
/// Create a new borrowed deserializer from the given string.
|
||||||
|
pub fn new(value: &'de str) -> BorrowedStrDeserializer<'de, E> {
|
||||||
|
BorrowedStrDeserializer {
|
||||||
|
value: value,
|
||||||
|
marker: PhantomData,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, E> de::Deserializer<'de> for BorrowedStrDeserializer<'de, E>
|
||||||
|
where
|
||||||
|
E: de::Error,
|
||||||
|
{
|
||||||
|
type Error = E;
|
||||||
|
|
||||||
|
fn deserialize_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
|
||||||
|
where
|
||||||
|
V: de::Visitor<'de>,
|
||||||
|
{
|
||||||
|
visitor.visit_borrowed_str(self.value)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn deserialize_enum<V>(
|
||||||
|
self,
|
||||||
|
name: &str,
|
||||||
|
variants: &'static [&'static str],
|
||||||
|
visitor: V,
|
||||||
|
) -> Result<V::Value, Self::Error>
|
||||||
|
where
|
||||||
|
V: de::Visitor<'de>,
|
||||||
|
{
|
||||||
|
let _ = name;
|
||||||
|
let _ = variants;
|
||||||
|
visitor.visit_enum(self)
|
||||||
|
}
|
||||||
|
|
||||||
|
forward_to_deserialize_any! {
|
||||||
|
bool i8 i16 i32 i64 u8 u16 u32 u64 f32 f64 char str string bytes
|
||||||
|
byte_buf option unit unit_struct newtype_struct seq tuple tuple_struct
|
||||||
|
map struct identifier ignored_any
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, E> de::EnumAccess<'de> for BorrowedStrDeserializer<'de, E>
|
||||||
|
where
|
||||||
|
E: de::Error,
|
||||||
|
{
|
||||||
|
type Error = E;
|
||||||
|
type Variant = private::UnitOnly<E>;
|
||||||
|
|
||||||
|
fn variant_seed<T>(self, seed: T) -> Result<(T::Value, Self::Variant), Self::Error>
|
||||||
|
where
|
||||||
|
T: de::DeserializeSeed<'de>,
|
||||||
|
{
|
||||||
|
seed.deserialize(self).map(private::unit_only)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
/// A deserializer holding a `String`.
|
/// A deserializer holding a `String`.
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
pub struct StringDeserializer<E> {
|
pub struct StringDeserializer<E> {
|
||||||
value: String,
|
value: String,
|
||||||
marker: PhantomData<E>,
|
marker: PhantomData<E>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, E> IntoDeserializer<'de, E> for String
|
impl<'de, E> IntoDeserializer<'de, E> for String
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -378,7 +451,7 @@ where
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, E> de::Deserializer<'de> for StringDeserializer<E>
|
impl<'de, E> de::Deserializer<'de> for StringDeserializer<E>
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -413,7 +486,7 @@ where
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, 'a, E> de::EnumAccess<'de> for StringDeserializer<E>
|
impl<'de, 'a, E> de::EnumAccess<'de> for StringDeserializer<E>
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -432,14 +505,14 @@ where
|
|||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
/// A deserializer holding a `Cow<str>`.
|
/// A deserializer holding a `Cow<str>`.
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
pub struct CowStrDeserializer<'a, E> {
|
pub struct CowStrDeserializer<'a, E> {
|
||||||
value: Cow<'a, str>,
|
value: Cow<'a, str>,
|
||||||
marker: PhantomData<E>,
|
marker: PhantomData<E>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, 'a, E> IntoDeserializer<'de, E> for Cow<'a, str>
|
impl<'de, 'a, E> IntoDeserializer<'de, E> for Cow<'a, str>
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -454,7 +527,7 @@ where
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, 'a, E> de::Deserializer<'de> for CowStrDeserializer<'a, E>
|
impl<'de, 'a, E> de::Deserializer<'de> for CowStrDeserializer<'a, E>
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -492,7 +565,7 @@ where
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, 'a, E> de::EnumAccess<'de> for CowStrDeserializer<'a, E>
|
impl<'de, 'a, E> de::EnumAccess<'de> for CowStrDeserializer<'a, E>
|
||||||
where
|
where
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
@@ -510,6 +583,46 @@ where
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
/// A deserializer holding a `&[u8]` with a lifetime tied to another
|
||||||
|
/// deserializer.
|
||||||
|
#[derive(Clone, Debug)]
|
||||||
|
pub struct BorrowedBytesDeserializer<'de, E> {
|
||||||
|
value: &'de [u8],
|
||||||
|
marker: PhantomData<E>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, E> BorrowedBytesDeserializer<'de, E> {
|
||||||
|
/// Create a new borrowed deserializer from the given byte slice.
|
||||||
|
pub fn new(value: &'de [u8]) -> BorrowedBytesDeserializer<'de, E> {
|
||||||
|
BorrowedBytesDeserializer {
|
||||||
|
value: value,
|
||||||
|
marker: PhantomData,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, E> de::Deserializer<'de> for BorrowedBytesDeserializer<'de, E>
|
||||||
|
where
|
||||||
|
E: de::Error,
|
||||||
|
{
|
||||||
|
type Error = E;
|
||||||
|
|
||||||
|
fn deserialize_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
|
||||||
|
where
|
||||||
|
V: de::Visitor<'de>,
|
||||||
|
{
|
||||||
|
visitor.visit_borrowed_bytes(self.value)
|
||||||
|
}
|
||||||
|
|
||||||
|
forward_to_deserialize_any! {
|
||||||
|
bool i8 i16 i32 i64 u8 u16 u32 u64 f32 f64 char str string bytes
|
||||||
|
byte_buf option unit unit_struct newtype_struct seq tuple tuple_struct
|
||||||
|
map struct identifier ignored_any enum
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
/// A deserializer that iterates over a sequence.
|
/// A deserializer that iterates over a sequence.
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
pub struct SeqDeserializer<I, E> {
|
pub struct SeqDeserializer<I, E> {
|
||||||
@@ -549,7 +662,10 @@ where
|
|||||||
} else {
|
} else {
|
||||||
// First argument is the number of elements in the data, second
|
// First argument is the number of elements in the data, second
|
||||||
// argument is the number of elements expected by the Deserialize.
|
// argument is the number of elements expected by the Deserialize.
|
||||||
Err(de::Error::invalid_length(self.count + remaining, &ExpectedInSeq(self.count)),)
|
Err(de::Error::invalid_length(
|
||||||
|
self.count + remaining,
|
||||||
|
&ExpectedInSeq(self.count),
|
||||||
|
))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -618,26 +734,26 @@ impl Expected for ExpectedInSeq {
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, T, E> IntoDeserializer<'de, E> for Vec<T>
|
impl<'de, T, E> IntoDeserializer<'de, E> for Vec<T>
|
||||||
where
|
where
|
||||||
T: IntoDeserializer<'de, E>,
|
T: IntoDeserializer<'de, E>,
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
{
|
{
|
||||||
type Deserializer = SeqDeserializer<<Vec<T> as IntoIterator>::IntoIter, E>;
|
type Deserializer = SeqDeserializer<<Self as IntoIterator>::IntoIter, E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> Self::Deserializer {
|
fn into_deserializer(self) -> Self::Deserializer {
|
||||||
SeqDeserializer::new(self.into_iter())
|
SeqDeserializer::new(self.into_iter())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, T, E> IntoDeserializer<'de, E> for BTreeSet<T>
|
impl<'de, T, E> IntoDeserializer<'de, E> for BTreeSet<T>
|
||||||
where
|
where
|
||||||
T: IntoDeserializer<'de, E> + Eq + Ord,
|
T: IntoDeserializer<'de, E> + Eq + Ord,
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
{
|
{
|
||||||
type Deserializer = SeqDeserializer<<BTreeSet<T> as IntoIterator>::IntoIter, E>;
|
type Deserializer = SeqDeserializer<<Self as IntoIterator>::IntoIter, E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> Self::Deserializer {
|
fn into_deserializer(self) -> Self::Deserializer {
|
||||||
SeqDeserializer::new(self.into_iter())
|
SeqDeserializer::new(self.into_iter())
|
||||||
@@ -645,12 +761,13 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
impl<'de, T, E> IntoDeserializer<'de, E> for HashSet<T>
|
impl<'de, T, S, E> IntoDeserializer<'de, E> for HashSet<T, S>
|
||||||
where
|
where
|
||||||
T: IntoDeserializer<'de, E> + Eq + Hash,
|
T: IntoDeserializer<'de, E> + Eq + Hash,
|
||||||
|
S: BuildHasher,
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
{
|
{
|
||||||
type Deserializer = SeqDeserializer<<HashSet<T> as IntoIterator>::IntoIter, E>;
|
type Deserializer = SeqDeserializer<<Self as IntoIterator>::IntoIter, E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> Self::Deserializer {
|
fn into_deserializer(self) -> Self::Deserializer {
|
||||||
SeqDeserializer::new(self.into_iter())
|
SeqDeserializer::new(self.into_iter())
|
||||||
@@ -742,7 +859,10 @@ where
|
|||||||
} else {
|
} else {
|
||||||
// First argument is the number of elements in the data, second
|
// First argument is the number of elements in the data, second
|
||||||
// argument is the number of elements expected by the Deserialize.
|
// argument is the number of elements expected by the Deserialize.
|
||||||
Err(de::Error::invalid_length(self.count + remaining, &ExpectedInMap(self.count)),)
|
Err(de::Error::invalid_length(
|
||||||
|
self.count + remaining,
|
||||||
|
&ExpectedInMap(self.count),
|
||||||
|
))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -791,11 +911,7 @@ where
|
|||||||
Ok(value)
|
Ok(value)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn deserialize_tuple<V>(
|
fn deserialize_tuple<V>(self, len: usize, visitor: V) -> Result<V::Value, Self::Error>
|
||||||
self,
|
|
||||||
len: usize,
|
|
||||||
visitor: V,
|
|
||||||
) -> Result<V::Value, Self::Error>
|
|
||||||
where
|
where
|
||||||
V: de::Visitor<'de>,
|
V: de::Visitor<'de>,
|
||||||
{
|
{
|
||||||
@@ -1036,14 +1152,14 @@ impl Expected for ExpectedInMap {
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl<'de, K, V, E> IntoDeserializer<'de, E> for BTreeMap<K, V>
|
impl<'de, K, V, E> IntoDeserializer<'de, E> for BTreeMap<K, V>
|
||||||
where
|
where
|
||||||
K: IntoDeserializer<'de, E> + Eq + Ord,
|
K: IntoDeserializer<'de, E> + Eq + Ord,
|
||||||
V: IntoDeserializer<'de, E>,
|
V: IntoDeserializer<'de, E>,
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
{
|
{
|
||||||
type Deserializer = MapDeserializer<'de, <BTreeMap<K, V> as IntoIterator>::IntoIter, E>;
|
type Deserializer = MapDeserializer<'de, <Self as IntoIterator>::IntoIter, E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> Self::Deserializer {
|
fn into_deserializer(self) -> Self::Deserializer {
|
||||||
MapDeserializer::new(self.into_iter())
|
MapDeserializer::new(self.into_iter())
|
||||||
@@ -1051,13 +1167,14 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
impl<'de, K, V, E> IntoDeserializer<'de, E> for HashMap<K, V>
|
impl<'de, K, V, S, E> IntoDeserializer<'de, E> for HashMap<K, V, S>
|
||||||
where
|
where
|
||||||
K: IntoDeserializer<'de, E> + Eq + Hash,
|
K: IntoDeserializer<'de, E> + Eq + Hash,
|
||||||
V: IntoDeserializer<'de, E>,
|
V: IntoDeserializer<'de, E>,
|
||||||
|
S: BuildHasher,
|
||||||
E: de::Error,
|
E: de::Error,
|
||||||
{
|
{
|
||||||
type Deserializer = MapDeserializer<'de, <HashMap<K, V> as IntoIterator>::IntoIter, E>;
|
type Deserializer = MapDeserializer<'de, <Self as IntoIterator>::IntoIter, E>;
|
||||||
|
|
||||||
fn into_deserializer(self) -> Self::Deserializer {
|
fn into_deserializer(self) -> Self::Deserializer {
|
||||||
MapDeserializer::new(self.into_iter())
|
MapDeserializer::new(self.into_iter())
|
||||||
@@ -1112,7 +1229,12 @@ mod private {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub fn unit_only<T, E>(t: T) -> (T, UnitOnly<E>) {
|
pub fn unit_only<T, E>(t: T) -> (T, UnitOnly<E>) {
|
||||||
(t, UnitOnly { marker: PhantomData })
|
(
|
||||||
|
t,
|
||||||
|
UnitOnly {
|
||||||
|
marker: PhantomData,
|
||||||
|
},
|
||||||
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<'de, E> de::VariantAccess<'de> for UnitOnly<E>
|
impl<'de, E> de::VariantAccess<'de> for UnitOnly<E>
|
||||||
@@ -1129,14 +1251,20 @@ mod private {
|
|||||||
where
|
where
|
||||||
T: de::DeserializeSeed<'de>,
|
T: de::DeserializeSeed<'de>,
|
||||||
{
|
{
|
||||||
Err(de::Error::invalid_type(Unexpected::UnitVariant, &"newtype variant"),)
|
Err(de::Error::invalid_type(
|
||||||
|
Unexpected::UnitVariant,
|
||||||
|
&"newtype variant",
|
||||||
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn tuple_variant<V>(self, _len: usize, _visitor: V) -> Result<V::Value, Self::Error>
|
fn tuple_variant<V>(self, _len: usize, _visitor: V) -> Result<V::Value, Self::Error>
|
||||||
where
|
where
|
||||||
V: de::Visitor<'de>,
|
V: de::Visitor<'de>,
|
||||||
{
|
{
|
||||||
Err(de::Error::invalid_type(Unexpected::UnitVariant, &"tuple variant"),)
|
Err(de::Error::invalid_type(
|
||||||
|
Unexpected::UnitVariant,
|
||||||
|
&"tuple variant",
|
||||||
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn struct_variant<V>(
|
fn struct_variant<V>(
|
||||||
@@ -1147,7 +1275,10 @@ mod private {
|
|||||||
where
|
where
|
||||||
V: de::Visitor<'de>,
|
V: de::Visitor<'de>,
|
||||||
{
|
{
|
||||||
Err(de::Error::invalid_type(Unexpected::UnitVariant, &"struct variant"),)
|
Err(de::Error::invalid_type(
|
||||||
|
Unexpected::UnitVariant,
|
||||||
|
&"struct variant",
|
||||||
|
))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
+3
-3
@@ -12,14 +12,14 @@ pub use lib::default::Default;
|
|||||||
pub use lib::fmt::{self, Formatter};
|
pub use lib::fmt::{self, Formatter};
|
||||||
pub use lib::marker::PhantomData;
|
pub use lib::marker::PhantomData;
|
||||||
pub use lib::option::Option::{self, None, Some};
|
pub use lib::option::Option::{self, None, Some};
|
||||||
pub use lib::result::Result::{self, Ok, Err};
|
pub use lib::result::Result::{self, Err, Ok};
|
||||||
|
|
||||||
pub use self::string::from_utf8_lossy;
|
pub use self::string::from_utf8_lossy;
|
||||||
|
|
||||||
mod string {
|
mod string {
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
pub fn from_utf8_lossy(bytes: &[u8]) -> Cow<str> {
|
pub fn from_utf8_lossy(bytes: &[u8]) -> Cow<str> {
|
||||||
String::from_utf8_lossy(bytes)
|
String::from_utf8_lossy(bytes)
|
||||||
}
|
}
|
||||||
@@ -31,7 +31,7 @@ mod string {
|
|||||||
//
|
//
|
||||||
// so it is okay for the return type to be different from the std case as long
|
// so it is okay for the return type to be different from the std case as long
|
||||||
// as the above works.
|
// as the above works.
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
pub fn from_utf8_lossy(bytes: &[u8]) -> &str {
|
pub fn from_utf8_lossy(bytes: &[u8]) -> &str {
|
||||||
// Three unicode replacement characters if it fails. They look like a
|
// Three unicode replacement characters if it fails. They look like a
|
||||||
// white-on-black question mark. The user will recognize it as invalid
|
// white-on-black question mark. The user will recognize it as invalid
|
||||||
|
|||||||
+48
-32
@@ -79,43 +79,57 @@
|
|||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
// Serde types in rustdoc of other crates get linked to here.
|
// Serde types in rustdoc of other crates get linked to here.
|
||||||
#![doc(html_root_url = "https://docs.rs/serde/1.0.2")]
|
#![doc(html_root_url = "https://docs.rs/serde/1.0.25")]
|
||||||
|
|
||||||
// Support using Serde without the standard library!
|
// Support using Serde without the standard library!
|
||||||
#![cfg_attr(not(feature = "std"), no_std)]
|
#![cfg_attr(not(feature = "std"), no_std)]
|
||||||
|
|
||||||
// Unstable functionality only if the user asks for it. For tracking and
|
// Unstable functionality only if the user asks for it. For tracking and
|
||||||
// discussion of these features please refer to this issue:
|
// discussion of these features please refer to this issue:
|
||||||
//
|
//
|
||||||
// https://github.com/serde-rs/serde/issues/812
|
// https://github.com/serde-rs/serde/issues/812
|
||||||
#![cfg_attr(feature = "unstable", feature(nonzero, specialization))]
|
#![cfg_attr(feature = "unstable", feature(nonzero, specialization))]
|
||||||
#![cfg_attr(all(feature = "std", feature = "unstable"), feature(into_boxed_c_str))]
|
|
||||||
#![cfg_attr(feature = "alloc", feature(alloc))]
|
#![cfg_attr(feature = "alloc", feature(alloc))]
|
||||||
#![cfg_attr(feature = "collections", feature(collections))]
|
#![cfg_attr(feature = "cargo-clippy", deny(clippy, clippy_pedantic))]
|
||||||
|
// Whitelisted clippy lints
|
||||||
// Whitelisted clippy lints.
|
#![cfg_attr(feature = "cargo-clippy",
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(doc_markdown))]
|
allow(cast_lossless, const_static_lifetime, doc_markdown, linkedlist,
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(linkedlist))]
|
needless_pass_by_value, type_complexity, unreadable_literal,
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(type_complexity))]
|
zero_prefixed_literal))]
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(zero_prefixed_literal))]
|
// Whitelisted clippy_pedantic lints
|
||||||
|
#![cfg_attr(feature = "cargo-clippy", allow(
|
||||||
|
// integer and float ser/de requires these sorts of casts
|
||||||
|
cast_possible_truncation,
|
||||||
|
cast_possible_wrap,
|
||||||
|
cast_precision_loss,
|
||||||
|
cast_sign_loss,
|
||||||
|
// simplifies some macros
|
||||||
|
invalid_upcast_comparisons,
|
||||||
|
// things are often more readable this way
|
||||||
|
option_unwrap_used,
|
||||||
|
result_unwrap_used,
|
||||||
|
shadow_reuse,
|
||||||
|
single_match_else,
|
||||||
|
stutter,
|
||||||
|
use_self,
|
||||||
|
// not practical
|
||||||
|
missing_docs_in_private_items,
|
||||||
|
// alternative is not stable
|
||||||
|
empty_enum,
|
||||||
|
use_debug,
|
||||||
|
))]
|
||||||
// Blacklisted Rust lints.
|
// Blacklisted Rust lints.
|
||||||
#![deny(missing_docs, unused_imports)]
|
#![deny(missing_docs, unused_imports)]
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
#[cfg(feature = "collections")]
|
|
||||||
extern crate collections;
|
|
||||||
|
|
||||||
#[cfg(feature = "alloc")]
|
#[cfg(feature = "alloc")]
|
||||||
extern crate alloc;
|
extern crate alloc;
|
||||||
|
|
||||||
#[cfg(all(feature = "unstable", feature = "std"))]
|
#[cfg(all(feature = "unstable", feature = "std"))]
|
||||||
extern crate core;
|
extern crate core;
|
||||||
|
|
||||||
/// A facade around all the types we need from the `std`, `core`, `alloc`, and
|
/// A facade around all the types we need from the `std`, `core`, and `alloc`
|
||||||
/// `collections` crates. This avoids elaborate import wrangling having to
|
/// crates. This avoids elaborate import wrangling having to happen in every
|
||||||
/// happen in every module.
|
/// module.
|
||||||
mod lib {
|
mod lib {
|
||||||
mod core {
|
mod core {
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
@@ -125,8 +139,8 @@ mod lib {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub use self::core::{cmp, iter, mem, ops, slice, str};
|
pub use self::core::{cmp, iter, mem, ops, slice, str};
|
||||||
pub use self::core::{i8, i16, i32, i64, isize};
|
pub use self::core::{isize, i16, i32, i64, i8};
|
||||||
pub use self::core::{u8, u16, u32, u64, usize};
|
pub use self::core::{usize, u16, u32, u64, u8};
|
||||||
pub use self::core::{f32, f64};
|
pub use self::core::{f32, f64};
|
||||||
|
|
||||||
pub use self::core::cell::{Cell, RefCell};
|
pub use self::core::cell::{Cell, RefCell};
|
||||||
@@ -140,18 +154,18 @@ mod lib {
|
|||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::borrow::{Cow, ToOwned};
|
pub use std::borrow::{Cow, ToOwned};
|
||||||
#[cfg(all(feature = "collections", not(feature = "std")))]
|
#[cfg(all(feature = "alloc", not(feature = "std")))]
|
||||||
pub use collections::borrow::{Cow, ToOwned};
|
pub use alloc::borrow::{Cow, ToOwned};
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::string::String;
|
pub use std::string::String;
|
||||||
#[cfg(all(feature = "collections", not(feature = "std")))]
|
#[cfg(all(feature = "alloc", not(feature = "std")))]
|
||||||
pub use collections::string::{String, ToString};
|
pub use alloc::string::{String, ToString};
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::vec::Vec;
|
pub use std::vec::Vec;
|
||||||
#[cfg(all(feature = "collections", not(feature = "std")))]
|
#[cfg(all(feature = "alloc", not(feature = "std")))]
|
||||||
pub use collections::vec::Vec;
|
pub use alloc::vec::Vec;
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::boxed::Box;
|
pub use std::boxed::Box;
|
||||||
@@ -169,9 +183,9 @@ mod lib {
|
|||||||
pub use alloc::arc::Arc;
|
pub use alloc::arc::Arc;
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::collections::{BinaryHeap, BTreeMap, BTreeSet, LinkedList, VecDeque};
|
pub use std::collections::{BTreeMap, BTreeSet, BinaryHeap, LinkedList, VecDeque};
|
||||||
#[cfg(all(feature = "collections", not(feature = "std")))]
|
#[cfg(all(feature = "alloc", not(feature = "std")))]
|
||||||
pub use collections::{BinaryHeap, BTreeMap, BTreeSet, LinkedList, VecDeque};
|
pub use alloc::{BTreeMap, BTreeSet, BinaryHeap, LinkedList, VecDeque};
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::{error, net};
|
pub use std::{error, net};
|
||||||
@@ -179,15 +193,17 @@ mod lib {
|
|||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::collections::{HashMap, HashSet};
|
pub use std::collections::{HashMap, HashSet};
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::ffi::{CString, CStr, OsString, OsStr};
|
pub use std::ffi::{CStr, CString, OsStr, OsString};
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::hash::{Hash, BuildHasher};
|
pub use std::hash::{BuildHasher, Hash};
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::io::Write;
|
pub use std::io::Write;
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
|
pub use std::num::Wrapping;
|
||||||
|
#[cfg(feature = "std")]
|
||||||
pub use std::path::{Path, PathBuf};
|
pub use std::path::{Path, PathBuf};
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::time::Duration;
|
pub use std::time::{Duration, SystemTime, UNIX_EPOCH};
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
pub use std::sync::{Mutex, RwLock};
|
pub use std::sync::{Mutex, RwLock};
|
||||||
|
|
||||||
|
|||||||
+396
-226
File diff suppressed because it is too large
Load Diff
+111
-100
@@ -8,10 +8,10 @@
|
|||||||
|
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
use ser::{self, Serialize, Serializer, SerializeMap, SerializeStruct, Impossible};
|
use ser::{self, Impossible, Serialize, SerializeMap, SerializeStruct, Serializer};
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
use self::content::{SerializeTupleVariantAsMapValue, SerializeStructVariantAsMapValue};
|
use self::content::{SerializeStructVariantAsMapValue, SerializeTupleVariantAsMapValue};
|
||||||
|
|
||||||
/// Used to check that serde(getter) attributes return the expected type.
|
/// Used to check that serde(getter) attributes return the expected type.
|
||||||
/// Not public API.
|
/// Not public API.
|
||||||
@@ -32,15 +32,13 @@ where
|
|||||||
S: Serializer,
|
S: Serializer,
|
||||||
T: Serialize,
|
T: Serialize,
|
||||||
{
|
{
|
||||||
value.serialize(
|
value.serialize(TaggedSerializer {
|
||||||
TaggedSerializer {
|
type_ident: type_ident,
|
||||||
type_ident: type_ident,
|
variant_ident: variant_ident,
|
||||||
variant_ident: variant_ident,
|
tag: tag,
|
||||||
tag: tag,
|
variant_name: variant_name,
|
||||||
variant_name: variant_name,
|
delegate: serializer,
|
||||||
delegate: serializer,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
struct TaggedSerializer<S> {
|
struct TaggedSerializer<S> {
|
||||||
@@ -60,11 +58,10 @@ enum Unsupported {
|
|||||||
ByteArray,
|
ByteArray,
|
||||||
Optional,
|
Optional,
|
||||||
Unit,
|
Unit,
|
||||||
UnitStruct,
|
|
||||||
Sequence,
|
Sequence,
|
||||||
Tuple,
|
Tuple,
|
||||||
TupleStruct,
|
TupleStruct,
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
Enum,
|
Enum,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -79,11 +76,10 @@ impl Display for Unsupported {
|
|||||||
Unsupported::ByteArray => formatter.write_str("a byte array"),
|
Unsupported::ByteArray => formatter.write_str("a byte array"),
|
||||||
Unsupported::Optional => formatter.write_str("an optional"),
|
Unsupported::Optional => formatter.write_str("an optional"),
|
||||||
Unsupported::Unit => formatter.write_str("unit"),
|
Unsupported::Unit => formatter.write_str("unit"),
|
||||||
Unsupported::UnitStruct => formatter.write_str("a unit struct"),
|
|
||||||
Unsupported::Sequence => formatter.write_str("a sequence"),
|
Unsupported::Sequence => formatter.write_str("a sequence"),
|
||||||
Unsupported::Tuple => formatter.write_str("a tuple"),
|
Unsupported::Tuple => formatter.write_str("a tuple"),
|
||||||
Unsupported::TupleStruct => formatter.write_str("a tuple struct"),
|
Unsupported::TupleStruct => formatter.write_str("a tuple struct"),
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
Unsupported::Enum => formatter.write_str("an enum"),
|
Unsupported::Enum => formatter.write_str("an enum"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -94,13 +90,10 @@ where
|
|||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
fn bad_type(self, what: Unsupported) -> S::Error {
|
fn bad_type(self, what: Unsupported) -> S::Error {
|
||||||
ser::Error::custom(
|
ser::Error::custom(format_args!(
|
||||||
format_args!(
|
|
||||||
"cannot serialize tagged newtype variant {}::{} containing {}",
|
"cannot serialize tagged newtype variant {}::{} containing {}",
|
||||||
self.type_ident,
|
self.type_ident, self.variant_ident, what
|
||||||
self.variant_ident,
|
))
|
||||||
what),
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -117,14 +110,14 @@ where
|
|||||||
type SerializeMap = S::SerializeMap;
|
type SerializeMap = S::SerializeMap;
|
||||||
type SerializeStruct = S::SerializeStruct;
|
type SerializeStruct = S::SerializeStruct;
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
type SerializeTupleVariant = Impossible<S::Ok, S::Error>;
|
type SerializeTupleVariant = Impossible<S::Ok, S::Error>;
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
type SerializeTupleVariant = SerializeTupleVariantAsMapValue<S::SerializeMap>;
|
type SerializeTupleVariant = SerializeTupleVariantAsMapValue<S::SerializeMap>;
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
type SerializeStructVariant = Impossible<S::Ok, S::Error>;
|
type SerializeStructVariant = Impossible<S::Ok, S::Error>;
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
type SerializeStructVariant = SerializeStructVariantAsMapValue<S::SerializeMap>;
|
type SerializeStructVariant = SerializeStructVariantAsMapValue<S::SerializeMap>;
|
||||||
|
|
||||||
fn serialize_bool(self, _: bool) -> Result<Self::Ok, Self::Error> {
|
fn serialize_bool(self, _: bool) -> Result<Self::Ok, Self::Error> {
|
||||||
@@ -199,7 +192,9 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_unit_struct(self, _: &'static str) -> Result<Self::Ok, Self::Error> {
|
fn serialize_unit_struct(self, _: &'static str) -> Result<Self::Ok, Self::Error> {
|
||||||
Err(self.bad_type(Unsupported::UnitStruct))
|
let mut map = try!(self.delegate.serialize_map(Some(1)));
|
||||||
|
try!(map.serialize_entry(self.tag, self.variant_name));
|
||||||
|
map.end()
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_unit_variant(
|
fn serialize_unit_variant(
|
||||||
@@ -257,7 +252,7 @@ where
|
|||||||
Err(self.bad_type(Unsupported::TupleStruct))
|
Err(self.bad_type(Unsupported::TupleStruct))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn serialize_tuple_variant(
|
fn serialize_tuple_variant(
|
||||||
self,
|
self,
|
||||||
_: &'static str,
|
_: &'static str,
|
||||||
@@ -270,7 +265,7 @@ where
|
|||||||
Err(self.bad_type(Unsupported::Enum))
|
Err(self.bad_type(Unsupported::Enum))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn serialize_tuple_variant(
|
fn serialize_tuple_variant(
|
||||||
self,
|
self,
|
||||||
_: &'static str,
|
_: &'static str,
|
||||||
@@ -281,7 +276,11 @@ where
|
|||||||
let mut map = try!(self.delegate.serialize_map(Some(2)));
|
let mut map = try!(self.delegate.serialize_map(Some(2)));
|
||||||
try!(map.serialize_entry(self.tag, self.variant_name));
|
try!(map.serialize_entry(self.tag, self.variant_name));
|
||||||
try!(map.serialize_key(inner_variant));
|
try!(map.serialize_key(inner_variant));
|
||||||
Ok(SerializeTupleVariantAsMapValue::new(map, inner_variant, len),)
|
Ok(SerializeTupleVariantAsMapValue::new(
|
||||||
|
map,
|
||||||
|
inner_variant,
|
||||||
|
len,
|
||||||
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_map(self, len: Option<usize>) -> Result<Self::SerializeMap, Self::Error> {
|
fn serialize_map(self, len: Option<usize>) -> Result<Self::SerializeMap, Self::Error> {
|
||||||
@@ -300,7 +299,7 @@ where
|
|||||||
Ok(state)
|
Ok(state)
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn serialize_struct_variant(
|
fn serialize_struct_variant(
|
||||||
self,
|
self,
|
||||||
_: &'static str,
|
_: &'static str,
|
||||||
@@ -313,7 +312,7 @@ where
|
|||||||
Err(self.bad_type(Unsupported::Enum))
|
Err(self.bad_type(Unsupported::Enum))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn serialize_struct_variant(
|
fn serialize_struct_variant(
|
||||||
self,
|
self,
|
||||||
_: &'static str,
|
_: &'static str,
|
||||||
@@ -324,10 +323,14 @@ where
|
|||||||
let mut map = try!(self.delegate.serialize_map(Some(2)));
|
let mut map = try!(self.delegate.serialize_map(Some(2)));
|
||||||
try!(map.serialize_entry(self.tag, self.variant_name));
|
try!(map.serialize_entry(self.tag, self.variant_name));
|
||||||
try!(map.serialize_key(inner_variant));
|
try!(map.serialize_key(inner_variant));
|
||||||
Ok(SerializeStructVariantAsMapValue::new(map, inner_variant, len),)
|
Ok(SerializeStructVariantAsMapValue::new(
|
||||||
|
map,
|
||||||
|
inner_variant,
|
||||||
|
len,
|
||||||
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn collect_str<T: ?Sized>(self, _: &T) -> Result<Self::Ok, Self::Error>
|
fn collect_str<T: ?Sized>(self, _: &T) -> Result<Self::Ok, Self::Error>
|
||||||
where
|
where
|
||||||
T: Display,
|
T: Display,
|
||||||
@@ -363,7 +366,7 @@ impl Display for Error {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
mod content {
|
mod content {
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
@@ -402,7 +405,10 @@ mod content {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn end(mut self) -> Result<M::Ok, M::Error> {
|
fn end(mut self) -> Result<M::Ok, M::Error> {
|
||||||
try!(self.map.serialize_value(&Content::TupleStruct(self.name, self.fields)));
|
try!(
|
||||||
|
self.map
|
||||||
|
.serialize_value(&Content::TupleStruct(self.name, self.fields))
|
||||||
|
);
|
||||||
self.map.end()
|
self.map.end()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -444,7 +450,10 @@ mod content {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn end(mut self) -> Result<M::Ok, M::Error> {
|
fn end(mut self) -> Result<M::Ok, M::Error> {
|
||||||
try!(self.map.serialize_value(&Content::Struct(self.name, self.fields)));
|
try!(
|
||||||
|
self.map
|
||||||
|
.serialize_value(&Content::Struct(self.name, self.fields))
|
||||||
|
);
|
||||||
self.map.end()
|
self.map.end()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -485,7 +494,12 @@ mod content {
|
|||||||
TupleVariant(&'static str, u32, &'static str, Vec<Content>),
|
TupleVariant(&'static str, u32, &'static str, Vec<Content>),
|
||||||
Map(Vec<(Content, Content)>),
|
Map(Vec<(Content, Content)>),
|
||||||
Struct(&'static str, Vec<(&'static str, Content)>),
|
Struct(&'static str, Vec<(&'static str, Content)>),
|
||||||
StructVariant(&'static str, u32, &'static str, Vec<(&'static str, Content)>),
|
StructVariant(
|
||||||
|
&'static str,
|
||||||
|
u32,
|
||||||
|
&'static str,
|
||||||
|
Vec<(&'static str, Content)>,
|
||||||
|
),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Serialize for Content {
|
impl Serialize for Content {
|
||||||
@@ -687,7 +701,10 @@ mod content {
|
|||||||
where
|
where
|
||||||
T: Serialize,
|
T: Serialize,
|
||||||
{
|
{
|
||||||
Ok(Content::NewtypeStruct(name, Box::new(try!(value.serialize(self)))),)
|
Ok(Content::NewtypeStruct(
|
||||||
|
name,
|
||||||
|
Box::new(try!(value.serialize(self))),
|
||||||
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_newtype_variant<T: ?Sized>(
|
fn serialize_newtype_variant<T: ?Sized>(
|
||||||
@@ -700,32 +717,26 @@ mod content {
|
|||||||
where
|
where
|
||||||
T: Serialize,
|
T: Serialize,
|
||||||
{
|
{
|
||||||
Ok(
|
Ok(Content::NewtypeVariant(
|
||||||
Content::NewtypeVariant(
|
name,
|
||||||
name,
|
variant_index,
|
||||||
variant_index,
|
variant,
|
||||||
variant,
|
Box::new(try!(value.serialize(self))),
|
||||||
Box::new(try!(value.serialize(self))),
|
))
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_seq(self, len: Option<usize>) -> Result<Self::SerializeSeq, E> {
|
fn serialize_seq(self, len: Option<usize>) -> Result<Self::SerializeSeq, E> {
|
||||||
Ok(
|
Ok(SerializeSeq {
|
||||||
SerializeSeq {
|
elements: Vec::with_capacity(len.unwrap_or(0)),
|
||||||
elements: Vec::with_capacity(len.unwrap_or(0)),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_tuple(self, len: usize) -> Result<Self::SerializeTuple, E> {
|
fn serialize_tuple(self, len: usize) -> Result<Self::SerializeTuple, E> {
|
||||||
Ok(
|
Ok(SerializeTuple {
|
||||||
SerializeTuple {
|
elements: Vec::with_capacity(len),
|
||||||
elements: Vec::with_capacity(len),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_tuple_struct(
|
fn serialize_tuple_struct(
|
||||||
@@ -733,13 +744,11 @@ mod content {
|
|||||||
name: &'static str,
|
name: &'static str,
|
||||||
len: usize,
|
len: usize,
|
||||||
) -> Result<Self::SerializeTupleStruct, E> {
|
) -> Result<Self::SerializeTupleStruct, E> {
|
||||||
Ok(
|
Ok(SerializeTupleStruct {
|
||||||
SerializeTupleStruct {
|
name: name,
|
||||||
name: name,
|
fields: Vec::with_capacity(len),
|
||||||
fields: Vec::with_capacity(len),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_tuple_variant(
|
fn serialize_tuple_variant(
|
||||||
@@ -749,25 +758,21 @@ mod content {
|
|||||||
variant: &'static str,
|
variant: &'static str,
|
||||||
len: usize,
|
len: usize,
|
||||||
) -> Result<Self::SerializeTupleVariant, E> {
|
) -> Result<Self::SerializeTupleVariant, E> {
|
||||||
Ok(
|
Ok(SerializeTupleVariant {
|
||||||
SerializeTupleVariant {
|
name: name,
|
||||||
name: name,
|
variant_index: variant_index,
|
||||||
variant_index: variant_index,
|
variant: variant,
|
||||||
variant: variant,
|
fields: Vec::with_capacity(len),
|
||||||
fields: Vec::with_capacity(len),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_map(self, len: Option<usize>) -> Result<Self::SerializeMap, E> {
|
fn serialize_map(self, len: Option<usize>) -> Result<Self::SerializeMap, E> {
|
||||||
Ok(
|
Ok(SerializeMap {
|
||||||
SerializeMap {
|
entries: Vec::with_capacity(len.unwrap_or(0)),
|
||||||
entries: Vec::with_capacity(len.unwrap_or(0)),
|
key: None,
|
||||||
key: None,
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_struct(
|
fn serialize_struct(
|
||||||
@@ -775,13 +780,11 @@ mod content {
|
|||||||
name: &'static str,
|
name: &'static str,
|
||||||
len: usize,
|
len: usize,
|
||||||
) -> Result<Self::SerializeStruct, E> {
|
) -> Result<Self::SerializeStruct, E> {
|
||||||
Ok(
|
Ok(SerializeStruct {
|
||||||
SerializeStruct {
|
name: name,
|
||||||
name: name,
|
fields: Vec::with_capacity(len),
|
||||||
fields: Vec::with_capacity(len),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_struct_variant(
|
fn serialize_struct_variant(
|
||||||
@@ -791,15 +794,13 @@ mod content {
|
|||||||
variant: &'static str,
|
variant: &'static str,
|
||||||
len: usize,
|
len: usize,
|
||||||
) -> Result<Self::SerializeStructVariant, E> {
|
) -> Result<Self::SerializeStructVariant, E> {
|
||||||
Ok(
|
Ok(SerializeStructVariant {
|
||||||
SerializeStructVariant {
|
name: name,
|
||||||
name: name,
|
variant_index: variant_index,
|
||||||
variant_index: variant_index,
|
variant: variant,
|
||||||
variant: variant,
|
fields: Vec::with_capacity(len),
|
||||||
fields: Vec::with_capacity(len),
|
error: PhantomData,
|
||||||
error: PhantomData,
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -907,7 +908,12 @@ mod content {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn end(self) -> Result<Content, E> {
|
fn end(self) -> Result<Content, E> {
|
||||||
Ok(Content::TupleVariant(self.name, self.variant_index, self.variant, self.fields),)
|
Ok(Content::TupleVariant(
|
||||||
|
self.name,
|
||||||
|
self.variant_index,
|
||||||
|
self.variant,
|
||||||
|
self.fields,
|
||||||
|
))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1013,7 +1019,12 @@ mod content {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn end(self) -> Result<Content, E> {
|
fn end(self) -> Result<Content, E> {
|
||||||
Ok(Content::StructVariant(self.name, self.variant_index, self.variant, self.fields),)
|
Ok(Content::StructVariant(
|
||||||
|
self.name,
|
||||||
|
self.variant_index,
|
||||||
|
self.variant,
|
||||||
|
self.fields,
|
||||||
|
))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
+118
-31
@@ -56,7 +56,7 @@ impl Serialize for str {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
impl Serialize for String {
|
impl Serialize for String {
|
||||||
#[inline]
|
#[inline]
|
||||||
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
@@ -177,6 +177,7 @@ where
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
macro_rules! seq_impl {
|
macro_rules! seq_impl {
|
||||||
($ty:ident < T $(: $tbound1:ident $(+ $tbound2:ident)*)* $(, $typaram:ident : $bound:ident)* >) => {
|
($ty:ident < T $(: $tbound1:ident $(+ $tbound2:ident)*)* $(, $typaram:ident : $bound:ident)* >) => {
|
||||||
impl<T $(, $typaram)*> Serialize for $ty<T $(, $typaram)*>
|
impl<T $(, $typaram)*> Serialize for $ty<T $(, $typaram)*>
|
||||||
@@ -195,22 +196,22 @@ macro_rules! seq_impl {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
seq_impl!(BinaryHeap<T: Ord>);
|
seq_impl!(BinaryHeap<T: Ord>);
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
seq_impl!(BTreeSet<T: Ord>);
|
seq_impl!(BTreeSet<T: Ord>);
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
seq_impl!(HashSet<T: Eq + Hash, H: BuildHasher>);
|
seq_impl!(HashSet<T: Eq + Hash, H: BuildHasher>);
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
seq_impl!(LinkedList<T>);
|
seq_impl!(LinkedList<T>);
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
seq_impl!(Vec<T>);
|
seq_impl!(Vec<T>);
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
seq_impl!(VecDeque<T>);
|
seq_impl!(VecDeque<T>);
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
@@ -290,6 +291,7 @@ tuple_impls! {
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
macro_rules! map_impl {
|
macro_rules! map_impl {
|
||||||
($ty:ident < K $(: $kbound1:ident $(+ $kbound2:ident)*)*, V $(, $typaram:ident : $bound:ident)* >) => {
|
($ty:ident < K $(: $kbound1:ident $(+ $kbound2:ident)*)*, V $(, $typaram:ident : $bound:ident)* >) => {
|
||||||
impl<K, V $(, $typaram)*> Serialize for $ty<K, V $(, $typaram)*>
|
impl<K, V $(, $typaram)*> Serialize for $ty<K, V $(, $typaram)*>
|
||||||
@@ -309,7 +311,7 @@ macro_rules! map_impl {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
map_impl!(BTreeMap<K: Ord, V>);
|
map_impl!(BTreeMap<K: Ord, V>);
|
||||||
|
|
||||||
#[cfg(feature = "std")]
|
#[cfg(feature = "std")]
|
||||||
@@ -338,19 +340,29 @@ deref_impl!(<'a, T: ?Sized> Serialize for &'a mut T where T: Serialize);
|
|||||||
deref_impl!(<T: ?Sized> Serialize for Box<T> where T: Serialize);
|
deref_impl!(<T: ?Sized> Serialize for Box<T> where T: Serialize);
|
||||||
|
|
||||||
#[cfg(all(feature = "rc", any(feature = "std", feature = "alloc")))]
|
#[cfg(all(feature = "rc", any(feature = "std", feature = "alloc")))]
|
||||||
deref_impl!(<T> Serialize for Rc<T> where T: Serialize);
|
deref_impl!(<T: ?Sized> Serialize for Rc<T> where T: Serialize);
|
||||||
|
|
||||||
#[cfg(all(feature = "rc", any(feature = "std", feature = "alloc")))]
|
#[cfg(all(feature = "rc", any(feature = "std", feature = "alloc")))]
|
||||||
deref_impl!(<T> Serialize for Arc<T> where T: Serialize);
|
deref_impl!(<T: ?Sized> Serialize for Arc<T> where T: Serialize);
|
||||||
|
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
deref_impl!(<'a, T: ?Sized> Serialize for Cow<'a, T> where T: Serialize + ToOwned);
|
deref_impl!(<'a, T: ?Sized> Serialize for Cow<'a, T> where T: Serialize + ToOwned);
|
||||||
|
|
||||||
#[cfg(feature = "unstable")]
|
|
||||||
deref_impl!(<T> Serialize for NonZero<T> where T: Serialize + Zeroable);
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
#[cfg(feature = "unstable")]
|
||||||
|
impl<T> Serialize for NonZero<T>
|
||||||
|
where
|
||||||
|
T: Serialize + Zeroable + Clone,
|
||||||
|
{
|
||||||
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
self.clone().get().serialize(serializer)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl<T> Serialize for Cell<T>
|
impl<T> Serialize for Cell<T>
|
||||||
where
|
where
|
||||||
T: Serialize + Copy,
|
T: Serialize + Copy,
|
||||||
@@ -445,6 +457,24 @@ impl Serialize for Duration {
|
|||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
#[cfg(feature = "std")]
|
||||||
|
impl Serialize for SystemTime {
|
||||||
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
use super::SerializeStruct;
|
||||||
|
let duration_since_epoch = self.duration_since(UNIX_EPOCH)
|
||||||
|
.expect("SystemTime must be later than UNIX_EPOCH");
|
||||||
|
let mut state = try!(serializer.serialize_struct("SystemTime", 2));
|
||||||
|
try!(state.serialize_field("secs_since_epoch", &duration_since_epoch.as_secs()));
|
||||||
|
try!(state.serialize_field("nanos_since_epoch", &duration_since_epoch.subsec_nanos()));
|
||||||
|
state.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
/// Serialize a value that implements `Display` as a string, when that string is
|
/// Serialize a value that implements `Display` as a string, when that string is
|
||||||
/// statically known to never have more than a constant `MAX_LEN` bytes.
|
/// statically known to never have more than a constant `MAX_LEN` bytes.
|
||||||
///
|
///
|
||||||
@@ -477,9 +507,20 @@ impl Serialize for net::IpAddr {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
match *self {
|
if serializer.is_human_readable() {
|
||||||
net::IpAddr::V4(ref a) => a.serialize(serializer),
|
match *self {
|
||||||
net::IpAddr::V6(ref a) => a.serialize(serializer),
|
net::IpAddr::V4(ref a) => a.serialize(serializer),
|
||||||
|
net::IpAddr::V6(ref a) => a.serialize(serializer),
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
match *self {
|
||||||
|
net::IpAddr::V4(ref a) => {
|
||||||
|
serializer.serialize_newtype_variant("IpAddr", 0, "V4", a)
|
||||||
|
}
|
||||||
|
net::IpAddr::V6(ref a) => {
|
||||||
|
serializer.serialize_newtype_variant("IpAddr", 1, "V6", a)
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -490,9 +531,13 @@ impl Serialize for net::Ipv4Addr {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
/// "101.102.103.104".len()
|
if serializer.is_human_readable() {
|
||||||
const MAX_LEN: usize = 15;
|
const MAX_LEN: usize = 15;
|
||||||
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
debug_assert_eq!(MAX_LEN, "101.102.103.104".len());
|
||||||
|
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
||||||
|
} else {
|
||||||
|
self.octets().serialize(serializer)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -502,9 +547,13 @@ impl Serialize for net::Ipv6Addr {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
/// "1000:1002:1003:1004:1005:1006:1007:1008".len()
|
if serializer.is_human_readable() {
|
||||||
const MAX_LEN: usize = 39;
|
const MAX_LEN: usize = 39;
|
||||||
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
debug_assert_eq!(MAX_LEN, "1001:1002:1003:1004:1005:1006:1007:1008".len());
|
||||||
|
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
||||||
|
} else {
|
||||||
|
self.octets().serialize(serializer)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -514,9 +563,20 @@ impl Serialize for net::SocketAddr {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
match *self {
|
if serializer.is_human_readable() {
|
||||||
net::SocketAddr::V4(ref addr) => addr.serialize(serializer),
|
match *self {
|
||||||
net::SocketAddr::V6(ref addr) => addr.serialize(serializer),
|
net::SocketAddr::V4(ref addr) => addr.serialize(serializer),
|
||||||
|
net::SocketAddr::V6(ref addr) => addr.serialize(serializer),
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
match *self {
|
||||||
|
net::SocketAddr::V4(ref addr) => {
|
||||||
|
serializer.serialize_newtype_variant("SocketAddr", 0, "V4", addr)
|
||||||
|
}
|
||||||
|
net::SocketAddr::V6(ref addr) => {
|
||||||
|
serializer.serialize_newtype_variant("SocketAddr", 1, "V6", addr)
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -527,9 +587,13 @@ impl Serialize for net::SocketAddrV4 {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
/// "101.102.103.104:65000".len()
|
if serializer.is_human_readable() {
|
||||||
const MAX_LEN: usize = 21;
|
const MAX_LEN: usize = 21;
|
||||||
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
debug_assert_eq!(MAX_LEN, "101.102.103.104:65000".len());
|
||||||
|
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
||||||
|
} else {
|
||||||
|
(self.ip(), self.port()).serialize(serializer)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -539,9 +603,16 @@ impl Serialize for net::SocketAddrV6 {
|
|||||||
where
|
where
|
||||||
S: Serializer,
|
S: Serializer,
|
||||||
{
|
{
|
||||||
/// "[1000:1002:1003:1004:1005:1006:1007:1008]:65000".len()
|
if serializer.is_human_readable() {
|
||||||
const MAX_LEN: usize = 47;
|
const MAX_LEN: usize = 47;
|
||||||
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
debug_assert_eq!(
|
||||||
|
MAX_LEN,
|
||||||
|
"[1001:1002:1003:1004:1005:1006:1007:1008]:65000".len()
|
||||||
|
);
|
||||||
|
serialize_display_bounded_length!(self, MAX_LEN, serializer)
|
||||||
|
} else {
|
||||||
|
(self.ip(), self.port()).serialize(serializer)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -601,3 +672,19 @@ impl Serialize for OsString {
|
|||||||
self.as_os_str().serialize(serializer)
|
self.as_os_str().serialize(serializer)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
#[cfg(feature = "std")]
|
||||||
|
impl<T> Serialize for Wrapping<T>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
#[inline]
|
||||||
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
self.0.serialize(serializer)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -10,8 +10,8 @@
|
|||||||
|
|
||||||
use lib::*;
|
use lib::*;
|
||||||
|
|
||||||
use ser::{self, Serialize, SerializeSeq, SerializeTuple, SerializeTupleStruct,
|
use ser::{self, Serialize, SerializeMap, SerializeSeq, SerializeStruct, SerializeStructVariant,
|
||||||
SerializeTupleVariant, SerializeMap, SerializeStruct, SerializeStructVariant};
|
SerializeTuple, SerializeTupleStruct, SerializeTupleVariant};
|
||||||
|
|
||||||
/// Helper type for implementing a `Serializer` that does not support
|
/// Helper type for implementing a `Serializer` that does not support
|
||||||
/// serializing one of the compound types.
|
/// serializing one of the compound types.
|
||||||
|
|||||||
+69
-2
@@ -89,6 +89,7 @@
|
|||||||
//! - OsString
|
//! - OsString
|
||||||
//! - **Miscellaneous standard library types**:
|
//! - **Miscellaneous standard library types**:
|
||||||
//! - Duration
|
//! - Duration
|
||||||
|
//! - SystemTime
|
||||||
//! - Path
|
//! - Path
|
||||||
//! - PathBuf
|
//! - PathBuf
|
||||||
//! - Range\<T\>
|
//! - Range\<T\>
|
||||||
@@ -1321,7 +1322,7 @@ pub trait Serializer: Sized {
|
|||||||
///
|
///
|
||||||
/// [`String`]: https://doc.rust-lang.org/std/string/struct.String.html
|
/// [`String`]: https://doc.rust-lang.org/std/string/struct.String.html
|
||||||
/// [`serialize_str`]: #tymethod.serialize_str
|
/// [`serialize_str`]: #tymethod.serialize_str
|
||||||
#[cfg(any(feature = "std", feature = "collections"))]
|
#[cfg(any(feature = "std", feature = "alloc"))]
|
||||||
fn collect_str<T: ?Sized>(self, value: &T) -> Result<Self::Ok, Self::Error>
|
fn collect_str<T: ?Sized>(self, value: &T) -> Result<Self::Ok, Self::Error>
|
||||||
where
|
where
|
||||||
T: Display,
|
T: Display,
|
||||||
@@ -1358,10 +1359,62 @@ pub trait Serializer: Sized {
|
|||||||
/// }
|
/// }
|
||||||
/// }
|
/// }
|
||||||
/// ```
|
/// ```
|
||||||
#[cfg(not(any(feature = "std", feature = "collections")))]
|
#[cfg(not(any(feature = "std", feature = "alloc")))]
|
||||||
fn collect_str<T: ?Sized>(self, value: &T) -> Result<Self::Ok, Self::Error>
|
fn collect_str<T: ?Sized>(self, value: &T) -> Result<Self::Ok, Self::Error>
|
||||||
where
|
where
|
||||||
T: Display;
|
T: Display;
|
||||||
|
|
||||||
|
/// Determine whether `Serialize` implementations should serialize in
|
||||||
|
/// human-readable form.
|
||||||
|
///
|
||||||
|
/// Some types have a human-readable form that may be somewhat expensive to
|
||||||
|
/// construct, as well as a binary form that is compact and efficient.
|
||||||
|
/// Generally text-based formats like JSON and YAML will prefer to use the
|
||||||
|
/// human-readable one and binary formats like Bincode will prefer the
|
||||||
|
/// compact one.
|
||||||
|
///
|
||||||
|
/// ```
|
||||||
|
/// # use std::fmt::{self, Display};
|
||||||
|
/// #
|
||||||
|
/// # struct Timestamp;
|
||||||
|
/// #
|
||||||
|
/// # impl Timestamp {
|
||||||
|
/// # fn seconds_since_epoch(&self) -> u64 { unimplemented!() }
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// # impl Display for Timestamp {
|
||||||
|
/// # fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
|
||||||
|
/// # unimplemented!()
|
||||||
|
/// # }
|
||||||
|
/// # }
|
||||||
|
/// #
|
||||||
|
/// use serde::{Serialize, Serializer};
|
||||||
|
///
|
||||||
|
/// impl Serialize for Timestamp {
|
||||||
|
/// fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
/// where S: Serializer
|
||||||
|
/// {
|
||||||
|
/// if serializer.is_human_readable() {
|
||||||
|
/// // Serialize to a human-readable string "2015-05-15T17:01:00Z".
|
||||||
|
/// self.to_string().serialize(serializer)
|
||||||
|
/// } else {
|
||||||
|
/// // Serialize to a compact binary representation.
|
||||||
|
/// self.seconds_since_epoch().serialize(serializer)
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// ```
|
||||||
|
///
|
||||||
|
/// The default implementation of this method returns `true`. Data formats
|
||||||
|
/// may override this to `false` to request a compact form for types that
|
||||||
|
/// support one. Note that modifying this method to change a format from
|
||||||
|
/// human-readable to compact or vice versa should be regarded as a breaking
|
||||||
|
/// change, as a value serialized in human-readable mode is not required to
|
||||||
|
/// deserialize from the same data in compact mode.
|
||||||
|
#[inline]
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Returned from `Serializer::serialize_seq`.
|
/// Returned from `Serializer::serialize_seq`.
|
||||||
@@ -1726,6 +1779,13 @@ pub trait SerializeStruct {
|
|||||||
where
|
where
|
||||||
T: Serialize;
|
T: Serialize;
|
||||||
|
|
||||||
|
/// Indicate that a struct field has been skipped.
|
||||||
|
#[inline]
|
||||||
|
fn skip_field(&mut self, key: &'static str) -> Result<(), Self::Error> {
|
||||||
|
let _ = key;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
/// Finish serializing a struct.
|
/// Finish serializing a struct.
|
||||||
fn end(self) -> Result<Self::Ok, Self::Error>;
|
fn end(self) -> Result<Self::Ok, Self::Error>;
|
||||||
}
|
}
|
||||||
@@ -1771,6 +1831,13 @@ pub trait SerializeStructVariant {
|
|||||||
where
|
where
|
||||||
T: Serialize;
|
T: Serialize;
|
||||||
|
|
||||||
|
/// Indicate that a struct variant field has been skipped.
|
||||||
|
#[inline]
|
||||||
|
fn skip_field(&mut self, key: &'static str) -> Result<(), Self::Error> {
|
||||||
|
let _ = key;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
/// Finish serializing a struct variant.
|
/// Finish serializing a struct variant.
|
||||||
fn end(self) -> Result<Self::Ok, Self::Error>;
|
fn end(self) -> Result<Self::Ok, Self::Error>;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "serde_derive"
|
name = "serde_derive"
|
||||||
version = "1.0.2" # remember to update html_root_url
|
version = "1.0.25" # remember to update html_root_url
|
||||||
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
||||||
license = "MIT/Apache-2.0"
|
license = "MIT/Apache-2.0"
|
||||||
description = "Macros 1.1 implementation of #[derive(Serialize, Deserialize)]"
|
description = "Macros 1.1 implementation of #[derive(Serialize, Deserialize)]"
|
||||||
@@ -14,11 +14,18 @@ include = ["Cargo.toml", "src/**/*.rs", "README.md", "LICENSE-APACHE", "LICENSE-
|
|||||||
[badges]
|
[badges]
|
||||||
travis-ci = { repository = "serde-rs/serde" }
|
travis-ci = { repository = "serde-rs/serde" }
|
||||||
|
|
||||||
|
[features]
|
||||||
|
default = []
|
||||||
|
deserialize_in_place = []
|
||||||
|
|
||||||
[lib]
|
[lib]
|
||||||
name = "serde_derive"
|
name = "serde_derive"
|
||||||
proc-macro = true
|
proc-macro = true
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
quote = "0.3.8"
|
quote = "0.3.8"
|
||||||
serde_derive_internals = { version = "=0.15.0", default-features = false, path = "../serde_derive_internals" }
|
serde_derive_internals = { version = "=0.18.1", default-features = false, path = "../serde_derive_internals" }
|
||||||
syn = { version = "0.11", features = ["visit"] }
|
syn = { version = "0.11", features = ["visit"] }
|
||||||
|
|
||||||
|
[dev-dependencies]
|
||||||
|
serde = { version = "1.0", path = "../serde" }
|
||||||
|
|||||||
+69
-67
@@ -10,12 +10,12 @@ use std::collections::HashSet;
|
|||||||
|
|
||||||
use syn::{self, visit};
|
use syn::{self, visit};
|
||||||
|
|
||||||
use internals::ast::Container;
|
use internals::ast::{Body, Container};
|
||||||
use internals::attr;
|
use internals::attr;
|
||||||
|
|
||||||
macro_rules! path {
|
macro_rules! path {
|
||||||
($($path:tt)+) => {
|
($($path:tt)+) => {
|
||||||
syn::parse_path(stringify!($($path)+)).unwrap()
|
syn::parse_path(quote!($($path)+).as_str()).unwrap()
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -27,14 +27,10 @@ pub fn without_defaults(generics: &syn::Generics) -> syn::Generics {
|
|||||||
ty_params: generics
|
ty_params: generics
|
||||||
.ty_params
|
.ty_params
|
||||||
.iter()
|
.iter()
|
||||||
.map(
|
.map(|ty_param| syn::TyParam {
|
||||||
|ty_param| {
|
default: None,
|
||||||
syn::TyParam {
|
..ty_param.clone()
|
||||||
default: None,
|
})
|
||||||
..ty_param.clone()
|
|
||||||
}
|
|
||||||
},
|
|
||||||
)
|
|
||||||
.collect(),
|
.collect(),
|
||||||
..generics.clone()
|
..generics.clone()
|
||||||
}
|
}
|
||||||
@@ -88,7 +84,7 @@ pub fn with_bound<F>(
|
|||||||
bound: &syn::Path,
|
bound: &syn::Path,
|
||||||
) -> syn::Generics
|
) -> syn::Generics
|
||||||
where
|
where
|
||||||
F: Fn(&attr::Field) -> bool,
|
F: Fn(&attr::Field, Option<&attr::Variant>) -> bool,
|
||||||
{
|
{
|
||||||
struct FindTyParams {
|
struct FindTyParams {
|
||||||
// Set of all generic type parameters on the current struct (A, B, C in
|
// Set of all generic type parameters on the current struct (A, B, C in
|
||||||
@@ -116,6 +112,14 @@ where
|
|||||||
}
|
}
|
||||||
visit::walk_path(self, path);
|
visit::walk_path(self, path);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Type parameter should not be considered used by a macro path.
|
||||||
|
//
|
||||||
|
// struct TypeMacro<T> {
|
||||||
|
// mac: T!(),
|
||||||
|
// marker: PhantomData<T>,
|
||||||
|
// }
|
||||||
|
fn visit_mac(&mut self, _mac: &syn::Mac) {}
|
||||||
}
|
}
|
||||||
|
|
||||||
let all_ty_params: HashSet<_> = generics
|
let all_ty_params: HashSet<_> = generics
|
||||||
@@ -124,17 +128,25 @@ where
|
|||||||
.map(|ty_param| ty_param.ident.clone())
|
.map(|ty_param| ty_param.ident.clone())
|
||||||
.collect();
|
.collect();
|
||||||
|
|
||||||
let relevant_tys = cont.body
|
|
||||||
.all_fields()
|
|
||||||
.filter(|&field| filter(&field.attrs))
|
|
||||||
.map(|field| &field.ty);
|
|
||||||
|
|
||||||
let mut visitor = FindTyParams {
|
let mut visitor = FindTyParams {
|
||||||
all_ty_params: all_ty_params,
|
all_ty_params: all_ty_params,
|
||||||
relevant_ty_params: HashSet::new(),
|
relevant_ty_params: HashSet::new(),
|
||||||
};
|
};
|
||||||
for ty in relevant_tys {
|
match cont.body {
|
||||||
visit::walk_ty(&mut visitor, ty);
|
Body::Enum(ref variants) => for variant in variants.iter() {
|
||||||
|
let relevant_fields = variant
|
||||||
|
.fields
|
||||||
|
.iter()
|
||||||
|
.filter(|field| filter(&field.attrs, Some(&variant.attrs)));
|
||||||
|
for field in relevant_fields {
|
||||||
|
visit::walk_ty(&mut visitor, field.ty);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
Body::Struct(_, ref fields) => {
|
||||||
|
for field in fields.iter().filter(|field| filter(&field.attrs, None)) {
|
||||||
|
visit::walk_ty(&mut visitor, field.ty);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let new_predicates = generics
|
let new_predicates = generics
|
||||||
@@ -142,27 +154,23 @@ where
|
|||||||
.iter()
|
.iter()
|
||||||
.map(|ty_param| ty_param.ident.clone())
|
.map(|ty_param| ty_param.ident.clone())
|
||||||
.filter(|id| visitor.relevant_ty_params.contains(id))
|
.filter(|id| visitor.relevant_ty_params.contains(id))
|
||||||
.map(
|
.map(|id| {
|
||||||
|id| {
|
syn::WherePredicate::BoundPredicate(syn::WhereBoundPredicate {
|
||||||
syn::WherePredicate::BoundPredicate(
|
bound_lifetimes: Vec::new(),
|
||||||
syn::WhereBoundPredicate {
|
// the type parameter that is being bounded e.g. T
|
||||||
bound_lifetimes: Vec::new(),
|
bounded_ty: syn::Ty::Path(None, id.into()),
|
||||||
// the type parameter that is being bounded e.g. T
|
// the bound e.g. Serialize
|
||||||
bounded_ty: syn::Ty::Path(None, id.into()),
|
bounds: vec![
|
||||||
// the bound e.g. Serialize
|
syn::TyParamBound::Trait(
|
||||||
bounds: vec![
|
syn::PolyTraitRef {
|
||||||
syn::TyParamBound::Trait(
|
bound_lifetimes: Vec::new(),
|
||||||
syn::PolyTraitRef {
|
trait_ref: bound.clone(),
|
||||||
bound_lifetimes: Vec::new(),
|
},
|
||||||
trait_ref: bound.clone(),
|
syn::TraitBoundModifier::None,
|
||||||
},
|
),
|
||||||
syn::TraitBoundModifier::None,
|
],
|
||||||
),
|
})
|
||||||
],
|
});
|
||||||
},
|
|
||||||
)
|
|
||||||
},
|
|
||||||
);
|
|
||||||
|
|
||||||
let mut generics = generics.clone();
|
let mut generics = generics.clone();
|
||||||
generics.where_clause.predicates.extend(new_predicates);
|
generics.where_clause.predicates.extend(new_predicates);
|
||||||
@@ -178,25 +186,23 @@ pub fn with_self_bound(
|
|||||||
generics
|
generics
|
||||||
.where_clause
|
.where_clause
|
||||||
.predicates
|
.predicates
|
||||||
.push(
|
.push(syn::WherePredicate::BoundPredicate(
|
||||||
syn::WherePredicate::BoundPredicate(
|
syn::WhereBoundPredicate {
|
||||||
syn::WhereBoundPredicate {
|
bound_lifetimes: Vec::new(),
|
||||||
bound_lifetimes: Vec::new(),
|
// the type that is being bounded e.g. MyStruct<'a, T>
|
||||||
// the type that is being bounded e.g. MyStruct<'a, T>
|
bounded_ty: type_of_item(cont),
|
||||||
bounded_ty: type_of_item(cont),
|
// the bound e.g. Default
|
||||||
// the bound e.g. Default
|
bounds: vec![
|
||||||
bounds: vec![
|
syn::TyParamBound::Trait(
|
||||||
syn::TyParamBound::Trait(
|
syn::PolyTraitRef {
|
||||||
syn::PolyTraitRef {
|
bound_lifetimes: Vec::new(),
|
||||||
bound_lifetimes: Vec::new(),
|
trait_ref: bound.clone(),
|
||||||
trait_ref: bound.clone(),
|
},
|
||||||
},
|
syn::TraitBoundModifier::None,
|
||||||
syn::TraitBoundModifier::None,
|
),
|
||||||
),
|
],
|
||||||
],
|
},
|
||||||
},
|
));
|
||||||
),
|
|
||||||
);
|
|
||||||
generics
|
generics
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -213,15 +219,11 @@ pub fn with_lifetime_bound(generics: &syn::Generics, lifetime: &str) -> syn::Gen
|
|||||||
.push(syn::TyParamBound::Region(syn::Lifetime::new(lifetime)));
|
.push(syn::TyParamBound::Region(syn::Lifetime::new(lifetime)));
|
||||||
}
|
}
|
||||||
|
|
||||||
generics
|
generics.lifetimes.push(syn::LifetimeDef {
|
||||||
.lifetimes
|
attrs: Vec::new(),
|
||||||
.push(
|
lifetime: syn::Lifetime::new(lifetime),
|
||||||
syn::LifetimeDef {
|
bounds: Vec::new(),
|
||||||
attrs: Vec::new(),
|
});
|
||||||
lifetime: syn::Lifetime::new(lifetime),
|
|
||||||
bounds: Vec::new(),
|
|
||||||
},
|
|
||||||
);
|
|
||||||
|
|
||||||
generics
|
generics
|
||||||
}
|
}
|
||||||
|
|||||||
+1042
-325
File diff suppressed because it is too large
Load Diff
@@ -6,7 +6,7 @@
|
|||||||
// option. This file may not be copied, modified, or distributed
|
// option. This file may not be copied, modified, or distributed
|
||||||
// except according to those terms.
|
// except according to those terms.
|
||||||
|
|
||||||
use quote::{Tokens, ToTokens};
|
use quote::{ToTokens, Tokens};
|
||||||
|
|
||||||
pub enum Fragment {
|
pub enum Fragment {
|
||||||
/// Tokens that can be used as an expression.
|
/// Tokens that can be used as an expression.
|
||||||
@@ -73,3 +73,12 @@ impl ToTokens for Match {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
impl AsRef<Tokens> for Fragment {
|
||||||
|
fn as_ref(&self) -> &Tokens {
|
||||||
|
match *self {
|
||||||
|
Fragment::Expr(ref expr) => expr,
|
||||||
|
Fragment::Block(ref block) => block,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -8,25 +8,29 @@
|
|||||||
|
|
||||||
//! This crate provides Serde's two derive macros.
|
//! This crate provides Serde's two derive macros.
|
||||||
//!
|
//!
|
||||||
//! ```rust,ignore
|
//! ```rust
|
||||||
|
//! # #[macro_use]
|
||||||
|
//! # extern crate serde_derive;
|
||||||
|
//! #
|
||||||
//! #[derive(Serialize, Deserialize)]
|
//! #[derive(Serialize, Deserialize)]
|
||||||
|
//! # struct S;
|
||||||
|
//! #
|
||||||
|
//! # fn main() {}
|
||||||
//! ```
|
//! ```
|
||||||
//!
|
//!
|
||||||
//! Please refer to [https://serde.rs/derive.html] for how to set this up.
|
//! Please refer to [https://serde.rs/derive.html] for how to set this up.
|
||||||
//!
|
//!
|
||||||
//! [https://serde.rs/derive.html]: https://serde.rs/derive.html
|
//! [https://serde.rs/derive.html]: https://serde.rs/derive.html
|
||||||
|
|
||||||
#![doc(html_root_url = "https://docs.rs/serde_derive/1.0.2")]
|
#![doc(html_root_url = "https://docs.rs/serde_derive/1.0.25")]
|
||||||
|
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(too_many_arguments))]
|
#![cfg_attr(feature = "cargo-clippy", allow(too_many_arguments))]
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(used_underscore_binding))]
|
#![cfg_attr(feature = "cargo-clippy", allow(used_underscore_binding))]
|
||||||
|
|
||||||
// The `quote!` macro requires deep recursion.
|
// The `quote!` macro requires deep recursion.
|
||||||
#![recursion_limit = "192"]
|
#![recursion_limit = "192"]
|
||||||
|
|
||||||
extern crate syn;
|
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate quote;
|
extern crate quote;
|
||||||
|
extern crate syn;
|
||||||
|
|
||||||
extern crate serde_derive_internals as internals;
|
extern crate serde_derive_internals as internals;
|
||||||
|
|
||||||
|
|||||||
+272
-203
@@ -10,7 +10,7 @@ use syn::{self, Ident};
|
|||||||
use quote::Tokens;
|
use quote::Tokens;
|
||||||
|
|
||||||
use bound;
|
use bound;
|
||||||
use fragment::{Fragment, Stmts, Match};
|
use fragment::{Fragment, Match, Stmts};
|
||||||
use internals::ast::{Body, Container, Field, Style, Variant};
|
use internals::ast::{Body, Container, Field, Style, Variant};
|
||||||
use internals::{attr, Ctxt};
|
use internals::{attr, Ctxt};
|
||||||
|
|
||||||
@@ -29,9 +29,10 @@ pub fn expand_derive_serialize(input: &syn::DeriveInput) -> Result<Tokens, Strin
|
|||||||
let body = Stmts(serialize_body(&cont, ¶ms));
|
let body = Stmts(serialize_body(&cont, ¶ms));
|
||||||
|
|
||||||
let impl_block = if let Some(remote) = cont.attrs.remote() {
|
let impl_block = if let Some(remote) = cont.attrs.remote() {
|
||||||
|
let vis = &input.vis;
|
||||||
quote! {
|
quote! {
|
||||||
impl #impl_generics #ident #ty_generics #where_clause {
|
impl #impl_generics #ident #ty_generics #where_clause {
|
||||||
fn serialize<__S>(__self: &#remote #ty_generics, __serializer: __S) -> _serde::export::Result<__S::Ok, __S::Error>
|
#vis fn serialize<__S>(__self: &#remote #ty_generics, __serializer: __S) -> _serde::export::Result<__S::Ok, __S::Error>
|
||||||
where __S: _serde::Serializer
|
where __S: _serde::Serializer
|
||||||
{
|
{
|
||||||
#body
|
#body
|
||||||
@@ -131,23 +132,23 @@ fn build_generics(cont: &Container) -> syn::Generics {
|
|||||||
|
|
||||||
match cont.attrs.ser_bound() {
|
match cont.attrs.ser_bound() {
|
||||||
Some(predicates) => bound::with_where_predicates(&generics, predicates),
|
Some(predicates) => bound::with_where_predicates(&generics, predicates),
|
||||||
None => {
|
None => bound::with_bound(
|
||||||
bound::with_bound(
|
cont,
|
||||||
cont,
|
&generics,
|
||||||
&generics,
|
needs_serialize_bound,
|
||||||
needs_serialize_bound,
|
&path!(_serde::Serialize),
|
||||||
&path!(_serde::Serialize),
|
),
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Fields with a `skip_serializing` or `serialize_with` attribute are not
|
// Fields with a `skip_serializing` or `serialize_with` attribute, or which
|
||||||
// serialized by us so we do not generate a bound. Fields with a `bound`
|
// belong to a variant with a `serialize_with` attribute, are not serialized by
|
||||||
// attribute specify their own bound so we do not generate one. All other fields
|
// us so we do not generate a bound. Fields with a `bound` attribute specify
|
||||||
// may need a `T: Serialize` bound where T is the type of the field.
|
// their own bound so we do not generate one. All other fields may need a `T:
|
||||||
fn needs_serialize_bound(attrs: &attr::Field) -> bool {
|
// Serialize` bound where T is the type of the field.
|
||||||
!attrs.skip_serializing() && attrs.serialize_with().is_none() && attrs.ser_bound().is_none()
|
fn needs_serialize_bound(field: &attr::Field, variant: Option<&attr::Variant>) -> bool {
|
||||||
|
!field.skip_serializing() && field.serialize_with().is_none() && field.ser_bound().is_none()
|
||||||
|
&& variant.map_or(true, |variant| variant.serialize_with().is_none())
|
||||||
}
|
}
|
||||||
|
|
||||||
fn serialize_body(cont: &Container, params: &Parameters) -> Fragment {
|
fn serialize_body(cont: &Container, params: &Parameters) -> Fragment {
|
||||||
@@ -202,7 +203,7 @@ fn serialize_newtype_struct(
|
|||||||
|
|
||||||
let mut field_expr = get_field(params, field, 0);
|
let mut field_expr = get_field(params, field, 0);
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -241,6 +242,7 @@ fn serialize_struct(params: &Parameters, fields: &[Field], cattrs: &attr::Contai
|
|||||||
params,
|
params,
|
||||||
false,
|
false,
|
||||||
quote!(_serde::ser::SerializeStruct::serialize_field),
|
quote!(_serde::ser::SerializeStruct::serialize_field),
|
||||||
|
quote!(_serde::ser::SerializeStruct::skip_field),
|
||||||
);
|
);
|
||||||
|
|
||||||
let type_name = cattrs.name().serialize_name();
|
let type_name = cattrs.name().serialize_name();
|
||||||
@@ -253,16 +255,14 @@ fn serialize_struct(params: &Parameters, fields: &[Field], cattrs: &attr::Contai
|
|||||||
let let_mut = mut_if(serialized_fields.peek().is_some());
|
let let_mut = mut_if(serialized_fields.peek().is_some());
|
||||||
|
|
||||||
let len = serialized_fields
|
let len = serialized_fields
|
||||||
.map(
|
.map(|field| match field.attrs.skip_serializing_if() {
|
||||||
|field| match field.attrs.skip_serializing_if() {
|
None => quote!(1),
|
||||||
None => quote!(1),
|
Some(path) => {
|
||||||
Some(path) => {
|
let ident = field.ident.clone().expect("struct has unnamed fields");
|
||||||
let ident = field.ident.clone().expect("struct has unnamed fields");
|
let field_expr = get_field(params, field, ident);
|
||||||
let field_expr = get_field(params, field, ident);
|
quote!(if #path(#field_expr) { 0 } else { 1 })
|
||||||
quote!(if #path(#field_expr) { 0 } else { 1 })
|
}
|
||||||
}
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
.fold(quote!(0), |sum, expr| quote!(#sum + #expr));
|
.fold(quote!(0), |sum, expr| quote!(#sum + #expr));
|
||||||
|
|
||||||
quote_block! {
|
quote_block! {
|
||||||
@@ -280,11 +280,9 @@ fn serialize_enum(params: &Parameters, variants: &[Variant], cattrs: &attr::Cont
|
|||||||
let arms: Vec<_> = variants
|
let arms: Vec<_> = variants
|
||||||
.iter()
|
.iter()
|
||||||
.enumerate()
|
.enumerate()
|
||||||
.map(
|
.map(|(variant_index, variant)| {
|
||||||
|(variant_index, variant)| {
|
serialize_variant(params, variant, variant_index as u32, cattrs)
|
||||||
serialize_variant(params, variant, variant_index as u32, cattrs)
|
})
|
||||||
},
|
|
||||||
)
|
|
||||||
.collect();
|
.collect();
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -315,13 +313,7 @@ fn serialize_variant(
|
|||||||
let fields_pat = match variant.style {
|
let fields_pat = match variant.style {
|
||||||
Style::Unit => quote!(),
|
Style::Unit => quote!(),
|
||||||
Style::Newtype | Style::Tuple => quote!((..)),
|
Style::Newtype | Style::Tuple => quote!((..)),
|
||||||
Style::Struct => {
|
Style::Struct => quote!({ .. }),
|
||||||
quote!(
|
|
||||||
{
|
|
||||||
..
|
|
||||||
}
|
|
||||||
)
|
|
||||||
}
|
|
||||||
};
|
};
|
||||||
quote! {
|
quote! {
|
||||||
#this::#variant_ident #fields_pat => #skipped_err,
|
#this::#variant_ident #fields_pat => #skipped_err,
|
||||||
@@ -350,34 +342,26 @@ fn serialize_variant(
|
|||||||
let fields = variant
|
let fields = variant
|
||||||
.fields
|
.fields
|
||||||
.iter()
|
.iter()
|
||||||
.map(
|
.map(|f| f.ident.clone().expect("struct variant has unnamed fields"));
|
||||||
|f| {
|
|
||||||
f.ident
|
|
||||||
.clone()
|
|
||||||
.expect("struct variant has unnamed fields")
|
|
||||||
},
|
|
||||||
);
|
|
||||||
quote! {
|
quote! {
|
||||||
#this::#variant_ident { #(ref #fields),* }
|
#this::#variant_ident { #(ref #fields),* }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let body = Match(
|
let body = Match(match *cattrs.tag() {
|
||||||
match *cattrs.tag() {
|
attr::EnumTag::External => {
|
||||||
attr::EnumTag::External => {
|
serialize_externally_tagged_variant(params, variant, variant_index, cattrs)
|
||||||
serialize_externally_tagged_variant(params, variant, variant_index, cattrs)
|
}
|
||||||
}
|
attr::EnumTag::Internal { ref tag } => {
|
||||||
attr::EnumTag::Internal { ref tag } => {
|
serialize_internally_tagged_variant(params, variant, cattrs, tag)
|
||||||
serialize_internally_tagged_variant(params, variant, cattrs, tag)
|
}
|
||||||
}
|
attr::EnumTag::Adjacent {
|
||||||
attr::EnumTag::Adjacent {
|
ref tag,
|
||||||
ref tag,
|
ref content,
|
||||||
ref content,
|
} => serialize_adjacently_tagged_variant(params, variant, cattrs, tag, content),
|
||||||
} => serialize_adjacently_tagged_variant(params, variant, cattrs, tag, content),
|
attr::EnumTag::None => serialize_untagged_variant(params, variant, cattrs),
|
||||||
attr::EnumTag::None => serialize_untagged_variant(params, variant, cattrs),
|
});
|
||||||
},
|
|
||||||
);
|
|
||||||
|
|
||||||
quote! {
|
quote! {
|
||||||
#case => #body
|
#case => #body
|
||||||
@@ -394,6 +378,19 @@ fn serialize_externally_tagged_variant(
|
|||||||
let type_name = cattrs.name().serialize_name();
|
let type_name = cattrs.name().serialize_name();
|
||||||
let variant_name = variant.attrs.name().serialize_name();
|
let variant_name = variant.attrs.name().serialize_name();
|
||||||
|
|
||||||
|
if let Some(path) = variant.attrs.serialize_with() {
|
||||||
|
let ser = wrap_serialize_variant_with(params, path, &variant);
|
||||||
|
return quote_expr! {
|
||||||
|
_serde::Serializer::serialize_newtype_variant(
|
||||||
|
__serializer,
|
||||||
|
#type_name,
|
||||||
|
#variant_index,
|
||||||
|
#variant_name,
|
||||||
|
#ser,
|
||||||
|
)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
match variant.style {
|
match variant.style {
|
||||||
Style::Unit => {
|
Style::Unit => {
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -409,7 +406,7 @@ fn serialize_externally_tagged_variant(
|
|||||||
let field = &variant.fields[0];
|
let field = &variant.fields[0];
|
||||||
let mut field_expr = quote!(__field0);
|
let mut field_expr = quote!(__field0);
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -422,28 +419,24 @@ fn serialize_externally_tagged_variant(
|
|||||||
)
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Style::Tuple => {
|
Style::Tuple => serialize_tuple_variant(
|
||||||
serialize_tuple_variant(
|
TupleVariant::ExternallyTagged {
|
||||||
TupleVariant::ExternallyTagged {
|
type_name: type_name,
|
||||||
type_name: type_name,
|
variant_index: variant_index,
|
||||||
variant_index: variant_index,
|
variant_name: variant_name,
|
||||||
variant_name: variant_name,
|
},
|
||||||
},
|
params,
|
||||||
params,
|
&variant.fields,
|
||||||
&variant.fields,
|
),
|
||||||
)
|
Style::Struct => serialize_struct_variant(
|
||||||
}
|
StructVariant::ExternallyTagged {
|
||||||
Style::Struct => {
|
variant_index: variant_index,
|
||||||
serialize_struct_variant(
|
variant_name: variant_name,
|
||||||
StructVariant::ExternallyTagged {
|
},
|
||||||
variant_index: variant_index,
|
params,
|
||||||
variant_name: variant_name,
|
&variant.fields,
|
||||||
},
|
&type_name,
|
||||||
params,
|
),
|
||||||
&variant.fields,
|
|
||||||
&type_name,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -459,6 +452,20 @@ fn serialize_internally_tagged_variant(
|
|||||||
let enum_ident_str = params.type_name();
|
let enum_ident_str = params.type_name();
|
||||||
let variant_ident_str = variant.ident.as_ref();
|
let variant_ident_str = variant.ident.as_ref();
|
||||||
|
|
||||||
|
if let Some(path) = variant.attrs.serialize_with() {
|
||||||
|
let ser = wrap_serialize_variant_with(params, path, &variant);
|
||||||
|
return quote_expr! {
|
||||||
|
_serde::private::ser::serialize_tagged_newtype(
|
||||||
|
__serializer,
|
||||||
|
#enum_ident_str,
|
||||||
|
#variant_ident_str,
|
||||||
|
#tag,
|
||||||
|
#variant_name,
|
||||||
|
#ser,
|
||||||
|
)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
match variant.style {
|
match variant.style {
|
||||||
Style::Unit => {
|
Style::Unit => {
|
||||||
quote_block! {
|
quote_block! {
|
||||||
@@ -473,7 +480,7 @@ fn serialize_internally_tagged_variant(
|
|||||||
let field = &variant.fields[0];
|
let field = &variant.fields[0];
|
||||||
let mut field_expr = quote!(__field0);
|
let mut field_expr = quote!(__field0);
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -487,17 +494,15 @@ fn serialize_internally_tagged_variant(
|
|||||||
)
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Style::Struct => {
|
Style::Struct => serialize_struct_variant(
|
||||||
serialize_struct_variant(
|
StructVariant::InternallyTagged {
|
||||||
StructVariant::InternallyTagged {
|
tag: tag,
|
||||||
tag: tag,
|
variant_name: variant_name,
|
||||||
variant_name: variant_name,
|
},
|
||||||
},
|
params,
|
||||||
params,
|
&variant.fields,
|
||||||
&variant.fields,
|
&type_name,
|
||||||
&type_name,
|
),
|
||||||
)
|
|
||||||
}
|
|
||||||
Style::Tuple => unreachable!("checked in serde_derive_internals"),
|
Style::Tuple => unreachable!("checked in serde_derive_internals"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -513,7 +518,12 @@ fn serialize_adjacently_tagged_variant(
|
|||||||
let type_name = cattrs.name().serialize_name();
|
let type_name = cattrs.name().serialize_name();
|
||||||
let variant_name = variant.attrs.name().serialize_name();
|
let variant_name = variant.attrs.name().serialize_name();
|
||||||
|
|
||||||
let inner = Stmts(
|
let inner = Stmts(if let Some(path) = variant.attrs.serialize_with() {
|
||||||
|
let ser = wrap_serialize_variant_with(params, path, &variant);
|
||||||
|
quote_expr! {
|
||||||
|
_serde::Serialize::serialize(#ser, __serializer)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
match variant.style {
|
match variant.style {
|
||||||
Style::Unit => {
|
Style::Unit => {
|
||||||
return quote_block! {
|
return quote_block! {
|
||||||
@@ -528,7 +538,7 @@ fn serialize_adjacently_tagged_variant(
|
|||||||
let field = &variant.fields[0];
|
let field = &variant.fields[0];
|
||||||
let mut field_expr = quote!(__field0);
|
let mut field_expr = quote!(__field0);
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -538,44 +548,42 @@ fn serialize_adjacently_tagged_variant(
|
|||||||
Style::Tuple => {
|
Style::Tuple => {
|
||||||
serialize_tuple_variant(TupleVariant::Untagged, params, &variant.fields)
|
serialize_tuple_variant(TupleVariant::Untagged, params, &variant.fields)
|
||||||
}
|
}
|
||||||
Style::Struct => {
|
Style::Struct => serialize_struct_variant(
|
||||||
serialize_struct_variant(
|
StructVariant::Untagged,
|
||||||
StructVariant::Untagged,
|
params,
|
||||||
params,
|
&variant.fields,
|
||||||
&variant.fields,
|
&variant_name,
|
||||||
&variant_name,
|
),
|
||||||
)
|
}
|
||||||
}
|
});
|
||||||
},
|
|
||||||
);
|
|
||||||
|
|
||||||
let fields_ty = variant.fields.iter().map(|f| &f.ty);
|
let fields_ty = variant.fields.iter().map(|f| &f.ty);
|
||||||
let ref fields_ident: Vec<_> = match variant.style {
|
let ref fields_ident: Vec<_> = match variant.style {
|
||||||
Style::Unit => unreachable!(),
|
Style::Unit => {
|
||||||
|
if variant.attrs.serialize_with().is_some() {
|
||||||
|
vec![]
|
||||||
|
} else {
|
||||||
|
unreachable!()
|
||||||
|
}
|
||||||
|
}
|
||||||
Style::Newtype => vec![Ident::new("__field0")],
|
Style::Newtype => vec![Ident::new("__field0")],
|
||||||
Style::Tuple => {
|
Style::Tuple => (0..variant.fields.len())
|
||||||
(0..variant.fields.len())
|
.map(|i| Ident::new(format!("__field{}", i)))
|
||||||
.map(|i| Ident::new(format!("__field{}", i)))
|
.collect(),
|
||||||
.collect()
|
Style::Struct => variant
|
||||||
}
|
.fields
|
||||||
Style::Struct => {
|
.iter()
|
||||||
variant
|
.map(|f| f.ident.clone().expect("struct variant has unnamed fields"))
|
||||||
.fields
|
.collect(),
|
||||||
.iter()
|
|
||||||
.map(
|
|
||||||
|f| {
|
|
||||||
f.ident
|
|
||||||
.clone()
|
|
||||||
.expect("struct variant has unnamed fields")
|
|
||||||
},
|
|
||||||
)
|
|
||||||
.collect()
|
|
||||||
}
|
|
||||||
};
|
};
|
||||||
|
|
||||||
let (_, ty_generics, where_clause) = params.generics.split_for_impl();
|
let (_, ty_generics, where_clause) = params.generics.split_for_impl();
|
||||||
|
|
||||||
let wrapper_generics = bound::with_lifetime_bound(¶ms.generics, "'__a");
|
let wrapper_generics = if let Style::Unit = variant.style {
|
||||||
|
params.generics.clone()
|
||||||
|
} else {
|
||||||
|
bound::with_lifetime_bound(¶ms.generics, "'__a")
|
||||||
|
};
|
||||||
let (wrapper_impl_generics, wrapper_ty_generics, _) = wrapper_generics.split_for_impl();
|
let (wrapper_impl_generics, wrapper_ty_generics, _) = wrapper_generics.split_for_impl();
|
||||||
|
|
||||||
quote_block! {
|
quote_block! {
|
||||||
@@ -611,6 +619,13 @@ fn serialize_untagged_variant(
|
|||||||
variant: &Variant,
|
variant: &Variant,
|
||||||
cattrs: &attr::Container,
|
cattrs: &attr::Container,
|
||||||
) -> Fragment {
|
) -> Fragment {
|
||||||
|
if let Some(path) = variant.attrs.serialize_with() {
|
||||||
|
let ser = wrap_serialize_variant_with(params, path, &variant);
|
||||||
|
return quote_expr! {
|
||||||
|
_serde::Serialize::serialize(#ser, __serializer)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
match variant.style {
|
match variant.style {
|
||||||
Style::Unit => {
|
Style::Unit => {
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -621,7 +636,7 @@ fn serialize_untagged_variant(
|
|||||||
let field = &variant.fields[0];
|
let field = &variant.fields[0];
|
||||||
let mut field_expr = quote!(__field0);
|
let mut field_expr = quote!(__field0);
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
quote_expr! {
|
quote_expr! {
|
||||||
@@ -696,7 +711,10 @@ enum StructVariant<'a> {
|
|||||||
variant_index: u32,
|
variant_index: u32,
|
||||||
variant_name: String,
|
variant_name: String,
|
||||||
},
|
},
|
||||||
InternallyTagged { tag: &'a str, variant_name: String },
|
InternallyTagged {
|
||||||
|
tag: &'a str,
|
||||||
|
variant_name: String,
|
||||||
|
},
|
||||||
Untagged,
|
Untagged,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -706,15 +724,18 @@ fn serialize_struct_variant<'a>(
|
|||||||
fields: &[Field],
|
fields: &[Field],
|
||||||
name: &str,
|
name: &str,
|
||||||
) -> Fragment {
|
) -> Fragment {
|
||||||
let method = match context {
|
let (method, skip_method) = match context {
|
||||||
StructVariant::ExternallyTagged { .. } => {
|
StructVariant::ExternallyTagged { .. } => (
|
||||||
quote!(_serde::ser::SerializeStructVariant::serialize_field)
|
quote!(_serde::ser::SerializeStructVariant::serialize_field),
|
||||||
}
|
quote!(_serde::ser::SerializeStructVariant::skip_field),
|
||||||
StructVariant::InternallyTagged { .. } |
|
),
|
||||||
StructVariant::Untagged => quote!(_serde::ser::SerializeStruct::serialize_field),
|
StructVariant::InternallyTagged { .. } | StructVariant::Untagged => (
|
||||||
|
quote!(_serde::ser::SerializeStruct::serialize_field),
|
||||||
|
quote!(_serde::ser::SerializeStruct::skip_field),
|
||||||
|
),
|
||||||
};
|
};
|
||||||
|
|
||||||
let serialize_fields = serialize_struct_visitor(fields, params, true, method);
|
let serialize_fields = serialize_struct_visitor(fields, params, true, method, skip_method);
|
||||||
|
|
||||||
let mut serialized_fields = fields
|
let mut serialized_fields = fields
|
||||||
.iter()
|
.iter()
|
||||||
@@ -724,16 +745,14 @@ fn serialize_struct_variant<'a>(
|
|||||||
let let_mut = mut_if(serialized_fields.peek().is_some());
|
let let_mut = mut_if(serialized_fields.peek().is_some());
|
||||||
|
|
||||||
let len = serialized_fields
|
let len = serialized_fields
|
||||||
.map(
|
.map(|field| {
|
||||||
|field| {
|
let ident = field.ident.clone().expect("struct has unnamed fields");
|
||||||
let ident = field.ident.clone().expect("struct has unnamed fields");
|
|
||||||
|
|
||||||
match field.attrs.skip_serializing_if() {
|
match field.attrs.skip_serializing_if() {
|
||||||
Some(path) => quote!(if #path(#ident) { 0 } else { 1 }),
|
Some(path) => quote!(if #path(#ident) { 0 } else { 1 }),
|
||||||
None => quote!(1),
|
None => quote!(1),
|
||||||
}
|
}
|
||||||
},
|
})
|
||||||
)
|
|
||||||
.fold(quote!(0), |sum, expr| quote!(#sum + #expr));
|
.fold(quote!(0), |sum, expr| quote!(#sum + #expr));
|
||||||
|
|
||||||
match context {
|
match context {
|
||||||
@@ -792,34 +811,32 @@ fn serialize_tuple_struct_visitor(
|
|||||||
fields
|
fields
|
||||||
.iter()
|
.iter()
|
||||||
.enumerate()
|
.enumerate()
|
||||||
.map(
|
.map(|(i, field)| {
|
||||||
|(i, field)| {
|
let mut field_expr = if is_enum {
|
||||||
let mut field_expr = if is_enum {
|
let id = Ident::new(format!("__field{}", i));
|
||||||
let id = Ident::new(format!("__field{}", i));
|
quote!(#id)
|
||||||
quote!(#id)
|
} else {
|
||||||
} else {
|
get_field(params, field, i)
|
||||||
get_field(params, field, i)
|
};
|
||||||
};
|
|
||||||
|
|
||||||
let skip = field
|
let skip = field
|
||||||
.attrs
|
.attrs
|
||||||
.skip_serializing_if()
|
.skip_serializing_if()
|
||||||
.map(|path| quote!(#path(#field_expr)));
|
.map(|path| quote!(#path(#field_expr)));
|
||||||
|
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr);
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
}
|
}
|
||||||
|
|
||||||
let ser = quote! {
|
let ser = quote! {
|
||||||
try!(#func(&mut __serde_state, #field_expr));
|
try!(#func(&mut __serde_state, #field_expr));
|
||||||
};
|
};
|
||||||
|
|
||||||
match skip {
|
match skip {
|
||||||
None => ser,
|
None => ser,
|
||||||
Some(skip) => quote!(if !#skip { #ser }),
|
Some(skip) => quote!(if !#skip { #ser }),
|
||||||
}
|
}
|
||||||
},
|
})
|
||||||
)
|
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -828,58 +845,106 @@ fn serialize_struct_visitor(
|
|||||||
params: &Parameters,
|
params: &Parameters,
|
||||||
is_enum: bool,
|
is_enum: bool,
|
||||||
func: Tokens,
|
func: Tokens,
|
||||||
|
skip_func: Tokens,
|
||||||
) -> Vec<Tokens> {
|
) -> Vec<Tokens> {
|
||||||
fields
|
fields
|
||||||
.iter()
|
.iter()
|
||||||
.filter(|&field| !field.attrs.skip_serializing())
|
.filter(|&field| !field.attrs.skip_serializing())
|
||||||
.map(
|
.map(|field| {
|
||||||
|field| {
|
let field_ident = field.ident.clone().expect("struct has unnamed field");
|
||||||
let field_ident = field.ident.clone().expect("struct has unnamed field");
|
let mut field_expr = if is_enum {
|
||||||
let mut field_expr = if is_enum {
|
quote!(#field_ident)
|
||||||
quote!(#field_ident)
|
} else {
|
||||||
} else {
|
get_field(params, field, field_ident)
|
||||||
get_field(params, field, field_ident)
|
};
|
||||||
};
|
|
||||||
|
|
||||||
let key_expr = field.attrs.name().serialize_name();
|
let key_expr = field.attrs.name().serialize_name();
|
||||||
|
|
||||||
let skip = field
|
let skip = field
|
||||||
.attrs
|
.attrs
|
||||||
.skip_serializing_if()
|
.skip_serializing_if()
|
||||||
.map(|path| quote!(#path(#field_expr)));
|
.map(|path| quote!(#path(#field_expr)));
|
||||||
|
|
||||||
if let Some(path) = field.attrs.serialize_with() {
|
if let Some(path) = field.attrs.serialize_with() {
|
||||||
field_expr = wrap_serialize_with(params, field.ty, path, field_expr)
|
field_expr = wrap_serialize_field_with(params, field.ty, path, field_expr);
|
||||||
|
}
|
||||||
|
|
||||||
|
let ser = quote! {
|
||||||
|
try!(#func(&mut __serde_state, #key_expr, #field_expr));
|
||||||
|
};
|
||||||
|
|
||||||
|
match skip {
|
||||||
|
None => ser,
|
||||||
|
Some(skip) => {
|
||||||
|
quote! {
|
||||||
|
if !#skip {
|
||||||
|
#ser
|
||||||
|
} else {
|
||||||
|
try!(#skip_func(&mut __serde_state, #key_expr));
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
let ser = quote! {
|
})
|
||||||
try!(#func(&mut __serde_state, #key_expr, #field_expr));
|
|
||||||
};
|
|
||||||
|
|
||||||
match skip {
|
|
||||||
None => ser,
|
|
||||||
Some(skip) => quote!(if !#skip { #ser }),
|
|
||||||
}
|
|
||||||
},
|
|
||||||
)
|
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn wrap_serialize_field_with(
|
||||||
|
params: &Parameters,
|
||||||
|
field_ty: &syn::Ty,
|
||||||
|
serialize_with: &syn::Path,
|
||||||
|
field_expr: Tokens,
|
||||||
|
) -> Tokens {
|
||||||
|
wrap_serialize_with(params, serialize_with, &[field_ty], &[quote!(#field_expr)])
|
||||||
|
}
|
||||||
|
|
||||||
|
fn wrap_serialize_variant_with(
|
||||||
|
params: &Parameters,
|
||||||
|
serialize_with: &syn::Path,
|
||||||
|
variant: &Variant,
|
||||||
|
) -> Tokens {
|
||||||
|
let field_tys: Vec<_> = variant.fields.iter().map(|field| field.ty).collect();
|
||||||
|
let field_exprs: Vec<_> = variant
|
||||||
|
.fields
|
||||||
|
.iter()
|
||||||
|
.enumerate()
|
||||||
|
.map(|(i, field)| {
|
||||||
|
let id = field
|
||||||
|
.ident
|
||||||
|
.as_ref()
|
||||||
|
.map_or_else(|| Ident::new(format!("__field{}", i)), |id| id.clone());
|
||||||
|
quote!(#id)
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
wrap_serialize_with(
|
||||||
|
params,
|
||||||
|
serialize_with,
|
||||||
|
field_tys.as_slice(),
|
||||||
|
field_exprs.as_slice(),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
fn wrap_serialize_with(
|
fn wrap_serialize_with(
|
||||||
params: &Parameters,
|
params: &Parameters,
|
||||||
field_ty: &syn::Ty,
|
|
||||||
serialize_with: &syn::Path,
|
serialize_with: &syn::Path,
|
||||||
value: Tokens,
|
field_tys: &[&syn::Ty],
|
||||||
|
field_exprs: &[Tokens],
|
||||||
) -> Tokens {
|
) -> Tokens {
|
||||||
let this = ¶ms.this;
|
let this = ¶ms.this;
|
||||||
let (_, ty_generics, where_clause) = params.generics.split_for_impl();
|
let (_, ty_generics, where_clause) = params.generics.split_for_impl();
|
||||||
|
|
||||||
let wrapper_generics = bound::with_lifetime_bound(¶ms.generics, "'__a");
|
let wrapper_generics = if field_exprs.len() == 0 {
|
||||||
|
params.generics.clone()
|
||||||
|
} else {
|
||||||
|
bound::with_lifetime_bound(¶ms.generics, "'__a")
|
||||||
|
};
|
||||||
let (wrapper_impl_generics, wrapper_ty_generics, _) = wrapper_generics.split_for_impl();
|
let (wrapper_impl_generics, wrapper_ty_generics, _) = wrapper_generics.split_for_impl();
|
||||||
|
|
||||||
|
let field_access = (0..field_exprs.len()).map(|n| Ident::new(format!("{}", n)));
|
||||||
|
|
||||||
quote!({
|
quote!({
|
||||||
struct __SerializeWith #wrapper_impl_generics #where_clause {
|
struct __SerializeWith #wrapper_impl_generics #where_clause {
|
||||||
value: &'__a #field_ty,
|
values: (#(&'__a #field_tys, )*),
|
||||||
phantom: _serde::export::PhantomData<#this #ty_generics>,
|
phantom: _serde::export::PhantomData<#this #ty_generics>,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -887,12 +952,12 @@ fn wrap_serialize_with(
|
|||||||
fn serialize<__S>(&self, __s: __S) -> _serde::export::Result<__S::Ok, __S::Error>
|
fn serialize<__S>(&self, __s: __S) -> _serde::export::Result<__S::Ok, __S::Error>
|
||||||
where __S: _serde::Serializer
|
where __S: _serde::Serializer
|
||||||
{
|
{
|
||||||
#serialize_with(self.value, __s)
|
#serialize_with(#(self.values.#field_access, )* __s)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
&__SerializeWith {
|
&__SerializeWith {
|
||||||
value: #value,
|
values: (#(#field_exprs, )*),
|
||||||
phantom: _serde::export::PhantomData::<#this #ty_generics>,
|
phantom: _serde::export::PhantomData::<#this #ty_generics>,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@@ -905,7 +970,11 @@ fn wrap_serialize_with(
|
|||||||
//
|
//
|
||||||
// where we want to omit the `mut` to avoid a warning.
|
// where we want to omit the `mut` to avoid a warning.
|
||||||
fn mut_if(is_mut: bool) -> Option<Tokens> {
|
fn mut_if(is_mut: bool) -> Option<Tokens> {
|
||||||
if is_mut { Some(quote!(mut)) } else { None }
|
if is_mut {
|
||||||
|
Some(quote!(mut))
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn get_field<I>(params: &Parameters, field: &Field, ident: I) -> Tokens
|
fn get_field<I>(params: &Parameters, field: &Field, ident: I) -> Tokens
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "serde_derive_internals"
|
name = "serde_derive_internals"
|
||||||
version = "0.15.0" # remember to update html_root_url
|
version = "0.18.1" # remember to update html_root_url
|
||||||
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
||||||
license = "MIT/Apache-2.0"
|
license = "MIT/Apache-2.0"
|
||||||
description = "AST representation used by Serde derive macros. Unstable."
|
description = "AST representation used by Serde derive macros. Unstable."
|
||||||
|
|||||||
@@ -49,27 +49,25 @@ impl<'a> Container<'a> {
|
|||||||
let attrs = attr::Container::from_ast(cx, item);
|
let attrs = attr::Container::from_ast(cx, item);
|
||||||
|
|
||||||
let mut body = match item.body {
|
let mut body = match item.body {
|
||||||
syn::Body::Enum(ref variants) => Body::Enum(enum_from_ast(cx, variants)),
|
syn::Body::Enum(ref variants) => {
|
||||||
|
Body::Enum(enum_from_ast(cx, variants, &attrs.default()))
|
||||||
|
}
|
||||||
syn::Body::Struct(ref variant_data) => {
|
syn::Body::Struct(ref variant_data) => {
|
||||||
let (style, fields) = struct_from_ast(cx, variant_data);
|
let (style, fields) = struct_from_ast(cx, variant_data, None, &attrs.default());
|
||||||
Body::Struct(style, fields)
|
Body::Struct(style, fields)
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
match body {
|
match body {
|
||||||
Body::Enum(ref mut variants) => {
|
Body::Enum(ref mut variants) => for ref mut variant in variants {
|
||||||
for ref mut variant in variants {
|
variant.attrs.rename_by_rule(attrs.rename_all());
|
||||||
variant.attrs.rename_by_rule(attrs.rename_all());
|
for ref mut field in &mut variant.fields {
|
||||||
for ref mut field in &mut variant.fields {
|
field.attrs.rename_by_rule(variant.attrs.rename_all());
|
||||||
field.attrs.rename_by_rule(variant.attrs.rename_all());
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
Body::Struct(_, ref mut fields) => {
|
Body::Struct(_, ref mut fields) => for field in fields {
|
||||||
for field in fields {
|
field.attrs.rename_by_rule(attrs.rename_all());
|
||||||
field.attrs.rename_by_rule(attrs.rename_all());
|
},
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
let item = Container {
|
let item = Container {
|
||||||
@@ -98,46 +96,63 @@ impl<'a> Body<'a> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn enum_from_ast<'a>(cx: &Ctxt, variants: &'a [syn::Variant]) -> Vec<Variant<'a>> {
|
fn enum_from_ast<'a>(
|
||||||
|
cx: &Ctxt,
|
||||||
|
variants: &'a [syn::Variant],
|
||||||
|
container_default: &attr::Default,
|
||||||
|
) -> Vec<Variant<'a>> {
|
||||||
variants
|
variants
|
||||||
.iter()
|
.iter()
|
||||||
.map(
|
.map(|variant| {
|
||||||
|variant| {
|
let attrs = attr::Variant::from_ast(cx, variant);
|
||||||
let (style, fields) = struct_from_ast(cx, &variant.data);
|
let (style, fields) =
|
||||||
Variant {
|
struct_from_ast(cx, &variant.data, Some(&attrs), container_default);
|
||||||
ident: variant.ident.clone(),
|
Variant {
|
||||||
attrs: attr::Variant::from_ast(cx, variant),
|
ident: variant.ident.clone(),
|
||||||
style: style,
|
attrs: attrs,
|
||||||
fields: fields,
|
style: style,
|
||||||
}
|
fields: fields,
|
||||||
},
|
}
|
||||||
)
|
})
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
fn struct_from_ast<'a>(cx: &Ctxt, data: &'a syn::VariantData) -> (Style, Vec<Field<'a>>) {
|
fn struct_from_ast<'a>(
|
||||||
|
cx: &Ctxt,
|
||||||
|
data: &'a syn::VariantData,
|
||||||
|
attrs: Option<&attr::Variant>,
|
||||||
|
container_default: &attr::Default,
|
||||||
|
) -> (Style, Vec<Field<'a>>) {
|
||||||
match *data {
|
match *data {
|
||||||
syn::VariantData::Struct(ref fields) => (Style::Struct, fields_from_ast(cx, fields)),
|
syn::VariantData::Struct(ref fields) => (
|
||||||
syn::VariantData::Tuple(ref fields) if fields.len() == 1 => {
|
Style::Struct,
|
||||||
(Style::Newtype, fields_from_ast(cx, fields))
|
fields_from_ast(cx, fields, attrs, container_default),
|
||||||
}
|
),
|
||||||
syn::VariantData::Tuple(ref fields) => (Style::Tuple, fields_from_ast(cx, fields)),
|
syn::VariantData::Tuple(ref fields) if fields.len() == 1 => (
|
||||||
|
Style::Newtype,
|
||||||
|
fields_from_ast(cx, fields, attrs, container_default),
|
||||||
|
),
|
||||||
|
syn::VariantData::Tuple(ref fields) => (
|
||||||
|
Style::Tuple,
|
||||||
|
fields_from_ast(cx, fields, attrs, container_default),
|
||||||
|
),
|
||||||
syn::VariantData::Unit => (Style::Unit, Vec::new()),
|
syn::VariantData::Unit => (Style::Unit, Vec::new()),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn fields_from_ast<'a>(cx: &Ctxt, fields: &'a [syn::Field]) -> Vec<Field<'a>> {
|
fn fields_from_ast<'a>(
|
||||||
|
cx: &Ctxt,
|
||||||
|
fields: &'a [syn::Field],
|
||||||
|
attrs: Option<&attr::Variant>,
|
||||||
|
container_default: &attr::Default,
|
||||||
|
) -> Vec<Field<'a>> {
|
||||||
fields
|
fields
|
||||||
.iter()
|
.iter()
|
||||||
.enumerate()
|
.enumerate()
|
||||||
.map(
|
.map(|(i, field)| Field {
|
||||||
|(i, field)| {
|
ident: field.ident.clone(),
|
||||||
Field {
|
attrs: attr::Field::from_ast(cx, i, field, attrs, container_default),
|
||||||
ident: field.ident.clone(),
|
ty: &field.ty,
|
||||||
attrs: attr::Field::from_ast(cx, i, field),
|
})
|
||||||
ty: &field.ty,
|
|
||||||
}
|
|
||||||
},
|
|
||||||
)
|
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|||||||
+184
-112
@@ -164,6 +164,15 @@ pub enum Identifier {
|
|||||||
Variant,
|
Variant,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
impl Identifier {
|
||||||
|
pub fn is_some(self) -> bool {
|
||||||
|
match self {
|
||||||
|
Identifier::No => false,
|
||||||
|
Identifier::Field | Identifier::Variant => true,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl Container {
|
impl Container {
|
||||||
/// Extract out the `#[serde(...)]` attributes from an item.
|
/// Extract out the `#[serde(...)]` attributes from an item.
|
||||||
pub fn from_ast(cx: &Ctxt, item: &syn::DeriveInput) -> Self {
|
pub fn from_ast(cx: &Ctxt, item: &syn::DeriveInput) -> Self {
|
||||||
@@ -207,11 +216,11 @@ impl Container {
|
|||||||
if let Ok(s) = get_string_from_lit(cx, name.as_ref(), name.as_ref(), lit) {
|
if let Ok(s) = get_string_from_lit(cx, name.as_ref(), name.as_ref(), lit) {
|
||||||
match RenameRule::from_str(&s) {
|
match RenameRule::from_str(&s) {
|
||||||
Ok(rename_rule) => rename_all.set(rename_rule),
|
Ok(rename_rule) => rename_all.set(rename_rule),
|
||||||
Err(()) => {
|
Err(()) => cx.error(format!(
|
||||||
cx.error(format!("unknown rename rule for #[serde(rename_all \
|
"unknown rename rule for #[serde(rename_all \
|
||||||
= {:?})]",
|
= {:?})]",
|
||||||
s))
|
s
|
||||||
}
|
)),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -222,19 +231,15 @@ impl Container {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Parse `#[serde(default)]`
|
// Parse `#[serde(default)]`
|
||||||
MetaItem(Word(ref name)) if name == "default" => {
|
MetaItem(Word(ref name)) if name == "default" => match item.body {
|
||||||
match item.body {
|
syn::Body::Struct(syn::VariantData::Struct(_)) => {
|
||||||
syn::Body::Struct(syn::VariantData::Struct(_)) => {
|
default.set(Default::Default);
|
||||||
default.set(Default::Default);
|
|
||||||
}
|
|
||||||
_ => {
|
|
||||||
cx.error(
|
|
||||||
"#[serde(default)] can only be used on structs \
|
|
||||||
with named fields",
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
_ => cx.error(
|
||||||
|
"#[serde(default)] can only be used on structs \
|
||||||
|
with named fields",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
|
||||||
// Parse `#[serde(default = "...")]`
|
// Parse `#[serde(default = "...")]`
|
||||||
MetaItem(NameValue(ref name, ref lit)) if name == "default" => {
|
MetaItem(NameValue(ref name, ref lit)) if name == "default" => {
|
||||||
@@ -243,12 +248,10 @@ impl Container {
|
|||||||
syn::Body::Struct(syn::VariantData::Struct(_)) => {
|
syn::Body::Struct(syn::VariantData::Struct(_)) => {
|
||||||
default.set(Default::Path(path));
|
default.set(Default::Path(path));
|
||||||
}
|
}
|
||||||
_ => {
|
_ => cx.error(
|
||||||
cx.error(
|
"#[serde(default = \"...\")] can only be used \
|
||||||
"#[serde(default = \"...\")] can only be used \
|
on structs with named fields",
|
||||||
on structs with named fields",
|
),
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -256,7 +259,8 @@ impl Container {
|
|||||||
// Parse `#[serde(bound = "D: Serialize")]`
|
// Parse `#[serde(bound = "D: Serialize")]`
|
||||||
MetaItem(NameValue(ref name, ref lit)) if name == "bound" => {
|
MetaItem(NameValue(ref name, ref lit)) if name == "bound" => {
|
||||||
if let Ok(where_predicates) =
|
if let Ok(where_predicates) =
|
||||||
parse_lit_into_where(cx, name.as_ref(), name.as_ref(), lit) {
|
parse_lit_into_where(cx, name.as_ref(), name.as_ref(), lit)
|
||||||
|
{
|
||||||
ser_bound.set(where_predicates.clone());
|
ser_bound.set(where_predicates.clone());
|
||||||
de_bound.set(where_predicates);
|
de_bound.set(where_predicates);
|
||||||
}
|
}
|
||||||
@@ -271,16 +275,14 @@ impl Container {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Parse `#[serde(untagged)]`
|
// Parse `#[serde(untagged)]`
|
||||||
MetaItem(Word(ref name)) if name == "untagged" => {
|
MetaItem(Word(ref name)) if name == "untagged" => match item.body {
|
||||||
match item.body {
|
syn::Body::Enum(_) => {
|
||||||
syn::Body::Enum(_) => {
|
untagged.set_true();
|
||||||
untagged.set_true();
|
|
||||||
}
|
|
||||||
syn::Body::Struct(_) => {
|
|
||||||
cx.error("#[serde(untagged)] can only be used on enums")
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
syn::Body::Struct(_) => {
|
||||||
|
cx.error("#[serde(untagged)] can only be used on enums")
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
// Parse `#[serde(tag = "type")]`
|
// Parse `#[serde(tag = "type")]`
|
||||||
MetaItem(NameValue(ref name, ref lit)) if name == "tag" => {
|
MetaItem(NameValue(ref name, ref lit)) if name == "tag" => {
|
||||||
@@ -303,12 +305,10 @@ impl Container {
|
|||||||
syn::Body::Enum(_) => {
|
syn::Body::Enum(_) => {
|
||||||
content.set(s);
|
content.set(s);
|
||||||
}
|
}
|
||||||
syn::Body::Struct(_) => {
|
syn::Body::Struct(_) => cx.error(
|
||||||
cx.error(
|
"#[serde(content = \"...\")] can only be used on \
|
||||||
"#[serde(content = \"...\")] can only be used on \
|
enums",
|
||||||
enums",
|
),
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -345,8 +345,10 @@ impl Container {
|
|||||||
}
|
}
|
||||||
|
|
||||||
MetaItem(ref meta_item) => {
|
MetaItem(ref meta_item) => {
|
||||||
cx.error(format!("unknown serde container attribute `{}`",
|
cx.error(format!(
|
||||||
meta_item.name()));
|
"unknown serde container attribute `{}`",
|
||||||
|
meta_item.name()
|
||||||
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
Literal(_) => {
|
Literal(_) => {
|
||||||
@@ -434,13 +436,12 @@ fn decide_tag(
|
|||||||
if let syn::Body::Enum(ref variants) = item.body {
|
if let syn::Body::Enum(ref variants) = item.body {
|
||||||
for variant in variants {
|
for variant in variants {
|
||||||
match variant.data {
|
match variant.data {
|
||||||
syn::VariantData::Struct(_) |
|
syn::VariantData::Struct(_) | syn::VariantData::Unit => {}
|
||||||
syn::VariantData::Unit => {}
|
|
||||||
syn::VariantData::Tuple(ref fields) => {
|
syn::VariantData::Tuple(ref fields) => {
|
||||||
if fields.len() != 1 {
|
if fields.len() != 1 {
|
||||||
cx.error(
|
cx.error(
|
||||||
"#[serde(tag = \"...\")] cannot be used with tuple \
|
"#[serde(tag = \"...\")] cannot be used with tuple \
|
||||||
variants",
|
variants",
|
||||||
);
|
);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
@@ -455,21 +456,19 @@ fn decide_tag(
|
|||||||
EnumTag::External // doesn't matter, will error
|
EnumTag::External // doesn't matter, will error
|
||||||
}
|
}
|
||||||
(false, None, Some(_)) => {
|
(false, None, Some(_)) => {
|
||||||
cx.error("#[serde(tag = \"...\", content = \"...\")] must be used together",);
|
cx.error("#[serde(tag = \"...\", content = \"...\")] must be used together");
|
||||||
EnumTag::External
|
EnumTag::External
|
||||||
}
|
}
|
||||||
(true, None, Some(_)) => {
|
(true, None, Some(_)) => {
|
||||||
cx.error("untagged enum cannot have #[serde(content = \"...\")]");
|
cx.error("untagged enum cannot have #[serde(content = \"...\")]");
|
||||||
EnumTag::External
|
EnumTag::External
|
||||||
}
|
}
|
||||||
(false, Some(tag), Some(content)) => {
|
(false, Some(tag), Some(content)) => EnumTag::Adjacent {
|
||||||
EnumTag::Adjacent {
|
tag: tag,
|
||||||
tag: tag,
|
content: content,
|
||||||
content: content,
|
},
|
||||||
}
|
|
||||||
}
|
|
||||||
(true, Some(_), Some(_)) => {
|
(true, Some(_), Some(_)) => {
|
||||||
cx.error("untagged enum cannot have #[serde(tag = \"...\", content = \"...\")]",);
|
cx.error("untagged enum cannot have #[serde(tag = \"...\", content = \"...\")]");
|
||||||
EnumTag::External
|
EnumTag::External
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -484,7 +483,7 @@ fn decide_identifier(
|
|||||||
match (&item.body, field_identifier.get(), variant_identifier.get()) {
|
match (&item.body, field_identifier.get(), variant_identifier.get()) {
|
||||||
(_, false, false) => Identifier::No,
|
(_, false, false) => Identifier::No,
|
||||||
(_, true, true) => {
|
(_, true, true) => {
|
||||||
cx.error("`field_identifier` and `variant_identifier` cannot both be set",);
|
cx.error("`field_identifier` and `variant_identifier` cannot both be set");
|
||||||
Identifier::No
|
Identifier::No
|
||||||
}
|
}
|
||||||
(&syn::Body::Struct(_), true, false) => {
|
(&syn::Body::Struct(_), true, false) => {
|
||||||
@@ -510,6 +509,9 @@ pub struct Variant {
|
|||||||
skip_deserializing: bool,
|
skip_deserializing: bool,
|
||||||
skip_serializing: bool,
|
skip_serializing: bool,
|
||||||
other: bool,
|
other: bool,
|
||||||
|
serialize_with: Option<syn::Path>,
|
||||||
|
deserialize_with: Option<syn::Path>,
|
||||||
|
borrow: Option<syn::MetaItem>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Variant {
|
impl Variant {
|
||||||
@@ -520,6 +522,9 @@ impl Variant {
|
|||||||
let mut skip_serializing = BoolAttr::none(cx, "skip_serializing");
|
let mut skip_serializing = BoolAttr::none(cx, "skip_serializing");
|
||||||
let mut rename_all = Attr::none(cx, "rename_all");
|
let mut rename_all = Attr::none(cx, "rename_all");
|
||||||
let mut other = BoolAttr::none(cx, "other");
|
let mut other = BoolAttr::none(cx, "other");
|
||||||
|
let mut serialize_with = Attr::none(cx, "serialize_with");
|
||||||
|
let mut deserialize_with = Attr::none(cx, "deserialize_with");
|
||||||
|
let mut borrow = Attr::none(cx, "borrow");
|
||||||
|
|
||||||
for meta_items in variant.attrs.iter().filter_map(get_serde_meta_items) {
|
for meta_items in variant.attrs.iter().filter_map(get_serde_meta_items) {
|
||||||
for meta_item in meta_items {
|
for meta_item in meta_items {
|
||||||
@@ -545,11 +550,11 @@ impl Variant {
|
|||||||
if let Ok(s) = get_string_from_lit(cx, name.as_ref(), name.as_ref(), lit) {
|
if let Ok(s) = get_string_from_lit(cx, name.as_ref(), name.as_ref(), lit) {
|
||||||
match RenameRule::from_str(&s) {
|
match RenameRule::from_str(&s) {
|
||||||
Ok(rename_rule) => rename_all.set(rename_rule),
|
Ok(rename_rule) => rename_all.set(rename_rule),
|
||||||
Err(()) => {
|
Err(()) => cx.error(format!(
|
||||||
cx.error(format!("unknown rename rule for #[serde(rename_all \
|
"unknown rename rule for #[serde(rename_all \
|
||||||
= {:?})]",
|
= {:?})]",
|
||||||
s))
|
s
|
||||||
}
|
)),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -569,8 +574,47 @@ impl Variant {
|
|||||||
other.set_true();
|
other.set_true();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Parse `#[serde(with = "...")]`
|
||||||
|
MetaItem(NameValue(ref name, ref lit)) if name == "with" => {
|
||||||
|
if let Ok(path) = parse_lit_into_path(cx, name.as_ref(), lit) {
|
||||||
|
let mut ser_path = path.clone();
|
||||||
|
ser_path.segments.push("serialize".into());
|
||||||
|
serialize_with.set(ser_path);
|
||||||
|
let mut de_path = path;
|
||||||
|
de_path.segments.push("deserialize".into());
|
||||||
|
deserialize_with.set(de_path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse `#[serde(serialize_with = "...")]`
|
||||||
|
MetaItem(NameValue(ref name, ref lit)) if name == "serialize_with" => {
|
||||||
|
if let Ok(path) = parse_lit_into_path(cx, name.as_ref(), lit) {
|
||||||
|
serialize_with.set(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse `#[serde(deserialize_with = "...")]`
|
||||||
|
MetaItem(NameValue(ref name, ref lit)) if name == "deserialize_with" => {
|
||||||
|
if let Ok(path) = parse_lit_into_path(cx, name.as_ref(), lit) {
|
||||||
|
deserialize_with.set(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Defer `#[serde(borrow)]` and `#[serde(borrow = "'a + 'b")]`
|
||||||
|
MetaItem(ref mi) if mi.name() == "borrow" => match variant.data {
|
||||||
|
syn::VariantData::Tuple(ref fields) if fields.len() == 1 => {
|
||||||
|
borrow.set(mi.clone());
|
||||||
|
}
|
||||||
|
_ => {
|
||||||
|
cx.error("#[serde(borrow)] may only be used on newtype variants");
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
MetaItem(ref meta_item) => {
|
MetaItem(ref meta_item) => {
|
||||||
cx.error(format!("unknown serde variant attribute `{}`", meta_item.name()));
|
cx.error(format!(
|
||||||
|
"unknown serde variant attribute `{}`",
|
||||||
|
meta_item.name()
|
||||||
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
Literal(_) => {
|
Literal(_) => {
|
||||||
@@ -595,6 +639,9 @@ impl Variant {
|
|||||||
skip_deserializing: skip_deserializing.get(),
|
skip_deserializing: skip_deserializing.get(),
|
||||||
skip_serializing: skip_serializing.get(),
|
skip_serializing: skip_serializing.get(),
|
||||||
other: other.get(),
|
other: other.get(),
|
||||||
|
serialize_with: serialize_with.get(),
|
||||||
|
deserialize_with: deserialize_with.get(),
|
||||||
|
borrow: borrow.get(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -626,6 +673,14 @@ impl Variant {
|
|||||||
pub fn other(&self) -> bool {
|
pub fn other(&self) -> bool {
|
||||||
self.other
|
self.other
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn serialize_with(&self) -> Option<&syn::Path> {
|
||||||
|
self.serialize_with.as_ref()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn deserialize_with(&self) -> Option<&syn::Path> {
|
||||||
|
self.deserialize_with.as_ref()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Represents field attribute information
|
/// Represents field attribute information
|
||||||
@@ -659,7 +714,13 @@ pub enum Default {
|
|||||||
|
|
||||||
impl Field {
|
impl Field {
|
||||||
/// Extract out the `#[serde(...)]` attributes from a struct field.
|
/// Extract out the `#[serde(...)]` attributes from a struct field.
|
||||||
pub fn from_ast(cx: &Ctxt, index: usize, field: &syn::Field) -> Self {
|
pub fn from_ast(
|
||||||
|
cx: &Ctxt,
|
||||||
|
index: usize,
|
||||||
|
field: &syn::Field,
|
||||||
|
attrs: Option<&Variant>,
|
||||||
|
container_default: &Default,
|
||||||
|
) -> Self {
|
||||||
let mut ser_name = Attr::none(cx, "rename");
|
let mut ser_name = Attr::none(cx, "rename");
|
||||||
let mut de_name = Attr::none(cx, "rename");
|
let mut de_name = Attr::none(cx, "rename");
|
||||||
let mut skip_serializing = BoolAttr::none(cx, "skip_serializing");
|
let mut skip_serializing = BoolAttr::none(cx, "skip_serializing");
|
||||||
@@ -678,7 +739,18 @@ impl Field {
|
|||||||
None => index.to_string(),
|
None => index.to_string(),
|
||||||
};
|
};
|
||||||
|
|
||||||
for meta_items in field.attrs.iter().filter_map(get_serde_meta_items) {
|
let variant_borrow = attrs
|
||||||
|
.map(|variant| &variant.borrow)
|
||||||
|
.unwrap_or(&None)
|
||||||
|
.as_ref()
|
||||||
|
.map(|borrow| vec![MetaItem(borrow.clone())]);
|
||||||
|
|
||||||
|
for meta_items in field
|
||||||
|
.attrs
|
||||||
|
.iter()
|
||||||
|
.filter_map(get_serde_meta_items)
|
||||||
|
.chain(variant_borrow)
|
||||||
|
{
|
||||||
for meta_item in meta_items {
|
for meta_item in meta_items {
|
||||||
match meta_item {
|
match meta_item {
|
||||||
// Parse `#[serde(rename = "foo")]`
|
// Parse `#[serde(rename = "foo")]`
|
||||||
@@ -719,6 +791,12 @@ impl Field {
|
|||||||
skip_deserializing.set_true();
|
skip_deserializing.set_true();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Parse `#[serde(skip)]`
|
||||||
|
MetaItem(Word(ref name)) if name == "skip" => {
|
||||||
|
skip_serializing.set_true();
|
||||||
|
skip_deserializing.set_true();
|
||||||
|
}
|
||||||
|
|
||||||
// Parse `#[serde(skip_serializing_if = "...")]`
|
// Parse `#[serde(skip_serializing_if = "...")]`
|
||||||
MetaItem(NameValue(ref name, ref lit)) if name == "skip_serializing_if" => {
|
MetaItem(NameValue(ref name, ref lit)) if name == "skip_serializing_if" => {
|
||||||
if let Ok(path) = parse_lit_into_path(cx, name.as_ref(), lit) {
|
if let Ok(path) = parse_lit_into_path(cx, name.as_ref(), lit) {
|
||||||
@@ -755,7 +833,8 @@ impl Field {
|
|||||||
// Parse `#[serde(bound = "D: Serialize")]`
|
// Parse `#[serde(bound = "D: Serialize")]`
|
||||||
MetaItem(NameValue(ref name, ref lit)) if name == "bound" => {
|
MetaItem(NameValue(ref name, ref lit)) if name == "bound" => {
|
||||||
if let Ok(where_predicates) =
|
if let Ok(where_predicates) =
|
||||||
parse_lit_into_where(cx, name.as_ref(), name.as_ref(), lit) {
|
parse_lit_into_where(cx, name.as_ref(), name.as_ref(), lit)
|
||||||
|
{
|
||||||
ser_bound.set(where_predicates.clone());
|
ser_bound.set(where_predicates.clone());
|
||||||
de_bound.set(where_predicates);
|
de_bound.set(where_predicates);
|
||||||
}
|
}
|
||||||
@@ -782,13 +861,10 @@ impl Field {
|
|||||||
if let Ok(borrowable) = borrowable_lifetimes(cx, &ident, &field.ty) {
|
if let Ok(borrowable) = borrowable_lifetimes(cx, &ident, &field.ty) {
|
||||||
for lifetime in &lifetimes {
|
for lifetime in &lifetimes {
|
||||||
if !borrowable.contains(lifetime) {
|
if !borrowable.contains(lifetime) {
|
||||||
cx.error(
|
cx.error(format!(
|
||||||
format!(
|
"field `{}` does not have lifetime {}",
|
||||||
"field `{}` does not have lifetime {}",
|
ident, lifetime.ident
|
||||||
ident,
|
));
|
||||||
lifetime.ident
|
|
||||||
),
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
borrowed_lifetimes.set(lifetimes);
|
borrowed_lifetimes.set(lifetimes);
|
||||||
@@ -804,7 +880,10 @@ impl Field {
|
|||||||
}
|
}
|
||||||
|
|
||||||
MetaItem(ref meta_item) => {
|
MetaItem(ref meta_item) => {
|
||||||
cx.error(format!("unknown serde field attribute `{}`", meta_item.name()),);
|
cx.error(format!(
|
||||||
|
"unknown serde field attribute `{}`",
|
||||||
|
meta_item.name()
|
||||||
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
Literal(_) => {
|
Literal(_) => {
|
||||||
@@ -814,9 +893,10 @@ impl Field {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Is skip_deserializing, initialize the field to Default::default()
|
// Is skip_deserializing, initialize the field to Default::default() unless a different
|
||||||
// unless a different default is specified by `#[serde(default = "...")]`
|
// default is specified by `#[serde(default = "...")]` on ourselves or our container (e.g.
|
||||||
if skip_deserializing.0.value.is_some() {
|
// the struct we are in).
|
||||||
|
if container_default == &Default::None && skip_deserializing.0.value.is_some() {
|
||||||
default.set_if_none(Default::Default);
|
default.set_if_none(Default::Default);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -951,13 +1031,11 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
_ => {
|
_ => {
|
||||||
cx.error(
|
cx.error(format!(
|
||||||
format!(
|
"malformed {0} attribute, expected `{0}(serialize = ..., \
|
||||||
"malformed {0} attribute, expected `{0}(serialize = ..., \
|
deserialize = ...)`",
|
||||||
deserialize = ...)`",
|
attr_name
|
||||||
attr_name
|
));
|
||||||
),
|
|
||||||
);
|
|
||||||
return Err(());
|
return Err(());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -993,13 +1071,10 @@ fn get_string_from_lit(
|
|||||||
if let syn::Lit::Str(ref s, _) = *lit {
|
if let syn::Lit::Str(ref s, _) = *lit {
|
||||||
Ok(s.clone())
|
Ok(s.clone())
|
||||||
} else {
|
} else {
|
||||||
cx.error(
|
cx.error(format!(
|
||||||
format!(
|
"expected serde {} attribute to be a string: `{} = \"...\"`",
|
||||||
"expected serde {} attribute to be a string: `{} = \"...\"`",
|
attr_name, meta_item_name
|
||||||
attr_name,
|
));
|
||||||
meta_item_name
|
|
||||||
),
|
|
||||||
);
|
|
||||||
Err(())
|
Err(())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1030,11 +1105,12 @@ fn parse_lit_into_where(
|
|||||||
fn parse_lit_into_ty(cx: &Ctxt, attr_name: &str, lit: &syn::Lit) -> Result<syn::Ty, ()> {
|
fn parse_lit_into_ty(cx: &Ctxt, attr_name: &str, lit: &syn::Lit) -> Result<syn::Ty, ()> {
|
||||||
let string = try!(get_string_from_lit(cx, attr_name, attr_name, lit));
|
let string = try!(get_string_from_lit(cx, attr_name, attr_name, lit));
|
||||||
|
|
||||||
syn::parse_type(&string).map_err(
|
syn::parse_type(&string).map_err(|_| {
|
||||||
|_| {
|
cx.error(format!(
|
||||||
cx.error(format!("failed to parse type: {} = {:?}", attr_name, string),)
|
"failed to parse type: {} = {:?}",
|
||||||
},
|
attr_name, string
|
||||||
)
|
))
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
// Parses a string literal like "'a + 'b + 'c" containing a nonempty list of
|
// Parses a string literal like "'a + 'b + 'c" containing a nonempty list of
|
||||||
@@ -1065,7 +1141,7 @@ fn parse_lit_into_lifetimes(
|
|||||||
return Ok(set);
|
return Ok(set);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Err(cx.error(format!("failed to parse borrowed lifetimes: {:?}", string)),)
|
Err(cx.error(format!("failed to parse borrowed lifetimes: {:?}", string)))
|
||||||
}
|
}
|
||||||
|
|
||||||
// Whether the type looks like it might be `std::borrow::Cow<T>` where elem="T".
|
// Whether the type looks like it might be `std::borrow::Cow<T>` where elem="T".
|
||||||
@@ -1109,8 +1185,8 @@ fn is_cow(ty: &syn::Ty, elem: &str) -> bool {
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
seg.ident == "Cow" && params.lifetimes.len() == 1 &&
|
seg.ident == "Cow" && params.lifetimes.len() == 1
|
||||||
params.types == vec![syn::parse_type(elem).unwrap()] && params.bindings.is_empty()
|
&& params.types == vec![syn::parse_type(elem).unwrap()] && params.bindings.is_empty()
|
||||||
}
|
}
|
||||||
|
|
||||||
// Whether the type looks like it might be `&T` where elem="T". This can have
|
// Whether the type looks like it might be `&T` where elem="T". This can have
|
||||||
@@ -1136,8 +1212,8 @@ fn is_cow(ty: &syn::Ty, elem: &str) -> bool {
|
|||||||
fn is_rptr(ty: &syn::Ty, elem: &str) -> bool {
|
fn is_rptr(ty: &syn::Ty, elem: &str) -> bool {
|
||||||
match *ty {
|
match *ty {
|
||||||
syn::Ty::Rptr(Some(_), ref mut_ty) => {
|
syn::Ty::Rptr(Some(_), ref mut_ty) => {
|
||||||
mut_ty.mutability == syn::Mutability::Immutable &&
|
mut_ty.mutability == syn::Mutability::Immutable
|
||||||
mut_ty.ty == syn::parse_type(elem).unwrap()
|
&& mut_ty.ty == syn::parse_type(elem).unwrap()
|
||||||
}
|
}
|
||||||
_ => false,
|
_ => false,
|
||||||
}
|
}
|
||||||
@@ -1158,7 +1234,7 @@ fn borrowable_lifetimes(
|
|||||||
let mut lifetimes = BTreeSet::new();
|
let mut lifetimes = BTreeSet::new();
|
||||||
collect_lifetimes(ty, &mut lifetimes);
|
collect_lifetimes(ty, &mut lifetimes);
|
||||||
if lifetimes.is_empty() {
|
if lifetimes.is_empty() {
|
||||||
Err(cx.error(format!("field `{}` has no lifetimes to borrow", name)),)
|
Err(cx.error(format!("field `{}` has no lifetimes to borrow", name)))
|
||||||
} else {
|
} else {
|
||||||
Ok(lifetimes)
|
Ok(lifetimes)
|
||||||
}
|
}
|
||||||
@@ -1166,9 +1242,7 @@ fn borrowable_lifetimes(
|
|||||||
|
|
||||||
fn collect_lifetimes(ty: &syn::Ty, out: &mut BTreeSet<syn::Lifetime>) {
|
fn collect_lifetimes(ty: &syn::Ty, out: &mut BTreeSet<syn::Lifetime>) {
|
||||||
match *ty {
|
match *ty {
|
||||||
syn::Ty::Slice(ref elem) |
|
syn::Ty::Slice(ref elem) | syn::Ty::Array(ref elem, _) | syn::Ty::Paren(ref elem) => {
|
||||||
syn::Ty::Array(ref elem, _) |
|
|
||||||
syn::Ty::Paren(ref elem) => {
|
|
||||||
collect_lifetimes(elem, out);
|
collect_lifetimes(elem, out);
|
||||||
}
|
}
|
||||||
syn::Ty::Ptr(ref elem) => {
|
syn::Ty::Ptr(ref elem) => {
|
||||||
@@ -1178,11 +1252,9 @@ fn collect_lifetimes(ty: &syn::Ty, out: &mut BTreeSet<syn::Lifetime>) {
|
|||||||
out.extend(lifetime.iter().cloned());
|
out.extend(lifetime.iter().cloned());
|
||||||
collect_lifetimes(&elem.ty, out);
|
collect_lifetimes(&elem.ty, out);
|
||||||
}
|
}
|
||||||
syn::Ty::Tup(ref elems) => {
|
syn::Ty::Tup(ref elems) => for elem in elems {
|
||||||
for elem in elems {
|
collect_lifetimes(elem, out);
|
||||||
collect_lifetimes(elem, out);
|
},
|
||||||
}
|
|
||||||
}
|
|
||||||
syn::Ty::Path(ref qself, ref path) => {
|
syn::Ty::Path(ref qself, ref path) => {
|
||||||
if let Some(ref qself) = *qself {
|
if let Some(ref qself) = *qself {
|
||||||
collect_lifetimes(&qself.ty, out);
|
collect_lifetimes(&qself.ty, out);
|
||||||
@@ -1199,11 +1271,11 @@ fn collect_lifetimes(ty: &syn::Ty, out: &mut BTreeSet<syn::Lifetime>) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
syn::Ty::BareFn(_) |
|
syn::Ty::BareFn(_)
|
||||||
syn::Ty::Never |
|
| syn::Ty::Never
|
||||||
syn::Ty::TraitObject(_) |
|
| syn::Ty::TraitObject(_)
|
||||||
syn::Ty::ImplTrait(_) |
|
| syn::Ty::ImplTrait(_)
|
||||||
syn::Ty::Infer |
|
| syn::Ty::Infer
|
||||||
syn::Ty::Mac(_) => {}
|
| syn::Ty::Mac(_) => {}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,7 +6,10 @@
|
|||||||
// option. This file may not be copied, modified, or distributed
|
// option. This file may not be copied, modified, or distributed
|
||||||
// except according to those terms.
|
// except according to those terms.
|
||||||
|
|
||||||
|
// See https://users.rust-lang.org/t/psa-dealing-with-warning-unused-import-std-ascii-asciiext-in-today-s-nightly/13726
|
||||||
|
#[allow(unused_imports)]
|
||||||
use std::ascii::AsciiExt;
|
use std::ascii::AsciiExt;
|
||||||
|
|
||||||
use std::str::FromStr;
|
use std::str::FromStr;
|
||||||
|
|
||||||
use self::RenameRule::*;
|
use self::RenameRule::*;
|
||||||
@@ -27,6 +30,8 @@ pub enum RenameRule {
|
|||||||
ScreamingSnakeCase,
|
ScreamingSnakeCase,
|
||||||
/// Rename direct children to "kebab-case" style.
|
/// Rename direct children to "kebab-case" style.
|
||||||
KebabCase,
|
KebabCase,
|
||||||
|
/// Rename direct children to "SCREAMING-KEBAB-CASE" style.
|
||||||
|
ScreamingKebabCase,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl RenameRule {
|
impl RenameRule {
|
||||||
@@ -47,6 +52,9 @@ impl RenameRule {
|
|||||||
}
|
}
|
||||||
ScreamingSnakeCase => SnakeCase.apply_to_variant(variant).to_ascii_uppercase(),
|
ScreamingSnakeCase => SnakeCase.apply_to_variant(variant).to_ascii_uppercase(),
|
||||||
KebabCase => SnakeCase.apply_to_variant(variant).replace('_', "-"),
|
KebabCase => SnakeCase.apply_to_variant(variant).replace('_', "-"),
|
||||||
|
ScreamingKebabCase => ScreamingSnakeCase
|
||||||
|
.apply_to_variant(variant)
|
||||||
|
.replace('_', "-"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -74,6 +82,7 @@ impl RenameRule {
|
|||||||
}
|
}
|
||||||
ScreamingSnakeCase => field.to_ascii_uppercase(),
|
ScreamingSnakeCase => field.to_ascii_uppercase(),
|
||||||
KebabCase => field.replace('_', "-"),
|
KebabCase => field.replace('_', "-"),
|
||||||
|
ScreamingKebabCase => ScreamingSnakeCase.apply_to_field(field).replace('_', "-"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -89,6 +98,7 @@ impl FromStr for RenameRule {
|
|||||||
"snake_case" => Ok(SnakeCase),
|
"snake_case" => Ok(SnakeCase),
|
||||||
"SCREAMING_SNAKE_CASE" => Ok(ScreamingSnakeCase),
|
"SCREAMING_SNAKE_CASE" => Ok(ScreamingSnakeCase),
|
||||||
"kebab-case" => Ok(KebabCase),
|
"kebab-case" => Ok(KebabCase),
|
||||||
|
"SCREAMING-KEBAB-CASE" => Ok(ScreamingKebabCase),
|
||||||
_ => Err(()),
|
_ => Err(()),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -96,13 +106,28 @@ impl FromStr for RenameRule {
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn rename_variants() {
|
fn rename_variants() {
|
||||||
for &(original, lower, camel, snake, screaming, kebab) in
|
for &(original, lower, camel, snake, screaming, kebab, screaming_kebab) in &[
|
||||||
&[
|
(
|
||||||
("Outcome", "outcome", "outcome", "outcome", "OUTCOME", "outcome"),
|
"Outcome",
|
||||||
("VeryTasty", "verytasty", "veryTasty", "very_tasty", "VERY_TASTY", "very-tasty"),
|
"outcome",
|
||||||
("A", "a", "a", "a", "A", "a"),
|
"outcome",
|
||||||
("Z42", "z42", "z42", "z42", "Z42", "z42"),
|
"outcome",
|
||||||
] {
|
"OUTCOME",
|
||||||
|
"outcome",
|
||||||
|
"OUTCOME",
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"VeryTasty",
|
||||||
|
"verytasty",
|
||||||
|
"veryTasty",
|
||||||
|
"very_tasty",
|
||||||
|
"VERY_TASTY",
|
||||||
|
"very-tasty",
|
||||||
|
"VERY-TASTY",
|
||||||
|
),
|
||||||
|
("A", "a", "a", "a", "A", "a", "A"),
|
||||||
|
("Z42", "z42", "z42", "z42", "Z42", "z42", "Z42"),
|
||||||
|
] {
|
||||||
assert_eq!(None.apply_to_variant(original), original);
|
assert_eq!(None.apply_to_variant(original), original);
|
||||||
assert_eq!(LowerCase.apply_to_variant(original), lower);
|
assert_eq!(LowerCase.apply_to_variant(original), lower);
|
||||||
assert_eq!(PascalCase.apply_to_variant(original), original);
|
assert_eq!(PascalCase.apply_to_variant(original), original);
|
||||||
@@ -110,23 +135,41 @@ fn rename_variants() {
|
|||||||
assert_eq!(SnakeCase.apply_to_variant(original), snake);
|
assert_eq!(SnakeCase.apply_to_variant(original), snake);
|
||||||
assert_eq!(ScreamingSnakeCase.apply_to_variant(original), screaming);
|
assert_eq!(ScreamingSnakeCase.apply_to_variant(original), screaming);
|
||||||
assert_eq!(KebabCase.apply_to_variant(original), kebab);
|
assert_eq!(KebabCase.apply_to_variant(original), kebab);
|
||||||
|
assert_eq!(
|
||||||
|
ScreamingKebabCase.apply_to_variant(original),
|
||||||
|
screaming_kebab
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn rename_fields() {
|
fn rename_fields() {
|
||||||
for &(original, pascal, camel, screaming, kebab) in
|
for &(original, pascal, camel, screaming, kebab, screaming_kebab) in &[
|
||||||
&[
|
(
|
||||||
("outcome", "Outcome", "outcome", "OUTCOME", "outcome"),
|
"outcome",
|
||||||
("very_tasty", "VeryTasty", "veryTasty", "VERY_TASTY", "very-tasty"),
|
"Outcome",
|
||||||
("a", "A", "a", "A", "a"),
|
"outcome",
|
||||||
("z42", "Z42", "z42", "Z42", "z42"),
|
"OUTCOME",
|
||||||
] {
|
"outcome",
|
||||||
|
"OUTCOME",
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"very_tasty",
|
||||||
|
"VeryTasty",
|
||||||
|
"veryTasty",
|
||||||
|
"VERY_TASTY",
|
||||||
|
"very-tasty",
|
||||||
|
"VERY-TASTY",
|
||||||
|
),
|
||||||
|
("a", "A", "a", "A", "a", "A"),
|
||||||
|
("z42", "Z42", "z42", "Z42", "z42", "Z42"),
|
||||||
|
] {
|
||||||
assert_eq!(None.apply_to_field(original), original);
|
assert_eq!(None.apply_to_field(original), original);
|
||||||
assert_eq!(PascalCase.apply_to_field(original), pascal);
|
assert_eq!(PascalCase.apply_to_field(original), pascal);
|
||||||
assert_eq!(CamelCase.apply_to_field(original), camel);
|
assert_eq!(CamelCase.apply_to_field(original), camel);
|
||||||
assert_eq!(SnakeCase.apply_to_field(original), original);
|
assert_eq!(SnakeCase.apply_to_field(original), original);
|
||||||
assert_eq!(ScreamingSnakeCase.apply_to_field(original), screaming);
|
assert_eq!(ScreamingSnakeCase.apply_to_field(original), screaming);
|
||||||
assert_eq!(KebabCase.apply_to_field(original), kebab);
|
assert_eq!(KebabCase.apply_to_field(original), kebab);
|
||||||
|
assert_eq!(ScreamingKebabCase.apply_to_field(original), screaming_kebab);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ use Ctxt;
|
|||||||
pub fn check(cx: &Ctxt, cont: &Container) {
|
pub fn check(cx: &Ctxt, cont: &Container) {
|
||||||
check_getter(cx, cont);
|
check_getter(cx, cont);
|
||||||
check_identifier(cx, cont);
|
check_identifier(cx, cont);
|
||||||
|
check_variant_skip_attrs(cx, cont);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Getters are only allowed inside structs (not enums) with the `remote`
|
/// Getters are only allowed inside structs (not enums) with the `remote`
|
||||||
@@ -30,7 +31,7 @@ fn check_getter(cx: &Ctxt, cont: &Container) {
|
|||||||
if cont.body.has_getter() && cont.attrs.remote().is_none() {
|
if cont.body.has_getter() && cont.attrs.remote().is_none() {
|
||||||
cx.error(
|
cx.error(
|
||||||
"#[serde(getter = \"...\")] can only be used in structs \
|
"#[serde(getter = \"...\")] can only be used in structs \
|
||||||
that have #[serde(remote = \"...\")]",
|
that have #[serde(remote = \"...\")]",
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -52,10 +53,13 @@ fn check_identifier(cx: &Ctxt, cont: &Container) {
|
|||||||
};
|
};
|
||||||
|
|
||||||
for (i, variant) in variants.iter().enumerate() {
|
for (i, variant) in variants.iter().enumerate() {
|
||||||
match (variant.style, cont.attrs.identifier(), variant.attrs.other()) {
|
match (
|
||||||
|
variant.style,
|
||||||
|
cont.attrs.identifier(),
|
||||||
|
variant.attrs.other(),
|
||||||
|
) {
|
||||||
// The `other` attribute may only be used in a field_identifier.
|
// The `other` attribute may only be used in a field_identifier.
|
||||||
(_, Identifier::Variant, true) |
|
(_, Identifier::Variant, true) | (_, Identifier::No, true) => {
|
||||||
(_, Identifier::No, true) => {
|
|
||||||
cx.error("#[serde(other)] may only be used inside a field_identifier");
|
cx.error("#[serde(other)] may only be used inside a field_identifier");
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -94,3 +98,74 @@ fn check_identifier(cx: &Ctxt, cont: &Container) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Skip-(de)serializing attributes are not allowed on variants marked
|
||||||
|
/// (de)serialize_with.
|
||||||
|
fn check_variant_skip_attrs(cx: &Ctxt, cont: &Container) {
|
||||||
|
let variants = match cont.body {
|
||||||
|
Body::Enum(ref variants) => variants,
|
||||||
|
Body::Struct(_, _) => {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
for variant in variants.iter() {
|
||||||
|
if variant.attrs.serialize_with().is_some() {
|
||||||
|
if variant.attrs.skip_serializing() {
|
||||||
|
cx.error(format!(
|
||||||
|
"variant `{}` cannot have both #[serde(serialize_with)] and \
|
||||||
|
#[serde(skip_serializing)]",
|
||||||
|
variant.ident
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
for (i, field) in variant.fields.iter().enumerate() {
|
||||||
|
let ident = field
|
||||||
|
.ident
|
||||||
|
.as_ref()
|
||||||
|
.map_or_else(|| format!("{}", i), |ident| format!("`{}`", ident));
|
||||||
|
|
||||||
|
if field.attrs.skip_serializing() {
|
||||||
|
cx.error(format!(
|
||||||
|
"variant `{}` cannot have both #[serde(serialize_with)] and \
|
||||||
|
a field {} marked with #[serde(skip_serializing)]",
|
||||||
|
variant.ident, ident
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
if field.attrs.skip_serializing_if().is_some() {
|
||||||
|
cx.error(format!(
|
||||||
|
"variant `{}` cannot have both #[serde(serialize_with)] and \
|
||||||
|
a field {} marked with #[serde(skip_serializing_if)]",
|
||||||
|
variant.ident, ident
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if variant.attrs.deserialize_with().is_some() {
|
||||||
|
if variant.attrs.skip_deserializing() {
|
||||||
|
cx.error(format!(
|
||||||
|
"variant `{}` cannot have both #[serde(deserialize_with)] and \
|
||||||
|
#[serde(skip_deserializing)]",
|
||||||
|
variant.ident
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
for (i, field) in variant.fields.iter().enumerate() {
|
||||||
|
if field.attrs.skip_deserializing() {
|
||||||
|
let ident = field
|
||||||
|
.ident
|
||||||
|
.as_ref()
|
||||||
|
.map_or_else(|| format!("{}", i), |ident| format!("`{}`", ident));
|
||||||
|
|
||||||
|
cx.error(format!(
|
||||||
|
"variant `{}` cannot have both #[serde(deserialize_with)] \
|
||||||
|
and a field {} marked with #[serde(skip_deserializing)]",
|
||||||
|
variant.ident, ident
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -16,7 +16,9 @@ pub struct Ctxt {
|
|||||||
|
|
||||||
impl Ctxt {
|
impl Ctxt {
|
||||||
pub fn new() -> Self {
|
pub fn new() -> Self {
|
||||||
Ctxt { errors: RefCell::new(Some(Vec::new())) }
|
Ctxt {
|
||||||
|
errors: RefCell::new(Some(Vec::new())),
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn error<T: Display>(&self, msg: T) {
|
pub fn error<T: Display>(&self, msg: T) {
|
||||||
|
|||||||
@@ -6,7 +6,7 @@
|
|||||||
// option. This file may not be copied, modified, or distributed
|
// option. This file may not be copied, modified, or distributed
|
||||||
// except according to those terms.
|
// except according to those terms.
|
||||||
|
|
||||||
#![doc(html_root_url = "https://docs.rs/serde_derive_internals/0.15.0")]
|
#![doc(html_root_url = "https://docs.rs/serde_derive_internals/0.18.1")]
|
||||||
|
|
||||||
extern crate syn;
|
extern crate syn;
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "serde_test"
|
name = "serde_test"
|
||||||
version = "1.0.2" # remember to update html_root_url
|
version = "1.0.25" # remember to update html_root_url
|
||||||
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
authors = ["Erick Tryzelaar <erick.tryzelaar@gmail.com>", "David Tolnay <dtolnay@gmail.com>"]
|
||||||
license = "MIT/Apache-2.0"
|
license = "MIT/Apache-2.0"
|
||||||
description = "Token De/Serializer for testing De/Serialize implementations"
|
description = "Token De/Serializer for testing De/Serialize implementations"
|
||||||
@@ -12,10 +12,10 @@ readme = "README.md"
|
|||||||
include = ["Cargo.toml", "src/**/*.rs", "README.md", "LICENSE-APACHE", "LICENSE-MIT"]
|
include = ["Cargo.toml", "src/**/*.rs", "README.md", "LICENSE-APACHE", "LICENSE-MIT"]
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
serde = { version = "1.0", path = "../serde" }
|
serde = { version = "1.0.16", path = "../serde" }
|
||||||
|
|
||||||
[dev-dependencies]
|
[dev-dependencies]
|
||||||
serde = { version = "1.0", path = "../serde", features = ["rc"] }
|
serde = { version = "1.0.16", path = "../serde", features = ["rc"] }
|
||||||
serde_derive = { version = "1.0", path = "../serde_derive" }
|
serde_derive = { version = "1.0", path = "../serde_derive" }
|
||||||
|
|
||||||
[badges]
|
[badges]
|
||||||
|
|||||||
@@ -6,7 +6,7 @@
|
|||||||
// option. This file may not be copied, modified, or distributed
|
// option. This file may not be copied, modified, or distributed
|
||||||
// except according to those terms.
|
// except according to those terms.
|
||||||
|
|
||||||
use serde::{Serialize, Deserialize};
|
use serde::{Deserialize, Serialize};
|
||||||
|
|
||||||
use de::Deserializer;
|
use de::Deserializer;
|
||||||
use ser::Serializer;
|
use ser::Serializer;
|
||||||
@@ -184,11 +184,27 @@ where
|
|||||||
T: Deserialize<'de> + PartialEq + Debug,
|
T: Deserialize<'de> + PartialEq + Debug,
|
||||||
{
|
{
|
||||||
let mut de = Deserializer::new(tokens);
|
let mut de = Deserializer::new(tokens);
|
||||||
match T::deserialize(&mut de) {
|
let mut deserialized_val = match T::deserialize(&mut de) {
|
||||||
Ok(v) => assert_eq!(v, *value),
|
Ok(v) => {
|
||||||
|
assert_eq!(v, *value);
|
||||||
|
v
|
||||||
|
}
|
||||||
Err(e) => panic!("tokens failed to deserialize: {}", e),
|
Err(e) => panic!("tokens failed to deserialize: {}", e),
|
||||||
|
};
|
||||||
|
if de.remaining() > 0 {
|
||||||
|
panic!("{} remaining tokens", de.remaining());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Do the same thing for deserialize_in_place. This isn't *great* because a
|
||||||
|
// no-op impl of deserialize_in_place can technically succeed here. Still,
|
||||||
|
// this should catch a lot of junk.
|
||||||
|
let mut de = Deserializer::new(tokens);
|
||||||
|
match T::deserialize_in_place(&mut de, &mut deserialized_val) {
|
||||||
|
Ok(()) => {
|
||||||
|
assert_eq!(deserialized_val, *value);
|
||||||
|
}
|
||||||
|
Err(e) => panic!("tokens failed to deserialize_in_place: {}", e),
|
||||||
|
}
|
||||||
if de.remaining() > 0 {
|
if de.remaining() > 0 {
|
||||||
panic!("{} remaining tokens", de.remaining());
|
panic!("{} remaining tokens", de.remaining());
|
||||||
}
|
}
|
||||||
@@ -215,7 +231,7 @@ where
|
|||||||
///
|
///
|
||||||
/// assert_de_tokens_error::<S>(
|
/// assert_de_tokens_error::<S>(
|
||||||
/// &[
|
/// &[
|
||||||
/// Token::Struct { name: "S", len: 1 },
|
/// Token::Struct { name: "S", len: 2 },
|
||||||
/// Token::Str("x"),
|
/// Token::Str("x"),
|
||||||
/// ],
|
/// ],
|
||||||
/// "unknown field `x`, expected `a` or `b`",
|
/// "unknown field `x`, expected `a` or `b`",
|
||||||
|
|||||||
@@ -0,0 +1,664 @@
|
|||||||
|
use std::fmt;
|
||||||
|
|
||||||
|
use serde::{Deserialize, Deserializer, Serialize, Serializer};
|
||||||
|
use serde::ser::{SerializeMap, SerializeSeq, SerializeStruct, SerializeStructVariant,
|
||||||
|
SerializeTuple, SerializeTupleStruct, SerializeTupleVariant};
|
||||||
|
|
||||||
|
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord)]
|
||||||
|
pub struct Readable<T: ?Sized>(T);
|
||||||
|
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord)]
|
||||||
|
pub struct Compact<T: ?Sized>(T);
|
||||||
|
|
||||||
|
/// Trait to determine whether a value is represented in human-readable or
|
||||||
|
/// compact form.
|
||||||
|
///
|
||||||
|
/// ```
|
||||||
|
/// extern crate serde;
|
||||||
|
/// extern crate serde_test;
|
||||||
|
///
|
||||||
|
/// use serde::{Deserialize, Deserializer, Serialize, Serializer};
|
||||||
|
/// use serde_test::{Configure, Token, assert_tokens};
|
||||||
|
///
|
||||||
|
/// #[derive(Debug, PartialEq)]
|
||||||
|
/// struct Example(u8, u8);
|
||||||
|
///
|
||||||
|
/// impl Serialize for Example {
|
||||||
|
/// fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
/// where S: Serializer,
|
||||||
|
/// {
|
||||||
|
/// if serializer.is_human_readable() {
|
||||||
|
/// format!("{}.{}", self.0, self.1).serialize(serializer)
|
||||||
|
/// } else {
|
||||||
|
/// (self.0, self.1).serialize(serializer)
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
///
|
||||||
|
/// impl<'de> Deserialize<'de> for Example {
|
||||||
|
/// fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||||
|
/// where D: Deserializer<'de>,
|
||||||
|
/// {
|
||||||
|
/// use serde::de::Error;
|
||||||
|
/// if deserializer.is_human_readable() {
|
||||||
|
/// let s = String::deserialize(deserializer)?;
|
||||||
|
/// let parts: Vec<_> = s.split('.').collect();
|
||||||
|
/// Ok(Example(
|
||||||
|
/// parts[0].parse().map_err(D::Error::custom)?,
|
||||||
|
/// parts[1].parse().map_err(D::Error::custom)?,
|
||||||
|
/// ))
|
||||||
|
/// } else {
|
||||||
|
/// let (x, y) = Deserialize::deserialize(deserializer)?;
|
||||||
|
/// Ok(Example(x, y))
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
/// }
|
||||||
|
///
|
||||||
|
/// fn main() {
|
||||||
|
/// assert_tokens(
|
||||||
|
/// &Example(1, 0).compact(),
|
||||||
|
/// &[
|
||||||
|
/// Token::Tuple { len: 2 },
|
||||||
|
/// Token::U8(1),
|
||||||
|
/// Token::U8(0),
|
||||||
|
/// Token::TupleEnd,
|
||||||
|
/// ],
|
||||||
|
/// );
|
||||||
|
/// assert_tokens(
|
||||||
|
/// &Example(1, 0).readable(),
|
||||||
|
/// &[
|
||||||
|
/// Token::Str("1.0"),
|
||||||
|
/// ],
|
||||||
|
/// );
|
||||||
|
/// }
|
||||||
|
/// ```
|
||||||
|
pub trait Configure {
|
||||||
|
/// Marks `self` as using `is_human_readable == true`
|
||||||
|
fn readable(self) -> Readable<Self>
|
||||||
|
where
|
||||||
|
Self: Sized,
|
||||||
|
{
|
||||||
|
Readable(self)
|
||||||
|
}
|
||||||
|
/// Marks `self` as using `is_human_readable == false`
|
||||||
|
fn compact(self) -> Compact<Self>
|
||||||
|
where
|
||||||
|
Self: Sized,
|
||||||
|
{
|
||||||
|
Compact(self)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<T: ?Sized> Configure for T {}
|
||||||
|
|
||||||
|
impl<T: ?Sized> Serialize for Readable<T>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
#[inline]
|
||||||
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
self.0.serialize(Readable(serializer))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl<T: ?Sized> Serialize for Compact<T>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
#[inline]
|
||||||
|
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
self.0.serialize(Compact(serializer))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl<'de, T> Deserialize<'de> for Readable<T>
|
||||||
|
where
|
||||||
|
T: Deserialize<'de>,
|
||||||
|
{
|
||||||
|
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
T::deserialize(Readable(deserializer)).map(Readable)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl<'de, T> Deserialize<'de> for Compact<T>
|
||||||
|
where
|
||||||
|
T: Deserialize<'de>,
|
||||||
|
{
|
||||||
|
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
T::deserialize(Compact(deserializer)).map(Compact)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, T> DeserializeSeed<'de> for Readable<T>
|
||||||
|
where
|
||||||
|
T: DeserializeSeed<'de>,
|
||||||
|
{
|
||||||
|
type Value = T::Value;
|
||||||
|
|
||||||
|
fn deserialize<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
self.0.deserialize(Readable(deserializer))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl<'de, T> DeserializeSeed<'de> for Compact<T>
|
||||||
|
where
|
||||||
|
T: DeserializeSeed<'de>,
|
||||||
|
{
|
||||||
|
type Value = T::Value;
|
||||||
|
|
||||||
|
fn deserialize<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
self.0.deserialize(Compact(deserializer))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
macro_rules! forward_method {
|
||||||
|
($name: ident (self $(, $arg: ident : $arg_type: ty)* ) -> $return_type: ty) => {
|
||||||
|
fn $name (self $(, $arg : $arg_type)* ) -> $return_type {
|
||||||
|
(self.0).$name( $($arg),* )
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
macro_rules! forward_serialize_methods {
|
||||||
|
( $( $name: ident $arg_type: ty ),* ) => {
|
||||||
|
$(
|
||||||
|
forward_method!($name(self, v : $arg_type) -> Result<Self::Ok, Self::Error>);
|
||||||
|
)*
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
macro_rules! impl_serializer {
|
||||||
|
($wrapper: ident, $is_human_readable : expr) => {
|
||||||
|
impl<S> Serializer for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
|
||||||
|
type SerializeSeq = $wrapper<S::SerializeSeq>;
|
||||||
|
type SerializeTuple = $wrapper<S::SerializeTuple>;
|
||||||
|
type SerializeTupleStruct = $wrapper<S::SerializeTupleStruct>;
|
||||||
|
type SerializeTupleVariant = $wrapper<S::SerializeTupleVariant>;
|
||||||
|
type SerializeMap = $wrapper<S::SerializeMap>;
|
||||||
|
type SerializeStruct = $wrapper<S::SerializeStruct>;
|
||||||
|
type SerializeStructVariant = $wrapper<S::SerializeStructVariant>;
|
||||||
|
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
$is_human_readable
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
forward_serialize_methods!{
|
||||||
|
serialize_bool bool,
|
||||||
|
serialize_i8 i8,
|
||||||
|
serialize_i16 i16,
|
||||||
|
serialize_i32 i32,
|
||||||
|
serialize_i64 i64,
|
||||||
|
serialize_u8 u8,
|
||||||
|
serialize_u16 u16,
|
||||||
|
serialize_u32 u32,
|
||||||
|
serialize_u64 u64,
|
||||||
|
serialize_f32 f32,
|
||||||
|
serialize_f64 f64,
|
||||||
|
serialize_char char,
|
||||||
|
serialize_str &str,
|
||||||
|
serialize_bytes &[u8],
|
||||||
|
serialize_unit_struct &'static str
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_unit(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.serialize_unit()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_unit_variant(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
variant_index: u32,
|
||||||
|
variant: &'static str,
|
||||||
|
) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.serialize_unit_variant(name, variant_index, variant)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_newtype_struct<T: ?Sized>(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
value: &T,
|
||||||
|
) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_newtype_struct(name, &$wrapper(value))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_newtype_variant<T: ?Sized>(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
variant_index: u32,
|
||||||
|
variant: &'static str,
|
||||||
|
value: &T,
|
||||||
|
) -> Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0
|
||||||
|
.serialize_newtype_variant(name, variant_index, variant, &$wrapper(value))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_none(self) -> Result<S::Ok, Self::Error> {
|
||||||
|
self.0.serialize_none()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_some<T: ?Sized>(self, value: &T) -> Result<S::Ok, Self::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_some(&$wrapper(value))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_seq(self, len: Option<usize>) -> Result<Self::SerializeSeq, Self::Error> {
|
||||||
|
self.0.serialize_seq(len).map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_tuple(self, len: usize) -> Result<Self::SerializeTuple, Self::Error> {
|
||||||
|
self.0.serialize_tuple(len).map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_tuple_struct(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
len: usize,
|
||||||
|
) -> Result<Self::SerializeTupleStruct, Self::Error> {
|
||||||
|
self.0.serialize_tuple_struct(name, len).map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_tuple_variant(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
variant_index: u32,
|
||||||
|
variant: &'static str,
|
||||||
|
len: usize,
|
||||||
|
) -> Result<Self::SerializeTupleVariant, Self::Error> {
|
||||||
|
self.0
|
||||||
|
.serialize_tuple_variant(name, variant_index, variant, len)
|
||||||
|
.map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_map(self, len: Option<usize>) -> Result<Self::SerializeMap, Self::Error> {
|
||||||
|
self.0.serialize_map(len).map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_struct(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
len: usize,
|
||||||
|
) -> Result<Self::SerializeStruct, Self::Error> {
|
||||||
|
self.0.serialize_struct(name, len).map($wrapper)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn serialize_struct_variant(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
variant_index: u32,
|
||||||
|
variant: &'static str,
|
||||||
|
len: usize,
|
||||||
|
) -> Result<Self::SerializeStructVariant, Self::Error> {
|
||||||
|
self.0
|
||||||
|
.serialize_struct_variant(name, variant_index, variant, len)
|
||||||
|
.map($wrapper)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeSeq for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeSeq,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_element<T: ?Sized>(&mut self, value: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_element(&$wrapper(value))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeTuple for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeTuple,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_element<T: ?Sized>(&mut self, value: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_element(&$wrapper(value))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeTupleStruct for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeTupleStruct,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_field<T: ?Sized>(&mut self, value: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_field(&$wrapper(value))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeTupleVariant for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeTupleVariant,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_field<T: ?Sized>(&mut self, value: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_field(&$wrapper(value))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeMap for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeMap,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_key<T: ?Sized>(&mut self, key: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_key(&$wrapper(key))
|
||||||
|
}
|
||||||
|
fn serialize_value<T: ?Sized>(&mut self, value: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_value(&$wrapper(value))
|
||||||
|
}
|
||||||
|
fn serialize_entry<K: ?Sized, V: ?Sized>(&mut self, key: &K, value: &V) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
K: Serialize,
|
||||||
|
V: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_entry(key, &$wrapper(value))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeStruct for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeStruct,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_field<T: ?Sized>(&mut self, name: &'static str, field: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_field(name, &$wrapper(field))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> SerializeStructVariant for $wrapper<S>
|
||||||
|
where
|
||||||
|
S: SerializeStructVariant,
|
||||||
|
{
|
||||||
|
type Ok = S::Ok;
|
||||||
|
type Error = S::Error;
|
||||||
|
fn serialize_field<T: ?Sized>(&mut self, name: &'static str, field: &T) -> Result<(), S::Error>
|
||||||
|
where
|
||||||
|
T: Serialize,
|
||||||
|
{
|
||||||
|
self.0.serialize_field(name, &$wrapper(field))
|
||||||
|
}
|
||||||
|
fn end(self) -> Result<S::Ok, S::Error> {
|
||||||
|
self.0.end()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl_serializer!(Readable, true);
|
||||||
|
impl_serializer!(Compact, false);
|
||||||
|
|
||||||
|
use serde::de::{DeserializeSeed, EnumAccess, Error, MapAccess, SeqAccess, VariantAccess, Visitor};
|
||||||
|
|
||||||
|
macro_rules! forward_deserialize_methods {
|
||||||
|
( $wrapper : ident ( $( $name: ident ),* ) ) => {
|
||||||
|
$(
|
||||||
|
fn $name<V>(self, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
(self.0).$name($wrapper(visitor))
|
||||||
|
}
|
||||||
|
)*
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
macro_rules! impl_deserializer {
|
||||||
|
($wrapper : ident, $is_human_readable : expr) => {
|
||||||
|
impl <'de, D> Deserializer<'de> for $wrapper<D> where D: Deserializer<'de> {
|
||||||
|
type Error = D::Error;
|
||||||
|
|
||||||
|
forward_deserialize_methods! {
|
||||||
|
$wrapper (
|
||||||
|
deserialize_any,
|
||||||
|
deserialize_bool,
|
||||||
|
deserialize_u8,
|
||||||
|
deserialize_u16,
|
||||||
|
deserialize_u32,
|
||||||
|
deserialize_u64,
|
||||||
|
deserialize_i8,
|
||||||
|
deserialize_i16,
|
||||||
|
deserialize_i32,
|
||||||
|
deserialize_i64,
|
||||||
|
deserialize_f32,
|
||||||
|
deserialize_f64,
|
||||||
|
deserialize_char,
|
||||||
|
deserialize_str,
|
||||||
|
deserialize_string,
|
||||||
|
deserialize_bytes,
|
||||||
|
deserialize_byte_buf,
|
||||||
|
deserialize_option,
|
||||||
|
deserialize_unit,
|
||||||
|
deserialize_seq,
|
||||||
|
deserialize_map,
|
||||||
|
deserialize_identifier,
|
||||||
|
deserialize_ignored_any
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn deserialize_unit_struct<V>(self, name: &'static str, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_unit_struct(name, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn deserialize_newtype_struct<V>(self, name: &'static str, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_newtype_struct(name, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn deserialize_tuple<V>(self, len: usize, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_tuple(len, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn deserialize_tuple_struct<V>(self, name: &'static str, len: usize, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_tuple_struct(name, len, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn deserialize_struct<V>(self, name: &'static str, fields: &'static [&'static str], visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_struct(name, fields, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn deserialize_enum<V>(self, name: &'static str, variants: &'static [&'static str], visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.deserialize_enum(name, variants, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
$is_human_readable
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, D> Visitor<'de> for $wrapper<D> where D: Visitor<'de> {
|
||||||
|
type Value = D::Value;
|
||||||
|
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
|
||||||
|
self.0.expecting(formatter)
|
||||||
|
}
|
||||||
|
fn visit_bool<E>(self, v: bool) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_bool(v)
|
||||||
|
}
|
||||||
|
fn visit_i8<E>(self, v: i8) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_i8(v)
|
||||||
|
}
|
||||||
|
fn visit_i16<E>(self, v: i16) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_i16(v)
|
||||||
|
}
|
||||||
|
fn visit_i32<E>(self, v: i32) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_i32(v)
|
||||||
|
}
|
||||||
|
fn visit_i64<E>(self, v: i64) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_i64(v)
|
||||||
|
}
|
||||||
|
fn visit_u8<E>(self, v: u8) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_u8(v)
|
||||||
|
}
|
||||||
|
fn visit_u16<E>(self, v: u16) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_u16(v)
|
||||||
|
}
|
||||||
|
fn visit_u32<E>(self, v: u32) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_u32(v)
|
||||||
|
}
|
||||||
|
fn visit_u64<E>(self, v: u64) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_u64(v)
|
||||||
|
}
|
||||||
|
fn visit_f32<E>(self, v: f32) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_f32(v)
|
||||||
|
}
|
||||||
|
fn visit_f64<E>(self, v: f64) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_f64(v)
|
||||||
|
}
|
||||||
|
fn visit_char<E>(self, v: char) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_char(v)
|
||||||
|
}
|
||||||
|
fn visit_str<E>(self, v: &str) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_str(v)
|
||||||
|
}
|
||||||
|
fn visit_borrowed_str<E>(self, v: &'de str) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_borrowed_str(v)
|
||||||
|
}
|
||||||
|
fn visit_string<E>(self, v: String) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_string(v)
|
||||||
|
}
|
||||||
|
fn visit_bytes<E>(self, v: &[u8]) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_bytes(v)
|
||||||
|
}
|
||||||
|
fn visit_borrowed_bytes<E>(self, v: &'de [u8]) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_borrowed_bytes(v)
|
||||||
|
}
|
||||||
|
fn visit_byte_buf<E>(self, v: Vec<u8>) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_byte_buf(v)
|
||||||
|
}
|
||||||
|
fn visit_none<E>(self) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_none()
|
||||||
|
}
|
||||||
|
fn visit_some<D2>(self, deserializer: D2) -> Result<Self::Value, D2::Error> where D2: Deserializer<'de> {
|
||||||
|
self.0.visit_some($wrapper(deserializer))
|
||||||
|
}
|
||||||
|
fn visit_unit<E>(self) -> Result<D::Value, E> where E: Error {
|
||||||
|
self.0.visit_unit()
|
||||||
|
}
|
||||||
|
fn visit_newtype_struct<D2>(self, deserializer: D2) -> Result<Self::Value, D2::Error> where D2: Deserializer<'de> {
|
||||||
|
self.0.visit_newtype_struct($wrapper(deserializer))
|
||||||
|
}
|
||||||
|
fn visit_seq<V>(self, seq: V) -> Result<D::Value, V::Error> where V: SeqAccess<'de> {
|
||||||
|
self.0.visit_seq($wrapper(seq))
|
||||||
|
}
|
||||||
|
fn visit_map<V>(self, map: V) -> Result<D::Value, V::Error> where V: MapAccess<'de> {
|
||||||
|
self.0.visit_map($wrapper(map))
|
||||||
|
}
|
||||||
|
fn visit_enum<V>(self, data: V) -> Result<D::Value, V::Error> where V: EnumAccess<'de> {
|
||||||
|
self.0.visit_enum($wrapper(data))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, D> SeqAccess<'de> for $wrapper<D> where D: SeqAccess<'de> {
|
||||||
|
type Error = D::Error;
|
||||||
|
fn next_element_seed<T>(&mut self, seed: T) -> Result<Option<T::Value>, D::Error> where T: DeserializeSeed<'de> {
|
||||||
|
self.0.next_element_seed($wrapper(seed))
|
||||||
|
}
|
||||||
|
fn size_hint(&self) -> Option<usize> {
|
||||||
|
self.0.size_hint()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, D> MapAccess<'de> for $wrapper<D> where D: MapAccess<'de> {
|
||||||
|
type Error = D::Error;
|
||||||
|
fn next_key_seed<K>(&mut self, seed: K) -> Result<Option<K::Value>, D::Error> where K: DeserializeSeed<'de> {
|
||||||
|
self.0.next_key_seed($wrapper(seed))
|
||||||
|
}
|
||||||
|
fn next_value_seed<V>(&mut self, seed: V) -> Result<V::Value, D::Error> where V: DeserializeSeed<'de> {
|
||||||
|
self.0.next_value_seed($wrapper(seed))
|
||||||
|
}
|
||||||
|
fn size_hint(&self) -> Option<usize> {
|
||||||
|
self.0.size_hint()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, D> EnumAccess<'de> for $wrapper<D> where D: EnumAccess<'de> {
|
||||||
|
type Error = D::Error;
|
||||||
|
type Variant = $wrapper<D::Variant>;
|
||||||
|
fn variant_seed<V>(self, seed: V) -> Result<(V::Value, Self::Variant), Self::Error> where V: DeserializeSeed<'de> {
|
||||||
|
self.0.variant_seed($wrapper(seed)).map(|(value, variant)| (value, $wrapper(variant)))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'de, D> VariantAccess<'de> for $wrapper<D> where D: VariantAccess<'de> {
|
||||||
|
type Error = D::Error;
|
||||||
|
fn unit_variant(self) -> Result<(), D::Error> {
|
||||||
|
self.0.unit_variant()
|
||||||
|
}
|
||||||
|
fn newtype_variant_seed<T>(self, seed: T) -> Result<T::Value, D::Error> where T: DeserializeSeed<'de> {
|
||||||
|
self.0.newtype_variant_seed($wrapper(seed))
|
||||||
|
}
|
||||||
|
fn tuple_variant<V>(self, len: usize, visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.tuple_variant(len, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
fn struct_variant<V>(self, fields: &'static [&'static str], visitor: V) -> Result<V::Value, D::Error> where V: Visitor<'de> {
|
||||||
|
self.0.struct_variant(fields, $wrapper(visitor))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl_deserializer!(Readable, true);
|
||||||
|
impl_deserializer!(Compact, false);
|
||||||
+91
-71
@@ -95,15 +95,11 @@ impl<'de> Deserializer<'de> {
|
|||||||
where
|
where
|
||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
let value = try!(
|
let value = try!(visitor.visit_seq(DeserializerSeqVisitor {
|
||||||
visitor.visit_seq(
|
de: self,
|
||||||
DeserializerSeqVisitor {
|
len: len,
|
||||||
de: self,
|
end: end,
|
||||||
len: len,
|
},));
|
||||||
end: end.clone(),
|
|
||||||
},
|
|
||||||
)
|
|
||||||
);
|
|
||||||
assert_next_token!(self, end);
|
assert_next_token!(self, end);
|
||||||
Ok(value)
|
Ok(value)
|
||||||
}
|
}
|
||||||
@@ -117,15 +113,11 @@ impl<'de> Deserializer<'de> {
|
|||||||
where
|
where
|
||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
let value = try!(
|
let value = try!(visitor.visit_map(DeserializerMapVisitor {
|
||||||
visitor.visit_map(
|
de: self,
|
||||||
DeserializerMapVisitor {
|
len: len,
|
||||||
de: self,
|
end: end,
|
||||||
len: len,
|
},));
|
||||||
end: end.clone(),
|
|
||||||
},
|
|
||||||
)
|
|
||||||
);
|
|
||||||
assert_next_token!(self, end);
|
assert_next_token!(self, end);
|
||||||
Ok(value)
|
Ok(value)
|
||||||
}
|
}
|
||||||
@@ -165,15 +157,16 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
Token::ByteBuf(v) => visitor.visit_byte_buf(v.to_vec()),
|
Token::ByteBuf(v) => visitor.visit_byte_buf(v.to_vec()),
|
||||||
Token::None => visitor.visit_none(),
|
Token::None => visitor.visit_none(),
|
||||||
Token::Some => visitor.visit_some(self),
|
Token::Some => visitor.visit_some(self),
|
||||||
Token::Unit => visitor.visit_unit(),
|
Token::Unit | Token::UnitStruct { .. } => visitor.visit_unit(),
|
||||||
Token::UnitStruct { name: _ } => visitor.visit_unit(),
|
Token::NewtypeStruct { .. } => visitor.visit_newtype_struct(self),
|
||||||
Token::NewtypeStruct { name: _ } => visitor.visit_newtype_struct(self),
|
|
||||||
Token::Seq { len } => self.visit_seq(len, Token::SeqEnd, visitor),
|
Token::Seq { len } => self.visit_seq(len, Token::SeqEnd, visitor),
|
||||||
Token::Tuple { len } => self.visit_seq(Some(len), Token::TupleEnd, visitor),
|
Token::Tuple { len } => self.visit_seq(Some(len), Token::TupleEnd, visitor),
|
||||||
Token::TupleStruct { name: _, len } => self.visit_seq(Some(len), Token::TupleStructEnd, visitor),
|
Token::TupleStruct { len, .. } => {
|
||||||
|
self.visit_seq(Some(len), Token::TupleStructEnd, visitor)
|
||||||
|
}
|
||||||
Token::Map { len } => self.visit_map(len, Token::MapEnd, visitor),
|
Token::Map { len } => self.visit_map(len, Token::MapEnd, visitor),
|
||||||
Token::Struct { name: _, len } => self.visit_map(Some(len), Token::StructEnd, visitor),
|
Token::Struct { len, .. } => self.visit_map(Some(len), Token::StructEnd, visitor),
|
||||||
Token::Enum { name: _ } => {
|
Token::Enum { .. } => {
|
||||||
let variant = self.next_token();
|
let variant = self.next_token();
|
||||||
let next = self.peek_token();
|
let next = self.peek_token();
|
||||||
match (variant, next) {
|
match (variant, next) {
|
||||||
@@ -195,18 +188,29 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Token::UnitVariant { name: _, variant } => visitor.visit_str(variant),
|
Token::UnitVariant { variant, .. } => visitor.visit_str(variant),
|
||||||
Token::NewtypeVariant { name: _, variant } => {
|
Token::NewtypeVariant { variant, .. } => visitor.visit_map(EnumMapVisitor::new(
|
||||||
visitor.visit_map(EnumMapVisitor::new(self, Token::Str(variant), EnumFormat::Any),)
|
self,
|
||||||
}
|
Token::Str(variant),
|
||||||
Token::TupleVariant { name: _, variant, len: _ } => {
|
EnumFormat::Any,
|
||||||
visitor.visit_map(EnumMapVisitor::new(self, Token::Str(variant), EnumFormat::Seq),)
|
)),
|
||||||
}
|
Token::TupleVariant { variant, .. } => visitor.visit_map(EnumMapVisitor::new(
|
||||||
Token::StructVariant { name: _, variant, len: _ } => {
|
self,
|
||||||
visitor.visit_map(EnumMapVisitor::new(self, Token::Str(variant), EnumFormat::Map),)
|
Token::Str(variant),
|
||||||
}
|
EnumFormat::Seq,
|
||||||
Token::SeqEnd | Token::TupleEnd | Token::TupleStructEnd | Token::MapEnd |
|
)),
|
||||||
Token::StructEnd | Token::TupleVariantEnd | Token::StructVariantEnd => {
|
Token::StructVariant { variant, .. } => visitor.visit_map(EnumMapVisitor::new(
|
||||||
|
self,
|
||||||
|
Token::Str(variant),
|
||||||
|
EnumFormat::Map,
|
||||||
|
)),
|
||||||
|
Token::SeqEnd
|
||||||
|
| Token::TupleEnd
|
||||||
|
| Token::TupleStructEnd
|
||||||
|
| Token::MapEnd
|
||||||
|
| Token::StructEnd
|
||||||
|
| Token::TupleVariantEnd
|
||||||
|
| Token::StructVariantEnd => {
|
||||||
unexpected!(token);
|
unexpected!(token);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -217,8 +221,7 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.peek_token() {
|
match self.peek_token() {
|
||||||
Token::Unit |
|
Token::Unit | Token::None => {
|
||||||
Token::None => {
|
|
||||||
self.next_token();
|
self.next_token();
|
||||||
visitor.visit_none()
|
visitor.visit_none()
|
||||||
}
|
}
|
||||||
@@ -245,10 +248,11 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
|
|
||||||
visitor.visit_enum(DeserializerEnumVisitor { de: self })
|
visitor.visit_enum(DeserializerEnumVisitor { de: self })
|
||||||
}
|
}
|
||||||
Token::UnitVariant { name: n, variant: _ } |
|
Token::UnitVariant { name: n, .. }
|
||||||
Token::NewtypeVariant { name: n, variant: _ } |
|
| Token::NewtypeVariant { name: n, .. }
|
||||||
Token::TupleVariant { name: n, variant: _, len: _ } |
|
| Token::TupleVariant { name: n, .. }
|
||||||
Token::StructVariant { name: n, variant: _, len: _ } if name == n => {
|
| Token::StructVariant { name: n, .. } if name == n =>
|
||||||
|
{
|
||||||
visitor.visit_enum(DeserializerEnumVisitor { de: self })
|
visitor.visit_enum(DeserializerEnumVisitor { de: self })
|
||||||
}
|
}
|
||||||
_ => {
|
_ => {
|
||||||
@@ -262,7 +266,7 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.peek_token() {
|
match self.peek_token() {
|
||||||
Token::UnitStruct { name: _ } => {
|
Token::UnitStruct { .. } => {
|
||||||
assert_next_token!(self, Token::UnitStruct { name: name });
|
assert_next_token!(self, Token::UnitStruct { name: name });
|
||||||
visitor.visit_unit()
|
visitor.visit_unit()
|
||||||
}
|
}
|
||||||
@@ -270,12 +274,16 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn deserialize_newtype_struct<V>(self, name: &'static str, visitor: V) -> Result<V::Value, Error>
|
fn deserialize_newtype_struct<V>(
|
||||||
|
self,
|
||||||
|
name: &'static str,
|
||||||
|
visitor: V,
|
||||||
|
) -> Result<V::Value, Error>
|
||||||
where
|
where
|
||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.peek_token() {
|
match self.peek_token() {
|
||||||
Token::NewtypeStruct { name: _ } => {
|
Token::NewtypeStruct { .. } => {
|
||||||
assert_next_token!(self, Token::NewtypeStruct { name: name });
|
assert_next_token!(self, Token::NewtypeStruct { name: name });
|
||||||
visitor.visit_newtype_struct(self)
|
visitor.visit_newtype_struct(self)
|
||||||
}
|
}
|
||||||
@@ -288,20 +296,19 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.peek_token() {
|
match self.peek_token() {
|
||||||
Token::Unit |
|
Token::Unit | Token::UnitStruct { .. } => {
|
||||||
Token::UnitStruct { name: _ } => {
|
|
||||||
self.next_token();
|
self.next_token();
|
||||||
visitor.visit_unit()
|
visitor.visit_unit()
|
||||||
}
|
}
|
||||||
Token::Seq { len: _ } => {
|
Token::Seq { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_seq(Some(len), Token::SeqEnd, visitor)
|
self.visit_seq(Some(len), Token::SeqEnd, visitor)
|
||||||
}
|
}
|
||||||
Token::Tuple { len: _ } => {
|
Token::Tuple { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_seq(Some(len), Token::TupleEnd, visitor)
|
self.visit_seq(Some(len), Token::TupleEnd, visitor)
|
||||||
}
|
}
|
||||||
Token::TupleStruct { name: _, len: _ } => {
|
Token::TupleStruct { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_seq(Some(len), Token::TupleStructEnd, visitor)
|
self.visit_seq(Some(len), Token::TupleStructEnd, visitor)
|
||||||
}
|
}
|
||||||
@@ -323,19 +330,19 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
self.next_token();
|
self.next_token();
|
||||||
visitor.visit_unit()
|
visitor.visit_unit()
|
||||||
}
|
}
|
||||||
Token::UnitStruct { name: _ } => {
|
Token::UnitStruct { .. } => {
|
||||||
assert_next_token!(self, Token::UnitStruct { name: name });
|
assert_next_token!(self, Token::UnitStruct { name: name });
|
||||||
visitor.visit_unit()
|
visitor.visit_unit()
|
||||||
}
|
}
|
||||||
Token::Seq { len: _ } => {
|
Token::Seq { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_seq(Some(len), Token::SeqEnd, visitor)
|
self.visit_seq(Some(len), Token::SeqEnd, visitor)
|
||||||
}
|
}
|
||||||
Token::Tuple { len: _ } => {
|
Token::Tuple { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_seq(Some(len), Token::TupleEnd, visitor)
|
self.visit_seq(Some(len), Token::TupleEnd, visitor)
|
||||||
}
|
}
|
||||||
Token::TupleStruct { name: _, len: n } => {
|
Token::TupleStruct { len: n, .. } => {
|
||||||
assert_next_token!(self, Token::TupleStruct { name: name, len: n });
|
assert_next_token!(self, Token::TupleStruct { name: name, len: n });
|
||||||
self.visit_seq(Some(len), Token::TupleStructEnd, visitor)
|
self.visit_seq(Some(len), Token::TupleStructEnd, visitor)
|
||||||
}
|
}
|
||||||
@@ -353,17 +360,24 @@ impl<'de, 'a> de::Deserializer<'de> for &'a mut Deserializer<'de> {
|
|||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.peek_token() {
|
match self.peek_token() {
|
||||||
Token::Struct { name: _, len: n } => {
|
Token::Struct { len: n, .. } => {
|
||||||
assert_next_token!(self, Token::Struct { name: name, len: n });
|
assert_next_token!(self, Token::Struct { name: name, len: n });
|
||||||
self.visit_map(Some(fields.len()), Token::StructEnd, visitor)
|
self.visit_map(Some(fields.len()), Token::StructEnd, visitor)
|
||||||
}
|
}
|
||||||
Token::Map { len: _ } => {
|
Token::Map { .. } => {
|
||||||
self.next_token();
|
self.next_token();
|
||||||
self.visit_map(Some(fields.len()), Token::MapEnd, visitor)
|
self.visit_map(Some(fields.len()), Token::MapEnd, visitor)
|
||||||
}
|
}
|
||||||
_ => self.deserialize_any(visitor),
|
_ => self.deserialize_any(visitor),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
panic!(
|
||||||
|
"Types which have different human-readable and compact representations \
|
||||||
|
must explicitly mark their test cases with `serde_test::Configure`"
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
//////////////////////////////////////////////////////////////////////////
|
//////////////////////////////////////////////////////////////////////////
|
||||||
@@ -442,10 +456,10 @@ impl<'de, 'a> EnumAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
V: DeserializeSeed<'de>,
|
V: DeserializeSeed<'de>,
|
||||||
{
|
{
|
||||||
match self.de.peek_token() {
|
match self.de.peek_token() {
|
||||||
Token::UnitVariant { name: _, variant: v } |
|
Token::UnitVariant { variant: v, .. }
|
||||||
Token::NewtypeVariant { name: _, variant: v } |
|
| Token::NewtypeVariant { variant: v, .. }
|
||||||
Token::TupleVariant { name: _, variant: v, len: _ } |
|
| Token::TupleVariant { variant: v, .. }
|
||||||
Token::StructVariant { name: _, variant: v, len: _ } => {
|
| Token::StructVariant { variant: v, .. } => {
|
||||||
let de = v.into_deserializer();
|
let de = v.into_deserializer();
|
||||||
let value = try!(seed.deserialize(de));
|
let value = try!(seed.deserialize(de));
|
||||||
Ok((value, self))
|
Ok((value, self))
|
||||||
@@ -463,7 +477,7 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
|
|
||||||
fn unit_variant(self) -> Result<(), Error> {
|
fn unit_variant(self) -> Result<(), Error> {
|
||||||
match self.de.peek_token() {
|
match self.de.peek_token() {
|
||||||
Token::UnitVariant { name: _, variant: _ } => {
|
Token::UnitVariant { .. } => {
|
||||||
self.de.next_token();
|
self.de.next_token();
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
@@ -476,7 +490,7 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
T: DeserializeSeed<'de>,
|
T: DeserializeSeed<'de>,
|
||||||
{
|
{
|
||||||
match self.de.peek_token() {
|
match self.de.peek_token() {
|
||||||
Token::NewtypeVariant { name: _, variant: _ } => {
|
Token::NewtypeVariant { .. } => {
|
||||||
self.de.next_token();
|
self.de.next_token();
|
||||||
seed.deserialize(self.de)
|
seed.deserialize(self.de)
|
||||||
}
|
}
|
||||||
@@ -489,7 +503,7 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.de.peek_token() {
|
match self.de.peek_token() {
|
||||||
Token::TupleVariant { name: _, variant: _, len: enum_len } => {
|
Token::TupleVariant { len: enum_len, .. } => {
|
||||||
let token = self.de.next_token();
|
let token = self.de.next_token();
|
||||||
|
|
||||||
if len == enum_len {
|
if len == enum_len {
|
||||||
@@ -499,7 +513,9 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
unexpected!(token);
|
unexpected!(token);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Token::Seq { len: Some(enum_len) } => {
|
Token::Seq {
|
||||||
|
len: Some(enum_len),
|
||||||
|
} => {
|
||||||
let token = self.de.next_token();
|
let token = self.de.next_token();
|
||||||
|
|
||||||
if len == enum_len {
|
if len == enum_len {
|
||||||
@@ -512,12 +528,16 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn struct_variant<V>(self, fields: &'static [&'static str], visitor: V) -> Result<V::Value, Error>
|
fn struct_variant<V>(
|
||||||
|
self,
|
||||||
|
fields: &'static [&'static str],
|
||||||
|
visitor: V,
|
||||||
|
) -> Result<V::Value, Error>
|
||||||
where
|
where
|
||||||
V: Visitor<'de>,
|
V: Visitor<'de>,
|
||||||
{
|
{
|
||||||
match self.de.peek_token() {
|
match self.de.peek_token() {
|
||||||
Token::StructVariant { name: _, variant: _, len: enum_len } => {
|
Token::StructVariant { len: enum_len, .. } => {
|
||||||
let token = self.de.next_token();
|
let token = self.de.next_token();
|
||||||
|
|
||||||
if fields.len() == enum_len {
|
if fields.len() == enum_len {
|
||||||
@@ -527,7 +547,9 @@ impl<'de, 'a> VariantAccess<'de> for DeserializerEnumVisitor<'a, 'de> {
|
|||||||
unexpected!(token);
|
unexpected!(token);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Token::Map { len: Some(enum_len) } => {
|
Token::Map {
|
||||||
|
len: Some(enum_len),
|
||||||
|
} => {
|
||||||
let token = self.de.next_token();
|
let token = self.de.next_token();
|
||||||
|
|
||||||
if fields.len() == enum_len {
|
if fields.len() == enum_len {
|
||||||
@@ -575,10 +597,8 @@ impl<'de, 'a> MapAccess<'de> for EnumMapVisitor<'a, 'de> {
|
|||||||
{
|
{
|
||||||
match self.variant.take() {
|
match self.variant.take() {
|
||||||
Some(Token::Str(variant)) => seed.deserialize(variant.into_deserializer()).map(Some),
|
Some(Token::Str(variant)) => seed.deserialize(variant.into_deserializer()).map(Some),
|
||||||
Some(Token::Bytes(variant)) => {
|
Some(Token::Bytes(variant)) => seed.deserialize(BytesDeserializer { value: variant })
|
||||||
seed.deserialize(BytesDeserializer { value: variant })
|
.map(Some),
|
||||||
.map(Some)
|
|
||||||
}
|
|
||||||
Some(Token::U32(variant)) => seed.deserialize(variant.into_deserializer()).map(Some),
|
Some(Token::U32(variant)) => seed.deserialize(variant.into_deserializer()).map(Some),
|
||||||
Some(other) => unexpected!(other),
|
Some(other) => unexpected!(other),
|
||||||
None => Ok(None),
|
None => Ok(None),
|
||||||
|
|||||||
@@ -9,7 +9,7 @@
|
|||||||
use std::error;
|
use std::error;
|
||||||
use std::fmt::{self, Display};
|
use std::fmt::{self, Display};
|
||||||
|
|
||||||
use serde::{ser, de};
|
use serde::{de, ser};
|
||||||
|
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
pub struct Error {
|
pub struct Error {
|
||||||
@@ -17,14 +17,18 @@ pub struct Error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl ser::Error for Error {
|
impl ser::Error for Error {
|
||||||
fn custom<T: Display>(msg: T) -> Error {
|
fn custom<T: Display>(msg: T) -> Self {
|
||||||
Error { msg: msg.to_string() }
|
Error {
|
||||||
|
msg: msg.to_string(),
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl de::Error for Error {
|
impl de::Error for Error {
|
||||||
fn custom<T: Display>(msg: T) -> Error {
|
fn custom<T: Display>(msg: T) -> Self {
|
||||||
Error { msg: msg.to_string() }
|
Error {
|
||||||
|
msg: msg.to_string(),
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
+12
-3
@@ -155,7 +155,13 @@
|
|||||||
//! # }
|
//! # }
|
||||||
//! ```
|
//! ```
|
||||||
|
|
||||||
#![doc(html_root_url = "https://docs.rs/serde_test/1.0.2")]
|
#![doc(html_root_url = "https://docs.rs/serde_test/1.0.25")]
|
||||||
|
#![cfg_attr(feature = "cargo-clippy", deny(clippy, clippy_pedantic))]
|
||||||
|
// Whitelisted clippy lints
|
||||||
|
#![cfg_attr(feature = "cargo-clippy", allow(float_cmp))]
|
||||||
|
// Whitelisted clippy_pedantic lints
|
||||||
|
#![cfg_attr(feature = "cargo-clippy",
|
||||||
|
allow(missing_docs_in_private_items, stutter, use_debug, use_self))]
|
||||||
|
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde;
|
extern crate serde;
|
||||||
@@ -164,12 +170,15 @@ mod ser;
|
|||||||
mod de;
|
mod de;
|
||||||
mod error;
|
mod error;
|
||||||
|
|
||||||
|
mod configure;
|
||||||
mod token;
|
mod token;
|
||||||
mod assert;
|
mod assert;
|
||||||
|
|
||||||
pub use token::Token;
|
pub use token::Token;
|
||||||
pub use assert::{assert_tokens, assert_ser_tokens, assert_ser_tokens_error, assert_de_tokens,
|
pub use assert::{assert_de_tokens, assert_de_tokens_error, assert_ser_tokens,
|
||||||
assert_de_tokens_error};
|
assert_ser_tokens_error, assert_tokens};
|
||||||
|
|
||||||
|
pub use configure::{Compact, Configure, Readable};
|
||||||
|
|
||||||
// Not public API.
|
// Not public API.
|
||||||
#[doc(hidden)]
|
#[doc(hidden)]
|
||||||
|
|||||||
+47
-10
@@ -40,25 +40,43 @@ impl<'a> Serializer<'a> {
|
|||||||
|
|
||||||
macro_rules! assert_next_token {
|
macro_rules! assert_next_token {
|
||||||
($ser:expr, $expected:ident) => {
|
($ser:expr, $expected:ident) => {
|
||||||
assert_next_token!($ser, $expected, Token::$expected, true);
|
assert_next_token!($ser, stringify!($expected), Token::$expected, true);
|
||||||
};
|
};
|
||||||
($ser:expr, $expected:ident($v:expr)) => {
|
($ser:expr, $expected:ident($v:expr)) => {
|
||||||
assert_next_token!($ser, $expected, Token::$expected(v), v == $v);
|
assert_next_token!(
|
||||||
|
$ser,
|
||||||
|
format_args!("{}({:?})", stringify!($expected), $v),
|
||||||
|
Token::$expected(v),
|
||||||
|
v == $v
|
||||||
|
);
|
||||||
};
|
};
|
||||||
($ser:expr, $expected:ident { $($k:ident),* }) => {
|
($ser:expr, $expected:ident { $($k:ident),* }) => {
|
||||||
let compare = ($($k,)*);
|
let compare = ($($k,)*);
|
||||||
assert_next_token!($ser, $expected, Token::$expected { $($k),* }, ($($k,)*) == compare);
|
let field_format = || {
|
||||||
|
use std::fmt::Write;
|
||||||
|
let mut buffer = String::new();
|
||||||
|
$(
|
||||||
|
write!(&mut buffer, "{}: {:?}, ", stringify!($k), $k).unwrap();
|
||||||
|
)*
|
||||||
|
buffer
|
||||||
|
};
|
||||||
|
assert_next_token!(
|
||||||
|
$ser,
|
||||||
|
format_args!("{} {{ {}}}", stringify!($expected), field_format()),
|
||||||
|
Token::$expected { $($k),* },
|
||||||
|
($($k,)*) == compare
|
||||||
|
);
|
||||||
};
|
};
|
||||||
($ser:expr, $expected:ident, $pat:pat, $guard:expr) => {
|
($ser:expr, $expected:expr, $pat:pat, $guard:expr) => {
|
||||||
match $ser.next_token() {
|
match $ser.next_token() {
|
||||||
Some($pat) if $guard => {}
|
Some($pat) if $guard => {}
|
||||||
Some(other) => {
|
Some(other) => {
|
||||||
panic!("expected Token::{} but serialized as {}",
|
panic!("expected Token::{} but serialized as {}",
|
||||||
stringify!($expected), other);
|
$expected, other);
|
||||||
}
|
}
|
||||||
None => {
|
None => {
|
||||||
panic!("expected Token::{} after end of serialized tokens",
|
panic!("expected Token::{} after end of serialized tokens",
|
||||||
stringify!($expected));
|
$expected);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
@@ -247,10 +265,16 @@ impl<'s, 'a> ser::Serializer for &'s mut Serializer<'a> {
|
|||||||
assert_next_token!(self, Str(variant));
|
assert_next_token!(self, Str(variant));
|
||||||
let len = Some(len);
|
let len = Some(len);
|
||||||
assert_next_token!(self, Seq { len });
|
assert_next_token!(self, Seq { len });
|
||||||
Ok(Variant { ser: self, end: Token::SeqEnd })
|
Ok(Variant {
|
||||||
|
ser: self,
|
||||||
|
end: Token::SeqEnd,
|
||||||
|
})
|
||||||
} else {
|
} else {
|
||||||
assert_next_token!(self, TupleVariant { name, variant, len });
|
assert_next_token!(self, TupleVariant { name, variant, len });
|
||||||
Ok(Variant { ser: self, end: Token::TupleVariantEnd })
|
Ok(Variant {
|
||||||
|
ser: self,
|
||||||
|
end: Token::TupleVariantEnd,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -276,12 +300,25 @@ impl<'s, 'a> ser::Serializer for &'s mut Serializer<'a> {
|
|||||||
assert_next_token!(self, Str(variant));
|
assert_next_token!(self, Str(variant));
|
||||||
let len = Some(len);
|
let len = Some(len);
|
||||||
assert_next_token!(self, Map { len });
|
assert_next_token!(self, Map { len });
|
||||||
Ok(Variant { ser: self, end: Token::MapEnd })
|
Ok(Variant {
|
||||||
|
ser: self,
|
||||||
|
end: Token::MapEnd,
|
||||||
|
})
|
||||||
} else {
|
} else {
|
||||||
assert_next_token!(self, StructVariant { name, variant, len });
|
assert_next_token!(self, StructVariant { name, variant, len });
|
||||||
Ok(Variant { ser: self, end: Token::StructVariantEnd })
|
Ok(Variant {
|
||||||
|
ser: self,
|
||||||
|
end: Token::StructVariantEnd,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn is_human_readable(&self) -> bool {
|
||||||
|
panic!(
|
||||||
|
"Types which have different human-readable and compact representations \
|
||||||
|
must explicitly mark their test cases with `serde_test::Configure`"
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct Variant<'s, 'a: 's> {
|
pub struct Variant<'s, 'a: 's> {
|
||||||
|
|||||||
+18
-4
@@ -232,7 +232,10 @@ pub enum Token {
|
|||||||
/// assert_tokens(&a, &[Token::UnitVariant { name: "E", variant: "A" }]);
|
/// assert_tokens(&a, &[Token::UnitVariant { name: "E", variant: "A" }]);
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
UnitVariant { name: &'static str, variant: &'static str },
|
UnitVariant {
|
||||||
|
name: &'static str,
|
||||||
|
variant: &'static str,
|
||||||
|
},
|
||||||
|
|
||||||
/// The header to a serialized newtype struct of the given name.
|
/// The header to a serialized newtype struct of the given name.
|
||||||
///
|
///
|
||||||
@@ -286,7 +289,10 @@ pub enum Token {
|
|||||||
/// ]);
|
/// ]);
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
NewtypeVariant { name: &'static str, variant: &'static str },
|
NewtypeVariant {
|
||||||
|
name: &'static str,
|
||||||
|
variant: &'static str,
|
||||||
|
},
|
||||||
|
|
||||||
/// The header to a sequence.
|
/// The header to a sequence.
|
||||||
///
|
///
|
||||||
@@ -391,7 +397,11 @@ pub enum Token {
|
|||||||
/// ]);
|
/// ]);
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
TupleVariant { name: &'static str, variant: &'static str, len: usize },
|
TupleVariant {
|
||||||
|
name: &'static str,
|
||||||
|
variant: &'static str,
|
||||||
|
len: usize,
|
||||||
|
},
|
||||||
|
|
||||||
/// An indicator of the end of a tuple variant.
|
/// An indicator of the end of a tuple variant.
|
||||||
TupleVariantEnd,
|
TupleVariantEnd,
|
||||||
@@ -488,7 +498,11 @@ pub enum Token {
|
|||||||
/// ]);
|
/// ]);
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
StructVariant { name: &'static str, variant: &'static str, len: usize },
|
StructVariant {
|
||||||
|
name: &'static str,
|
||||||
|
variant: &'static str,
|
||||||
|
len: usize,
|
||||||
|
},
|
||||||
|
|
||||||
/// An indicator of the end of a struct variant.
|
/// An indicator of the end of a struct variant.
|
||||||
StructVariantEnd,
|
StructVariantEnd,
|
||||||
|
|||||||
@@ -10,9 +10,9 @@ unstable = ["serde/unstable", "compiletest_rs"]
|
|||||||
[dev-dependencies]
|
[dev-dependencies]
|
||||||
fnv = "1.0"
|
fnv = "1.0"
|
||||||
rustc-serialize = "0.3.16"
|
rustc-serialize = "0.3.16"
|
||||||
serde = { path = "../serde" }
|
serde = { path = "../serde", features = ["rc"] }
|
||||||
serde_derive = { path = "../serde_derive" }
|
serde_derive = { path = "../serde_derive", features = ["deserialize_in_place"] }
|
||||||
serde_test = { path = "../serde_test" }
|
serde_test = { path = "../serde_test" }
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
compiletest_rs = { version = "0.2", optional = true }
|
compiletest_rs = { version = "0.3", optional = true }
|
||||||
|
|||||||
@@ -0,0 +1,11 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#![feature(/*=============================================]
|
||||||
|
#![=== Serde test suite requires a nightly compiler. ===]
|
||||||
|
#![====================================================*/)]
|
||||||
|
|||||||
@@ -4,5 +4,8 @@ version = "0.0.0"
|
|||||||
publish = false
|
publish = false
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
|
libc = { version = "0.2", default-features = false }
|
||||||
serde = { path = "../../serde", default-features = false }
|
serde = { path = "../../serde", default-features = false }
|
||||||
serde_derive = { path = "../../serde_derive" }
|
serde_derive = { path = "../../serde_derive" }
|
||||||
|
|
||||||
|
[workspace]
|
||||||
|
|||||||
@@ -1,4 +1,12 @@
|
|||||||
#![feature(lang_items, start, libc, compiler_builtins_lib)]
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#![feature(lang_items, start, compiler_builtins_lib)]
|
||||||
#![no_std]
|
#![no_std]
|
||||||
|
|
||||||
extern crate libc;
|
extern crate libc;
|
||||||
|
|||||||
@@ -0,0 +1,21 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)]
|
||||||
|
struct Str<'a>(&'a str);
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
enum Test<'a> {
|
||||||
|
#[serde(borrow)] //~^^ HELP: duplicate serde attribute `borrow`
|
||||||
|
S(#[serde(borrow)] Str<'a>)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() {}
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)]
|
||||||
|
struct Str<'a>(&'a str);
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
enum Test<'a> {
|
||||||
|
#[serde(borrow)] //~^^ HELP: #[serde(borrow)] may only be used on newtype variants
|
||||||
|
S { s: Str<'a> }
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() {}
|
||||||
@@ -16,7 +16,7 @@ mod remote {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)] //~ ERROR: missing field `b` in initializer of `remote::S`
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(remote = "remote::S")]
|
#[serde(remote = "remote::S")]
|
||||||
struct S {
|
struct S {
|
||||||
a: u8, //~^^^ ERROR: missing field `b` in initializer of `remote::S`
|
a: u8, //~^^^ ERROR: missing field `b` in initializer of `remote::S`
|
||||||
|
|||||||
@@ -18,7 +18,8 @@ mod remote {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(remote = "remote::S")]
|
#[serde(remote = "remote::S")]
|
||||||
struct S {
|
struct S {
|
||||||
b: u8, //~^^^ ERROR: no field `b` on type `&remote::S`
|
//~^^^ ERROR: struct `remote::S` has no field named `b`
|
||||||
|
b: u8, //~^^^^ ERROR: no field `b` on type `&remote::S`
|
||||||
}
|
}
|
||||||
|
|
||||||
fn main() {}
|
fn main() {}
|
||||||
|
|||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Newtype` cannot have both #[serde(deserialize_with)] and a field 0 marked with #[serde(skip_deserializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(deserialize_with = "deserialize_some_newtype_variant")]
|
||||||
|
Newtype(#[serde(skip_deserializing)] String),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Struct` cannot have both #[serde(deserialize_with)] and a field `f1` marked with #[serde(skip_deserializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
Struct {
|
||||||
|
#[serde(skip_deserializing)]
|
||||||
|
f1: String,
|
||||||
|
f2: u8,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Tuple` cannot have both #[serde(deserialize_with)] and a field 0 marked with #[serde(skip_deserializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
Tuple(#[serde(skip_deserializing)] String, u8),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,20 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Deserialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Unit` cannot have both #[serde(deserialize_with)] and #[serde(skip_deserializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(deserialize_with = "deserialize_some_unit_variant")]
|
||||||
|
#[serde(skip_deserializing)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Newtype` cannot have both #[serde(serialize_with)] and a field 0 marked with #[serde(skip_serializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_newtype_variant")]
|
||||||
|
Newtype(#[serde(skip_serializing)] String),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Newtype` cannot have both #[serde(serialize_with)] and a field 0 marked with #[serde(skip_serializing_if)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_newtype_variant")]
|
||||||
|
Newtype(#[serde(skip_serializing_if = "always")] String),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Struct` cannot have both #[serde(serialize_with)] and a field `f1` marked with #[serde(skip_serializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
Struct {
|
||||||
|
#[serde(skip_serializing)]
|
||||||
|
f1: String,
|
||||||
|
f2: u8,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Struct` cannot have both #[serde(serialize_with)] and a field `f1` marked with #[serde(skip_serializing_if)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_newtype_variant")]
|
||||||
|
Struct {
|
||||||
|
#[serde(skip_serializing_if = "always")]
|
||||||
|
f1: String,
|
||||||
|
f2: u8,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Tuple` cannot have both #[serde(serialize_with)] and a field 0 marked with #[serde(skip_serializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
Tuple(#[serde(skip_serializing)] String, u8),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Tuple` cannot have both #[serde(serialize_with)] and a field 0 marked with #[serde(skip_serializing_if)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
Tuple(#[serde(skip_serializing_if = "always")] String, u8),
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -0,0 +1,20 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
extern crate serde_derive;
|
||||||
|
|
||||||
|
#[derive(Serialize)] //~ ERROR: proc-macro derive panicked
|
||||||
|
//~^ HELP: variant `Unit` cannot have both #[serde(serialize_with)] and #[serde(skip_serializing)]
|
||||||
|
enum Enum {
|
||||||
|
#[serde(serialize_with = "serialize_some_unit_variant")]
|
||||||
|
#[serde(skip_serializing)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() { }
|
||||||
@@ -13,7 +13,7 @@ extern crate compiletest_rs as compiletest;
|
|||||||
use std::env;
|
use std::env;
|
||||||
|
|
||||||
fn run_mode(mode: &'static str) {
|
fn run_mode(mode: &'static str) {
|
||||||
let mut config = compiletest::default_config();
|
let mut config = compiletest::Config::default();
|
||||||
|
|
||||||
config.mode = mode.parse().expect("invalid mode");
|
config.mode = mode.parse().expect("invalid mode");
|
||||||
config.target_rustcflags = Some("-L deps/target/debug/deps".to_owned());
|
config.target_rustcflags = Some("-L deps/target/debug/deps".to_owned());
|
||||||
|
|||||||
@@ -73,3 +73,29 @@ macro_rules! hashmap {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
macro_rules! seq_impl {
|
||||||
|
(seq $first:expr,) => {
|
||||||
|
seq_impl!(seq $first)
|
||||||
|
};
|
||||||
|
($first:expr,) => {
|
||||||
|
seq_impl!($first)
|
||||||
|
};
|
||||||
|
(seq $first:expr) => {
|
||||||
|
$first.into_iter()
|
||||||
|
};
|
||||||
|
($first:expr) => {
|
||||||
|
Some($first).into_iter()
|
||||||
|
};
|
||||||
|
(seq $first:expr , $( $elem: tt)*) => {
|
||||||
|
$first.into_iter().chain(seq!( $($elem)* ))
|
||||||
|
};
|
||||||
|
($first:expr , $($elem: tt)*) => {
|
||||||
|
Some($first).into_iter().chain(seq!( $($elem)* ))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
macro_rules! seq {
|
||||||
|
($($tt: tt)*) => {
|
||||||
|
seq_impl!($($tt)*).collect::<Vec<_>>()
|
||||||
|
};
|
||||||
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -13,7 +13,7 @@ extern crate serde;
|
|||||||
use serde::{Deserialize, Deserializer};
|
use serde::{Deserialize, Deserializer};
|
||||||
|
|
||||||
extern crate serde_test;
|
extern crate serde_test;
|
||||||
use serde_test::{Token, assert_de_tokens, assert_de_tokens_error};
|
use serde_test::{assert_de_tokens, assert_de_tokens_error, Token};
|
||||||
|
|
||||||
use std::borrow::Cow;
|
use std::borrow::Cow;
|
||||||
|
|
||||||
@@ -87,18 +87,18 @@ fn test_struct() {
|
|||||||
|
|
||||||
assert_de_tokens(
|
assert_de_tokens(
|
||||||
&Borrowing {
|
&Borrowing {
|
||||||
bs: "str",
|
bs: "str",
|
||||||
bb: b"bytes",
|
bb: b"bytes",
|
||||||
},
|
},
|
||||||
&[
|
&[
|
||||||
Token::Struct { name: "Borrowing", len: 2 },
|
Token::Struct {
|
||||||
|
name: "Borrowing",
|
||||||
|
len: 2,
|
||||||
|
},
|
||||||
Token::BorrowedStr("bs"),
|
Token::BorrowedStr("bs"),
|
||||||
Token::BorrowedStr("str"),
|
Token::BorrowedStr("str"),
|
||||||
|
|
||||||
Token::BorrowedStr("bb"),
|
Token::BorrowedStr("bb"),
|
||||||
Token::BorrowedBytes(b"bytes"),
|
Token::BorrowedBytes(b"bytes"),
|
||||||
|
|
||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
);
|
);
|
||||||
@@ -115,14 +115,14 @@ fn test_cow() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let tokens = &[
|
let tokens = &[
|
||||||
Token::Struct { name: "Cows", len: 2 },
|
Token::Struct {
|
||||||
|
name: "Cows",
|
||||||
|
len: 2,
|
||||||
|
},
|
||||||
Token::Str("copied"),
|
Token::Str("copied"),
|
||||||
Token::BorrowedStr("copied"),
|
Token::BorrowedStr("copied"),
|
||||||
|
|
||||||
Token::Str("borrowed"),
|
Token::Str("borrowed"),
|
||||||
Token::BorrowedStr("borrowed"),
|
Token::BorrowedStr("borrowed"),
|
||||||
|
|
||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
];
|
];
|
||||||
|
|
||||||
|
|||||||
+248
-55
@@ -6,17 +6,18 @@
|
|||||||
// option. This file may not be copied, modified, or distributed
|
// option. This file may not be copied, modified, or distributed
|
||||||
// except according to those terms.
|
// except according to those terms.
|
||||||
|
|
||||||
#![cfg_attr(feature = "unstable", feature(into_boxed_c_str))]
|
|
||||||
|
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde_derive;
|
extern crate serde_derive;
|
||||||
|
|
||||||
use std::collections::{BTreeMap, BTreeSet, HashMap, HashSet};
|
use std::collections::{BTreeMap, BTreeSet, HashMap, HashSet};
|
||||||
use std::net;
|
use std::net;
|
||||||
use std::path::PathBuf;
|
use std::path::{Path, PathBuf};
|
||||||
use std::time::Duration;
|
use std::time::{Duration, UNIX_EPOCH};
|
||||||
use std::default::Default;
|
use std::default::Default;
|
||||||
use std::ffi::{CString, OsString};
|
use std::ffi::{CString, OsString};
|
||||||
|
use std::rc::Rc;
|
||||||
|
use std::sync::Arc;
|
||||||
|
use std::num::Wrapping;
|
||||||
|
|
||||||
#[cfg(feature = "unstable")]
|
#[cfg(feature = "unstable")]
|
||||||
use std::ffi::CStr;
|
use std::ffi::CStr;
|
||||||
@@ -28,7 +29,7 @@ extern crate fnv;
|
|||||||
use self::fnv::FnvHasher;
|
use self::fnv::FnvHasher;
|
||||||
|
|
||||||
extern crate serde_test;
|
extern crate serde_test;
|
||||||
use self::serde_test::{Token, assert_de_tokens, assert_de_tokens_error};
|
use self::serde_test::{assert_de_tokens, assert_de_tokens_error, Configure, Token};
|
||||||
|
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
mod macros;
|
mod macros;
|
||||||
@@ -82,6 +83,26 @@ struct StructSkipAll {
|
|||||||
a: i32,
|
a: i32,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[derive(PartialEq, Debug, Deserialize)]
|
||||||
|
#[serde(default)]
|
||||||
|
struct StructSkipDefault {
|
||||||
|
#[serde(skip_deserializing)]
|
||||||
|
a: i32,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(PartialEq, Debug, Deserialize)]
|
||||||
|
#[serde(default)]
|
||||||
|
struct StructSkipDefaultGeneric<T> {
|
||||||
|
#[serde(skip_deserializing)]
|
||||||
|
t: T,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Default for StructSkipDefault {
|
||||||
|
fn default() -> Self {
|
||||||
|
StructSkipDefault { a: 16 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(PartialEq, Debug, Deserialize)]
|
#[derive(PartialEq, Debug, Deserialize)]
|
||||||
#[serde(deny_unknown_fields)]
|
#[serde(deny_unknown_fields)]
|
||||||
struct StructSkipAllDenyUnknown {
|
struct StructSkipAllDenyUnknown {
|
||||||
@@ -97,7 +118,11 @@ enum Enum {
|
|||||||
Unit,
|
Unit,
|
||||||
Simple(i32),
|
Simple(i32),
|
||||||
Seq(i32, i32, i32),
|
Seq(i32, i32, i32),
|
||||||
Map { a: i32, b: i32, c: i32 },
|
Map {
|
||||||
|
a: i32,
|
||||||
|
b: i32,
|
||||||
|
c: i32,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(PartialEq, Debug, Deserialize)]
|
#[derive(PartialEq, Debug, Deserialize)]
|
||||||
@@ -109,25 +134,37 @@ enum EnumSkipAll {
|
|||||||
|
|
||||||
//////////////////////////////////////////////////////////////////////////
|
//////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
macro_rules! declare_test {
|
|
||||||
($name:ident { $($value:expr => $tokens:expr,)+ }) => {
|
|
||||||
#[test]
|
|
||||||
fn $name() {
|
|
||||||
$(
|
|
||||||
// Test ser/de roundtripping
|
|
||||||
assert_de_tokens(&$value, $tokens);
|
|
||||||
|
|
||||||
// Test that the tokens are ignorable
|
|
||||||
assert_de_tokens_ignore($tokens);
|
|
||||||
)+
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
macro_rules! declare_tests {
|
macro_rules! declare_tests {
|
||||||
|
(
|
||||||
|
$readable:tt
|
||||||
|
$($name:ident { $($value:expr => $tokens:expr,)+ })+
|
||||||
|
) => {
|
||||||
|
$(
|
||||||
|
#[test]
|
||||||
|
fn $name() {
|
||||||
|
$(
|
||||||
|
// Test ser/de roundtripping
|
||||||
|
assert_de_tokens(&$value.$readable(), $tokens);
|
||||||
|
|
||||||
|
// Test that the tokens are ignorable
|
||||||
|
assert_de_tokens_ignore($tokens);
|
||||||
|
)+
|
||||||
|
}
|
||||||
|
)+
|
||||||
|
};
|
||||||
|
|
||||||
($($name:ident { $($value:expr => $tokens:expr,)+ })+) => {
|
($($name:ident { $($value:expr => $tokens:expr,)+ })+) => {
|
||||||
$(
|
$(
|
||||||
declare_test!($name { $($value => $tokens,)+ });
|
#[test]
|
||||||
|
fn $name() {
|
||||||
|
$(
|
||||||
|
// Test ser/de roundtripping
|
||||||
|
assert_de_tokens(&$value, $tokens);
|
||||||
|
|
||||||
|
// Test that the tokens are ignorable
|
||||||
|
assert_de_tokens_ignore($tokens);
|
||||||
|
)+
|
||||||
|
}
|
||||||
)+
|
)+
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -155,13 +192,11 @@ fn assert_de_tokens_ignore(ignorable_tokens: &[Token]) {
|
|||||||
Token::Map { len: Some(2) },
|
Token::Map { len: Some(2) },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(1),
|
Token::I32(1),
|
||||||
|
|
||||||
Token::Str("ignored"),
|
Token::Str("ignored"),
|
||||||
]
|
].into_iter()
|
||||||
.into_iter()
|
.chain(ignorable_tokens.to_vec().into_iter())
|
||||||
.chain(ignorable_tokens.to_vec().into_iter())
|
.chain(vec![Token::MapEnd].into_iter())
|
||||||
.chain(vec![Token::MapEnd].into_iter())
|
.collect();
|
||||||
.collect();
|
|
||||||
|
|
||||||
let mut de = serde_test::Deserializer::new(&concated_tokens);
|
let mut de = serde_test::Deserializer::new(&concated_tokens);
|
||||||
let base = IgnoreBase::deserialize(&mut de).unwrap();
|
let base = IgnoreBase::deserialize(&mut de).unwrap();
|
||||||
@@ -522,7 +557,16 @@ declare_tests! {
|
|||||||
Token::MapEnd,
|
Token::MapEnd,
|
||||||
],
|
],
|
||||||
Struct { a: 1, b: 2, c: 0 } => &[
|
Struct { a: 1, b: 2, c: 0 } => &[
|
||||||
Token::Struct { name: "Struct", len: 3 },
|
Token::Map { len: Some(3) },
|
||||||
|
Token::U32(0),
|
||||||
|
Token::I32(1),
|
||||||
|
|
||||||
|
Token::U32(1),
|
||||||
|
Token::I32(2),
|
||||||
|
Token::MapEnd,
|
||||||
|
],
|
||||||
|
Struct { a: 1, b: 2, c: 0 } => &[
|
||||||
|
Token::Struct { name: "Struct", len: 2 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(1),
|
Token::I32(1),
|
||||||
|
|
||||||
@@ -554,7 +598,7 @@ declare_tests! {
|
|||||||
Token::MapEnd,
|
Token::MapEnd,
|
||||||
],
|
],
|
||||||
Struct { a: 1, b: 2, c: 0 } => &[
|
Struct { a: 1, b: 2, c: 0 } => &[
|
||||||
Token::Struct { name: "Struct", len: 3 },
|
Token::Struct { name: "Struct", len: 2 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(1),
|
Token::I32(1),
|
||||||
|
|
||||||
@@ -575,7 +619,7 @@ declare_tests! {
|
|||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
StructSkipAll { a: 0 } => &[
|
StructSkipAll { a: 0 } => &[
|
||||||
Token::Struct { name: "StructSkipAll", len: 1 },
|
Token::Struct { name: "StructSkipAll", len: 0 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(1),
|
Token::I32(1),
|
||||||
|
|
||||||
@@ -584,6 +628,12 @@ declare_tests! {
|
|||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_struct_skip_default {
|
||||||
|
StructSkipDefault { a: 16 } => &[
|
||||||
|
Token::Struct { name: "StructSkipDefault", len: 0 },
|
||||||
|
Token::StructEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
test_struct_skip_all_deny_unknown {
|
test_struct_skip_all_deny_unknown {
|
||||||
StructSkipAllDenyUnknown { a: 0 } => &[
|
StructSkipAllDenyUnknown { a: 0 } => &[
|
||||||
Token::Struct { name: "StructSkipAllDenyUnknown", len: 0 },
|
Token::Struct { name: "StructSkipAllDenyUnknown", len: 0 },
|
||||||
@@ -592,7 +642,7 @@ declare_tests! {
|
|||||||
}
|
}
|
||||||
test_struct_default {
|
test_struct_default {
|
||||||
StructDefault { a: 50, b: "overwritten".to_string() } => &[
|
StructDefault { a: 50, b: "overwritten".to_string() } => &[
|
||||||
Token::Struct { name: "StructDefault", len: 1 },
|
Token::Struct { name: "StructDefault", len: 2 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(50),
|
Token::I32(50),
|
||||||
|
|
||||||
@@ -601,7 +651,7 @@ declare_tests! {
|
|||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
StructDefault { a: 100, b: "default".to_string() } => &[
|
StructDefault { a: 100, b: "default".to_string() } => &[
|
||||||
Token::Struct { name: "StructDefault", len: 0 },
|
Token::Struct { name: "StructDefault", len: 2 },
|
||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
@@ -682,6 +732,23 @@ declare_tests! {
|
|||||||
Token::SeqEnd,
|
Token::SeqEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_system_time {
|
||||||
|
UNIX_EPOCH + Duration::new(1, 2) => &[
|
||||||
|
Token::Struct { name: "SystemTime", len: 2 },
|
||||||
|
Token::Str("secs_since_epoch"),
|
||||||
|
Token::U64(1),
|
||||||
|
|
||||||
|
Token::Str("nanos_since_epoch"),
|
||||||
|
Token::U32(2),
|
||||||
|
Token::StructEnd,
|
||||||
|
],
|
||||||
|
UNIX_EPOCH + Duration::new(1, 2) => &[
|
||||||
|
Token::Seq { len: Some(2) },
|
||||||
|
Token::I64(1),
|
||||||
|
Token::I64(2),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
test_range {
|
test_range {
|
||||||
1u32..2u32 => &[
|
1u32..2u32 => &[
|
||||||
Token::Struct { name: "Range", len: 2 },
|
Token::Struct { name: "Range", len: 2 },
|
||||||
@@ -699,16 +766,10 @@ declare_tests! {
|
|||||||
Token::SeqEnd,
|
Token::SeqEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
test_net_ipv4addr {
|
test_path {
|
||||||
"1.2.3.4".parse::<net::Ipv4Addr>().unwrap() => &[Token::Str("1.2.3.4")],
|
Path::new("/usr/local/lib") => &[
|
||||||
}
|
Token::BorrowedStr("/usr/local/lib"),
|
||||||
test_net_ipv6addr {
|
],
|
||||||
"::1".parse::<net::Ipv6Addr>().unwrap() => &[Token::Str("::1")],
|
|
||||||
}
|
|
||||||
test_net_socketaddr {
|
|
||||||
"1.2.3.4:1234".parse::<net::SocketAddr>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
|
||||||
"1.2.3.4:1234".parse::<net::SocketAddrV4>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
|
||||||
"[::1]:1234".parse::<net::SocketAddrV6>().unwrap() => &[Token::Str("[::1]:1234")],
|
|
||||||
}
|
}
|
||||||
test_path_buf {
|
test_path_buf {
|
||||||
PathBuf::from("/usr/local/lib") => &[
|
PathBuf::from("/usr/local/lib") => &[
|
||||||
@@ -720,6 +781,141 @@ declare_tests! {
|
|||||||
Token::Bytes(b"abc"),
|
Token::Bytes(b"abc"),
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_rc {
|
||||||
|
Rc::new(true) => &[
|
||||||
|
Token::Bool(true),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_arc {
|
||||||
|
Arc::new(true) => &[
|
||||||
|
Token::Bool(true),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_wrapping {
|
||||||
|
Wrapping(1usize) => &[
|
||||||
|
Token::U32(1),
|
||||||
|
],
|
||||||
|
Wrapping(1usize) => &[
|
||||||
|
Token::U64(1),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
declare_tests! {
|
||||||
|
readable
|
||||||
|
|
||||||
|
test_net_ipv4addr_readable {
|
||||||
|
"1.2.3.4".parse::<net::Ipv4Addr>().unwrap() => &[Token::Str("1.2.3.4")],
|
||||||
|
}
|
||||||
|
test_net_ipv6addr_readable {
|
||||||
|
"::1".parse::<net::Ipv6Addr>().unwrap() => &[Token::Str("::1")],
|
||||||
|
}
|
||||||
|
test_net_ipaddr_readable {
|
||||||
|
"1.2.3.4".parse::<net::IpAddr>().unwrap() => &[Token::Str("1.2.3.4")],
|
||||||
|
}
|
||||||
|
test_net_socketaddr_readable {
|
||||||
|
"1.2.3.4:1234".parse::<net::SocketAddr>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
||||||
|
"1.2.3.4:1234".parse::<net::SocketAddrV4>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
||||||
|
"[::1]:1234".parse::<net::SocketAddrV6>().unwrap() => &[Token::Str("[::1]:1234")],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
declare_tests! {
|
||||||
|
compact
|
||||||
|
|
||||||
|
test_net_ipv4addr_compact {
|
||||||
|
net::Ipv4Addr::from(*b"1234") => &seq![
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_ipv6addr_compact {
|
||||||
|
net::Ipv6Addr::from(*b"1234567890123456") => &seq![
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_ipaddr_compact {
|
||||||
|
net::IpAddr::from(*b"1234") => &seq![
|
||||||
|
Token::NewtypeVariant { name: "IpAddr", variant: "V4" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_socketaddr_compact {
|
||||||
|
net::SocketAddr::from((*b"1234567890123456", 1234)) => &seq![
|
||||||
|
Token::NewtypeVariant { name: "SocketAddr", variant: "V6" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
net::SocketAddr::from((*b"1234", 1234)) => &seq![
|
||||||
|
Token::NewtypeVariant { name: "SocketAddr", variant: "V4" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
net::SocketAddrV4::new(net::Ipv4Addr::from(*b"1234"), 1234) => &seq![
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
net::SocketAddrV6::new(net::Ipv6Addr::from(*b"1234567890123456"), 1234, 0, 0) => &seq![
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd
|
||||||
|
],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(feature = "unstable")]
|
||||||
|
declare_tests! {
|
||||||
|
test_rc_dst {
|
||||||
|
Rc::<str>::from("s") => &[
|
||||||
|
Token::Str("s"),
|
||||||
|
],
|
||||||
|
Rc::<[bool]>::from(&[true][..]) => &[
|
||||||
|
Token::Seq { len: Some(1) },
|
||||||
|
Token::Bool(true),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_arc_dst {
|
||||||
|
Arc::<str>::from("s") => &[
|
||||||
|
Token::Str("s"),
|
||||||
|
],
|
||||||
|
Arc::<[bool]>::from(&[true][..]) => &[
|
||||||
|
Token::Seq { len: Some(1) },
|
||||||
|
Token::Bool(true),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(unix)]
|
#[cfg(unix)]
|
||||||
@@ -771,15 +967,6 @@ fn test_cstr() {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(feature = "unstable")]
|
|
||||||
#[test]
|
|
||||||
fn test_net_ipaddr() {
|
|
||||||
assert_de_tokens(
|
|
||||||
&"1.2.3.4".parse::<net::IpAddr>().unwrap(),
|
|
||||||
&[Token::Str("1.2.3.4")],
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(feature = "unstable")]
|
#[cfg(feature = "unstable")]
|
||||||
#[test]
|
#[test]
|
||||||
fn test_cstr_internal_null() {
|
fn test_cstr_internal_null() {
|
||||||
@@ -801,7 +988,7 @@ fn test_cstr_internal_null_end() {
|
|||||||
declare_error_tests! {
|
declare_error_tests! {
|
||||||
test_unknown_field<StructDenyUnknown> {
|
test_unknown_field<StructDenyUnknown> {
|
||||||
&[
|
&[
|
||||||
Token::Struct { name: "StructDenyUnknown", len: 2 },
|
Token::Struct { name: "StructDenyUnknown", len: 1 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
Token::I32(0),
|
Token::I32(0),
|
||||||
|
|
||||||
@@ -811,14 +998,14 @@ declare_error_tests! {
|
|||||||
}
|
}
|
||||||
test_skipped_field_is_unknown<StructDenyUnknown> {
|
test_skipped_field_is_unknown<StructDenyUnknown> {
|
||||||
&[
|
&[
|
||||||
Token::Struct { name: "StructDenyUnknown", len: 2 },
|
Token::Struct { name: "StructDenyUnknown", len: 1 },
|
||||||
Token::Str("b"),
|
Token::Str("b"),
|
||||||
],
|
],
|
||||||
"unknown field `b`, expected `a`",
|
"unknown field `b`, expected `a`",
|
||||||
}
|
}
|
||||||
test_skip_all_deny_unknown<StructSkipAllDenyUnknown> {
|
test_skip_all_deny_unknown<StructSkipAllDenyUnknown> {
|
||||||
&[
|
&[
|
||||||
Token::Struct { name: "StructSkipAllDenyUnknown", len: 1 },
|
Token::Struct { name: "StructSkipAllDenyUnknown", len: 0 },
|
||||||
Token::Str("a"),
|
Token::Str("a"),
|
||||||
],
|
],
|
||||||
"unknown field `a`, there are no fields",
|
"unknown field `a`, there are no fields",
|
||||||
@@ -1021,4 +1208,10 @@ declare_error_tests! {
|
|||||||
],
|
],
|
||||||
"invalid type: sequence, expected unit struct UnitStruct",
|
"invalid type: sequence, expected unit struct UnitStruct",
|
||||||
}
|
}
|
||||||
|
test_wrapping_overflow<Wrapping<u16>> {
|
||||||
|
&[
|
||||||
|
Token::U32(65_536),
|
||||||
|
],
|
||||||
|
"invalid value: integer `65536`, expected u16",
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
+271
-41
@@ -10,12 +10,9 @@
|
|||||||
// successfully when there are a variety of generics and non-(de)serializable
|
// successfully when there are a variety of generics and non-(de)serializable
|
||||||
// types involved.
|
// types involved.
|
||||||
|
|
||||||
|
#![deny(warnings)]
|
||||||
#![cfg_attr(feature = "unstable", feature(non_ascii_idents))]
|
#![cfg_attr(feature = "unstable", feature(non_ascii_idents))]
|
||||||
|
|
||||||
// Clippy false positive
|
|
||||||
// https://github.com/Manishearth/rust-clippy/issues/292
|
|
||||||
#![cfg_attr(feature = "cargo-clippy", allow(needless_lifetimes))]
|
|
||||||
|
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde_derive;
|
extern crate serde_derive;
|
||||||
|
|
||||||
@@ -25,6 +22,7 @@ use self::serde::de::{DeserializeOwned, Deserializer};
|
|||||||
|
|
||||||
use std::borrow::Cow;
|
use std::borrow::Cow;
|
||||||
use std::marker::PhantomData;
|
use std::marker::PhantomData;
|
||||||
|
use std::option::Option as StdOption;
|
||||||
use std::result::Result as StdResult;
|
use std::result::Result as StdResult;
|
||||||
|
|
||||||
// Try to trip up the generated code if it fails to use fully qualified paths.
|
// Try to trip up the generated code if it fails to use fully qualified paths.
|
||||||
@@ -34,6 +32,12 @@ struct Result;
|
|||||||
struct Ok;
|
struct Ok;
|
||||||
#[allow(dead_code)]
|
#[allow(dead_code)]
|
||||||
struct Err;
|
struct Err;
|
||||||
|
#[allow(dead_code)]
|
||||||
|
struct Option;
|
||||||
|
#[allow(dead_code)]
|
||||||
|
struct Some;
|
||||||
|
#[allow(dead_code)]
|
||||||
|
struct None;
|
||||||
|
|
||||||
//////////////////////////////////////////////////////////////////////////
|
//////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
@@ -42,7 +46,7 @@ fn test_gen() {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct With<T> {
|
struct With<T> {
|
||||||
t: T,
|
t: T,
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
x: X,
|
x: X,
|
||||||
}
|
}
|
||||||
assert::<With<i32>>();
|
assert::<With<i32>>();
|
||||||
@@ -50,7 +54,7 @@ fn test_gen() {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct WithTogether<T> {
|
struct WithTogether<T> {
|
||||||
t: T,
|
t: T,
|
||||||
#[serde(with="both_x")]
|
#[serde(with = "both_x")]
|
||||||
x: X,
|
x: X,
|
||||||
}
|
}
|
||||||
assert::<WithTogether<i32>>();
|
assert::<WithTogether<i32>>();
|
||||||
@@ -58,8 +62,8 @@ fn test_gen() {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct WithRef<'a, T: 'a> {
|
struct WithRef<'a, T: 'a> {
|
||||||
#[serde(skip_deserializing)]
|
#[serde(skip_deserializing)]
|
||||||
t: Option<&'a T>,
|
t: StdOption<&'a T>,
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
x: X,
|
x: X,
|
||||||
}
|
}
|
||||||
assert::<WithRef<i32>>();
|
assert::<WithRef<i32>>();
|
||||||
@@ -79,9 +83,9 @@ fn test_gen() {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct NoBounds<T> {
|
struct NoBounds<T> {
|
||||||
t: T,
|
t: T,
|
||||||
option: Option<T>,
|
option: StdOption<T>,
|
||||||
boxed: Box<T>,
|
boxed: Box<T>,
|
||||||
option_boxed: Option<Box<T>>,
|
option_boxed: StdOption<Box<T>>,
|
||||||
}
|
}
|
||||||
assert::<NoBounds<i32>>();
|
assert::<NoBounds<i32>>();
|
||||||
|
|
||||||
@@ -89,17 +93,17 @@ fn test_gen() {
|
|||||||
enum EnumWith<T> {
|
enum EnumWith<T> {
|
||||||
Unit,
|
Unit,
|
||||||
Newtype(
|
Newtype(
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
X
|
X,
|
||||||
),
|
),
|
||||||
Tuple(
|
Tuple(
|
||||||
T,
|
T,
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
X
|
X,
|
||||||
),
|
),
|
||||||
Struct {
|
Struct {
|
||||||
t: T,
|
t: T,
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
x: X,
|
x: X,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
@@ -119,16 +123,16 @@ fn test_gen() {
|
|||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct Newtype(
|
struct Newtype(
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
X
|
X,
|
||||||
);
|
);
|
||||||
assert::<Newtype>();
|
assert::<Newtype>();
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct Tuple<T>(
|
struct Tuple<T>(
|
||||||
T,
|
T,
|
||||||
#[serde(serialize_with="ser_x", deserialize_with="de_x")]
|
#[serde(serialize_with = "ser_x", deserialize_with = "de_x")]
|
||||||
X
|
X,
|
||||||
);
|
);
|
||||||
assert::<Tuple<i32>>();
|
assert::<Tuple<i32>>();
|
||||||
|
|
||||||
@@ -138,7 +142,9 @@ fn test_gen() {
|
|||||||
left: Box<TreeNode<D>>,
|
left: Box<TreeNode<D>>,
|
||||||
right: Box<TreeNode<D>>,
|
right: Box<TreeNode<D>>,
|
||||||
},
|
},
|
||||||
Leaf { data: D },
|
Leaf {
|
||||||
|
data: D,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
assert::<TreeNode<i32>>();
|
assert::<TreeNode<i32>>();
|
||||||
|
|
||||||
@@ -177,35 +183,34 @@ fn test_gen() {
|
|||||||
|
|
||||||
#[derive(Serialize)]
|
#[derive(Serialize)]
|
||||||
struct OptionStatic<'a> {
|
struct OptionStatic<'a> {
|
||||||
a: Option<&'a str>,
|
a: StdOption<&'a str>,
|
||||||
b: Option<&'static str>,
|
b: StdOption<&'static str>,
|
||||||
}
|
}
|
||||||
assert_ser::<OptionStatic>();
|
assert_ser::<OptionStatic>();
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(bound="D: SerializeWith + DeserializeWith")]
|
#[serde(bound = "D: SerializeWith + DeserializeWith")]
|
||||||
struct WithTraits1<D, E> {
|
struct WithTraits1<D, E> {
|
||||||
#[serde(serialize_with="SerializeWith::serialize_with",
|
#[serde(serialize_with = "SerializeWith::serialize_with",
|
||||||
deserialize_with="DeserializeWith::deserialize_with")]
|
deserialize_with = "DeserializeWith::deserialize_with")]
|
||||||
d: D,
|
d: D,
|
||||||
#[serde(serialize_with="SerializeWith::serialize_with",
|
#[serde(serialize_with = "SerializeWith::serialize_with",
|
||||||
deserialize_with="DeserializeWith::deserialize_with",
|
deserialize_with = "DeserializeWith::deserialize_with",
|
||||||
bound="E: SerializeWith + DeserializeWith")]
|
bound = "E: SerializeWith + DeserializeWith")]
|
||||||
e: E,
|
e: E,
|
||||||
}
|
}
|
||||||
assert::<WithTraits1<X, X>>();
|
assert::<WithTraits1<X, X>>();
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(bound(serialize="D: SerializeWith",
|
#[serde(bound(serialize = "D: SerializeWith", deserialize = "D: DeserializeWith"))]
|
||||||
deserialize="D: DeserializeWith"))]
|
|
||||||
struct WithTraits2<D, E> {
|
struct WithTraits2<D, E> {
|
||||||
#[serde(serialize_with="SerializeWith::serialize_with",
|
#[serde(serialize_with = "SerializeWith::serialize_with",
|
||||||
deserialize_with="DeserializeWith::deserialize_with")]
|
deserialize_with = "DeserializeWith::deserialize_with")]
|
||||||
d: D,
|
d: D,
|
||||||
#[serde(serialize_with="SerializeWith::serialize_with",
|
#[serde(serialize_with = "SerializeWith::serialize_with",
|
||||||
bound(serialize="E: SerializeWith"))]
|
bound(serialize = "E: SerializeWith"))]
|
||||||
#[serde(deserialize_with="DeserializeWith::deserialize_with",
|
#[serde(deserialize_with = "DeserializeWith::deserialize_with",
|
||||||
bound(deserialize="E: DeserializeWith"))]
|
bound(deserialize = "E: DeserializeWith"))]
|
||||||
e: E,
|
e: E,
|
||||||
}
|
}
|
||||||
assert::<WithTraits2<X, X>>();
|
assert::<WithTraits2<X, X>>();
|
||||||
@@ -267,14 +272,14 @@ fn test_gen() {
|
|||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
struct TupleSkipAll(
|
struct TupleSkipAll(
|
||||||
#[serde(skip_deserializing)]
|
#[serde(skip_deserializing)]
|
||||||
u8
|
u8,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(deny_unknown_fields)]
|
#[serde(deny_unknown_fields)]
|
||||||
struct TupleSkipAllDenyUnknown(
|
struct TupleSkipAllDenyUnknown(
|
||||||
#[serde(skip_deserializing)]
|
#[serde(skip_deserializing)]
|
||||||
u8
|
u8,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
@@ -302,7 +307,7 @@ fn test_gen() {
|
|||||||
},
|
},
|
||||||
TupleSkip(
|
TupleSkip(
|
||||||
#[serde(skip_deserializing)]
|
#[serde(skip_deserializing)]
|
||||||
u8
|
u8,
|
||||||
),
|
),
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -318,7 +323,7 @@ fn test_gen() {
|
|||||||
},
|
},
|
||||||
TupleSkip(
|
TupleSkip(
|
||||||
#[serde(skip_deserializing)]
|
#[serde(skip_deserializing)]
|
||||||
u8
|
u8,
|
||||||
),
|
),
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -330,6 +335,199 @@ fn test_gen() {
|
|||||||
struct EmptyArray {
|
struct EmptyArray {
|
||||||
empty: [X; 0],
|
empty: [X; 0],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
enum Or<A, B> {
|
||||||
|
A(A),
|
||||||
|
B(B),
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(untagged, remote = "Or")]
|
||||||
|
enum OrDef<A, B> {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
A(A),
|
||||||
|
#[allow(dead_code)]
|
||||||
|
B(B),
|
||||||
|
}
|
||||||
|
|
||||||
|
struct Str<'a>(&'a str);
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(remote = "Str")]
|
||||||
|
struct StrDef<'a>(&'a str);
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct Remote<'a> {
|
||||||
|
#[serde(with = "OrDef")]
|
||||||
|
or: Or<u8, bool>,
|
||||||
|
#[serde(borrow, with = "StrDef")]
|
||||||
|
s: Str<'a>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
enum BorrowVariant<'a> {
|
||||||
|
#[serde(borrow, with = "StrDef")]
|
||||||
|
S(Str<'a>),
|
||||||
|
}
|
||||||
|
|
||||||
|
mod vis {
|
||||||
|
pub struct S;
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(remote = "S")]
|
||||||
|
pub struct SDef;
|
||||||
|
}
|
||||||
|
|
||||||
|
// This would not work if SDef::serialize / deserialize are private.
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct RemoteVisibility {
|
||||||
|
#[serde(with = "vis::SDef")]
|
||||||
|
s: vis::S,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
enum ExternallyTaggedVariantWith {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Normal { f1: String },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "ser_x")]
|
||||||
|
#[serde(deserialize_with = "de_x")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Newtype(X),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Tuple(String, u8),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Struct { f1: String, f2: u8 },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_unit_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_unit_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
assert_ser::<ExternallyTaggedVariantWith>();
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(tag = "t")]
|
||||||
|
enum InternallyTaggedVariantWith {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Normal { f1: String },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "ser_x")]
|
||||||
|
#[serde(deserialize_with = "de_x")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Newtype(X),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Struct { f1: String, f2: u8 },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_unit_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_unit_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
assert_ser::<InternallyTaggedVariantWith>();
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(tag = "t", content = "c")]
|
||||||
|
enum AdjacentlyTaggedVariantWith {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Normal { f1: String },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "ser_x")]
|
||||||
|
#[serde(deserialize_with = "de_x")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Newtype(X),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Tuple(String, u8),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Struct { f1: String, f2: u8 },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_unit_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_unit_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
assert_ser::<AdjacentlyTaggedVariantWith>();
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
#[serde(untagged)]
|
||||||
|
enum UntaggedVariantWith {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Normal { f1: String },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "ser_x")]
|
||||||
|
#[serde(deserialize_with = "de_x")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Newtype(X),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Tuple(String, u8),
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_other_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_other_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Struct { f1: String, f2: u8 },
|
||||||
|
|
||||||
|
#[serde(serialize_with = "serialize_some_unit_variant")]
|
||||||
|
#[serde(deserialize_with = "deserialize_some_unit_variant")]
|
||||||
|
#[allow(dead_code)]
|
||||||
|
Unit,
|
||||||
|
}
|
||||||
|
assert_ser::<UntaggedVariantWith>();
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct StaticStrStruct<'a> {
|
||||||
|
a: &'a str,
|
||||||
|
b: &'static str,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct StaticStrTupleStruct<'a>(&'a str, &'static str);
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct StaticStrNewtypeStruct(&'static str);
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
enum StaticStrEnum<'a> {
|
||||||
|
Struct { a: &'a str, b: &'static str },
|
||||||
|
Tuple(&'a str, &'static str),
|
||||||
|
Newtype(&'static str),
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct SkippedStaticStr {
|
||||||
|
#[serde(skip_deserializing)]
|
||||||
|
skipped: &'static str,
|
||||||
|
other: isize,
|
||||||
|
}
|
||||||
|
assert::<SkippedStaticStr>();
|
||||||
|
|
||||||
|
macro_rules! T {
|
||||||
|
() => { () }
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
struct TypeMacro<T> {
|
||||||
|
mac: T!(),
|
||||||
|
marker: PhantomData<T>,
|
||||||
|
}
|
||||||
|
assert::<TypeMacro<X>>();
|
||||||
}
|
}
|
||||||
|
|
||||||
//////////////////////////////////////////////////////////////////////////
|
//////////////////////////////////////////////////////////////////////////
|
||||||
@@ -357,7 +555,7 @@ pub fn de_x<'de, D: Deserializer<'de>>(_: D) -> StdResult<X, D::Error> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
mod both_x {
|
mod both_x {
|
||||||
pub use super::{ser_x as serialize, de_x as deserialize};
|
pub use super::{de_x as deserialize, ser_x as serialize};
|
||||||
}
|
}
|
||||||
|
|
||||||
impl SerializeWith for X {
|
impl SerializeWith for X {
|
||||||
@@ -371,3 +569,35 @@ impl DeserializeWith for X {
|
|||||||
unimplemented!()
|
unimplemented!()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn serialize_some_unit_variant<S>(_: S) -> StdResult<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
unimplemented!()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn deserialize_some_unit_variant<'de, D>(_: D) -> StdResult<(), D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
unimplemented!()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn serialize_some_other_variant<S>(_: &str, _: &u8, _: S) -> StdResult<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
unimplemented!()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn deserialize_some_other_variant<'de, D>(_: D) -> StdResult<(String, u8), D::Error>
|
||||||
|
where
|
||||||
|
D: Deserializer<'de>,
|
||||||
|
{
|
||||||
|
unimplemented!()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn is_zero(n: &u8) -> bool {
|
||||||
|
*n == 0
|
||||||
|
}
|
||||||
|
|||||||
@@ -9,10 +9,8 @@
|
|||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde_derive;
|
extern crate serde_derive;
|
||||||
|
|
||||||
extern crate serde;
|
|
||||||
|
|
||||||
extern crate serde_test;
|
extern crate serde_test;
|
||||||
use serde_test::{Token, assert_de_tokens};
|
use serde_test::{assert_de_tokens, Token};
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_variant_identifier() {
|
fn test_variant_identifier() {
|
||||||
@@ -23,7 +21,10 @@ fn test_variant_identifier() {
|
|||||||
Bbb,
|
Bbb,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
assert_de_tokens(&V::Aaa, &[Token::U8(0)]);
|
||||||
|
assert_de_tokens(&V::Aaa, &[Token::U16(0)]);
|
||||||
assert_de_tokens(&V::Aaa, &[Token::U32(0)]);
|
assert_de_tokens(&V::Aaa, &[Token::U32(0)]);
|
||||||
|
assert_de_tokens(&V::Aaa, &[Token::U64(0)]);
|
||||||
assert_de_tokens(&V::Aaa, &[Token::Str("Aaa")]);
|
assert_de_tokens(&V::Aaa, &[Token::Str("Aaa")]);
|
||||||
assert_de_tokens(&V::Aaa, &[Token::Bytes(b"Aaa")]);
|
assert_de_tokens(&V::Aaa, &[Token::Bytes(b"Aaa")]);
|
||||||
}
|
}
|
||||||
|
|||||||
+465
-221
File diff suppressed because it is too large
Load Diff
@@ -9,8 +9,6 @@
|
|||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde_derive;
|
extern crate serde_derive;
|
||||||
|
|
||||||
extern crate serde;
|
|
||||||
|
|
||||||
mod remote {
|
mod remote {
|
||||||
pub struct Unit;
|
pub struct Unit;
|
||||||
|
|
||||||
@@ -123,7 +121,7 @@ struct UnitDef;
|
|||||||
#[serde(remote = "remote::PrimitivePriv")]
|
#[serde(remote = "remote::PrimitivePriv")]
|
||||||
struct PrimitivePrivDef(
|
struct PrimitivePrivDef(
|
||||||
#[serde(getter = "remote::PrimitivePriv::get")]
|
#[serde(getter = "remote::PrimitivePriv::get")]
|
||||||
u8
|
u8,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
@@ -134,14 +132,14 @@ struct PrimitivePubDef(u8);
|
|||||||
#[serde(remote = "remote::NewtypePriv")]
|
#[serde(remote = "remote::NewtypePriv")]
|
||||||
struct NewtypePrivDef(
|
struct NewtypePrivDef(
|
||||||
#[serde(getter = "remote::NewtypePriv::get", with = "UnitDef")]
|
#[serde(getter = "remote::NewtypePriv::get", with = "UnitDef")]
|
||||||
remote::Unit
|
remote::Unit,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
#[serde(remote = "remote::NewtypePub")]
|
#[serde(remote = "remote::NewtypePub")]
|
||||||
struct NewtypePubDef(
|
struct NewtypePubDef(
|
||||||
#[serde(with = "UnitDef")]
|
#[serde(with = "UnitDef")]
|
||||||
remote::Unit
|
remote::Unit,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
@@ -150,7 +148,7 @@ struct TuplePrivDef(
|
|||||||
#[serde(getter = "remote::TuplePriv::first")]
|
#[serde(getter = "remote::TuplePriv::first")]
|
||||||
u8,
|
u8,
|
||||||
#[serde(getter = "remote::TuplePriv::second", with = "UnitDef")]
|
#[serde(getter = "remote::TuplePriv::second", with = "UnitDef")]
|
||||||
remote::Unit
|
remote::Unit,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
@@ -158,7 +156,7 @@ struct TuplePrivDef(
|
|||||||
struct TuplePubDef(
|
struct TuplePubDef(
|
||||||
u8,
|
u8,
|
||||||
#[serde(with = "UnitDef")]
|
#[serde(with = "UnitDef")]
|
||||||
remote::Unit
|
remote::Unit,
|
||||||
);
|
);
|
||||||
|
|
||||||
#[derive(Serialize, Deserialize)]
|
#[derive(Serialize, Deserialize)]
|
||||||
@@ -168,7 +166,7 @@ struct StructPrivDef {
|
|||||||
a: u8,
|
a: u8,
|
||||||
|
|
||||||
#[serde(getter = "remote::StructPriv::b")]
|
#[serde(getter = "remote::StructPriv::b")]
|
||||||
#[serde(with= "UnitDef")]
|
#[serde(with = "UnitDef")]
|
||||||
b: remote::Unit,
|
b: remote::Unit,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -179,7 +177,7 @@ struct StructPubDef {
|
|||||||
a: u8,
|
a: u8,
|
||||||
|
|
||||||
#[allow(dead_code)]
|
#[allow(dead_code)]
|
||||||
#[serde(with= "UnitDef")]
|
#[serde(with = "UnitDef")]
|
||||||
b: remote::Unit,
|
b: remote::Unit,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,49 @@
|
|||||||
|
// Copyright 2017 Serde Developers
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
extern crate serde_test;
|
||||||
|
use self::serde_test::{assert_tokens, Configure, Token};
|
||||||
|
|
||||||
|
use std::net;
|
||||||
|
|
||||||
|
#[macro_use]
|
||||||
|
#[allow(unused_macros)]
|
||||||
|
mod macros;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn ip_addr_roundtrip() {
|
||||||
|
assert_tokens(
|
||||||
|
&net::IpAddr::from(*b"1234").compact(),
|
||||||
|
&seq![
|
||||||
|
Token::NewtypeVariant { name: "IpAddr", variant: "V4" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn socked_addr_roundtrip() {
|
||||||
|
assert_tokens(
|
||||||
|
&net::SocketAddr::from((*b"1234567890123456", 1234)).compact(),
|
||||||
|
&seq![
|
||||||
|
Token::NewtypeVariant { name: "SocketAddr", variant: "V6" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
);
|
||||||
|
}
|
||||||
+169
-29
@@ -9,19 +9,20 @@
|
|||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate serde_derive;
|
extern crate serde_derive;
|
||||||
|
|
||||||
use std::collections::{BTreeMap, HashMap, HashSet};
|
use std::collections::{BTreeMap, BTreeSet, HashMap, HashSet};
|
||||||
use std::net;
|
use std::net;
|
||||||
use std::path::{Path, PathBuf};
|
use std::path::{Path, PathBuf};
|
||||||
use std::time::Duration;
|
use std::time::{Duration, UNIX_EPOCH};
|
||||||
use std::ffi::CString;
|
use std::ffi::CString;
|
||||||
|
use std::rc::Rc;
|
||||||
|
use std::sync::Arc;
|
||||||
|
use std::num::Wrapping;
|
||||||
|
|
||||||
#[cfg(unix)]
|
#[cfg(unix)]
|
||||||
use std::str;
|
use std::str;
|
||||||
|
|
||||||
extern crate serde;
|
|
||||||
|
|
||||||
extern crate serde_test;
|
extern crate serde_test;
|
||||||
use self::serde_test::{Token, assert_ser_tokens, assert_ser_tokens_error};
|
use self::serde_test::{assert_ser_tokens, assert_ser_tokens_error, Configure, Token};
|
||||||
|
|
||||||
extern crate fnv;
|
extern crate fnv;
|
||||||
use self::fnv::FnvHasher;
|
use self::fnv::FnvHasher;
|
||||||
@@ -49,7 +50,10 @@ enum Enum {
|
|||||||
Unit,
|
Unit,
|
||||||
One(i32),
|
One(i32),
|
||||||
Seq(i32, i32),
|
Seq(i32, i32),
|
||||||
Map { a: i32, b: i32 },
|
Map {
|
||||||
|
a: i32,
|
||||||
|
b: i32,
|
||||||
|
},
|
||||||
#[serde(skip_serializing)]
|
#[serde(skip_serializing)]
|
||||||
SkippedUnit,
|
SkippedUnit,
|
||||||
#[serde(skip_serializing)]
|
#[serde(skip_serializing)]
|
||||||
@@ -57,12 +61,29 @@ enum Enum {
|
|||||||
#[serde(skip_serializing)]
|
#[serde(skip_serializing)]
|
||||||
SkippedSeq(i32, i32),
|
SkippedSeq(i32, i32),
|
||||||
#[serde(skip_serializing)]
|
#[serde(skip_serializing)]
|
||||||
SkippedMap { _a: i32, _b: i32 },
|
SkippedMap {
|
||||||
|
_a: i32,
|
||||||
|
_b: i32,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
//////////////////////////////////////////////////////////////////////////
|
//////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
macro_rules! declare_tests {
|
macro_rules! declare_tests {
|
||||||
|
(
|
||||||
|
$readable:tt
|
||||||
|
$($name:ident { $($value:expr => $tokens:expr,)+ })+
|
||||||
|
) => {
|
||||||
|
$(
|
||||||
|
#[test]
|
||||||
|
fn $name() {
|
||||||
|
$(
|
||||||
|
assert_ser_tokens(&$value.$readable(), $tokens);
|
||||||
|
)+
|
||||||
|
}
|
||||||
|
)+
|
||||||
|
};
|
||||||
|
|
||||||
($($name:ident { $($value:expr => $tokens:expr,)+ })+) => {
|
($($name:ident { $($value:expr => $tokens:expr,)+ })+) => {
|
||||||
$(
|
$(
|
||||||
#[test]
|
#[test]
|
||||||
@@ -170,6 +191,17 @@ declare_tests! {
|
|||||||
Token::SeqEnd,
|
Token::SeqEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_btreeset {
|
||||||
|
BTreeSet::<isize>::new() => &[
|
||||||
|
Token::Seq { len: Some(0) },
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
btreeset![1] => &[
|
||||||
|
Token::Seq { len: Some(1) },
|
||||||
|
Token::I32(1),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
test_hashset {
|
test_hashset {
|
||||||
HashSet::<isize>::new() => &[
|
HashSet::<isize>::new() => &[
|
||||||
Token::Seq { len: Some(0) },
|
Token::Seq { len: Some(0) },
|
||||||
@@ -319,6 +351,17 @@ declare_tests! {
|
|||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_system_time {
|
||||||
|
UNIX_EPOCH + Duration::new(1, 200) => &[
|
||||||
|
Token::Struct { name: "SystemTime", len: 2 },
|
||||||
|
Token::Str("secs_since_epoch"),
|
||||||
|
Token::U64(1),
|
||||||
|
|
||||||
|
Token::Str("nanos_since_epoch"),
|
||||||
|
Token::U32(200),
|
||||||
|
Token::StructEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
test_range {
|
test_range {
|
||||||
1u32..2u32 => &[
|
1u32..2u32 => &[
|
||||||
Token::Struct { name: "Range", len: 2 },
|
Token::Struct { name: "Range", len: 2 },
|
||||||
@@ -330,17 +373,6 @@ declare_tests! {
|
|||||||
Token::StructEnd,
|
Token::StructEnd,
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
test_net_ipv4addr {
|
|
||||||
"1.2.3.4".parse::<net::Ipv4Addr>().unwrap() => &[Token::Str("1.2.3.4")],
|
|
||||||
}
|
|
||||||
test_net_ipv6addr {
|
|
||||||
"::1".parse::<net::Ipv6Addr>().unwrap() => &[Token::Str("::1")],
|
|
||||||
}
|
|
||||||
test_net_socketaddr {
|
|
||||||
"1.2.3.4:1234".parse::<net::SocketAddr>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
|
||||||
"1.2.3.4:1234".parse::<net::SocketAddrV4>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
|
||||||
"[::1]:1234".parse::<net::SocketAddrV6>().unwrap() => &[Token::Str("[::1]:1234")],
|
|
||||||
}
|
|
||||||
test_path {
|
test_path {
|
||||||
Path::new("/usr/local/lib") => &[
|
Path::new("/usr/local/lib") => &[
|
||||||
Token::Str("/usr/local/lib"),
|
Token::Str("/usr/local/lib"),
|
||||||
@@ -361,15 +393,127 @@ declare_tests! {
|
|||||||
Token::Bytes(b"abc"),
|
Token::Bytes(b"abc"),
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
test_rc {
|
||||||
|
Rc::new(true) => &[
|
||||||
|
Token::Bool(true),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_arc {
|
||||||
|
Arc::new(true) => &[
|
||||||
|
Token::Bool(true),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_wrapping {
|
||||||
|
Wrapping(1usize) => &[
|
||||||
|
Token::U64(1),
|
||||||
|
],
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
declare_tests! {
|
||||||
|
readable
|
||||||
|
|
||||||
|
test_net_ipv4addr_readable {
|
||||||
|
"1.2.3.4".parse::<net::Ipv4Addr>().unwrap() => &[Token::Str("1.2.3.4")],
|
||||||
|
}
|
||||||
|
test_net_ipv6addr_readable {
|
||||||
|
"::1".parse::<net::Ipv6Addr>().unwrap() => &[Token::Str("::1")],
|
||||||
|
}
|
||||||
|
test_net_ipaddr_readable {
|
||||||
|
"1.2.3.4".parse::<net::IpAddr>().unwrap() => &[Token::Str("1.2.3.4")],
|
||||||
|
}
|
||||||
|
test_net_socketaddr_readable {
|
||||||
|
"1.2.3.4:1234".parse::<net::SocketAddr>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
||||||
|
"1.2.3.4:1234".parse::<net::SocketAddrV4>().unwrap() => &[Token::Str("1.2.3.4:1234")],
|
||||||
|
"[::1]:1234".parse::<net::SocketAddrV6>().unwrap() => &[Token::Str("[::1]:1234")],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
declare_tests! {
|
||||||
|
compact
|
||||||
|
|
||||||
|
test_net_ipv4addr_compact {
|
||||||
|
net::Ipv4Addr::from(*b"1234") => &seq![
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_ipv6addr_compact {
|
||||||
|
net::Ipv6Addr::from(*b"1234567890123456") => &seq![
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_ipaddr_compact {
|
||||||
|
net::IpAddr::from(*b"1234") => &seq![
|
||||||
|
Token::NewtypeVariant { name: "IpAddr", variant: "V4" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_net_socketaddr_compact {
|
||||||
|
net::SocketAddr::from((*b"1234567890123456", 1234)) => &seq![
|
||||||
|
Token::NewtypeVariant { name: "SocketAddr", variant: "V6" },
|
||||||
|
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
net::SocketAddrV4::new(net::Ipv4Addr::from(*b"1234"), 1234) => &seq![
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 4 },
|
||||||
|
seq b"1234".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
net::SocketAddrV6::new(net::Ipv6Addr::from(*b"1234567890123456"), 1234, 0, 0) => &seq![
|
||||||
|
Token::Tuple { len: 2 },
|
||||||
|
|
||||||
|
Token::Tuple { len: 16 },
|
||||||
|
seq b"1234567890123456".iter().map(|&b| Token::U8(b)),
|
||||||
|
Token::TupleEnd,
|
||||||
|
|
||||||
|
Token::U16(1234),
|
||||||
|
Token::TupleEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serde's implementation is not unstable, but the constructors are.
|
||||||
#[cfg(feature = "unstable")]
|
#[cfg(feature = "unstable")]
|
||||||
#[test]
|
declare_tests! {
|
||||||
fn test_net_ipaddr() {
|
test_rc_dst {
|
||||||
assert_ser_tokens(
|
Rc::<str>::from("s") => &[
|
||||||
&"1.2.3.4".parse::<net::IpAddr>().unwrap(),
|
Token::Str("s"),
|
||||||
&[Token::Str("1.2.3.4")],
|
],
|
||||||
);
|
Rc::<[bool]>::from(&[true][..]) => &[
|
||||||
|
Token::Seq { len: Some(1) },
|
||||||
|
Token::Bool(true),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
test_arc_dst {
|
||||||
|
Arc::<str>::from("s") => &[
|
||||||
|
Token::Str("s"),
|
||||||
|
],
|
||||||
|
Arc::<[bool]>::from(&[true][..]) => &[
|
||||||
|
Token::Seq { len: Some(1) },
|
||||||
|
Token::Bool(true),
|
||||||
|
Token::SeqEnd,
|
||||||
|
],
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -385,11 +529,7 @@ fn test_cannot_serialize_paths() {
|
|||||||
let mut path_buf = PathBuf::new();
|
let mut path_buf = PathBuf::new();
|
||||||
path_buf.push(path);
|
path_buf.push(path);
|
||||||
|
|
||||||
assert_ser_tokens_error(
|
assert_ser_tokens_error(&path_buf, &[], "path contains invalid UTF-8 characters");
|
||||||
&path_buf,
|
|
||||||
&[],
|
|
||||||
"path contains invalid UTF-8 characters",
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
@@ -34,6 +34,9 @@ if [ -n "${CLIPPY}" ]; then
|
|||||||
cd "$DIR/serde_derive"
|
cd "$DIR/serde_derive"
|
||||||
cargo clippy -- -Dclippy
|
cargo clippy -- -Dclippy
|
||||||
|
|
||||||
|
cd "$DIR/serde_test"
|
||||||
|
cargo clippy -- -Dclippy
|
||||||
|
|
||||||
cd "$DIR/test_suite"
|
cd "$DIR/test_suite"
|
||||||
cargo clippy --features unstable -- -Dclippy
|
cargo clippy --features unstable -- -Dclippy
|
||||||
|
|
||||||
@@ -41,21 +44,24 @@ if [ -n "${CLIPPY}" ]; then
|
|||||||
cargo clippy -- -Dclippy
|
cargo clippy -- -Dclippy
|
||||||
else
|
else
|
||||||
CHANNEL=nightly
|
CHANNEL=nightly
|
||||||
|
cd "$DIR"
|
||||||
cargo clean
|
cargo clean
|
||||||
cd "$DIR/serde"
|
cd "$DIR/serde"
|
||||||
channel build
|
channel build
|
||||||
channel build --no-default-features
|
channel build --no-default-features
|
||||||
channel build --no-default-features --features alloc
|
channel build --no-default-features --features alloc
|
||||||
channel build --no-default-features --features collections
|
|
||||||
channel test --features 'rc unstable'
|
channel test --features 'rc unstable'
|
||||||
cd "$DIR/test_suite/deps"
|
cd "$DIR/test_suite/deps"
|
||||||
channel build
|
channel build
|
||||||
cd "$DIR/test_suite"
|
cd "$DIR/test_suite"
|
||||||
channel test --features unstable
|
channel test --features unstable
|
||||||
cd "$DIR/test_suite/no_std"
|
if [ -z "${APPVEYOR}" ]; then
|
||||||
channel build
|
cd "$DIR/test_suite/no_std"
|
||||||
|
channel build
|
||||||
|
fi
|
||||||
|
|
||||||
CHANNEL=beta
|
CHANNEL=beta
|
||||||
|
cd "$DIR"
|
||||||
cargo clean
|
cargo clean
|
||||||
cd "$DIR/serde"
|
cd "$DIR/serde"
|
||||||
channel build --features rc
|
channel build --features rc
|
||||||
@@ -63,6 +69,7 @@ else
|
|||||||
channel test
|
channel test
|
||||||
|
|
||||||
CHANNEL=stable
|
CHANNEL=stable
|
||||||
|
cd "$DIR"
|
||||||
cargo clean
|
cargo clean
|
||||||
cd "$DIR/serde"
|
cd "$DIR/serde"
|
||||||
channel build --features rc
|
channel build --features rc
|
||||||
@@ -72,6 +79,7 @@ else
|
|||||||
channel test
|
channel test
|
||||||
|
|
||||||
CHANNEL=1.13.0
|
CHANNEL=1.13.0
|
||||||
|
cd "$DIR"
|
||||||
cargo clean
|
cargo clean
|
||||||
cd "$DIR/serde"
|
cd "$DIR/serde"
|
||||||
channel build --features rc
|
channel build --features rc
|
||||||
|
|||||||
Reference in New Issue
Block a user