Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix typos #1574

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@ If this is the intended way of usage, take a look at the [`pallet-ethereum`](../

An Ethereum-based blockchain can use the pre-block feeding strategy to migrate to Substrate.
In the post-block generation model, the Ethereum block is generated *after* runtime execution.
In the pre-block feeding model, the Ethereum block is feeded in *before* runtime execution.
In the pre-block feeding model, the Ethereum block is fed in *before* runtime execution.

A blockchain can first use pre-block feeding with empty extrinsic requirement.
In this way, because no other external information is feeded, combined with a suitable consensus engine, one Ethereum block will have an exact corresponding Substrate block.
In this way, because no other external information is fed, combined with a suitable consensus engine, one Ethereum block will have an exact corresponding Substrate block.
This is called the [wrapper block](https://corepaper.org/substrate/wrapper/) strategy, and it allows Frontier to function as a normal Ethereum client.

With a sufficient number of the network running a Frontier node, the blockchain can then initiate a hard fork, allowing extrinsic to be added in.
Expand Down
4 changes: 2 additions & 2 deletions frame/evm/precompile/bls12377/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -590,7 +590,7 @@ impl Bls12377MapG1 {

impl Precompile for Bls12377MapG1 {
/// Implements EIP-2539 Map_To_G1 precompile.
/// > Field-to-curve call expects `64` bytes an an input that is interpreted as a an element of the base field.
/// > Field-to-curve call expects `64` bytes an input that is interpreted as an element of the base field.
/// > Output of this call is `128` bytes and is G1 point following respective encoding rules.
fn execute(handle: &mut impl PrecompileHandle) -> PrecompileResult {
handle.record_cost(Bls12377MapG1::GAS_COST)?;
Expand Down Expand Up @@ -629,7 +629,7 @@ impl Bls12377MapG2 {

impl Precompile for Bls12377MapG2 {
/// Implements EIP-2539 Map_FP2_TO_G2 precompile logic.
/// > Field-to-curve call expects `128` bytes an an input that is interpreted as a an element of the quadratic extension field.
/// > Field-to-curve call expects `128` bytes an input that is interpreted as an element of the quadratic extension field.
/// > Output of this call is `256` bytes and is G2 point following respective encoding rules.
fn execute(handle: &mut impl PrecompileHandle) -> PrecompileResult {
handle.record_cost(Bls12377MapG2::GAS_COST)?;
Expand Down
2 changes: 1 addition & 1 deletion precompiles/src/solidity/codec/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ pub use native::{Address, BoundedVec};
// derive macro
pub use precompile_utils_macro::Codec;

/// Data that can be encoded/encoded followiong the Solidity ABI Specification.
/// Data that can be encoded/encoded following the Solidity ABI Specification.
pub trait Codec: Sized {
fn read(reader: &mut Reader) -> MayRevert<Self>;
fn write(writer: &mut Writer, value: Self);
Expand Down
2 changes: 1 addition & 1 deletion precompiles/src/solidity/codec/native.rs
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ impl<T: Codec, S: Get<u32>> Codec for BoundedVec<T, S> {

for inner in value {
// Any offset in items are relative to the start of the item instead of the
// start of the array. However if there is offseted data it must but appended after
// start of the array. However if there is offsetted data it must but appended after
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AFAIK, the more widely accepted form is offset rather than offsetted.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AFAIK, the more widely accepted form is offset rather than offsetted.

I'll redo it in the next PR

// all items (offsets) are written. We thus need to rely on `compute_offsets` to do
// that, and must store a "shift" to correct the offsets.
let shift = inner_writer.data.len();
Expand Down
Loading