mirror of
https://github.com/kremalicious/metamask-extension.git
synced 2024-11-21 17:37:01 +01:00
Add testing documentation (#17411)
* Add testing documentation * Implement diff filtering by regex * change to relative import
This commit is contained in:
parent
6b64572f8d
commit
26f6ae4c7c
28
.github/workflows/fitness-functions.yml
vendored
Normal file
28
.github/workflows/fitness-functions.yml
vendored
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
name: Fitness Functions CI
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
types: [assigned, opened, synchronize, reopened]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
|
- name: Use Node.js
|
||||||
|
uses: actions/setup-node@v3
|
||||||
|
with:
|
||||||
|
node-version: '16'
|
||||||
|
|
||||||
|
- name: Run fitness functions
|
||||||
|
env:
|
||||||
|
HEAD_REF: ${{ github.event.pull_request.head.ref }}
|
||||||
|
BASE_REF: ${{ github.event.pull_request.base.ref }}
|
||||||
|
run: |
|
||||||
|
git fetch origin $HEAD_REF
|
||||||
|
git fetch origin $BASE_REF
|
||||||
|
git diff origin/$BASE_REF origin/$HEAD_REF -- . ':(exclude)development/fitness-functions/*' > diff
|
||||||
|
npm run fitness-functions -- "ci" "./diff"
|
4
.husky/pre-commit
Executable file
4
.husky/pre-commit
Executable file
@ -0,0 +1,4 @@
|
|||||||
|
#!/usr/bin/env sh
|
||||||
|
. "$(dirname -- "$0")/_/husky.sh"
|
||||||
|
|
||||||
|
npm run fitness-functions -- "pre-commit-hook"
|
65
README.md
65
README.md
@ -15,21 +15,29 @@ To learn how to contribute to the MetaMask project itself, visit our [Internal D
|
|||||||
## Building locally
|
## Building locally
|
||||||
|
|
||||||
- Install [Node.js](https://nodejs.org) version 16
|
- Install [Node.js](https://nodejs.org) version 16
|
||||||
- If you are using [nvm](https://github.com/nvm-sh/nvm#installing-and-updating) (recommended) running `nvm use` will automatically choose the right node version for you.
|
- If you are using [nvm](https://github.com/nvm-sh/nvm#installing-and-updating) (recommended) running `nvm use` will automatically choose the right node version for you.
|
||||||
- Install [Yarn v3](https://yarnpkg.com/getting-started/install)
|
- Install [Yarn v3](https://yarnpkg.com/getting-started/install)
|
||||||
- Install dependencies: `yarn`
|
- Install dependencies: `yarn`
|
||||||
- Copy the `.metamaskrc.dist` file to `.metamaskrc`
|
- Copy the `.metamaskrc.dist` file to `.metamaskrc`
|
||||||
- Replace the `INFURA_PROJECT_ID` value with your own personal [Infura Project ID](https://infura.io/docs).
|
- Replace the `INFURA_PROJECT_ID` value with your own personal [Infura Project ID](https://infura.io/docs).
|
||||||
- If debugging MetaMetrics, you'll need to add a value for `SEGMENT_WRITE_KEY` [Segment write key](https://segment.com/docs/connections/find-writekey/), see [Developing on MetaMask - Segment](./development/README.md#segment).
|
- If debugging MetaMetrics, you'll need to add a value for `SEGMENT_WRITE_KEY` [Segment write key](https://segment.com/docs/connections/find-writekey/), see [Developing on MetaMask - Segment](./development/README.md#segment).
|
||||||
- If debugging unhandled exceptions, you'll need to add a value for `SENTRY_DSN` [Sentry Dsn](https://docs.sentry.io/product/sentry-basics/dsn-explainer/), see [Developing on MetaMask - Sentry](./development/README.md#sentry).
|
- If debugging unhandled exceptions, you'll need to add a value for `SENTRY_DSN` [Sentry Dsn](https://docs.sentry.io/product/sentry-basics/dsn-explainer/), see [Developing on MetaMask - Sentry](./development/README.md#sentry).
|
||||||
- Optionally, replace the `PASSWORD` value with your development wallet password to avoid entering it each time you open the app.
|
- Optionally, replace the `PASSWORD` value with your development wallet password to avoid entering it each time you open the app.
|
||||||
- Build the project to the `./dist/` folder with `yarn dist`.
|
- Build the project to the `./dist/` folder with `yarn dist`.
|
||||||
- Optionally, you may run `yarn start` to run dev mode.
|
- Optionally, you may run `yarn start` to run dev mode.
|
||||||
|
|
||||||
Uncompressed builds can be found in `/dist`, compressed builds can be found in `/builds` once they're built.
|
Uncompressed builds can be found in `/dist`, compressed builds can be found in `/builds` once they're built.
|
||||||
|
|
||||||
See the [build system readme](./development/build/README.md) for build system usage information.
|
See the [build system readme](./development/build/README.md) for build system usage information.
|
||||||
|
|
||||||
|
## Git Hooks
|
||||||
|
|
||||||
|
To get quick feedback from our shared code quality fitness functions before committing the code, you can install our git hooks with Husky.
|
||||||
|
|
||||||
|
`$ yarn githooks:install`
|
||||||
|
|
||||||
|
You can read more about them in our [testing documentation](./docs/testing.md#fitness-functions-measuring-progress-in-code-quality-and-preventing-regressions-using-custom-git-hooks).
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
### Development builds
|
### Development builds
|
||||||
@ -41,6 +49,7 @@ To start a development build (e.g. with logging and file watching) run `yarn sta
|
|||||||
To start the [React DevTools](https://github.com/facebook/react-devtools), run `yarn devtools:react` with a development build installed in a browser. This will open in a separate window; no browser extension is required.
|
To start the [React DevTools](https://github.com/facebook/react-devtools), run `yarn devtools:react` with a development build installed in a browser. This will open in a separate window; no browser extension is required.
|
||||||
|
|
||||||
To start the [Redux DevTools Extension](https://github.com/reduxjs/redux-devtools/tree/main/extension):
|
To start the [Redux DevTools Extension](https://github.com/reduxjs/redux-devtools/tree/main/extension):
|
||||||
|
|
||||||
- Install the package `remotedev-server` globally (e.g. `yarn global add remotedev-server`)
|
- Install the package `remotedev-server` globally (e.g. `yarn global add remotedev-server`)
|
||||||
- Install the Redux Devtools extension.
|
- Install the Redux Devtools extension.
|
||||||
- Open the Redux DevTools extension and check the "Use custom (local) server" checkbox in the Remote DevTools Settings, using the default server configuration (host `localhost`, port `8000`, secure connection checkbox unchecked).
|
- Open the Redux DevTools extension and check the "Use custom (local) server" checkbox in the Remote DevTools Settings, using the default server configuration (host `localhost`, port `8000`, secure connection checkbox unchecked).
|
||||||
@ -67,8 +76,9 @@ Our e2e test suite can be run on either Firefox or Chrome.
|
|||||||
|
|
||||||
1. **required** `yarn build:test` to create a test build.
|
1. **required** `yarn build:test` to create a test build.
|
||||||
2. run tests, targetting the browser:
|
2. run tests, targetting the browser:
|
||||||
* Firefox e2e tests can be run with `yarn test:e2e:firefox`.
|
|
||||||
* Chrome e2e tests can be run with `yarn test:e2e:chrome`. The `chromedriver` package major version must match the major version of your local Chrome installation. If they don't match, update whichever is behind before running Chrome e2e tests.
|
- Firefox e2e tests can be run with `yarn test:e2e:firefox`.
|
||||||
|
- Chrome e2e tests can be run with `yarn test:e2e:chrome`. The `chromedriver` package major version must match the major version of your local Chrome installation. If they don't match, update whichever is behind before running Chrome e2e tests.
|
||||||
|
|
||||||
These test scripts all support additional options, which might be helpful for debugging. Run the script with the flag `--help` to see all options.
|
These test scripts all support additional options, which might be helpful for debugging. Run the script with the flag `--help` to see all options.
|
||||||
|
|
||||||
@ -95,25 +105,25 @@ For example, to run the `account-details` tests using Chrome, with debug logging
|
|||||||
|
|
||||||
Whenever you change dependencies (adding, removing, or updating, either in `package.json` or `yarn.lock`), there are various files that must be kept up-to-date.
|
Whenever you change dependencies (adding, removing, or updating, either in `package.json` or `yarn.lock`), there are various files that must be kept up-to-date.
|
||||||
|
|
||||||
* `yarn.lock`:
|
- `yarn.lock`:
|
||||||
* Run `yarn` again after your changes to ensure `yarn.lock` has been properly updated.
|
- Run `yarn` again after your changes to ensure `yarn.lock` has been properly updated.
|
||||||
* Run `yarn lint:lockfile:dedupe:fix` to remove duplicate dependencies from the lockfile.
|
- Run `yarn lint:lockfile:dedupe:fix` to remove duplicate dependencies from the lockfile.
|
||||||
* The `allow-scripts` configuration in `package.json`
|
- The `allow-scripts` configuration in `package.json`
|
||||||
* Run `yarn allow-scripts auto` to update the `allow-scripts` configuration automatically. This config determines whether the package's install/postinstall scripts are allowed to run. Review each new package to determine whether the install script needs to run or not, testing if necessary.
|
- Run `yarn allow-scripts auto` to update the `allow-scripts` configuration automatically. This config determines whether the package's install/postinstall scripts are allowed to run. Review each new package to determine whether the install script needs to run or not, testing if necessary.
|
||||||
* Unfortunately, `yarn allow-scripts auto` will behave inconsistently on different platforms. macOS and Windows users may see extraneous changes relating to optional dependencies.
|
- Unfortunately, `yarn allow-scripts auto` will behave inconsistently on different platforms. macOS and Windows users may see extraneous changes relating to optional dependencies.
|
||||||
* The LavaMoat policy files. The _tl;dr_ is to run `yarn lavamoat:auto` to update these files, but there can be devils in the details:
|
- The LavaMoat policy files. The _tl;dr_ is to run `yarn lavamoat:auto` to update these files, but there can be devils in the details:
|
||||||
* There are two sets of LavaMoat policy files:
|
- There are two sets of LavaMoat policy files:
|
||||||
* The production LavaMoat policy files (`lavamoat/browserify/*/policy.json`), which are re-generated using `yarn lavamoat:background:auto`. Add `--help` for usage.
|
- The production LavaMoat policy files (`lavamoat/browserify/*/policy.json`), which are re-generated using `yarn lavamoat:background:auto`. Add `--help` for usage.
|
||||||
* These should be regenerated whenever the production dependencies for the background change.
|
- These should be regenerated whenever the production dependencies for the background change.
|
||||||
* The build system LavaMoat policy file (`lavamoat/build-system/policy.json`), which is re-generated using `yarn lavamoat:build:auto`.
|
- The build system LavaMoat policy file (`lavamoat/build-system/policy.json`), which is re-generated using `yarn lavamoat:build:auto`.
|
||||||
* This should be regenerated whenever the dependencies used by the build system itself change.
|
- This should be regenerated whenever the dependencies used by the build system itself change.
|
||||||
* Whenever you regenerate a policy file, review the changes to determine whether the access granted to each package seems appropriate.
|
- Whenever you regenerate a policy file, review the changes to determine whether the access granted to each package seems appropriate.
|
||||||
* Unfortunately, `yarn lavamoat:auto` will behave inconsistently on different platforms.
|
- Unfortunately, `yarn lavamoat:auto` will behave inconsistently on different platforms.
|
||||||
macOS and Windows users may see extraneous changes relating to optional dependencies.
|
macOS and Windows users may see extraneous changes relating to optional dependencies.
|
||||||
* If you keep getting policy failures even after regenerating the policy files, try regenerating the policies after a clean install by doing:
|
- If you keep getting policy failures even after regenerating the policy files, try regenerating the policies after a clean install by doing:
|
||||||
* `rm -rf node_modules/ && yarn && yarn lavamoat:auto`
|
- `rm -rf node_modules/ && yarn && yarn lavamoat:auto`
|
||||||
* Keep in mind that any kind of dynamic import or dynamic use of globals may elude LavaMoat's static analysis.
|
- Keep in mind that any kind of dynamic import or dynamic use of globals may elude LavaMoat's static analysis.
|
||||||
Refer to the LavaMoat documentation or ask for help if you run into any issues.
|
Refer to the LavaMoat documentation or ask for help if you run into any issues.
|
||||||
|
|
||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
@ -133,6 +143,7 @@ Whenever you change dependencies (adding, removing, or updating, either in `pack
|
|||||||
- [How to generate a visualization of this repository's development](./development/gource-viz.sh)
|
- [How to generate a visualization of this repository's development](./development/gource-viz.sh)
|
||||||
|
|
||||||
## Dapp Developer Resources
|
## Dapp Developer Resources
|
||||||
|
|
||||||
- [Extend MetaMask's features w/ MetaMask Snaps.](https://docs.metamask.io/guide/snaps.html)
|
- [Extend MetaMask's features w/ MetaMask Snaps.](https://docs.metamask.io/guide/snaps.html)
|
||||||
- [Prompt your users to add and switch to a new network.](https://medium.com/metamask/connect-users-to-layer-2-networks-with-the-metamask-custom-networks-api-d0873fac51e5)
|
- [Prompt your users to add and switch to a new network.](https://medium.com/metamask/connect-users-to-layer-2-networks-with-the-metamask-custom-networks-api-d0873fac51e5)
|
||||||
- [Change the logo that appears when your dapp connects to MetaMask.](https://docs.metamask.io/guide/defining-your-icon.html)
|
- [Change the logo that appears when your dapp connects to MetaMask.](https://docs.metamask.io/guide/defining-your-icon.html)
|
||||||
|
44
development/fitness-functions/check-mocha-syntax.js
Normal file
44
development/fitness-functions/check-mocha-syntax.js
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
const {
|
||||||
|
filterDiffAdditions,
|
||||||
|
filterDiffByFilePath,
|
||||||
|
hasNumberOfCodeBlocksIncreased,
|
||||||
|
} = require('./shared');
|
||||||
|
|
||||||
|
function checkMochaSyntax(diff) {
|
||||||
|
const ruleHeading = 'favor-jest-instead-of-mocha';
|
||||||
|
const codeBlocks = [
|
||||||
|
"import { strict as assert } from 'assert';",
|
||||||
|
'assert.deepEqual',
|
||||||
|
'assert.equal',
|
||||||
|
'assert.rejects',
|
||||||
|
'assert.strictEqual',
|
||||||
|
'sinon.',
|
||||||
|
];
|
||||||
|
|
||||||
|
console.log(`Checking ${ruleHeading}...`);
|
||||||
|
|
||||||
|
const jsFilesExcludingE2ETests = '^(?!.*/test/e2e/).*.(js|ts|jsx)$';
|
||||||
|
const diffByFilePath = filterDiffByFilePath(diff, jsFilesExcludingE2ETests);
|
||||||
|
const diffAdditions = filterDiffAdditions(diffByFilePath);
|
||||||
|
const hashmap = hasNumberOfCodeBlocksIncreased(diffAdditions, codeBlocks);
|
||||||
|
|
||||||
|
Object.keys(hashmap).forEach((key) => {
|
||||||
|
if (hashmap[key]) {
|
||||||
|
console.error(`Number of occurences of "${key}" have increased.`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (Object.values(hashmap).includes(true)) {
|
||||||
|
console.error(
|
||||||
|
`...changes have not been committed.\nFor more info, see: https://github.com/MetaMask/metamask-extension/blob/develop/docs/testing.md#${ruleHeading}`,
|
||||||
|
);
|
||||||
|
process.exit(1);
|
||||||
|
} else {
|
||||||
|
console.log(
|
||||||
|
`...number of occurences has not increased for any code block.`,
|
||||||
|
);
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = { checkMochaSyntax };
|
53
development/fitness-functions/index.js
Normal file
53
development/fitness-functions/index.js
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
const fs = require('fs');
|
||||||
|
const { execSync } = require('child_process');
|
||||||
|
const { checkMochaSyntax } = require('./check-mocha-syntax');
|
||||||
|
|
||||||
|
const AUTOMATION_TYPE = Object.freeze({
|
||||||
|
CI: 'ci',
|
||||||
|
PRE_COMMIT_HOOK: 'pre-commit-hook',
|
||||||
|
PRE_PUSH_HOOK: 'pre-push-hook',
|
||||||
|
});
|
||||||
|
|
||||||
|
const automationType = process.argv[2];
|
||||||
|
|
||||||
|
if (automationType === AUTOMATION_TYPE.CI) {
|
||||||
|
const optionalArguments = process.argv.slice(3);
|
||||||
|
if (optionalArguments.length !== 1) {
|
||||||
|
console.error('Invalid number of arguments.');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const diff = fs.readFileSync(optionalArguments[0], {
|
||||||
|
encoding: 'utf8',
|
||||||
|
flag: 'r',
|
||||||
|
});
|
||||||
|
|
||||||
|
checkMochaSyntax(diff);
|
||||||
|
} else if (automationType === AUTOMATION_TYPE.PRE_COMMIT_HOOK) {
|
||||||
|
const diff = getPreCommitHookDiff();
|
||||||
|
|
||||||
|
checkMochaSyntax(diff);
|
||||||
|
} else if (automationType === AUTOMATION_TYPE.PRE_PUSH_HOOK) {
|
||||||
|
const diff = getPrePushHookDiff();
|
||||||
|
|
||||||
|
checkMochaSyntax(diff);
|
||||||
|
} else {
|
||||||
|
console.error('Invalid automation type.');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
function getPreCommitHookDiff() {
|
||||||
|
return execSync(`git diff --cached HEAD`).toString().trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
function getPrePushHookDiff() {
|
||||||
|
const currentBranch = execSync(`git rev-parse --abbrev-ref HEAD`)
|
||||||
|
.toString()
|
||||||
|
.trim();
|
||||||
|
|
||||||
|
return execSync(
|
||||||
|
`git diff ${currentBranch} origin/${currentBranch} -- . ':(exclude)development/fitness-functions/'`,
|
||||||
|
)
|
||||||
|
.toString()
|
||||||
|
.trim();
|
||||||
|
}
|
70
development/fitness-functions/shared.js
Normal file
70
development/fitness-functions/shared.js
Normal file
@ -0,0 +1,70 @@
|
|||||||
|
function filterDiffByFilePath(diff, regex) {
|
||||||
|
// split by `diff --git` and remove the first element which is empty
|
||||||
|
const diffBlocks = diff.split(`diff --git`).slice(1);
|
||||||
|
|
||||||
|
const filteredDiff = diffBlocks
|
||||||
|
.map((block) => block.trim())
|
||||||
|
.filter((block) => {
|
||||||
|
let didAPathInBlockMatchRegEx = false;
|
||||||
|
|
||||||
|
block
|
||||||
|
// get the first line of the block which has the paths
|
||||||
|
.split('\n')[0]
|
||||||
|
.trim()
|
||||||
|
// split the two paths
|
||||||
|
.split(' ')
|
||||||
|
// remove `a/` and `b/` from the paths
|
||||||
|
.map((path) => path.substring(2))
|
||||||
|
// if at least one of the two paths matches the regex, filter the
|
||||||
|
// corresponding diff block in
|
||||||
|
.forEach((path) => {
|
||||||
|
if (new RegExp(regex, 'u').test(path)) {
|
||||||
|
didAPathInBlockMatchRegEx = true;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return didAPathInBlockMatchRegEx;
|
||||||
|
})
|
||||||
|
// prepend `git --diff` to each block
|
||||||
|
.map((block) => `diff --git ${block}`)
|
||||||
|
.join('\n');
|
||||||
|
|
||||||
|
return filteredDiff;
|
||||||
|
}
|
||||||
|
|
||||||
|
function filterDiffAdditions(diff) {
|
||||||
|
const diffLines = diff.split('\n');
|
||||||
|
|
||||||
|
const diffAdditionLines = diffLines.filter((line) => {
|
||||||
|
const isAdditionLine = line.startsWith('+') && !line.startsWith('+++');
|
||||||
|
|
||||||
|
return isAdditionLine;
|
||||||
|
});
|
||||||
|
|
||||||
|
return diffAdditionLines.join('/n');
|
||||||
|
}
|
||||||
|
|
||||||
|
function hasNumberOfCodeBlocksIncreased(diffFragment, codeBlocks) {
|
||||||
|
const diffLines = diffFragment.split('\n');
|
||||||
|
|
||||||
|
const codeBlockFound = {};
|
||||||
|
|
||||||
|
for (const codeBlock of codeBlocks) {
|
||||||
|
codeBlockFound[codeBlock] = false;
|
||||||
|
|
||||||
|
for (const diffLine of diffLines) {
|
||||||
|
if (diffLine.includes(codeBlock)) {
|
||||||
|
codeBlockFound[codeBlock] = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return codeBlockFound;
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
filterDiffByFilePath,
|
||||||
|
filterDiffAdditions,
|
||||||
|
hasNumberOfCodeBlocksIncreased,
|
||||||
|
};
|
158
development/fitness-functions/shared.test.js
Normal file
158
development/fitness-functions/shared.test.js
Normal file
@ -0,0 +1,158 @@
|
|||||||
|
const {
|
||||||
|
filterDiffAdditions,
|
||||||
|
hasNumberOfCodeBlocksIncreased,
|
||||||
|
filterDiffByFilePath,
|
||||||
|
} = require('./shared');
|
||||||
|
|
||||||
|
const generateCreateFileDiff = (filePath, content) => `
|
||||||
|
diff --git a/${filePath} b/${filePath}
|
||||||
|
new file mode 100644
|
||||||
|
index 000000000..30d74d258
|
||||||
|
--- /dev/null
|
||||||
|
+++ b/${filePath}
|
||||||
|
@@ -0,0 +1 @@
|
||||||
|
+${content}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const generateModifyFilesDiff = (filePath, addition, removal) => `
|
||||||
|
diff --git a/${filePath} b/${filePath}
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/${filePath}
|
||||||
|
+++ b/${filePath}
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+${addition}
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-${removal}
|
||||||
|
`;
|
||||||
|
|
||||||
|
describe('filterDiffAdditions()', () => {
|
||||||
|
it('should return code additions in the diff', () => {
|
||||||
|
const testFilePath = 'new-file.js';
|
||||||
|
const testAddition = 'foo';
|
||||||
|
const testFileDiff = generateCreateFileDiff(testFilePath, testAddition);
|
||||||
|
|
||||||
|
const actualResult = filterDiffAdditions(testFileDiff);
|
||||||
|
const expectedResult = `+${testAddition}`;
|
||||||
|
|
||||||
|
expect(actualResult).toStrictEqual(expectedResult);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hasNumberOfCodeBlocksIncreased()', () => {
|
||||||
|
it('should show which code blocks have increased', () => {
|
||||||
|
const testDiffFragment = `
|
||||||
|
+foo
|
||||||
|
+bar
|
||||||
|
+baz`;
|
||||||
|
const testCodeBlocks = ['code block 1', 'foo', 'baz'];
|
||||||
|
|
||||||
|
const actualResult = hasNumberOfCodeBlocksIncreased(
|
||||||
|
testDiffFragment,
|
||||||
|
testCodeBlocks,
|
||||||
|
);
|
||||||
|
const expectedResult = { 'code block 1': false, foo: true, baz: true };
|
||||||
|
|
||||||
|
expect(actualResult).toStrictEqual(expectedResult);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('filterDiffByFilePath()', () => {
|
||||||
|
const testFileDiff = [
|
||||||
|
generateModifyFilesDiff('new-file.ts', 'foo', 'bar'),
|
||||||
|
generateModifyFilesDiff('old-file.js', 'ping', 'pong'),
|
||||||
|
generateModifyFilesDiff('old-file.jsx', 'yin', 'yang'),
|
||||||
|
].join('');
|
||||||
|
|
||||||
|
it('should return the right diff for a generic matcher', () => {
|
||||||
|
const actualResult = filterDiffByFilePath(
|
||||||
|
testFileDiff,
|
||||||
|
'.*/.*.(js|ts)$|.*.(js|ts)$',
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(actualResult).toMatchInlineSnapshot(`
|
||||||
|
"diff --git a/new-file.ts b/new-file.ts
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/new-file.ts
|
||||||
|
+++ b/new-file.ts
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+foo
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-bar
|
||||||
|
diff --git a/old-file.js b/old-file.js
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/old-file.js
|
||||||
|
+++ b/old-file.js
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+ping
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-pong"
|
||||||
|
`);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return the right diff for a specific file in any dir matcher', () => {
|
||||||
|
const actualResult = filterDiffByFilePath(testFileDiff, '.*old-file.js$');
|
||||||
|
|
||||||
|
expect(actualResult).toMatchInlineSnapshot(`
|
||||||
|
"diff --git a/old-file.js b/old-file.js
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/old-file.js
|
||||||
|
+++ b/old-file.js
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+ping
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-pong"
|
||||||
|
`);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return the right diff for a multiple file extension (OR) matcher', () => {
|
||||||
|
const actualResult = filterDiffByFilePath(
|
||||||
|
testFileDiff,
|
||||||
|
'^(./)*old-file.(js|ts|jsx)$',
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(actualResult).toMatchInlineSnapshot(`
|
||||||
|
"diff --git a/old-file.js b/old-file.js
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/old-file.js
|
||||||
|
+++ b/old-file.js
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+ping
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-pong
|
||||||
|
diff --git a/old-file.jsx b/old-file.jsx
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/old-file.jsx
|
||||||
|
+++ b/old-file.jsx
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+yin
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-yang"
|
||||||
|
`);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return the right diff for a file name negation matcher', () => {
|
||||||
|
const actualResult = filterDiffByFilePath(
|
||||||
|
testFileDiff,
|
||||||
|
'^(?!.*old-file.js$).*.[a-zA-Z]+$',
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(actualResult).toMatchInlineSnapshot(`
|
||||||
|
"diff --git a/new-file.ts b/new-file.ts
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/new-file.ts
|
||||||
|
+++ b/new-file.ts
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+foo
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-bar
|
||||||
|
diff --git a/old-file.jsx b/old-file.jsx
|
||||||
|
index 57d5de75c..808d8ba37 100644
|
||||||
|
--- a/old-file.jsx
|
||||||
|
+++ b/old-file.jsx
|
||||||
|
@@ -1,3 +1,8 @@
|
||||||
|
+yin
|
||||||
|
@@ -34,33 +39,4 @@
|
||||||
|
-yang"
|
||||||
|
`);
|
||||||
|
});
|
||||||
|
});
|
212
docs/testing.md
Normal file
212
docs/testing.md
Normal file
@ -0,0 +1,212 @@
|
|||||||
|
# Automated Testing in the MetaMask Extension Repository
|
||||||
|
|
||||||
|
The purpose of this document is to summarize the testing tactics we want to align on for this repository.
|
||||||
|
|
||||||
|
In its current version, this document does not include discussions on manual testing, performance testing, or browser testing. Our thoughts on our e2e tests will be shared on a separate file (to be linked here).
|
||||||
|
|
||||||
|
To contribute, submit a PR. We encourage you to read the section about fitness functions below and include automation in your contribution.
|
||||||
|
|
||||||
|
## Why We Test
|
||||||
|
|
||||||
|
Testing is important because it helps ensure that software works as intended.
|
||||||
|
|
||||||
|
Without automated tests, bugs organically occur in production or during manual regression testing for release candidates.
|
||||||
|
|
||||||
|
This can lead to user dissatisfaction and costly rework. It also carries a big process and organizational overhead because people have to stop what they are doing multiple times and reach out to each other to uphold correctness.
|
||||||
|
|
||||||
|
Speeding up the feedback loop from writing software to discovering bugs directly increases the productivity of the software delivery team. Apart from that, it minimizes context switching, increasing developer happiness and, indirectly, developer productivity.
|
||||||
|
|
||||||
|
Testing is also proven to teach developers to write better and more readable code. Unit tests, for example, encourage separating functionality into smaller units that do less.
|
||||||
|
|
||||||
|
Tests are generally the best way to document code. They are better than comments because they don't grow stale. If a block of code is moved or changed, developers must remember to update the corresponding comments. However, automated tests will break, and it'll be evident and impossible to miss what needs fixing by the developer.
|
||||||
|
|
||||||
|
## Best Practices & Recommendations
|
||||||
|
|
||||||
|
### Favor Jest instead of Mocha
|
||||||
|
|
||||||
|
For consistency in syntax and coverage tooling, we favor using Jest for all unit tests. That means we don't need to use Sinon for mocks and stubs or Assert for assertions and that we'll eventually not need Mocha to run the tests. Jest replaces functionality from all these dependencies in one package.
|
||||||
|
|
||||||
|
While Mocha is a test runner that can be used for unit, integration, and end-to-end testing, jest has been built specifically for unit testing and is much faster.
|
||||||
|
|
||||||
|
To get a sense of the syntactic differences between the two approaches, you can see this [example PR](https://github.com/MetaMask/metamask-extension/pull/17226/files).
|
||||||
|
|
||||||
|
### Tests Should Mirror How Software is Used
|
||||||
|
|
||||||
|
One possible way to write tests is to manipulate the internal state directly and artificially increase test coverage. Unfortunately, these tests are ephemeral and brittle because they are coupled with implementation details. In other words, if a piece of code is altered or refactored, the tests need to be substantially rewritten, and we can't conclusively determine whether or not functionality has changed.
|
||||||
|
|
||||||
|
The goal of tests is precisely to improve the quality of the implementation while keeping functionality constant. When testing the UI, the test might include inspecting the text in DOM elements and simulating clicking behaviors. [This PR includes good examples](https://github.com/MetaMask/metamask-extension/pull/17360/files) of that. Instead of changing the internal state and props of the component programmatically, the testing API encourages interaction with DOM elements. This was a [deliberate design choice](https://testing-library.com/docs/guiding-principles) for React Testing Library.
|
||||||
|
|
||||||
|
For unit tests that are not related to the UI of the extension, we can't rely on the library to direct us on how to write our tests, but the same principle can be generalized. We aim to test a unit code for all the possible interaction scenarios. Mocks should be used sparingly, as they often make the test diverge from how the unit is used in production.
|
||||||
|
|
||||||
|
### The Anatomy of a Test
|
||||||
|
|
||||||
|
A good way to reason about tests is to think of their typical structure.
|
||||||
|
|
||||||
|
The first block is the SETUP, where the object of the test is instantiated or rendered. This typically includes initializing the state to be passed as an argument and mocking or stubbing functions as needed.
|
||||||
|
|
||||||
|
ACT is the trigger after which the functionality should occur. (The first part of "**When foo**, then bar")
|
||||||
|
|
||||||
|
ASSERT is the consequence of the trigger that has happened. (The second part of "When foo, **then bar**")
|
||||||
|
|
||||||
|
The TEARDOWN block typically cancels the setup by cleaning up the state to ensure test isolation (see section "Isolation and independence").
|
||||||
|
|
||||||
|
You can read more about [this structure here](https://github.com/goldbergyoni/nodebestpractices/blob/master/sections/testingandquality/aaa.md).
|
||||||
|
|
||||||
|
Jest provides [APIs](https://jestjs.io/docs/setup-teardown) to remove duplication on setup and teardown.
|
||||||
|
|
||||||
|
### Isolation & Independence
|
||||||
|
|
||||||
|
Tests should not depend on each other to pass. This can be verified by running one test of the suite only or by changing the test order. Setup and tear-down steps are used to ensure this.
|
||||||
|
|
||||||
|
You can read more on the topic [here](https://github.com/goldbergyoni/nodebestpractices/blob/master/sections/testingandquality/avoid-global-test-fixture.md).
|
||||||
|
|
||||||
|
### Manage Test Data
|
||||||
|
|
||||||
|
Test data makes up for a lot of the boilerplate for tests.
|
||||||
|
|
||||||
|
To avoid duplication hell and enhance readability, it's sometimes helpful to transform the base objects into object generators that accept overrides.
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const generateGasEstimatorProps = (overrides) => ({
|
||||||
|
maxPriorityFeePerGas: '10',
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
|
||||||
|
const generateAppState = (overrides) => ({
|
||||||
|
networkDropdownOpen: false,
|
||||||
|
gasIsLoading: false,
|
||||||
|
isLoading: false,
|
||||||
|
modal: {
|
||||||
|
open: false,
|
||||||
|
modalState: {
|
||||||
|
name: null,
|
||||||
|
props: {},
|
||||||
|
},
|
||||||
|
previousModalState: {
|
||||||
|
name: null,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
warning: null,
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('<GasEstimator />', () => {
|
||||||
|
const generateMockState = (overrides) => ({
|
||||||
|
appState: generateAppState(),
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
|
||||||
|
const defaultState = generateMockState();
|
||||||
|
const defaultStore = configureMockStore()(defaultState);
|
||||||
|
|
||||||
|
it('renders likely estimate', async () => {
|
||||||
|
const props = generateGasEstimatorProps({
|
||||||
|
maxPriorityFeePerGas: '10',
|
||||||
|
});
|
||||||
|
|
||||||
|
const { queryByText } = renderWithProvider(
|
||||||
|
<GasEstimator {...props} />,
|
||||||
|
defaultStore,
|
||||||
|
);
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(queryByText(/Very likely/u)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('renders unlikely estimate', async () => {
|
||||||
|
const props = generateGasEstimatorProps({
|
||||||
|
maxPriorityFeePerGas: '100',
|
||||||
|
});
|
||||||
|
|
||||||
|
const { queryByText } = renderWithProvider(
|
||||||
|
<GasEstimator {...props} />,
|
||||||
|
defaultStore,
|
||||||
|
);
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(queryByText(/Very unlikely/u)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
Using this pattern, each test contains the minimum sufficient information for setting up the assertion (setting the `maxPriorityFeePerGas`).
|
||||||
|
|
||||||
|
This also means test data is not entirely reused between different tests. In each test, the relevant data is explicitly stated.
|
||||||
|
|
||||||
|
The data generation functions can be placed at the top of the file if they are used exclusively in it or shared files if they are reused for testing multiple components or units of the repository.
|
||||||
|
|
||||||
|
### Using Jest timers
|
||||||
|
|
||||||
|
Imperative waiting for time to pass in a test locks the test execution more than it needs to and can cause flakiness and timeouts on slower machines and so it should generally be avoided.
|
||||||
|
|
||||||
|
Here's an example of that:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Wait 5 seconds
|
||||||
|
await new Promise<void>((resolve) => setTimeout(() => resolve(), 5000));
|
||||||
|
```
|
||||||
|
|
||||||
|
Jest has a [set of timer APIs](https://jestjs.io/docs/timer-mocks) to replace that anti-pattern. From the docs:
|
||||||
|
|
||||||
|
> The native timer functions (i.e., setTimeout(), setInterval(), clearTimeout(), clearInterval()) are less than ideal for a testing environment since they depend on real-time to elapse. Jest can swap out timers with functions that allow you to control the passage of time.
|
||||||
|
|
||||||
|
Using `jest.advanceTimersByTime(msToRun)`, we can tell Jest what effect we want to simulate on the test (declarative programming) instead of having to provoke the effect (imperative programming) using the promisified `setTimeout`.
|
||||||
|
|
||||||
|
Unfortunately, using these APIs With asynchronous code leaves unresolved promises, so we can use a custom `flushPromises` method to resolve them and let Jest optimize the execution speed in the background ([source](https://stackoverflow.com/questions/52177631/jest-timer-and-promise-dont-work-well-settimeout-and-async-function/58716087#58716087)):
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
function flushPromises(): Promise<unknown> {
|
||||||
|
return new Promise(jest.requireActual('timers').setImmediate);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function advanceTimersByTime(ms) {
|
||||||
|
jest.advanceTimersByTime(ms);
|
||||||
|
await flushPromises();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Here's [an example PR](https://github.com/MetaMask/core/pull/1002/files) for this pattern in our core library.
|
||||||
|
|
||||||
|
For UI tests, React Testing Library exposes another API that relies on the same idea of abstracting the passage of time instead of explicitly managing it:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
waitFor(() => {
|
||||||
|
expect(<element that needs will disappear once the DOM updates>).toBeRemoved()
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## Automation
|
||||||
|
|
||||||
|
### Code Coverage Requirements
|
||||||
|
|
||||||
|
Thanks to @brad.decker, we can now see our total test coverage using [codecov](https://app.codecov.io/gh/MetaMask/metamask-extension).
|
||||||
|
|
||||||
|
Codeconv's quality gate enforces that code coverage does not decrease. As per our 2023 Q1 OKR of ["Increase platform-wide reliability by expanding test coverage, and improving tooling and infrastructure"](https://docs.google.com/document/d/1dUCE9PJA0L6EMlVgG3y0dyiDrht1-MG-DhbEXp78QAs/edit#bookmark=id.s8dqtv3asm7x), we are working towards 80 code coverage.
|
||||||
|
|
||||||
|
For that to happen, simply not decreasing code coverage is not enough. Developers should progressively increase the coverage of the files they change. We don't currently have an automated way of expressing this, but it can be checked manually by using the Jest flag `--changedSince` ([read Jest docs for more info](https://jestjs.io/docs/cli#--changedsince)). Apart from our individual contributions, during PR reviews, it's OK to encourage others to continue increasing code coverage for their changes within reason. Remember that code coverage is not a silver bullet and that the quality of our tests is just as important.
|
||||||
|
|
||||||
|
In the future, we may use other tools to ensure an upward trajectory for code coverage, such as [Sonarcloud](https://docs.sonarcloud.io/improving/new-code-definition/) or other static checking tools.
|
||||||
|
|
||||||
|
### Fitness functions: measuring progress in code quality and preventing regressions using custom git hooks
|
||||||
|
|
||||||
|
When designing software architecture changes, we need to define objectively what "better" or "quality" means.
|
||||||
|
|
||||||
|
This may sound like a philosophical problem initially, but it's quite pragmatic. Recommendations such as the ones enumerated above are only as good as they can be defined, and communication and enforcement of standards are harder as team size increases. This is especially true when contributions are varied, such as with this repository, which has internal contributors from several teams and external contributors.
|
||||||
|
|
||||||
|
Just as we use code coverage as an imperfect yet directionally correct way to assess the state of testing, we can extend this concept to other aspects of our codebase and tests.
|
||||||
|
|
||||||
|
Widespread code coverage and static checks tools support an incremental approach to software improvement, but linting tools don't. And given the size of our codebase, it's not always realistic to make changes across the entire codebase in one PR.
|
||||||
|
|
||||||
|
One way to go about this is to define and automate [fitness functions](https://www.thoughtworks.com/radar/techniques/architectural-fitness-function), used to drive architecture changes in complex systems and measure progress against those goals. Just as important, this framework of fitness functions allows us to prevent regressions in code quality.
|
||||||
|
|
||||||
|
We use shareable git hooks using Husky to drive these incremental changes. Fitness functions are then version controlled, reviewed as part of the standard software development workflow, and permanently prevent regressions and encourage progress.
|
||||||
|
|
||||||
|
## References and further reading
|
||||||
|
|
||||||
|
https://github.com/MetaMask/core/issues/413
|
||||||
|
|
||||||
|
https://github.com/MetaMask/metamask-extension/pull/17056/files#r1064860162
|
||||||
|
|
||||||
|
https://github.com/testjavascript/nodejs-integration-tests-best-practices
|
@ -10,6 +10,7 @@ module.exports = {
|
|||||||
'<rootDir>/app/scripts/platforms/*.js',
|
'<rootDir>/app/scripts/platforms/*.js',
|
||||||
'<rootDir>/shared/**/*.(js|ts|tsx)',
|
'<rootDir>/shared/**/*.(js|ts|tsx)',
|
||||||
'<rootDir>/ui/**/*.(js|ts|tsx)',
|
'<rootDir>/ui/**/*.(js|ts|tsx)',
|
||||||
|
'<rootDir>/development/fitness-functions/**/*.test.(js|ts|tsx)',
|
||||||
],
|
],
|
||||||
coverageDirectory: './coverage',
|
coverageDirectory: './coverage',
|
||||||
coveragePathIgnorePatterns: ['.stories.js', '.snap'],
|
coveragePathIgnorePatterns: ['.stories.js', '.snap'],
|
||||||
@ -45,6 +46,7 @@ module.exports = {
|
|||||||
'<rootDir>/app/scripts/platforms/*.test.js',
|
'<rootDir>/app/scripts/platforms/*.test.js',
|
||||||
'<rootDir>/shared/**/*.test.(js|ts)',
|
'<rootDir>/shared/**/*.test.(js|ts)',
|
||||||
'<rootDir>/ui/**/*.test.(js|ts|tsx)',
|
'<rootDir>/ui/**/*.test.(js|ts|tsx)',
|
||||||
|
'<rootDir>/development/fitness-functions/**/*.test.(js|ts|tsx)',
|
||||||
],
|
],
|
||||||
testTimeout: 2500,
|
testTimeout: 2500,
|
||||||
// We have to specify the environment we are running in, which is jsdom. The
|
// We have to specify the environment we are running in, which is jsdom. The
|
||||||
|
@ -89,7 +89,9 @@
|
|||||||
"ts-migration:dashboard:watch": "yarn ts-migration:dashboard:build --watch",
|
"ts-migration:dashboard:watch": "yarn ts-migration:dashboard:build --watch",
|
||||||
"ts-migration:enumerate": "ts-node development/ts-migration-dashboard/scripts/write-list-of-files-to-convert.ts",
|
"ts-migration:enumerate": "ts-node development/ts-migration-dashboard/scripts/write-list-of-files-to-convert.ts",
|
||||||
"test-storybook": "test-storybook -c .storybook",
|
"test-storybook": "test-storybook -c .storybook",
|
||||||
"test-storybook:ci": "concurrently -k -s first -n \"SB,TEST\" -c \"magenta,blue\" \"yarn storybook:build && npx http-server storybook-build --port 6006 \" \"wait-on tcp:6006 && yarn test-storybook --maxWorkers=2\""
|
"test-storybook:ci": "concurrently -k -s first -n \"SB,TEST\" -c \"magenta,blue\" \"yarn storybook:build && npx http-server storybook-build --port 6006 \" \"wait-on tcp:6006 && yarn test-storybook --maxWorkers=2\"",
|
||||||
|
"githooks:install": "husky install",
|
||||||
|
"fitness-functions": "node development/fitness-functions/index.js"
|
||||||
},
|
},
|
||||||
"resolutions": {
|
"resolutions": {
|
||||||
"analytics-node/axios": "^0.21.2",
|
"analytics-node/axios": "^0.21.2",
|
||||||
@ -467,6 +469,7 @@
|
|||||||
"gulp-watch": "^5.0.1",
|
"gulp-watch": "^5.0.1",
|
||||||
"gulp-zip": "^5.1.0",
|
"gulp-zip": "^5.1.0",
|
||||||
"history": "^5.0.0",
|
"history": "^5.0.0",
|
||||||
|
"husky": "^8.0.3",
|
||||||
"improved-yarn-audit": "^3.0.0",
|
"improved-yarn-audit": "^3.0.0",
|
||||||
"ini": "^3.0.0",
|
"ini": "^3.0.0",
|
||||||
"istanbul-lib-coverage": "^3.2.0",
|
"istanbul-lib-coverage": "^3.2.0",
|
||||||
|
10
yarn.lock
10
yarn.lock
@ -19387,6 +19387,15 @@ __metadata:
|
|||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"husky@npm:^8.0.3":
|
||||||
|
version: 8.0.3
|
||||||
|
resolution: "husky@npm:8.0.3"
|
||||||
|
bin:
|
||||||
|
husky: lib/bin.js
|
||||||
|
checksum: 837bc7e4413e58c1f2946d38fb050f5d7324c6f16b0fd66411ffce5703b294bd21429e8ba58711cd331951ee86ed529c5be4f76805959ff668a337dbfa82a1b0
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"hyphenate-style-name@npm:^1.0.3":
|
"hyphenate-style-name@npm:^1.0.3":
|
||||||
version: 1.0.3
|
version: 1.0.3
|
||||||
resolution: "hyphenate-style-name@npm:1.0.3"
|
resolution: "hyphenate-style-name@npm:1.0.3"
|
||||||
@ -24346,6 +24355,7 @@ __metadata:
|
|||||||
gulp-zip: ^5.1.0
|
gulp-zip: ^5.1.0
|
||||||
history: ^5.0.0
|
history: ^5.0.0
|
||||||
human-standard-token-abi: ^2.0.0
|
human-standard-token-abi: ^2.0.0
|
||||||
|
husky: ^8.0.3
|
||||||
immer: ^9.0.6
|
immer: ^9.0.6
|
||||||
improved-yarn-audit: ^3.0.0
|
improved-yarn-audit: ^3.0.0
|
||||||
ini: ^3.0.0
|
ini: ^3.0.0
|
||||||
|
Loading…
Reference in New Issue
Block a user