1
0
mirror of https://github.com/oceanprotocol/market.git synced 2024-11-15 01:34:57 +01:00
🧜‍♀️ THE Data Market
Go to file
2021-04-22 14:03:58 +03:00
.github replace Travis with GitHub Actions workflow (#358) 2021-02-02 15:07:02 +02:00
.husky Removing showing updates on never updated assets (#438) 2021-03-16 19:23:11 +01:00
.storybook fix storybook typography display 2020-07-13 23:48:19 +02:00
.vscode Dev experience tweaks (#354) 2021-01-28 17:28:32 +01:00
content merge fix 2021-04-14 15:34:28 +02:00
gatsby copy, move terms file 2020-07-17 15:16:00 +02:00
scripts refactor network name output (#421) 2021-03-04 18:16:20 +01:00
src update price value for pool assets using consumePrice and added isConsumable logic 2021-04-22 14:03:58 +03:00
tests/unit Merge branch 'main' into feature/compute 2021-03-17 12:58:13 +01:00
_redirects netlify rewrite rules (#244) 2020-11-12 15:15:41 +01:00
.env.example The Graph sync status (#466) 2021-04-13 10:57:59 +02:00
.eslintrc add eslint-plugin-react-hooks 2020-09-16 11:08:35 +02:00
.gitignore Compute job history (#462) 2021-04-14 12:01:37 +03:00
.prettierrc replace Travis with GitHub Actions workflow (#358) 2021-02-02 15:07:02 +02:00
apollo.config.js Web3/Ocean splitup + full Polygon/Matic support (#433) 2021-03-17 11:44:26 +01:00
app.config.js feature/add toggle option for fixed & dynamic pricing (#453) 2021-03-24 20:27:32 +08:00
gatsby-browser.js downgrade gatsby (#339) 2021-01-26 23:00:54 +02:00
gatsby-config.js remove ga (#345) 2021-01-28 14:11:08 +02:00
gatsby-node.js Web3/Ocean splitup + full Polygon/Matic support (#433) 2021-03-17 11:44:26 +01:00
gatsby-ssr.js add providers with wrapRootElement 2020-07-01 18:57:10 +02:00
LICENSE first commit 2020-05-07 09:03:30 +03:00
package-lock.json update price value for pool assets using consumePrice and added isConsumable logic 2021-04-22 14:03:58 +03:00
package.json bump to ocean.js v0.14.4 2021-04-19 16:07:52 +02:00
README.md Update search and publish (#444) 2021-03-18 13:27:58 +02:00
tsconfig.json disable baseUrl 2020-10-30 16:31:22 +01:00
vercel.json set Vercel rewrites 2020-09-14 16:44:10 +02:00

banner

Ocean Marketplace

Build Status Netlify Status Maintainability Test Coverage js oceanprotocol

Table of Contents

🏄 Get Started

The app is a React app built with Gatsby.js + TypeScript + CSS modules and will connect to Ocean components in Rinkeby by default.

To start local development:

git clone git@github.com:oceanprotocol/market.git
cd market

npm install
npm start

This will start the development server under http://localhost:8000.

To explore the generated GraphQL data structure fire up the accompanying GraphiQL IDE under http://localhost:8000/__graphql.

Local components with Barge

If you prefer to connect to locally running components instead of remote connections, you can spin up barge and use a local Ganache network in another terminal before running npm start:

git clone git@github.com:oceanprotocol/barge.git
cd barge

# startup with local Ganache node
./start_ocean.sh

Barge will deploy contracts to the local Ganache node which will take some time. At the end the compiled artifacts need to be copied over to this project into node_modules/@oceanprotocol/contracts/artifacts. This script will do that for you:

./scripts/copy-contracts.sh

Finally, set environment variables to use this local connection in .env in the app:

# modify env variables, setting GATSBY_NETWORK="development"
cp .env.example .env

npm start

To use the app together with MetaMask, importing one of the accounts auto-generated by the Ganache container is the easiest way to have test ETH available. All of them have 100 ETH by default. Upon start, the ocean_ganache_1 container will print out the private keys of multiple accounts in its logs. Pick one of them and import into MetaMask.

To fully test all The Graph integrations, you have to run your own local Graph node with our ocean-subgraph deployed to it. Barge does not include a local subgraph so by default, the config.subgraphUri is hardcoded to the Rinkeby subgraph in our NetworkMonitor component.

Cleaning all Docker images so they are fetched freshly is often a good idea to make sure no issues are caused by old or stale images: docker system prune --all --volumes

🦑 Environment variables

The app.config.js file is setup to prioritize environment variables for setting each Ocean component endpoint. By setting environment variables, you can easily switch between Ocean networks the app connects to, without directly modifying app.config.js.

For local development, you can use a .env file:

# modify env variables, Rinkeby is enabled by default when using those files
cp .env.example .env

🦀 Data Sources

All displayed data in the app is presented around the concept of one data set, which is a combination of:

  • metadata about a data set
  • the actual data set files
  • the datatoken which represents the data set
  • financial data connected to this datatoken, either a pool or a fixed rate exchange contract
  • calculations and conversions based on financial data
  • metadata about publishers

All this data then comes from multiple sources:

Aquarius

All initial data sets and their metadata (DDO) is retrieved client-side on run-time from the Aquarius instance for each network. All app calls to Aquarius are done with 2 internal methods which mimic the same methods in ocean.js, but allow us:

  • to cancel requests when components get unmounted in combination with axios
  • hit Aquarius as early as possible without relying on any ocean.js initialization

Aquarius runs Elasticsearch under the hood so its stored metadata can be queried with Elasticsearch queries like so:

import { QueryResult } from '@oceanprotocol/lib/dist/node/metadatacache/MetadataCache'
import { queryMetadata } from '../../utils/aquarius'

const queryLatest = {
  page: 1,
  offset: 9,
  query: {
    // https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html
    query_string: { query: `-isInPurgatory:true` }
  },
  sort: { created: -1 }
}

function Component() {
  const { config } = useOcean()
  const [result, setResult] = useState<QueryResult>()

  useEffect(() => {
    if (!config?.metadataCacheUri) return
    const source = axios.CancelToken.source()

    async function init() {
      const result = await queryMetadata(
        query,
        config.metadataCacheUri,
        source.token
      )
      setResult(result)
    }
    init()

    return () => {
      source.cancel()
    }
  }, [config?.metadataCacheUri, query])

  return <div>{result}</div>
}

For components within a single data set view the useAsset() hook can be used, which in the background gets the respective metadata from Aquarius.

import { useAsset } from '../../../providers/Asset'

function Component() {
  const { ddo } = useAsset()
  return <div>{ddo}</div>
}

Ocean Protocol Subgraph

Most financial data in the market is retrieved with GraphQL from our own subgraph, rendered on top of the initial data coming from Aquarius.

The app has Apollo Client setup to query the respective subgraph based on network. In any component this client can be used like so:

import { gql, useQuery } from '@apollo/client'

const query = gql`
  query PoolLiquidity($id: ID!, $shareId: ID) {
    pool(id: $id) {
      id
      totalShares
    }
  }
`

function Component() {
  const { data } = useQuery(query, {}, pollInterval: 5000 })
  return <div>{data}</div>
}

3Box

Publishers can create a profile on 3Box Hub and when found, it will be displayed in the app.

For this our own 3box-proxy is used, within the app the utility method get3BoxProfile() can be used to get all info:

import get3BoxProfile from '../../../utils/profile'

function Component() {
  const [profile, setProfile] = useState<Profile>()

  useEffect(() => {
    if (!account) return
    const source = axios.CancelToken.source()

    async function get3Box() {
      const profile = await get3BoxProfile(account, source.token)
      if (!profile) return

      setProfile(profile)
    }
    get3Box()

    return () => {
      source.cancel()
    }
  }, [account])
  return (
    <div>
      {profile.emoji} {profile.name}
    </div>
  )
}

Purgatory

Based on list-purgatory some data sets get additional data. Within most components this can be done with the internal useAsset() hook which fetches data from the market-purgatory endpoint in the background.

For asset purgatory:

import { useAsset } from '../../../providers/Asset'

function Component() {
  const { isInPurgatory, purgatoryData } = useAsset()
  return isInPurgatory ? <div>{purgatoryData.reason}</div> : null
}

For account purgatory:

import { useWeb3 } from '../../../providers/Web3'
import { useAccountPurgatory } from '../../../hooks/useAccountPurgatory'

function Component() {
  const { accountId } = useWeb3()
  const { isInPurgatory, purgatoryData } = useAccountPurgatory(accountId)
  return isInPurgatory ? <div>{purgatoryData.reason}</div> : null
}

Network Metadata

All displayed chain & network metadata is retrieved from https://chainid.network on build time and integrated into Gatsby's GraphQL layer. This data source is a community-maintained GitHub repository under ethereum-lists/chains.

Within components this metadata can be queried for under allNetworksMetadataJson. The useWeb3() hook does this in the background to expose the final networkDisplayName for use in components:

export default function NetworkName(): ReactElement {
  const { networkDisplayName, isTestnet } = useWeb3()
  return (
    <>
      {networkDisplayName} {isTestnet && `(Test)`}
    </>
  )
}

🎨 Storybook

TODO: this is broken for most components. See https://github.com/oceanprotocol/market/issues/128

Storybook is set up for this project and is used for UI development of components. Stories are created inside src/components/ alongside each component in the form of ComponentName.stories.tsx.

To run the Storybook server, execute in your Terminal:

npm run storybook

This will launch the Storybook UI with all stories loaded under localhost:4000.

Code Style

Code style is automatically enforced through ESLint & Prettier rules:

  • Git pre-commit hook runs prettier on staged files, setup with Husky
  • VS Code suggested extensions and settings for auto-formatting on file save
  • CI runs a linting & TypeScript typings check with npm run lint, and fails if errors are found

For running linting and auto-formatting manually, you can use from the root of the project:

# linting check, also runs Typescript typings check
npm run lint

# auto format all files in the project with prettier, taking all configs into account
npm run format

👩‍🔬 Testing

TODO: this is broken and never runs in CI. See https://github.com/oceanprotocol/market/issues/128

Test suite for unit tests is setup with Jest as a test runner and:

To run all linting and unit tests:

npm test

For local development, you can start the test runner in a watch mode.

npm run test:watch

For analyzing the generated JavaScript bundle sizes you can use:

npm run analyze

🛳 Production

To create a production build, run from the root of the project:

npm run build
# serve production build
npm run serve

⬆️ Deployment

Every branch or Pull Request is automatically deployed to multiple hosts for redundancy and emergency reasons:

A link to a preview deployment will appear under each Pull Request.

The latest deployment of the main branch is automatically aliased to market.oceanprotocol.com, where the deployment on Netlify is the current live deployment.

💖 Contributing

We welcome contributions in form of bug reports, feature requests, code changes, or documentation improvements. Have a look at our contribution documentation for instructions and workflows:

🏛 License

Copyright 2021 Ocean Protocol Foundation Ltd.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.