diff --git a/.gitbook.yaml b/.gitbook.yaml new file mode 100644 index 00000000..fb7a797b --- /dev/null +++ b/.gitbook.yaml @@ -0,0 +1,65 @@ +root: ./ + +redirects: + readme/metamask-setup: discover/wallets/metamask-setup + readme/wallets: discover/wallets + readme/wallets-and-ocean-tokens: discover/wallets-and-ocean-tokens + core-concepts: developers + core-concepts/architecture: developers/architecture + core-concepts/datanft-and-datatoken: developers/contracts/datanft-and-datatoken + core-concepts/roles: developers/contracts/roles + core-concepts/networks: discover/networks + core-concepts/networks/bridges: discover/networks/bridges + core-concepts/fees: developers/contracts/fees + core-concepts/asset-pricing: developers/contracts/pricing-schemas + core-concepts/did-ddo: developers/identifiers + using-ocean-market: user-guides/using-ocean-market + using-ocean-market/marketplace-publish-data-asset: user-guides/publish-data-nfts + using-ocean-market/marketplace-download-data-asset: user-guides/buy-data-nfts + using-ocean-market/asset-hosting: user-guides/asset-hosting + using-ocean-market/remove-liquidity-using-etherscan: user-guides/remove-liquidity-pools + building-with-ocean: developers + building-with-ocean/build-a-marketplace: developers/build-a-marketplace + building-with-ocean/build-a-marketplace/forking-ocean-market: developers/build-a-marketplace/forking-ocean-market + building-with-ocean/build-a-marketplace/customising-your-market: developers/build-a-marketplace/customising-your-market + building-with-ocean/build-a-marketplace/deploying-market: developers/build-a-marketplace/deploying-market + building-with-ocean/using-ocean-libraries: developers/ocean.js + building-with-ocean/using-ocean-libraries/configuration: developers/ocean.js/configuration + building-with-ocean/using-ocean-libraries/creating_dataNFT: developers/ocean.js/creating-datanft + building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing: developers/ocean.js/publish + building-with-ocean/using-ocean-libraries/mint-datatoken: developers/ocean.js/mint-datatoken + building-with-ocean/using-ocean-libraries/update-metadata: developers/ocean.js/update-metadata + building-with-ocean/compute-to-data: developers/compute-to-data + building-with-ocean/compute-to-data/compute-to-data-architecture: developers/compute-to-data/compute-to-data-architecture + building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms: developers/compute-to-data/compute-to-data-datasets-algorithms + building-with-ocean/compute-to-data/compute-to-data-algorithms: developers/compute-to-data/compute-to-data-algorithms + building-with-ocean/compute-to-data/compute-to-data-minikube: infrastructure/compute-to-data-minikube + building-with-ocean/compute-to-data/compute-to-data-docker-registry: infrastructure/compute-to-data-docker-registry + building-with-ocean/compute-to-data/user-defined-parameters: developers/compute-to-data/compute-options + building-with-ocean/deploying-components: infrastructure + building-with-ocean/deploying-components/setup-server: infrastructure/setup-server + building-with-ocean/deploying-components/deploying-ocean-subgraph: infrastructure/deploying-ocean-subgraph + building-with-ocean/deploying-components/deploying-marketplace: infrastructure/deploying-marketplace + building-with-ocean/deploying-components/deploying-aquarius: infrastructure/deploying-aquarius + building-with-ocean/deploying-components/deploying-provider: infrastructure/deploying-provider + building-with-ocean/using-ocean-subgraph: developers/subgraph + building-with-ocean/using-ocean-subgraph/list-data-nfts: developers/subgraph/list-data-nfts + building-with-ocean/using-ocean-subgraph/list-datatokens: developers/subgraph/list-datatokens + building-with-ocean/using-ocean-subgraph/get-data-nft-information: developers/subgraph/get-data-nft-information + building-with-ocean/using-ocean-subgraph/get-datatoken-information: developers/subgraph/get-datatoken-information + building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges: developers/subgraph/list-fixed-rate-exchanges + building-with-ocean/using-ocean-subgraph/deploying-ocean-subgraph: infrastructure/deploying-components/deploying-ocean-subgraph + building-with-ocean/contributing: contribute + building-with-ocean/contributing/code-of-conduct: contribute/code-of-conduct + building-with-ocean/contributing/legal-reqs: contribute/legal-reqs + building-with-ocean/projects-using-ocean: contribute/projects-using-ocean + veocean-data-farming: rewards + veocean-data-farming/veocean: rewards/veocean + veocean-data-farming/df-intro: rewards/df-intro + veocean-data-farming/df-background: rewards/df-max-out-yield#a-brief-history-of-data-farming + veocean-data-farming/emissions-apys: rewards/df-emissions-apys + veocean-data-farming/delegation: rewards#delegation + rewards/veOcean-Data-Farming-Tutorial: user-guides/get-started-df + api-references: developers + api-references/aquarius-rest-api: aquarius/asset-requests + api-references/provider-rest-api: developers/provider/general-endpoints diff --git a/.gitbook/assets/architecture.png b/.gitbook/assets/architecture.png deleted file mode 100644 index 97084f85..00000000 Binary files a/.gitbook/assets/architecture.png and /dev/null differ diff --git a/.gitbook/assets/architecture/DataNFT&Datatokens.png b/.gitbook/assets/architecture/DataNFT&Datatokens.png new file mode 100644 index 00000000..b33f6cff Binary files /dev/null and b/.gitbook/assets/architecture/DataNFT&Datatokens.png differ diff --git a/.gitbook/assets/architecture/Ocean101.png b/.gitbook/assets/architecture/Ocean101.png new file mode 100644 index 00000000..5a430327 Binary files /dev/null and b/.gitbook/assets/architecture/Ocean101.png differ diff --git a/.gitbook/assets/architecture/architecture_overview.png b/.gitbook/assets/architecture/architecture_overview.png new file mode 100644 index 00000000..fa75be2b Binary files /dev/null and b/.gitbook/assets/architecture/architecture_overview.png differ diff --git a/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png b/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png new file mode 100644 index 00000000..2b2d85a3 Binary files /dev/null and b/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png differ diff --git a/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png b/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png new file mode 100644 index 00000000..d8947336 Binary files /dev/null and b/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png differ diff --git a/.gitbook/assets/architecture/publish_and_retrieve_ddos.png b/.gitbook/assets/architecture/publish_and_retrieve_ddos.png new file mode 100644 index 00000000..4328e865 Binary files /dev/null and b/.gitbook/assets/architecture/publish_and_retrieve_ddos.png differ diff --git a/.gitbook/assets/blowfish b/.gitbook/assets/blowfish deleted file mode 100644 index 742b5c49..00000000 Binary files a/.gitbook/assets/blowfish and /dev/null differ diff --git a/.gitbook/assets/c2d/Set-a-price-algo.png b/.gitbook/assets/c2d/Set-a-price-algo.png new file mode 100644 index 00000000..540af226 Binary files /dev/null and b/.gitbook/assets/c2d/Set-a-price-algo.png differ diff --git a/.gitbook/assets/c2d/Sign-transactions.png b/.gitbook/assets/c2d/Sign-transactions.png new file mode 100644 index 00000000..348ccf43 Binary files /dev/null and b/.gitbook/assets/c2d/Sign-transactions.png differ diff --git a/.gitbook/assets/c2d/Submit-compute-settings.png b/.gitbook/assets/c2d/Submit-compute-settings.png new file mode 100644 index 00000000..3e424eaa Binary files /dev/null and b/.gitbook/assets/c2d/Submit-compute-settings.png differ diff --git a/.gitbook/assets/c2d/algo-asset.png b/.gitbook/assets/c2d/algo-asset.png new file mode 100644 index 00000000..8dcb5aad Binary files /dev/null and b/.gitbook/assets/c2d/algo-asset.png differ diff --git a/.gitbook/assets/c2d/algorithm-privacy.png b/.gitbook/assets/c2d/algorithm-privacy.png new file mode 100644 index 00000000..54069c2a Binary files /dev/null and b/.gitbook/assets/c2d/algorithm-privacy.png differ diff --git a/.gitbook/assets/c2d/buy-compute-job.png b/.gitbook/assets/c2d/buy-compute-job.png new file mode 100644 index 00000000..c2c360e5 Binary files /dev/null and b/.gitbook/assets/c2d/buy-compute-job.png differ diff --git a/.gitbook/assets/c2d/c2d_compute_job.png b/.gitbook/assets/c2d/c2d_compute_job.png new file mode 100644 index 00000000..6c3c4e3d Binary files /dev/null and b/.gitbook/assets/c2d/c2d_compute_job.png differ diff --git a/.gitbook/assets/compute-to-data-parameters-publish-algorithm.png b/.gitbook/assets/c2d/compute-to-data-parameters-publish-algorithm.png similarity index 100% rename from .gitbook/assets/compute-to-data-parameters-publish-algorithm.png rename to .gitbook/assets/c2d/compute-to-data-parameters-publish-algorithm.png diff --git a/.gitbook/assets/compute-to-data-parameters-publish-dataset.png b/.gitbook/assets/c2d/compute-to-data-parameters-publish-dataset.png similarity index 100% rename from .gitbook/assets/compute-to-data-parameters-publish-dataset.png rename to .gitbook/assets/c2d/compute-to-data-parameters-publish-dataset.png diff --git a/.gitbook/assets/c2d/data-nft-c2d-preview.png b/.gitbook/assets/c2d/data-nft-c2d-preview.png new file mode 100644 index 00000000..68216e5e Binary files /dev/null and b/.gitbook/assets/c2d/data-nft-c2d-preview.png differ diff --git a/.gitbook/assets/c2d/dataset-compute-option.png b/.gitbook/assets/c2d/dataset-compute-option.png new file mode 100644 index 00000000..a9f98314 Binary files /dev/null and b/.gitbook/assets/c2d/dataset-compute-option.png differ diff --git a/.gitbook/assets/c2d/dataset-default-option.png b/.gitbook/assets/c2d/dataset-default-option.png new file mode 100644 index 00000000..bf25c247 Binary files /dev/null and b/.gitbook/assets/c2d/dataset-default-option.png differ diff --git a/.gitbook/assets/c2d/docker-image.png b/.gitbook/assets/c2d/docker-image.png new file mode 100644 index 00000000..1e5c1a17 Binary files /dev/null and b/.gitbook/assets/c2d/docker-image.png differ diff --git a/.gitbook/assets/c2d/double-check-work.png b/.gitbook/assets/c2d/double-check-work.png new file mode 100644 index 00000000..cbd39efc Binary files /dev/null and b/.gitbook/assets/c2d/double-check-work.png differ diff --git a/.gitbook/assets/c2d/edit-asset-link.png b/.gitbook/assets/c2d/edit-asset-link.png new file mode 100644 index 00000000..c988720c Binary files /dev/null and b/.gitbook/assets/c2d/edit-asset-link.png differ diff --git a/.gitbook/assets/c2d/edit-compute-settings.png b/.gitbook/assets/c2d/edit-compute-settings.png new file mode 100644 index 00000000..a2022e8e Binary files /dev/null and b/.gitbook/assets/c2d/edit-compute-settings.png differ diff --git a/.gitbook/assets/c2d/preview-publish.png b/.gitbook/assets/c2d/preview-publish.png new file mode 100644 index 00000000..c0c36272 Binary files /dev/null and b/.gitbook/assets/c2d/preview-publish.png differ diff --git a/.gitbook/assets/c2d/publish.png b/.gitbook/assets/c2d/publish.png new file mode 100644 index 00000000..bf3a381d Binary files /dev/null and b/.gitbook/assets/c2d/publish.png differ diff --git a/.gitbook/assets/c2d/select-algorithm-for-compute.png b/.gitbook/assets/c2d/select-algorithm-for-compute.png new file mode 100644 index 00000000..e6b1b07f Binary files /dev/null and b/.gitbook/assets/c2d/select-algorithm-for-compute.png differ diff --git a/.gitbook/assets/components/aquarius.png b/.gitbook/assets/components/aquarius.png new file mode 100644 index 00000000..79d41d38 Binary files /dev/null and b/.gitbook/assets/components/aquarius.png differ diff --git a/.gitbook/assets/components/aquarius_deployment.jpg b/.gitbook/assets/components/aquarius_deployment.jpg new file mode 100644 index 00000000..5ba73df5 Binary files /dev/null and b/.gitbook/assets/components/aquarius_deployment.jpg differ diff --git a/.gitbook/assets/components/barge.png b/.gitbook/assets/components/barge.png new file mode 100644 index 00000000..d1e26cd6 Binary files /dev/null and b/.gitbook/assets/components/barge.png differ diff --git a/.gitbook/assets/components/ocean_py.png b/.gitbook/assets/components/ocean_py.png new file mode 100644 index 00000000..9becf0ac Binary files /dev/null and b/.gitbook/assets/components/ocean_py.png differ diff --git a/.gitbook/assets/components/provider.png b/.gitbook/assets/components/provider.png new file mode 100644 index 00000000..70bf5777 Binary files /dev/null and b/.gitbook/assets/components/provider.png differ diff --git a/.gitbook/assets/components/subgraph.png b/.gitbook/assets/components/subgraph.png new file mode 100644 index 00000000..c8a3dff2 Binary files /dev/null and b/.gitbook/assets/components/subgraph.png differ diff --git a/.gitbook/assets/connect-wallet.png b/.gitbook/assets/connect-wallet.png deleted file mode 100644 index cf71156a..00000000 Binary files a/.gitbook/assets/connect-wallet.png and /dev/null differ diff --git a/.gitbook/assets/contracts/pricing_schemas.png b/.gitbook/assets/contracts/pricing_schemas.png new file mode 100644 index 00000000..88a64781 Binary files /dev/null and b/.gitbook/assets/contracts/pricing_schemas.png differ diff --git a/.gitbook/assets/contracts/publish_detailed_flow.png b/.gitbook/assets/contracts/publish_detailed_flow.png new file mode 100644 index 00000000..a876df90 Binary files /dev/null and b/.gitbook/assets/contracts/publish_detailed_flow.png differ diff --git a/.gitbook/assets/contracts/roles_datatokens_level.png b/.gitbook/assets/contracts/roles_datatokens_level.png new file mode 100644 index 00000000..0c5ef5ae Binary files /dev/null and b/.gitbook/assets/contracts/roles_datatokens_level.png differ diff --git a/.gitbook/assets/contracts/roles_nft_level.png b/.gitbook/assets/contracts/roles_nft_level.png new file mode 100644 index 00000000..4593a88d Binary files /dev/null and b/.gitbook/assets/contracts/roles_nft_level.png differ diff --git a/.gitbook/assets/contracts/smart-contracts.png b/.gitbook/assets/contracts/smart-contracts.png new file mode 100644 index 00000000..9af02ac2 Binary files /dev/null and b/.gitbook/assets/contracts/smart-contracts.png differ diff --git a/.gitbook/assets/contracts/v4_contracts_overview.png b/.gitbook/assets/contracts/v4_contracts_overview.png new file mode 100644 index 00000000..a4aa4e02 Binary files /dev/null and b/.gitbook/assets/contracts/v4_contracts_overview.png differ diff --git a/.gitbook/assets/cover/contribute_banner.png b/.gitbook/assets/cover/contribute_banner.png new file mode 100644 index 00000000..b627c8d5 Binary files /dev/null and b/.gitbook/assets/cover/contribute_banner.png differ diff --git a/.gitbook/assets/cover/contribute_card.png b/.gitbook/assets/cover/contribute_card.png new file mode 100644 index 00000000..4ee76eef Binary files /dev/null and b/.gitbook/assets/cover/contribute_card.png differ diff --git a/.gitbook/assets/cover/data_science_banner.png b/.gitbook/assets/cover/data_science_banner.png new file mode 100644 index 00000000..3eb30dcb Binary files /dev/null and b/.gitbook/assets/cover/data_science_banner.png differ diff --git a/.gitbook/assets/cover/data_science_card.png b/.gitbook/assets/cover/data_science_card.png new file mode 100644 index 00000000..862b97e8 Binary files /dev/null and b/.gitbook/assets/cover/data_science_card.png differ diff --git a/.gitbook/assets/cover/defi_banner.png b/.gitbook/assets/cover/defi_banner.png new file mode 100644 index 00000000..a453ebfe Binary files /dev/null and b/.gitbook/assets/cover/defi_banner.png differ diff --git a/.gitbook/assets/cover/defi_card.png b/.gitbook/assets/cover/defi_card.png new file mode 100644 index 00000000..c1f097bf Binary files /dev/null and b/.gitbook/assets/cover/defi_card.png differ diff --git a/.gitbook/assets/cover/developer_banner.png b/.gitbook/assets/cover/developer_banner.png new file mode 100644 index 00000000..cc039eae Binary files /dev/null and b/.gitbook/assets/cover/developer_banner.png differ diff --git a/.gitbook/assets/cover/developer_card.png b/.gitbook/assets/cover/developer_card.png new file mode 100644 index 00000000..f0f5ebe9 Binary files /dev/null and b/.gitbook/assets/cover/developer_card.png differ diff --git a/.gitbook/assets/cover/discover_banner.png b/.gitbook/assets/cover/discover_banner.png new file mode 100644 index 00000000..00b9d1e9 Binary files /dev/null and b/.gitbook/assets/cover/discover_banner.png differ diff --git a/.gitbook/assets/cover/discover_card.png b/.gitbook/assets/cover/discover_card.png new file mode 100644 index 00000000..b41976df Binary files /dev/null and b/.gitbook/assets/cover/discover_card.png differ diff --git a/.gitbook/assets/cover/docs_banner.png b/.gitbook/assets/cover/docs_banner.png new file mode 100644 index 00000000..427abfb6 Binary files /dev/null and b/.gitbook/assets/cover/docs_banner.png differ diff --git a/.gitbook/assets/cover/infrastructure_banner.png b/.gitbook/assets/cover/infrastructure_banner.png new file mode 100644 index 00000000..a2568603 Binary files /dev/null and b/.gitbook/assets/cover/infrastructure_banner.png differ diff --git a/.gitbook/assets/cover/infrastructure_card.png b/.gitbook/assets/cover/infrastructure_card.png new file mode 100644 index 00000000..395fb229 Binary files /dev/null and b/.gitbook/assets/cover/infrastructure_card.png differ diff --git a/.gitbook/assets/cover/rewards_banner.png b/.gitbook/assets/cover/rewards_banner.png new file mode 100644 index 00000000..daa78170 Binary files /dev/null and b/.gitbook/assets/cover/rewards_banner.png differ diff --git a/.gitbook/assets/cover/rewards_card.png b/.gitbook/assets/cover/rewards_card.png new file mode 100644 index 00000000..ce298ae4 Binary files /dev/null and b/.gitbook/assets/cover/rewards_card.png differ diff --git a/.gitbook/assets/cover/user_guides_banner.png b/.gitbook/assets/cover/user_guides_banner.png new file mode 100644 index 00000000..db2d6197 Binary files /dev/null and b/.gitbook/assets/cover/user_guides_banner.png differ diff --git a/.gitbook/assets/cover/user_guides_card.png b/.gitbook/assets/cover/user_guides_card.png new file mode 100644 index 00000000..ec1321c6 Binary files /dev/null and b/.gitbook/assets/cover/user_guides_card.png differ diff --git a/.gitbook/assets/datanft-and-datatoken.png b/.gitbook/assets/datanft-and-datatoken.png deleted file mode 100644 index 1457d0d2..00000000 Binary files a/.gitbook/assets/datanft-and-datatoken.png and /dev/null differ diff --git a/.gitbook/assets/ddo flow b/.gitbook/assets/ddo flow deleted file mode 100644 index fab9a027..00000000 Binary files a/.gitbook/assets/ddo flow and /dev/null differ diff --git a/.gitbook/assets/deployment/image (1).png b/.gitbook/assets/deployment/image (1).png new file mode 100644 index 00000000..8c49613f Binary files /dev/null and b/.gitbook/assets/deployment/image (1).png differ diff --git a/.gitbook/assets/deployment/image (2).png b/.gitbook/assets/deployment/image (2).png new file mode 100644 index 00000000..12de50b0 Binary files /dev/null and b/.gitbook/assets/deployment/image (2).png differ diff --git a/.gitbook/assets/deployment/image (3).png b/.gitbook/assets/deployment/image (3).png new file mode 100644 index 00000000..8c49613f Binary files /dev/null and b/.gitbook/assets/deployment/image (3).png differ diff --git a/.gitbook/assets/deployment/image (4).png b/.gitbook/assets/deployment/image (4).png new file mode 100644 index 00000000..8c49613f Binary files /dev/null and b/.gitbook/assets/deployment/image (4).png differ diff --git a/.gitbook/assets/deployment/image (5).png b/.gitbook/assets/deployment/image (5).png new file mode 100644 index 00000000..9722765b Binary files /dev/null and b/.gitbook/assets/deployment/image (5).png differ diff --git a/.gitbook/assets/deployment/image (6).png b/.gitbook/assets/deployment/image (6).png new file mode 100644 index 00000000..570dafdd Binary files /dev/null and b/.gitbook/assets/deployment/image (6).png differ diff --git a/.gitbook/assets/deployment/image.png b/.gitbook/assets/deployment/image.png new file mode 100644 index 00000000..61594cef Binary files /dev/null and b/.gitbook/assets/deployment/image.png differ diff --git a/.gitbook/assets/dynamic asset pricing b/.gitbook/assets/dynamic asset pricing deleted file mode 100644 index 156114e3..00000000 Binary files a/.gitbook/assets/dynamic asset pricing and /dev/null differ diff --git a/.gitbook/assets/feature-compute@2x.webp b/.gitbook/assets/feature-compute@2x.webp deleted file mode 100644 index 09d216a8..00000000 Binary files a/.gitbook/assets/feature-compute@2x.webp and /dev/null differ diff --git a/.gitbook/assets/feature-datascience@2x.webp b/.gitbook/assets/feature-datascience@2x.webp deleted file mode 100644 index f8b8923b..00000000 Binary files a/.gitbook/assets/feature-datascience@2x.webp and /dev/null differ diff --git a/.gitbook/assets/feature-marketplaces@2x.webp b/.gitbook/assets/feature-marketplaces@2x.webp deleted file mode 100644 index 9548f4cb..00000000 Binary files a/.gitbook/assets/feature-marketplaces@2x.webp and /dev/null differ diff --git a/.gitbook/assets/fixed-asset-pricing.png b/.gitbook/assets/fixed-asset-pricing.png deleted file mode 100644 index cc55112a..00000000 Binary files a/.gitbook/assets/fixed-asset-pricing.png and /dev/null differ diff --git a/.gitbook/assets/free-asset-pricing.png b/.gitbook/assets/free-asset-pricing.png deleted file mode 100644 index 8876bc31..00000000 Binary files a/.gitbook/assets/free-asset-pricing.png and /dev/null differ diff --git a/.gitbook/assets/general/dao.jpeg b/.gitbook/assets/general/dao.jpeg new file mode 100644 index 00000000..6c291bc3 Binary files /dev/null and b/.gitbook/assets/general/dao.jpeg differ diff --git a/.gitbook/assets/general/developers.png b/.gitbook/assets/general/developers.png new file mode 100644 index 00000000..3f6a86bf Binary files /dev/null and b/.gitbook/assets/general/developers.png differ diff --git a/.gitbook/assets/general/explore_ocean.png b/.gitbook/assets/general/explore_ocean.png new file mode 100644 index 00000000..f2140ae8 Binary files /dev/null and b/.gitbook/assets/general/explore_ocean.png differ diff --git a/.gitbook/assets/gif/200.webp b/.gitbook/assets/gif/200.webp new file mode 100644 index 00000000..b978dd0a Binary files /dev/null and b/.gitbook/assets/gif/200.webp differ diff --git a/.gitbook/assets/gif/anchorman-teamwork.gif b/.gitbook/assets/gif/anchorman-teamwork.gif new file mode 100644 index 00000000..19ec6d2c Binary files /dev/null and b/.gitbook/assets/gif/anchorman-teamwork.gif differ diff --git a/.gitbook/assets/gif/big-money.gif b/.gitbook/assets/gif/big-money.gif new file mode 100644 index 00000000..9276d391 Binary files /dev/null and b/.gitbook/assets/gif/big-money.gif differ diff --git a/.gitbook/assets/gif/cash-flow.gif b/.gitbook/assets/gif/cash-flow.gif new file mode 100644 index 00000000..89d495c4 Binary files /dev/null and b/.gitbook/assets/gif/cash-flow.gif differ diff --git a/.gitbook/assets/gif/clueless-shopping.gif b/.gitbook/assets/gif/clueless-shopping.gif new file mode 100644 index 00000000..67146b0a Binary files /dev/null and b/.gitbook/assets/gif/clueless-shopping.gif differ diff --git a/.gitbook/assets/gif/data_everywhere.gif b/.gitbook/assets/gif/data_everywhere.gif new file mode 100644 index 00000000..e24abcfb Binary files /dev/null and b/.gitbook/assets/gif/data_everywhere.gif differ diff --git a/.gitbook/assets/gif/drew-barrymore-notes.gif b/.gitbook/assets/gif/drew-barrymore-notes.gif new file mode 100644 index 00000000..3e8e3b40 Binary files /dev/null and b/.gitbook/assets/gif/drew-barrymore-notes.gif differ diff --git a/.gitbook/assets/gif/farming.gif b/.gitbook/assets/gif/farming.gif new file mode 100644 index 00000000..d7d59599 Binary files /dev/null and b/.gitbook/assets/gif/farming.gif differ diff --git a/.gitbook/assets/gif/follow-instructions.gif b/.gitbook/assets/gif/follow-instructions.gif new file mode 100644 index 00000000..306837a3 Binary files /dev/null and b/.gitbook/assets/gif/follow-instructions.gif differ diff --git a/.gitbook/assets/gif/giphy.gif b/.gitbook/assets/gif/giphy.gif new file mode 100644 index 00000000..a5e74195 Binary files /dev/null and b/.gitbook/assets/gif/giphy.gif differ diff --git a/.gitbook/assets/gif/giphy.webp b/.gitbook/assets/gif/giphy.webp new file mode 100644 index 00000000..21682362 Binary files /dev/null and b/.gitbook/assets/gif/giphy.webp differ diff --git a/.gitbook/assets/gif/hustlin.gif b/.gitbook/assets/gif/hustlin.gif new file mode 100644 index 00000000..6a2e918e Binary files /dev/null and b/.gitbook/assets/gif/hustlin.gif differ diff --git a/.gitbook/assets/gif/i-know-kung-fu.gif b/.gitbook/assets/gif/i-know-kung-fu.gif new file mode 100644 index 00000000..0d2c674f Binary files /dev/null and b/.gitbook/assets/gif/i-know-kung-fu.gif differ diff --git a/.gitbook/assets/gif/just-publish.gif b/.gitbook/assets/gif/just-publish.gif new file mode 100644 index 00000000..dfa0b1e3 Binary files /dev/null and b/.gitbook/assets/gif/just-publish.gif differ diff --git a/.gitbook/assets/gif/kermit-typing.gif b/.gitbook/assets/gif/kermit-typing.gif new file mode 100644 index 00000000..4e099165 Binary files /dev/null and b/.gitbook/assets/gif/kermit-typing.gif differ diff --git a/.gitbook/assets/gif/like-a-boss.gif b/.gitbook/assets/gif/like-a-boss.gif new file mode 100644 index 00000000..3cf63a3a Binary files /dev/null and b/.gitbook/assets/gif/like-a-boss.gif differ diff --git a/.gitbook/assets/gif/matrix-code.gif b/.gitbook/assets/gif/matrix-code.gif new file mode 100644 index 00000000..9f3356ae Binary files /dev/null and b/.gitbook/assets/gif/matrix-code.gif differ diff --git a/.gitbook/assets/gif/morpheus-taunting.gif b/.gitbook/assets/gif/morpheus-taunting.gif new file mode 100644 index 00000000..2211fb35 Binary files /dev/null and b/.gitbook/assets/gif/morpheus-taunting.gif differ diff --git a/.gitbook/assets/gif/morpheus.gif b/.gitbook/assets/gif/morpheus.gif new file mode 100644 index 00000000..343bb67a Binary files /dev/null and b/.gitbook/assets/gif/morpheus.gif differ diff --git a/.gitbook/assets/gif/my-data.gif b/.gitbook/assets/gif/my-data.gif new file mode 100644 index 00000000..f2dd42fb Binary files /dev/null and b/.gitbook/assets/gif/my-data.gif differ diff --git a/.gitbook/assets/gif/neo-bb.gif b/.gitbook/assets/gif/neo-bb.gif new file mode 100644 index 00000000..067f696e Binary files /dev/null and b/.gitbook/assets/gif/neo-bb.gif differ diff --git a/.gitbook/assets/gif/neo-blocking.gif b/.gitbook/assets/gif/neo-blocking.gif new file mode 100644 index 00000000..4260200f Binary files /dev/null and b/.gitbook/assets/gif/neo-blocking.gif differ diff --git a/.gitbook/assets/gif/neo-kinda-martial-arts.gif b/.gitbook/assets/gif/neo-kinda-martial-arts.gif new file mode 100644 index 00000000..31c92293 Binary files /dev/null and b/.gitbook/assets/gif/neo-kinda-martial-arts.gif differ diff --git a/.gitbook/assets/gif/passive-income.gif b/.gitbook/assets/gif/passive-income.gif new file mode 100644 index 00000000..b776a466 Binary files /dev/null and b/.gitbook/assets/gif/passive-income.gif differ diff --git a/.gitbook/assets/gif/shopping-minions.gif b/.gitbook/assets/gif/shopping-minions.gif new file mode 100644 index 00000000..d49c7876 Binary files /dev/null and b/.gitbook/assets/gif/shopping-minions.gif differ diff --git a/.gitbook/assets/gif/sponge-money.gif b/.gitbook/assets/gif/sponge-money.gif new file mode 100644 index 00000000..c2507e64 Binary files /dev/null and b/.gitbook/assets/gif/sponge-money.gif differ diff --git a/.gitbook/assets/gif/super-mario-coins.gif b/.gitbook/assets/gif/super-mario-coins.gif new file mode 100644 index 00000000..e53e987a Binary files /dev/null and b/.gitbook/assets/gif/super-mario-coins.gif differ diff --git a/.gitbook/assets/gif/talk-data-to-me.gif b/.gitbook/assets/gif/talk-data-to-me.gif new file mode 100644 index 00000000..c147240a Binary files /dev/null and b/.gitbook/assets/gif/talk-data-to-me.gif differ diff --git a/.gitbook/assets/gif/tell-me-more.gif b/.gitbook/assets/gif/tell-me-more.gif new file mode 100644 index 00000000..ff542c91 Binary files /dev/null and b/.gitbook/assets/gif/tell-me-more.gif differ diff --git a/.gitbook/assets/gif/the-algorithm.gif b/.gitbook/assets/gif/the-algorithm.gif new file mode 100644 index 00000000..4df0b339 Binary files /dev/null and b/.gitbook/assets/gif/the-algorithm.gif differ diff --git a/.gitbook/assets/gif/to-the-computer.gif b/.gitbook/assets/gif/to-the-computer.gif new file mode 100644 index 00000000..76a6210b Binary files /dev/null and b/.gitbook/assets/gif/to-the-computer.gif differ diff --git a/.gitbook/assets/gif/underwater-treasure.gif b/.gitbook/assets/gif/underwater-treasure.gif new file mode 100644 index 00000000..6cc01470 Binary files /dev/null and b/.gitbook/assets/gif/underwater-treasure.gif differ diff --git a/.gitbook/assets/gif/welcome-to-my-dojo.gif b/.gitbook/assets/gif/welcome-to-my-dojo.gif new file mode 100644 index 00000000..309ebcb0 Binary files /dev/null and b/.gitbook/assets/gif/welcome-to-my-dojo.gif differ diff --git a/.gitbook/assets/gif/whats-a-wallet.gif b/.gitbook/assets/gif/whats-a-wallet.gif new file mode 100644 index 00000000..11d7ce42 Binary files /dev/null and b/.gitbook/assets/gif/whats-a-wallet.gif differ diff --git a/.gitbook/assets/hosting/Raw-URL.png b/.gitbook/assets/hosting/Raw-URL.png new file mode 100644 index 00000000..f2713090 Binary files /dev/null and b/.gitbook/assets/hosting/Raw-URL.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.52.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.52.29.png new file mode 100644 index 00000000..7da50512 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.52.29.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.54.21.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.54.21.png new file mode 100644 index 00000000..a92499ee Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.54.21.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.55.16.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.55.16.png new file mode 100644 index 00000000..aa1dac52 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.55.16.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.56.34.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.56.34.png new file mode 100644 index 00000000..2b937b34 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.56.34.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.58.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.58.29.png new file mode 100644 index 00000000..21fbda19 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.58.29.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.08.42.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.08.42.png new file mode 100644 index 00000000..981af7a3 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.08.42.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.12.10.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.12.10.png new file mode 100644 index 00000000..e378dd55 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.12.10.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.26.56.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.26.56.png new file mode 100644 index 00000000..3d094c1c Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.26.56.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.50.27.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.50.27.png new file mode 100644 index 00000000..e42698b7 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.50.27.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.51.14.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.51.14.png new file mode 100644 index 00000000..6feb1f77 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.51.14.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.54.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.54.29.png new file mode 100644 index 00000000..fbbeb05a Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.54.29.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.56.01.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.56.01.png new file mode 100644 index 00000000..7c7b728c Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.56.01.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.58.20.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.58.20.png new file mode 100644 index 00000000..6ba9d12e Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.58.20.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.59.38.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.59.38.png new file mode 100644 index 00000000..f2215469 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.59.38.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.02.25.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.02.25.png new file mode 100644 index 00000000..bbc86bb3 Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.02.25.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.05.41.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.05.41.png new file mode 100644 index 00000000..39af555d Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.05.41.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.08.12.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.08.12.png new file mode 100644 index 00000000..b2545bba Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.08.12.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.15.45.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.15.45.png new file mode 100644 index 00000000..7d71185e Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.15.45.png differ diff --git a/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.16.46.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.16.46.png new file mode 100644 index 00000000..1768146f Binary files /dev/null and b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.16.46.png differ diff --git a/.gitbook/assets/arweave-1.png b/.gitbook/assets/hosting/arweave-1.png similarity index 100% rename from .gitbook/assets/arweave-1.png rename to .gitbook/assets/hosting/arweave-1.png diff --git a/.gitbook/assets/arweave-2.png b/.gitbook/assets/hosting/arweave-2.png similarity index 100% rename from .gitbook/assets/arweave-2.png rename to .gitbook/assets/hosting/arweave-2.png diff --git a/.gitbook/assets/arweave-3.png b/.gitbook/assets/hosting/arweave-3.png similarity index 100% rename from .gitbook/assets/arweave-3.png rename to .gitbook/assets/hosting/arweave-3.png diff --git a/.gitbook/assets/arweave-4.png b/.gitbook/assets/hosting/arweave-4.png similarity index 100% rename from .gitbook/assets/arweave-4.png rename to .gitbook/assets/hosting/arweave-4.png diff --git a/using-ocean-market/images/hosting-services/aws-1.png b/.gitbook/assets/hosting/aws-1.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-1.png rename to .gitbook/assets/hosting/aws-1.png diff --git a/using-ocean-market/images/hosting-services/aws-10.png b/.gitbook/assets/hosting/aws-10.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-10.png rename to .gitbook/assets/hosting/aws-10.png diff --git a/using-ocean-market/images/hosting-services/aws-11.png b/.gitbook/assets/hosting/aws-11.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-11.png rename to .gitbook/assets/hosting/aws-11.png diff --git a/using-ocean-market/images/hosting-services/aws-12.png b/.gitbook/assets/hosting/aws-12.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-12.png rename to .gitbook/assets/hosting/aws-12.png diff --git a/using-ocean-market/images/hosting-services/aws-2.png b/.gitbook/assets/hosting/aws-2.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-2.png rename to .gitbook/assets/hosting/aws-2.png diff --git a/using-ocean-market/images/hosting-services/aws-3.png b/.gitbook/assets/hosting/aws-3.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-3.png rename to .gitbook/assets/hosting/aws-3.png diff --git a/using-ocean-market/images/hosting-services/aws-4.png b/.gitbook/assets/hosting/aws-4.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-4.png rename to .gitbook/assets/hosting/aws-4.png diff --git a/using-ocean-market/images/hosting-services/aws-5.png b/.gitbook/assets/hosting/aws-5.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-5.png rename to .gitbook/assets/hosting/aws-5.png diff --git a/using-ocean-market/images/hosting-services/aws-6.png b/.gitbook/assets/hosting/aws-6.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-6.png rename to .gitbook/assets/hosting/aws-6.png diff --git a/using-ocean-market/images/hosting-services/aws-7.png b/.gitbook/assets/hosting/aws-7.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-7.png rename to .gitbook/assets/hosting/aws-7.png diff --git a/using-ocean-market/images/hosting-services/aws-8.png b/.gitbook/assets/hosting/aws-8.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-8.png rename to .gitbook/assets/hosting/aws-8.png diff --git a/using-ocean-market/images/hosting-services/aws-9.png b/.gitbook/assets/hosting/aws-9.png similarity index 100% rename from using-ocean-market/images/hosting-services/aws-9.png rename to .gitbook/assets/hosting/aws-9.png diff --git a/building-with-ocean/images/marketplace/publish/azure-1.png b/.gitbook/assets/hosting/azure1.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-1.png rename to .gitbook/assets/hosting/azure1.png diff --git a/using-ocean-market/images/hosting-services/azure-10.png b/.gitbook/assets/hosting/azure10.png similarity index 100% rename from using-ocean-market/images/hosting-services/azure-10.png rename to .gitbook/assets/hosting/azure10.png diff --git a/building-with-ocean/images/marketplace/publish/azure-2.png b/.gitbook/assets/hosting/azure2.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-2.png rename to .gitbook/assets/hosting/azure2.png diff --git a/building-with-ocean/images/marketplace/publish/azure-3.png b/.gitbook/assets/hosting/azure3.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-3.png rename to .gitbook/assets/hosting/azure3.png diff --git a/building-with-ocean/images/marketplace/publish/azure-4.png b/.gitbook/assets/hosting/azure4.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-4.png rename to .gitbook/assets/hosting/azure4.png diff --git a/building-with-ocean/images/marketplace/publish/azure-5.png b/.gitbook/assets/hosting/azure5.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-5.png rename to .gitbook/assets/hosting/azure5.png diff --git a/building-with-ocean/images/marketplace/publish/azure-6.png b/.gitbook/assets/hosting/azure6.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-6.png rename to .gitbook/assets/hosting/azure6.png diff --git a/building-with-ocean/images/marketplace/publish/azure-7.png b/.gitbook/assets/hosting/azure7.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-7.png rename to .gitbook/assets/hosting/azure7.png diff --git a/building-with-ocean/images/marketplace/publish/azure-8.png b/.gitbook/assets/hosting/azure8.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-8.png rename to .gitbook/assets/hosting/azure8.png diff --git a/building-with-ocean/images/marketplace/publish/azure-9.png b/.gitbook/assets/hosting/azure9.png similarity index 100% rename from building-with-ocean/images/marketplace/publish/azure-9.png rename to .gitbook/assets/hosting/azure9.png diff --git a/.gitbook/assets/image (10).png b/.gitbook/assets/image (10).png deleted file mode 100644 index a633d8f8..00000000 Binary files a/.gitbook/assets/image (10).png and /dev/null differ diff --git a/.gitbook/assets/image (2).png b/.gitbook/assets/image (2).png deleted file mode 100644 index 171eaf81..00000000 Binary files a/.gitbook/assets/image (2).png and /dev/null differ diff --git a/.gitbook/assets/image (3).png b/.gitbook/assets/image (3).png deleted file mode 100644 index bd62e17e..00000000 Binary files a/.gitbook/assets/image (3).png and /dev/null differ diff --git a/.gitbook/assets/image (6).png b/.gitbook/assets/image (6).png deleted file mode 100644 index a633d8f8..00000000 Binary files a/.gitbook/assets/image (6).png and /dev/null differ diff --git a/.gitbook/assets/image (7).png b/.gitbook/assets/image (7).png deleted file mode 100644 index 99eb413e..00000000 Binary files a/.gitbook/assets/image (7).png and /dev/null differ diff --git a/.gitbook/assets/image (8).png b/.gitbook/assets/image (8).png deleted file mode 100644 index cab24603..00000000 Binary files a/.gitbook/assets/image (8).png and /dev/null differ diff --git a/.gitbook/assets/image (9).png b/.gitbook/assets/image (9).png deleted file mode 100644 index 9967ad31..00000000 Binary files a/.gitbook/assets/image (9).png and /dev/null differ diff --git a/.gitbook/assets/image.png b/.gitbook/assets/image.png deleted file mode 100644 index 0059a3bc..00000000 Binary files a/.gitbook/assets/image.png and /dev/null differ diff --git a/.gitbook/assets/liquidity/read-contract.png b/.gitbook/assets/liquidity/read-contract.png new file mode 100644 index 00000000..cd6f3165 Binary files /dev/null and b/.gitbook/assets/liquidity/read-contract.png differ diff --git a/.gitbook/assets/remove-liquidity-2.png b/.gitbook/assets/liquidity/remove-liquidity-2.png similarity index 100% rename from .gitbook/assets/remove-liquidity-2.png rename to .gitbook/assets/liquidity/remove-liquidity-2.png diff --git a/.gitbook/assets/remove-liquidity-6.png b/.gitbook/assets/liquidity/remove-liquidity-6.png similarity index 100% rename from .gitbook/assets/remove-liquidity-6.png rename to .gitbook/assets/liquidity/remove-liquidity-6.png diff --git a/.gitbook/assets/liquidity/remove-liquidity.png b/.gitbook/assets/liquidity/remove-liquidity.png new file mode 100644 index 00000000..37dd44fe Binary files /dev/null and b/.gitbook/assets/liquidity/remove-liquidity.png differ diff --git a/.gitbook/assets/liquidity/total-supply.png b/.gitbook/assets/liquidity/total-supply.png new file mode 100644 index 00000000..3a640906 Binary files /dev/null and b/.gitbook/assets/liquidity/total-supply.png differ diff --git a/.gitbook/assets/liquidity/write-contract.png b/.gitbook/assets/liquidity/write-contract.png new file mode 100644 index 00000000..116c7a8f Binary files /dev/null and b/.gitbook/assets/liquidity/write-contract.png differ diff --git a/.gitbook/assets/market-forking-1.png b/.gitbook/assets/market-forking-1.png deleted file mode 100644 index c5b59e35..00000000 Binary files a/.gitbook/assets/market-forking-1.png and /dev/null differ diff --git a/.gitbook/assets/market-forking-2.png b/.gitbook/assets/market-forking-2.png deleted file mode 100644 index 68f024d4..00000000 Binary files a/.gitbook/assets/market-forking-2.png and /dev/null differ diff --git a/.gitbook/assets/market/Access.png b/.gitbook/assets/market/Access.png new file mode 100644 index 00000000..af31b621 Binary files /dev/null and b/.gitbook/assets/market/Access.png differ diff --git a/.gitbook/assets/market/Check-Debug-Mode.png b/.gitbook/assets/market/Check-Debug-Mode.png new file mode 100644 index 00000000..0aea6d94 Binary files /dev/null and b/.gitbook/assets/market/Check-Debug-Mode.png differ diff --git a/.gitbook/assets/market/Click-Settings.png b/.gitbook/assets/market/Click-Settings.png new file mode 100644 index 00000000..3ad7c735 Binary files /dev/null and b/.gitbook/assets/market/Click-Settings.png differ diff --git a/.gitbook/assets/market/Enter-Metadata.png b/.gitbook/assets/market/Enter-Metadata.png new file mode 100644 index 00000000..26afa886 Binary files /dev/null and b/.gitbook/assets/market/Enter-Metadata.png differ diff --git a/.gitbook/assets/market/Preview.png b/.gitbook/assets/market/Preview.png new file mode 100644 index 00000000..9be6db89 Binary files /dev/null and b/.gitbook/assets/market/Preview.png differ diff --git a/.gitbook/assets/market/Price.png b/.gitbook/assets/market/Price.png new file mode 100644 index 00000000..1bccc402 Binary files /dev/null and b/.gitbook/assets/market/Price.png differ diff --git a/.gitbook/assets/market/Publish-Link.png b/.gitbook/assets/market/Publish-Link.png new file mode 100644 index 00000000..5e8a5a14 Binary files /dev/null and b/.gitbook/assets/market/Publish-Link.png differ diff --git a/.gitbook/assets/market/Screenshot 2023-06-13 at 14.39.17.png b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.39.17.png new file mode 100644 index 00000000..d21c0531 Binary files /dev/null and b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.39.17.png differ diff --git a/.gitbook/assets/market/Screenshot 2023-06-13 at 14.43.25.png b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.43.25.png new file mode 100644 index 00000000..ebde7c97 Binary files /dev/null and b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.43.25.png differ diff --git a/.gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png b/.gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png new file mode 100644 index 00000000..1f9b8c89 Binary files /dev/null and b/.gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png differ diff --git a/.gitbook/assets/market/Scroll-DDO-Info.png b/.gitbook/assets/market/Scroll-DDO-Info.png new file mode 100644 index 00000000..71333930 Binary files /dev/null and b/.gitbook/assets/market/Scroll-DDO-Info.png differ diff --git a/core-concepts/images/change-payment-collector.png b/.gitbook/assets/market/change-payment-collector.png similarity index 100% rename from core-concepts/images/change-payment-collector.png rename to .gitbook/assets/market/change-payment-collector.png diff --git a/.gitbook/assets/market/connect-wallet.png b/.gitbook/assets/market/connect-wallet.png new file mode 100644 index 00000000..9d5327b7 Binary files /dev/null and b/.gitbook/assets/market/connect-wallet.png differ diff --git a/.gitbook/assets/consume-1.png b/.gitbook/assets/market/consume-1.png similarity index 100% rename from .gitbook/assets/consume-1.png rename to .gitbook/assets/market/consume-1.png diff --git a/.gitbook/assets/consume-2.png b/.gitbook/assets/market/consume-2.png similarity index 100% rename from .gitbook/assets/consume-2.png rename to .gitbook/assets/market/consume-2.png diff --git a/.gitbook/assets/consume-3.png b/.gitbook/assets/market/consume-3.png similarity index 100% rename from .gitbook/assets/consume-3.png rename to .gitbook/assets/market/consume-3.png diff --git a/.gitbook/assets/consume-4.png b/.gitbook/assets/market/consume-4.png similarity index 100% rename from .gitbook/assets/consume-4.png rename to .gitbook/assets/market/consume-4.png diff --git a/.gitbook/assets/consume-5.png b/.gitbook/assets/market/consume-5.png similarity index 100% rename from .gitbook/assets/consume-5.png rename to .gitbook/assets/market/consume-5.png diff --git a/.gitbook/assets/consume-connect-wallet.png b/.gitbook/assets/market/consume-connect-wallet.png similarity index 100% rename from .gitbook/assets/consume-connect-wallet.png rename to .gitbook/assets/market/consume-connect-wallet.png diff --git a/.gitbook/assets/market-customisation-10.1.png b/.gitbook/assets/market/market-customisation-10.1.png similarity index 100% rename from .gitbook/assets/market-customisation-10.1.png rename to .gitbook/assets/market/market-customisation-10.1.png diff --git a/.gitbook/assets/market-customisation-10.2.png b/.gitbook/assets/market/market-customisation-10.2.png similarity index 100% rename from .gitbook/assets/market-customisation-10.2.png rename to .gitbook/assets/market/market-customisation-10.2.png diff --git a/.gitbook/assets/market-customisation-11.1.png b/.gitbook/assets/market/market-customisation-11.1.png similarity index 100% rename from .gitbook/assets/market-customisation-11.1.png rename to .gitbook/assets/market/market-customisation-11.1.png diff --git a/.gitbook/assets/market-customisation-12.png b/.gitbook/assets/market/market-customisation-12.png similarity index 100% rename from .gitbook/assets/market-customisation-12.png rename to .gitbook/assets/market/market-customisation-12.png diff --git a/.gitbook/assets/market-customisation-13.png b/.gitbook/assets/market/market-customisation-13.png similarity index 100% rename from .gitbook/assets/market-customisation-13.png rename to .gitbook/assets/market/market-customisation-13.png diff --git a/.gitbook/assets/market-customisation-14.png b/.gitbook/assets/market/market-customisation-14.png similarity index 100% rename from .gitbook/assets/market-customisation-14.png rename to .gitbook/assets/market/market-customisation-14.png diff --git a/.gitbook/assets/market-customisation-15.png b/.gitbook/assets/market/market-customisation-15.png similarity index 100% rename from .gitbook/assets/market-customisation-15.png rename to .gitbook/assets/market/market-customisation-15.png diff --git a/.gitbook/assets/market-customisation-18.png b/.gitbook/assets/market/market-customisation-18.png similarity index 100% rename from .gitbook/assets/market-customisation-18.png rename to .gitbook/assets/market/market-customisation-18.png diff --git a/.gitbook/assets/market-customisation-19.png b/.gitbook/assets/market/market-customisation-19.png similarity index 100% rename from .gitbook/assets/market-customisation-19.png rename to .gitbook/assets/market/market-customisation-19.png diff --git a/.gitbook/assets/market-customisation-20.png b/.gitbook/assets/market/market-customisation-20.png similarity index 100% rename from .gitbook/assets/market-customisation-20.png rename to .gitbook/assets/market/market-customisation-20.png diff --git a/.gitbook/assets/market-customisation-21.png b/.gitbook/assets/market/market-customisation-21.png similarity index 100% rename from .gitbook/assets/market-customisation-21.png rename to .gitbook/assets/market/market-customisation-21.png diff --git a/.gitbook/assets/market-customisation-22.png b/.gitbook/assets/market/market-customisation-22.png similarity index 100% rename from .gitbook/assets/market-customisation-22.png rename to .gitbook/assets/market/market-customisation-22.png diff --git a/.gitbook/assets/market-customisation-23.png b/.gitbook/assets/market/market-customisation-23.png similarity index 100% rename from .gitbook/assets/market-customisation-23.png rename to .gitbook/assets/market/market-customisation-23.png diff --git a/.gitbook/assets/market-customisation-24.png b/.gitbook/assets/market/market-customisation-24.png similarity index 100% rename from .gitbook/assets/market-customisation-24.png rename to .gitbook/assets/market/market-customisation-24.png diff --git a/.gitbook/assets/market-customisation-25.png b/.gitbook/assets/market/market-customisation-25.png similarity index 100% rename from .gitbook/assets/market-customisation-25.png rename to .gitbook/assets/market/market-customisation-25.png diff --git a/.gitbook/assets/market-customisation-3.png b/.gitbook/assets/market/market-customisation-3.png similarity index 100% rename from .gitbook/assets/market-customisation-3.png rename to .gitbook/assets/market/market-customisation-3.png diff --git a/.gitbook/assets/market-customisation-4.1.png b/.gitbook/assets/market/market-customisation-4.1.png similarity index 100% rename from .gitbook/assets/market-customisation-4.1.png rename to .gitbook/assets/market/market-customisation-4.1.png diff --git a/.gitbook/assets/market-customisation-4.2.jpg b/.gitbook/assets/market/market-customisation-4.2.jpg similarity index 100% rename from .gitbook/assets/market-customisation-4.2.jpg rename to .gitbook/assets/market/market-customisation-4.2.jpg diff --git a/.gitbook/assets/market-customisation-4.png b/.gitbook/assets/market/market-customisation-4.png similarity index 100% rename from .gitbook/assets/market-customisation-4.png rename to .gitbook/assets/market/market-customisation-4.png diff --git a/.gitbook/assets/market-customisation-5.png b/.gitbook/assets/market/market-customisation-5.png similarity index 100% rename from .gitbook/assets/market-customisation-5.png rename to .gitbook/assets/market/market-customisation-5.png diff --git a/.gitbook/assets/market-customisation-6.1.png b/.gitbook/assets/market/market-customisation-6.1.png similarity index 100% rename from .gitbook/assets/market-customisation-6.1.png rename to .gitbook/assets/market/market-customisation-6.1.png diff --git a/.gitbook/assets/market-customisation-6.png b/.gitbook/assets/market/market-customisation-6.png similarity index 100% rename from .gitbook/assets/market-customisation-6.png rename to .gitbook/assets/market/market-customisation-6.png diff --git a/.gitbook/assets/market-customisation-7.1.png b/.gitbook/assets/market/market-customisation-7.1.png similarity index 100% rename from .gitbook/assets/market-customisation-7.1.png rename to .gitbook/assets/market/market-customisation-7.1.png diff --git a/.gitbook/assets/market-customisation-8.png b/.gitbook/assets/market/market-customisation-8.png similarity index 100% rename from .gitbook/assets/market-customisation-8.png rename to .gitbook/assets/market/market-customisation-8.png diff --git a/.gitbook/assets/market/marketplace_data.jpg b/.gitbook/assets/market/marketplace_data.jpg new file mode 100644 index 00000000..539db422 Binary files /dev/null and b/.gitbook/assets/market/marketplace_data.jpg differ diff --git a/.gitbook/assets/market/network-and-datatoken-address.png b/.gitbook/assets/market/network-and-datatoken-address.png new file mode 100644 index 00000000..7557121e Binary files /dev/null and b/.gitbook/assets/market/network-and-datatoken-address.png differ diff --git a/.gitbook/assets/publish-5.png b/.gitbook/assets/market/publish-5.png similarity index 100% rename from .gitbook/assets/publish-5.png rename to .gitbook/assets/market/publish-5.png diff --git a/.gitbook/assets/publish-6.png b/.gitbook/assets/market/publish-6.png similarity index 100% rename from .gitbook/assets/publish-6.png rename to .gitbook/assets/market/publish-6.png diff --git a/.gitbook/assets/publish-7.png b/.gitbook/assets/market/publish-7.png similarity index 100% rename from .gitbook/assets/publish-7.png rename to .gitbook/assets/market/publish-7.png diff --git a/.gitbook/assets/market/publish-page-2.png b/.gitbook/assets/market/publish-page-2.png new file mode 100644 index 00000000..0cb13b87 Binary files /dev/null and b/.gitbook/assets/market/publish-page-2.png differ diff --git a/.gitbook/assets/market/publish-page-before-edit.png b/.gitbook/assets/market/publish-page-before-edit.png new file mode 100644 index 00000000..caeb4e3c Binary files /dev/null and b/.gitbook/assets/market/publish-page-before-edit.png differ diff --git a/.gitbook/assets/marketplace-landing-page.png b/.gitbook/assets/marketplace-landing-page.png deleted file mode 100644 index e57548bd..00000000 Binary files a/.gitbook/assets/marketplace-landing-page.png and /dev/null differ diff --git a/.gitbook/assets/marketplace-publish-file-field.png b/.gitbook/assets/marketplace-publish-file-field.png deleted file mode 100644 index f16eee95..00000000 Binary files a/.gitbook/assets/marketplace-publish-file-field.png and /dev/null differ diff --git a/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp b/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp deleted file mode 100644 index 256e4431..00000000 Binary files a/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp and /dev/null differ diff --git a/.gitbook/assets/ocean-market-homepage.png b/.gitbook/assets/ocean-market-homepage.png deleted file mode 100644 index 4aa49e30..00000000 Binary files a/.gitbook/assets/ocean-market-homepage.png and /dev/null differ diff --git a/.gitbook/assets/publish-1.png b/.gitbook/assets/publish-1.png deleted file mode 100644 index b9d2604e..00000000 Binary files a/.gitbook/assets/publish-1.png and /dev/null differ diff --git a/.gitbook/assets/publish-2.png b/.gitbook/assets/publish-2.png deleted file mode 100644 index e47421ab..00000000 Binary files a/.gitbook/assets/publish-2.png and /dev/null differ diff --git a/.gitbook/assets/publish-3.png b/.gitbook/assets/publish-3.png deleted file mode 100644 index 371f19e3..00000000 Binary files a/.gitbook/assets/publish-3.png and /dev/null differ diff --git a/.gitbook/assets/publish-4.png b/.gitbook/assets/publish-4.png deleted file mode 100644 index d77cece1..00000000 Binary files a/.gitbook/assets/publish-4.png and /dev/null differ diff --git a/.gitbook/assets/publish-8.png b/.gitbook/assets/publish-8.png deleted file mode 100644 index 4bee7cee..00000000 Binary files a/.gitbook/assets/publish-8.png and /dev/null differ diff --git a/.gitbook/assets/publish.png b/.gitbook/assets/publish.png deleted file mode 100644 index cd41d29a..00000000 Binary files a/.gitbook/assets/publish.png and /dev/null differ diff --git a/.gitbook/assets/remove-liquidity-1 (1).png b/.gitbook/assets/remove-liquidity-1 (1).png deleted file mode 100644 index 61bcc1a7..00000000 Binary files a/.gitbook/assets/remove-liquidity-1 (1).png and /dev/null differ diff --git a/.gitbook/assets/remove-liquidity-1.png b/.gitbook/assets/remove-liquidity-1.png deleted file mode 100644 index 5288d028..00000000 Binary files a/.gitbook/assets/remove-liquidity-1.png and /dev/null differ diff --git a/.gitbook/assets/remove-liquidity-3.png b/.gitbook/assets/remove-liquidity-3.png deleted file mode 100644 index 50bda9bb..00000000 Binary files a/.gitbook/assets/remove-liquidity-3.png and /dev/null differ diff --git a/.gitbook/assets/remove-liquidity-4.png b/.gitbook/assets/remove-liquidity-4.png deleted file mode 100644 index fa7e4263..00000000 Binary files a/.gitbook/assets/remove-liquidity-4.png and /dev/null differ diff --git a/.gitbook/assets/remove-liquidity-5.png b/.gitbook/assets/remove-liquidity-5.png deleted file mode 100644 index 4780d741..00000000 Binary files a/.gitbook/assets/remove-liquidity-5.png and /dev/null differ diff --git a/.gitbook/assets/rewards/Rewards-Tab.png b/.gitbook/assets/rewards/Rewards-Tab.png new file mode 100644 index 00000000..74f7f05d Binary files /dev/null and b/.gitbook/assets/rewards/Rewards-Tab.png differ diff --git a/.gitbook/assets/rewards/allocations.png b/.gitbook/assets/rewards/allocations.png new file mode 100644 index 00000000..ce15abf1 Binary files /dev/null and b/.gitbook/assets/rewards/allocations.png differ diff --git a/.gitbook/assets/rewards/claim-rewards.png b/.gitbook/assets/rewards/claim-rewards.png new file mode 100644 index 00000000..0bf6c865 Binary files /dev/null and b/.gitbook/assets/rewards/claim-rewards.png differ diff --git a/veocean-data-farming/images/df_rewards_page.png b/.gitbook/assets/rewards/df_rewards_page.png similarity index 100% rename from veocean-data-farming/images/df_rewards_page.png rename to .gitbook/assets/rewards/df_rewards_page.png diff --git a/veocean-data-farming/images/emissions_first_20years.png b/.gitbook/assets/rewards/emissions_first_20years.png similarity index 100% rename from veocean-data-farming/images/emissions_first_20years.png rename to .gitbook/assets/rewards/emissions_first_20years.png diff --git a/veocean-data-farming/images/emissions_first_5years.png b/.gitbook/assets/rewards/emissions_first_5years.png similarity index 100% rename from veocean-data-farming/images/emissions_first_5years.png rename to .gitbook/assets/rewards/emissions_first_5years.png diff --git a/veocean-data-farming/images/emissions_lifetime.png b/.gitbook/assets/rewards/emissions_lifetime.png similarity index 100% rename from veocean-data-farming/images/emissions_lifetime.png rename to .gitbook/assets/rewards/emissions_lifetime.png diff --git a/veocean-data-farming/images/example_apys.png b/.gitbook/assets/rewards/example_apys.png similarity index 100% rename from veocean-data-farming/images/example_apys.png rename to .gitbook/assets/rewards/example_apys.png diff --git a/.gitbook/assets/rewards/farms-page.png b/.gitbook/assets/rewards/farms-page.png new file mode 100644 index 00000000..353d3254 Binary files /dev/null and b/.gitbook/assets/rewards/farms-page.png differ diff --git a/veocean-data-farming/images/flow_of_value.png b/.gitbook/assets/rewards/flow_of_value.png similarity index 100% rename from veocean-data-farming/images/flow_of_value.png rename to .gitbook/assets/rewards/flow_of_value.png diff --git a/veocean-data-farming/images/ranked_rewards_study.png b/.gitbook/assets/rewards/ranked_rewards_study.png similarity index 100% rename from veocean-data-farming/images/ranked_rewards_study.png rename to .gitbook/assets/rewards/ranked_rewards_study.png diff --git a/veocean-data-farming/images/reward_schedule.png b/.gitbook/assets/rewards/reward_schedule.png similarity index 100% rename from veocean-data-farming/images/reward_schedule.png rename to .gitbook/assets/rewards/reward_schedule.png diff --git a/.gitbook/assets/rewards/update-allocations.png b/.gitbook/assets/rewards/update-allocations.png new file mode 100644 index 00000000..eb0fcd27 Binary files /dev/null and b/.gitbook/assets/rewards/update-allocations.png differ diff --git a/veocean-data-farming/images/veOCEAN-Delegation.png b/.gitbook/assets/rewards/veOCEAN-Delegation.png similarity index 100% rename from veocean-data-farming/images/veOCEAN-Delegation.png rename to .gitbook/assets/rewards/veOCEAN-Delegation.png diff --git a/.gitbook/assets/token tool b/.gitbook/assets/token tool deleted file mode 100644 index eea3ded5..00000000 Binary files a/.gitbook/assets/token tool and /dev/null differ diff --git a/.gitbook/assets/use case b/.gitbook/assets/use case deleted file mode 100644 index a581963f..00000000 Binary files a/.gitbook/assets/use case and /dev/null differ diff --git a/.gitbook/assets/binance-receive.png b/.gitbook/assets/wallet/binance-receive.png similarity index 100% rename from .gitbook/assets/binance-receive.png rename to .gitbook/assets/wallet/binance-receive.png diff --git a/.gitbook/assets/confirm-backup-phrase.png b/.gitbook/assets/wallet/confirm-backup-phrase.png similarity index 100% rename from .gitbook/assets/confirm-backup-phrase.png rename to .gitbook/assets/wallet/confirm-backup-phrase.png diff --git a/.gitbook/assets/create-new-metamask-wallet.png b/.gitbook/assets/wallet/create-new-metamask-wallet.png similarity index 100% rename from .gitbook/assets/create-new-metamask-wallet.png rename to .gitbook/assets/wallet/create-new-metamask-wallet.png diff --git a/.gitbook/assets/wallet/data_nft_open_sea.png b/.gitbook/assets/wallet/data_nft_open_sea.png new file mode 100644 index 00000000..360e0e7f Binary files /dev/null and b/.gitbook/assets/wallet/data_nft_open_sea.png differ diff --git a/.gitbook/assets/manage-tokens.png b/.gitbook/assets/wallet/manage-tokens.png similarity index 100% rename from .gitbook/assets/manage-tokens.png rename to .gitbook/assets/wallet/manage-tokens.png diff --git a/.gitbook/assets/metamask-add-network.png b/.gitbook/assets/wallet/metamask-add-network.png similarity index 100% rename from .gitbook/assets/metamask-add-network.png rename to .gitbook/assets/wallet/metamask-add-network.png diff --git a/.gitbook/assets/metamask-browser-extension.png b/.gitbook/assets/wallet/metamask-browser-extension.png similarity index 100% rename from .gitbook/assets/metamask-browser-extension.png rename to .gitbook/assets/wallet/metamask-browser-extension.png diff --git a/.gitbook/assets/metamask-chrome-extension.png b/.gitbook/assets/wallet/metamask-chrome-extension.png similarity index 100% rename from .gitbook/assets/metamask-chrome-extension.png rename to .gitbook/assets/wallet/metamask-chrome-extension.png diff --git a/.gitbook/assets/polygon-bridge.png b/.gitbook/assets/wallet/polygon-bridge.png similarity index 100% rename from .gitbook/assets/polygon-bridge.png rename to .gitbook/assets/wallet/polygon-bridge.png diff --git a/.gitbook/assets/polygon-explorer.png b/.gitbook/assets/wallet/polygon-explorer.png similarity index 100% rename from .gitbook/assets/polygon-explorer.png rename to .gitbook/assets/wallet/polygon-explorer.png diff --git a/.gitbook/assets/polygon-login.png b/.gitbook/assets/wallet/polygon-login.png similarity index 100% rename from .gitbook/assets/polygon-login.png rename to .gitbook/assets/wallet/polygon-login.png diff --git a/.gitbook/assets/polygon-ocean.png b/.gitbook/assets/wallet/polygon-ocean.png similarity index 100% rename from .gitbook/assets/polygon-ocean.png rename to .gitbook/assets/wallet/polygon-ocean.png diff --git a/.gitbook/assets/polygon-wallet-page.png b/.gitbook/assets/wallet/polygon-wallet-page.png similarity index 100% rename from .gitbook/assets/polygon-wallet-page.png rename to .gitbook/assets/wallet/polygon-wallet-page.png diff --git a/.gitbook/assets/secret-backup-phrase.png b/.gitbook/assets/wallet/secret-backup-phrase.png similarity index 100% rename from .gitbook/assets/secret-backup-phrase.png rename to .gitbook/assets/wallet/secret-backup-phrase.png diff --git a/README.md b/README.md index b692a7cc..5ea64c5b 100644 --- a/README.md +++ b/README.md @@ -1,130 +1,10 @@ --- -description: Ocean Protocol - Tools for the Web3 Data Economy +description: Help for wherever you are on your Ocean Protocol journey. +cover: .gitbook/assets/cover/docs_banner.png +coverY: 0 +layout: landing --- -# Ocean Documentation +# 👋 Welcome -## What is Ocean? - -Ocean provides the next generation of tools to unlock data at a large scale. Ocean makes it easy to publish and consume data services. - -Ocean uses Data NFTs (ERC721) and datatokens (ERC20) as the interface to connect data assets with blockchain and DeFi tools. Crypto wallets become data wallets, crypto exchanges become data marketplaces, DAOs for data co-ops, and more via DeFi composability. - -![Creating a New Data Economy](./.gitbook/assets/feature-datascience@2x.webp) - -The following guides are a greate place to start if you are new to Ocean: - -* [Architecture Overview](core-concepts/architecture.md) -* [Data NFTs and Datatokens](core-concepts/datanft-and-datatoken.md) -* [Publish a data asset](using-ocean-market/marketplace-publish-data-asset.md) -* [Download a data asset](using-ocean-market/marketplace-download-data-asset.md) - -## What is our Mission? - -**To unlock data, for more equitable outcomes for users of data, using a thoughtful application of both technology and governance.** - -Society is becoming increasingly reliant on data, especially with the advent of AI. However, a small handful of organizations with both massive data assets and AI capabilities attained worrying levels of control which is a danger to a free and open society. - -Our team and community is committed to kick-starting a New Data Economy that reaches every single person, company and device, giving power back to data owners and enabling people to capture value from data to better our world. - -Find out more about the people building Ocean on our [site](https://oceanprotocol.com/about). - -## What can you do with Ocean? - -### Buy or Sell Data - -Use Ocean Market to publish and sell data, or browse and buy data. Data is published as interoperable ERC721 data NFTs & ERC20 datatokens. It's a decentralized exchange (DEX), tuned for data. The acts of publishing data, purchasing data, and consuming data are all recorded on the blockchain to make a tamper-proof audit trail. - -As a data scientist or AI practitioner, you can benefit from access to more data (including private data), crypto-secured provenance in data & AI training, and income opportunities for selling data and curating data. - -![Decentralized Exchange Marketplaces](./.gitbook/assets/feature-marketplaces@2x.webp) - -The following guides will help you get started with buying and selling data: - -* [Publish a data asset](using-ocean-market/marketplace-publish-data-asset.md) -* [Download a data asset](using-ocean-market/marketplace-download-data-asset.md) -* [Publishing with hosting services](using-ocean-market/asset-hosting.md) - -### Build Your Own Data Market - -Use Ocean Protocol software tools to build your own data marketplace, by either forking [Ocean Market](https://v4.market.oceanprotocol.com/) code or building up with Ocean components. - -![Ocean Market Homepage](./.gitbook/assets/ocean-market-homepage.png) - -If you're interested in starting your own marketplace checkout the following guides: - -* [Forking Ocean Market](building-with-ocean/build-a-marketplace/forking-ocean-market.md) -* [Customising your market](building-with-ocean/build-a-marketplace/customising-your-market.md) -* [Deploying your market](building-with-ocean/build-a-marketplace/deploying-market.md) - -### Manage datatokens and data NFTs for use in DeFi - -Ocean makes it easy to publish data services (deploy ERC721 data NFTs and ERC20 datatokens), and to consume data services (spend datatokens). Crypto wallets, exchanges, and DAOs become data wallets, exchanges, and DAOs. - -Use Ocean [JavaScript](https://github.com/oceanprotocol/ocean.js) or [Python](https://github.com/oceanprotocol/ocean.py) drivers to manage data NFTs and datatokens: - -Ocean-based apps make data asset on-ramps and off-ramps easy for end users. Ocean smart contracts and libraries make this easy for developers. The data itself does not need to be on-chain, just the access control. - -![New Data on-ramp and off-ramp](./.gitbook/assets/new-ramp-on-crypto-ramp-off.webp) - -Data NFTs are ERC721 tokens representing the unique asset and datatokens are ERC20 tokens to access data services. Each data service gets its own data NFT and one or more type of datatokens. - -To access the dataset, you send 1.0 datatokens to the data provider (running Ocean Provider). To give access to someone else, send them 1.0 datatokens. That's it. - -Since datatokens are ERC20, and live on Ethereum mainnet, there's a whole ecosystem to leverage. - -* _Publish and access data services:_ downloadable files or compute-to-data. Use Ocean to deploy a new [ERC721](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-721.md) and [ERC20](https://github.com/ethereum/EIPs/blob/7f4f0377730f5fc266824084188cc17cf246932e/EIPS/eip-20.md) datatoken contract for each data service, then mint datatokens. -* _Transfer datatokens_ to another owner (or approve & transferFrom). -* _And more._ Use ERC20 support in [web3.js](https://web3js.readthedocs.io/), [web3.py](https://web3py.readthedocs.io/en/stable/examples.html#working-with-an-erc20-token-contract) and Solidity to connect datatokens with crypto wallets and other DeFi services. - - -### Compute-to-Data - -Ocean's "Compute-to-Data" feature enables private data to be bought & sold. You can sell compute access to privately-held data, which never leaves the data owner’s premises. Ocean-based marketplaces enable the monetization of private data while preserving privacy. - -Compute-to-data resolves the tradeoff between the benefits of using private data, and the risks of exposing it. It lets the data stay on-premise, yet allows 3rd parties to run specific compute jobs on it to get useful compute results like averaging or building an AI model. - -The most valuable data is private data — using it can improve research and business outcomes. But concerns over privacy and control make it hard to access. With Compute-to-Data, private data isn’t directly shared but rather specific access to it is granted. - -![Compute-to-data](./.gitbook/assets/feature-compute@2x.webp) - -It can be used for data sharing in science or technology contexts, or in marketplaces for selling private data while preserving privacy, as an opportunity for companies to monetize their data assets. - -Private data can help research, leading to life-altering innovations in science and technology. For example, more data improves the predictive accuracy of modern Artificial Intelligence (AI) models. Private data is often considered the most valuable data because it’s so hard to get at, and using it can lead to potentially big payoffs. - -Checkout these guides if you are aiming to get a deeper understanding on how compute-to-data works: - -* [Architecture](building-with-ocean/compute-to-data/compute-to-data-architecture.md) -* [Datasets & Algorithms](building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms.md) -* [Minikube Environment](building-with-ocean/compute-to-data/compute-to-data-minikube.md) -* [Writing Algorithms](building-with-ocean/compute-to-data/compute-to-data-algorithms.md) -* [Private docker registry](building-with-ocean/compute-to-data/compute-to-data-docker-registry.md) -## How does it work? - -In Ocean Protocol, each asset gets its own ERC721 **data NFT** and one(or more) ERC20 **datatokens**. This enables data wallets, data exchanges, and data co-ops by directly leveraging crypto wallets, exchanges, and more. - -Ocean Protocol provides tools for developers to _build data markets_, and to _manage data NFTs and datatokens_ for use in DeFi. - -If you are new to web3 and blockchain technologies then we suggest you first read these introductory guides: - -* [Wallet Basics](building-with-ocean/wallets.md) -* [Set Up MetaMask Wallet](orientation/metamask-setup.md) -* [Manage Your OCEAN Tokens](building-with-ocean/wallets-and-ocean-tokens.md) - -If ou are looking to get to grips with the inner workings of Ocean, then you'll be interested in the following guides: - -* [Architecture Overview](./core-concepts/architecture.md) -* [Data NFTs and Datatokens](./core-concepts/datanft-and-datatoken.md) -* [Networks](./core-concepts/networks.md) -* [Fees](./core-concepts/fees.md) -* [Asset pricing](./core-concepts/asset-pricing.md) -* [DID & DDO](./core-concepts/did-ddo.md) -* [Roles](./core-concepts/roles.md) -* [Set Up a Marketplace](./building-with-ocean/marketplace.md) -* [Compute-to-Data](./building-with-ocean/compute-to-data/README.md) -* [Deploying components](building-with-ocean/deploying-components/README.md) -* [Contributing](core-concepts/contributing.md) - -## Supporters - -[GitBook](https://www.gitbook.com/) is a supporter of this open source project by providing hosting for this documentation. +
discoverLearn how Ocean Protocol transforms data sharing and monetization with its powerful Web3 open source tools.discoverdiscover_card.png
user-guidesFollow our step-by-step instructions for a no-code solution to unleash the power of Ocean Protocol technologies!user-guidesuser_guides_card.png
developersFind APIs, libraries, and other tools to build awesome dApps or integrate with the Ocean Protocol ecosystem.developersdeveloper_card.png
data-scienceWe invite and engage all data scientists and machine learning specialists to discover Ocean Protocol.data-sciencedata_science_card.png
infrastructureFor software architects and developers - deploy your own components on the Ocean Protocol network.infrastructureinfrastructure_card.png
defiGet creative capitalizing with Ocean Protocol tools in a variety of DeFi (Decentralized Finance) applications.defidefi_card.png
rewardsExplore how you can earn OCEAN rewards by Data Farming and publishing your assets on the Ocean Market.rewardsrewards_card.png
contributeGet involved and make a difference! Learn how you can contribute to the growth and development of Ocean Protocol.contributecontribute_card.png
diff --git a/SUMMARY.md b/SUMMARY.md index 0bf7554c..6f8b1f5f 100644 --- a/SUMMARY.md +++ b/SUMMARY.md @@ -1,72 +1,124 @@ # Table of contents -* [Orientation](README.md) - * [Wallet Basics](building-with-ocean/wallets.md) - * [Set Up MetaMask Wallet](orientation/metamask-setup.md) - * [Manage Your OCEAN Tokens](building-with-ocean/wallets-and-ocean-tokens.md) -* [Core Concepts](core-concepts/README.md) - * [Architecture Overview](core-concepts/architecture.md) - * [Data NFTs and Datatokens](core-concepts/datanft-and-datatoken.md) - * [Roles](core-concepts/roles.md) - * [Networks](core-concepts/networks.md) - * [Bridges](core-concepts/networks/bridges.md) - * [Fees](core-concepts/fees.md) - * [Asset Pricing](core-concepts/asset-pricing.md) - * [DID & DDO](core-concepts/did-ddo.md) -* [Using Ocean Market](using-ocean-market/README.md) - * [Publish a Data Asset](using-ocean-market/marketplace-publish-data-asset.md) - * [Download a Data Asset](using-ocean-market/marketplace-download-data-asset.md) - * [Publishing with Hosting Services](using-ocean-market/asset-hosting.md) - * [Liquidity Pools \[deprecated\]](using-ocean-market/remove-liquidity-using-etherscan.md) -* [Building with Ocean](building-with-ocean/README.md) - * [Build a Marketplace](building-with-ocean/build-a-marketplace/README.md) - * [Forking Ocean Market](building-with-ocean/build-a-marketplace/forking-ocean-market.md) - * [Customising a Market](building-with-ocean/build-a-marketplace/customising-your-market.md) - * [Deploying a Market](building-with-ocean/build-a-marketplace/deploying-market.md) - * [Using Ocean Libraries](building-with-ocean/using-ocean-libraries/README.md) - * [Configuration](building-with-ocean/using-ocean-libraries/configuration.md) - * [Creating a data NFT](building-with-ocean/using-ocean-libraries/creating\_dataNFT.md) - * [Publish with Fixed Pricing](building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing.md) - * [Mint Datatokens](building-with-ocean/using-ocean-libraries/mint-datatoken.md) - * [Update Metadata](building-with-ocean/using-ocean-libraries/update-metadata.md) - * [Compute-to-Data](building-with-ocean/compute-to-data/README.md) - * [Architecture](building-with-ocean/compute-to-data/compute-to-data-architecture.md) - * [Datasets & Algorithms](building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms.md) - * [Minikube Environment](building-with-ocean/compute-to-data/compute-to-data-minikube.md) - * [Writing Algorithms](building-with-ocean/compute-to-data/compute-to-data-algorithms.md) - * [Private Docker Registry](building-with-ocean/compute-to-data/compute-to-data-docker-registry.md) - * [User defined parameters](building-with-ocean/compute-to-data/user-defined-parameters.md) - * [Deploying Components](building-with-ocean/deploying-components/README.md) - * [Setup a Server](building-with-ocean/deploying-components/setup-server.md) - * [Deploying Marketplace](building-with-ocean/deploying-components/deploying-marketplace.md) - * [Deploying Aquarius](building-with-ocean/deploying-components/deploying-aquarius.md) - * [Deploying Provider](building-with-ocean/deploying-components/deploying-provider.md) - * [Deploying Ocean Subgraph](building-with-ocean/deploying-components/deploying-ocean-subgraph.md) - * [Using Ocean Subgraph](building-with-ocean/using-ocean-subgraph/README.md) - * [List data NFTs](building-with-ocean/using-ocean-subgraph/list-data-nfts.md) - * [List all Tokens](building-with-ocean/using-ocean-subgraph/list-datatokens.md) - * [Get Data NFT Information](building-with-ocean/using-ocean-subgraph/get-data-nft-information.md) - * [Get Datatoken Information](building-with-ocean/using-ocean-subgraph/get-datatoken-information.md) - * [List Fixed Rate Exchanges](building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges.md) - * [Contributing](core-concepts/contributing.md) - * [Contributor Code of Conduct](core-concepts/code-of-conduct.md) - * [Legal Requirements](core-concepts/legal-reqs.md) - * [Partners & Collaborators](building-with-ocean/projects-using-ocean.md) -* [veOCEAN & Data Farming](veocean-data-farming/README.md) - * [veOCEAN](veocean-data-farming/veocean.md) - * [Data Farming 101](veocean-data-farming/df-intro.md) - * [Data Farming Background](veocean-data-farming/df-background.md) - * [Emissions & APYs](veocean-data-farming/emissions-apys.md) - * [Delegation](veocean-data-farming/delegation.md) - * [Rewards Tutorial](rewards/veOcean-Data-Farming-Tutorial.md) -* [API References](api-references/README.md) - * [Aquarius REST API](api-references/aquarius-rest-api.md) - * [Provider REST API](api-references/provider-rest-api.md) -* [FAQ](orientation/faq.md) - -## Community - -* [Medium](https://blog.oceanprotocol.com/) -* [Discord](https://discord.com/invite/TnXjkR5) -* [Telegram](https://t.me/OceanProtocol\_Community) -* [Twitter](https://twitter.com/oceanprotocol) +* [👋 Welcome](README.md) +* [🌊 Discover](discover/README.md) + * [Explore](discover/explore.md) + * [Ocean 101](discover/ocean-101.md) + * [Basic Concepts](discover/basic-concepts.md) + * [Wallets](discover/wallets/README.md) + * [Set Up MetaMask Wallet](discover/wallets/metamask-setup.md) + * [Networks](discover/networks/README.md) + * [Bridges](discover/networks/bridges.md) + * [Manage Your OCEAN Tokens](discover/wallets-and-ocean-tokens.md) + * [Glossary](discover/glossary.md) + * [FAQ](discover/faq.md) +* [📚 User Guides](user-guides/README.md) + * [Guide to the Ocean Market](user-guides/using-ocean-market.md) + * [Publish Data NFTs](user-guides/publish-data-nfts.md) + * [Buy NFT Data](user-guides/buy-data-nfts.md) + * [Sell NFT Computations (Compute-to-Data)](user-guides/compute-to-data/README.md) + * [Make a Boss C2D Algorithm](user-guides/compute-to-data/make-a-boss-c2d-algorithm.md) + * [Publish a C2D Algorithm NFT](user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md) + * [Publish a C2D Data NFT](user-guides/compute-to-data/publish-a-c2d-data-nft.md) + * [Host Assets](user-guides/asset-hosting/README.md) + * [Arweave](user-guides/asset-hosting/arweave.md) + * [AWS](user-guides/asset-hosting/aws.md) + * [Azure Cloud](user-guides/asset-hosting/azure-cloud.md) + * [Google Storage](user-guides/asset-hosting/google-storage.md) + * [Github](user-guides/asset-hosting/github.md) + * [Join a Data Challenge](user-guides/join-a-data-challenge.md) + * [Sponsor a Data Challenge](user-guides/sponsor-a-data-challenge.md) + * [Get Started Data Farming](user-guides/get-started-df.md) + * [Harvest More Yield Data Farming](user-guides/how-to-data-farm.md) + * [Claim Rewards Data Farming](user-guides/claim-ocean-rewards.md) + * [Liquidity Pools \[deprecated\]](user-guides/remove-liquidity-pools.md) +* [💻 Developers](developers/README.md) + * [Architecture Overview](developers/architecture.md) + * [Contracts](developers/contracts/README.md) + * [Data NFTs](developers/contracts/data-nfts.md) + * [Datatokens](developers/contracts/datatokens.md) + * [Data NFTs and Datatokens](developers/contracts/datanft-and-datatoken.md) + * [Datatoken Templates](developers/contracts/datatoken-templates.md) + * [Roles](developers/contracts/roles.md) + * [Pricing Schemas](developers/contracts/pricing-schemas.md) + * [Fees](developers/contracts/fees.md) + * [Revenue](developers/contracts/revenue.md) + * [Fractional Ownership](developers/fractional-ownership.md) + * [Community Monetization](developers/community-monetization.md) + * [Metadata](developers/metadata.md) + * [Identifiers (DIDs)](developers/identifiers.md) + * [DDO Specification](developers/ddo-specification.md) + * [Storage Specifications](developers/storage.md) + * [Fine-Grained Permissions](developers/fg-permissions.md) + * [Retrieve datatoken/data NFT addresses & Chain ID](developers/retrieve-datatoken-address.md) + * [Get API Keys for Blockchain Access](developers/get-api-keys-for-blockchain-access.md) + * [Barge](developers/barge/README.md) + * [Local Setup](developers/barge/local-setup-ganache.md) + * [Build a Marketplace](developers/build-a-marketplace/README.md) + * [Forking Ocean Market](developers/build-a-marketplace/forking-ocean-market.md) + * [Customising a Market](developers/build-a-marketplace/customising-your-market.md) + * [Build and host your Data Marketplace](developers/build-a-marketplace/deploying-market.md) + * [Subgraph](developers/subgraph/README.md) + * [Get data NFTs](developers/subgraph/list-data-nfts.md) + * [Get data NFT information](developers/subgraph/get-data-nft-information.md) + * [Get datatokens](developers/subgraph/list-datatokens.md) + * [Get datatoken information](developers/subgraph/get-datatoken-information.md) + * [Get datatoken buyers](developers/subgraph/get-datatoken-buyers.md) + * [Get fixed-rate exchanges](developers/subgraph/list-fixed-rate-exchanges.md) + * [Get veOCEAN stats](developers/subgraph/get-veocean-stats.md) + * [Ocean.py](developers/ocean.py/README.md) + * [Install](developers/ocean.py/install.md) + * [Local Setup](developers/ocean.py/local-setup.md) + * [Remote Setup](developers/ocean.py/remote-setup.md) + * [Publish Flow](developers/ocean.py/publish-flow.md) + * [Consume Flow](developers/ocean.py/consume-flow.md) + * [Compute Flow](developers/ocean.py/compute-flow.md) + * [Ocean Instance Tech Details](developers/ocean.py/technical-details.md) + * [Ocean Assets Tech Details](developers/ocean.py/ocean-assets-tech-details.md) + * [Ocean Compute Tech Details](developers/ocean.py/ocean-compute-tech-details.md) + * [Datatoken Interface Tech Details](developers/ocean.py/datatoken-interface-tech-details.md) + * [Ocean.js](developers/ocean.js/README.md) + * [Configuration](developers/ocean.js/configuration.md) + * [Creating a data NFT](developers/ocean.js/creating-datanft.md) + * [Publish](developers/ocean.js/publish.md) + * [Mint Datatokens](developers/ocean.js/mint-datatoken.md) + * [Update Metadata](developers/ocean.js/update-metadata.md) + * [Asset Visibility](developers/ocean.js/remove-asset.md) + * [Consume Asset](developers/ocean.js/consume-asset.md) + * [Run C2D Jobs](developers/ocean.js/cod-asset.md) + * [Compute to data](developers/compute-to-data/README.md) + * [Architecture](developers/compute-to-data/compute-to-data-architecture.md) + * [Datasets & Algorithms](developers/compute-to-data/compute-to-data-datasets-algorithms.md) + * [Writing Algorithms](developers/compute-to-data/compute-to-data-algorithms.md) + * [Compute Options](developers/compute-to-data/compute-options.md) + * [Aquarius](developers/aquarius/README.md) + * [Asset Requests](developers/aquarius/asset-requests.md) + * [Chain Requests](developers/aquarius/chain-requests.md) + * [Other Requests](developers/aquarius/other-requests.md) + * [Provider](developers/provider/README.md) + * [General Endpoints](developers/provider/general-endpoints.md) + * [Encryption / Decryption](developers/provider/encryption-decryption.md) + * [Compute Endpoints](developers/provider/compute-endpoints.md) + * [Authentication Endpoints](developers/provider/authentication-endpoints.md) +* [📊 Data Science](data-science/README.md) + * [Data Value Creation Loop](data-science/the-data-value-creation-loop.md) + * [What data is valuable?](data-science/data-engineers.md) +* [🔨 Infrastructure](infrastructure/README.md) + * [Setup a Server](infrastructure/setup-server.md) + * [Deploying Marketplace](infrastructure/deploying-marketplace.md) + * [Deploying Aquarius](infrastructure/deploying-aquarius.md) + * [Deploying Provider](infrastructure/deploying-provider.md) + * [Deploying Ocean Subgraph](infrastructure/deploying-ocean-subgraph.md) + * [Deploying C2D](infrastructure/compute-to-data-minikube.md) + * [C2D - Private Docker Registry](infrastructure/compute-to-data-docker-registry.md) +* [🤑 DeFi](defi/README.md) +* [💰 Rewards](rewards/README.md) + * [Data Farming 101 (White Belt)](rewards/df-intro.md) + * [DF Basic Actions (Blue Belt)](rewards/df-basic.md) + * [DF Max Out Yield (Purple Belt)](rewards/df-max-out-yield.md) + * [DF "ve" in veOCEAN (Brown Belt)](rewards/veocean.md) + * [DF Emissions & APYs (Black Belt)](rewards/df-emissions-apys.md) +* [🤝 Contribute](contribute/README.md) + * [Partners & Collaborators](contribute/projects-using-ocean.md) + * [Contributor Code of Conduct](contribute/code-of-conduct.md) + * [Legal Requirements](contribute/legal-reqs.md) diff --git a/api-references/README.md b/api-references/README.md deleted file mode 100644 index 7f813ad1..00000000 --- a/api-references/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# API References - diff --git a/api-references/aquarius-rest-api.md b/api-references/aquarius-rest-api.md deleted file mode 100644 index 6696b832..00000000 --- a/api-references/aquarius-rest-api.md +++ /dev/null @@ -1,323 +0,0 @@ -# Aquarius REST API - -## Assets - -### **Get** `/api/aquarius/assets/ddo/` - -* Description - - Get DDO of a particular asset. -* Parameters - - | name | description | type | in | required | - | ----- | ---------------- | ------ | ---- | -------- | - | `did` | DID of the asset | string | path | true | -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6' - ``` -* Responses - * 200 - * content-type: json - * description: On successful operation returns DDO information. - * 404 - * content-type: json - * description: This asset DID is not in ES. - * response body: - - ``` - { - "error": "Asset DID not found in Elasticsearch." - } - ``` - -### **GET** `/api/aquarius/assets/metadata/` - -* Description - - Get metadata of a particular asset. -* Parameters - - | name | description | type | in | required | - | ----- | ---------------- | ------ | ---- | -------- | - | `did` | DID of the asset | string | path | true | -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/metadata/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6' - ``` -* Responses - * 200 - * content-type: json - * description: successful operation. - * 404 - * content-type: json - * description: This asset DID is not in ES. - * response body: - - ``` - { - "error": "Error encountered while retrieving metadata: NotFoundError(404, '{\"_index\":\"aquarius\",\"_type\":\"_doc\",\"_id\":\"\",\"found\":false}')." - } - ``` - -### **POST** `/api/aquarius/assets/names` - -* Description - - Get names of assets as specified in the payload. -* Parameters - - | name | description | type | in | required | - | --------- | ------------------ | ---- | ---- | -------- | - | `didList` | list of asset DIDs | list | body | true | -* Example - - ```bash - curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/names' \ - --header 'Content-Type: application/json' \ - --data-raw '{ - "didList" : ["did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6"] - }' - ``` -* Responses - * 200 - * content-type: json - * description: successful operation. - * response body: - - ``` - {"did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6": "Ocean CEX Aggregator: OHLC history for OCEAN/USDT "} - ``` - * 400 - * content-type: json - * description: This asset DID is not in ES. - * response body: - - ``` - { - "error": "The requested didList can not be empty." - } - ``` - -### **POST** `/api/aquarius/assets/query` - -* Description - - Run a native ES query. Body must be a valid json object. -* Example - - ```bash - curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/query' \ - --header 'Content-Type: application/json' \ - --data-raw '{ - "query": { - "match_all": {} - } - }' - ``` -* Responses - * 200 - * content-type: json - * 500 - * description: elasticsearch exception - -### **POST** `/api/aquarius/assets/ddo/validate` - -* Description - - Validate DDO content. Cosumes `application/octet-stream` -* Example - - ```bash - curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/query/api/v1/aquarius/assets/ddo/validate' \ - --header 'Content-Type: application/json' \ - --data-raw '' - ``` -* Valid body - - ``` - { - "@context": ["https://w3id.org/did/v1"], - "id": "did:op:56c3d0ac76c02cc5cec98993be2b23c8a681800c08f2ff77d40c895907517280", - "version": "4.1.0", - "chainId": 1337, - "nftAddress": "0xabc", - "metadata": { - "created": "2000-10-31T01:30:00.000-05:00Z", - "updated": "2000-10-31T01:30:00.000-05:00", - "name": "Ocean protocol white paper", - "type": "dataset", - "description": "Ocean protocol white paper -- description", - "author": "Ocean Protocol Foundation Ltd.", - "license": "CC-BY", - "contentLanguage": "en-US", - "tags": ["white-papers"], - "additionalInformation": {"test-key": "test-value"}, - "links": [ - "http://data.ceda.ac.uk/badc/ukcp09/data/gridded-land-obs/gridded-land-obs-daily/", - "http://data.ceda.ac.uk/badc/ukcp09/data/gridded-land-obs/gridded-land-obs-averages-25km/", - "http://data.ceda.ac.uk/badc/ukcp09/" - ] - }, - "services": [ - { - "id": "test", - "type": "access", - "datatokenAddress": "0xC7EC1970B09224B317c52d92f37F5e1E4fF6B687", - "name": "Download service", - "description": "Download service", - "serviceEndpoint": "http://172.15.0.4:8030/", - "timeout": 0, - "files": "encryptedFiles" - } - ] - } - ``` -* Responses: - * 200 - * description: successfully request. - * 400 - * description: Invalid DDO format - * 500 - * description: Error - -### **POST** `/api/aquarius/assets/triggerCaching` - -* Description - - Manually triggers DDO caching based on a transacionId containing either MetadataCreated or MetadataUpdated event(s). -* Parameters - - | name | description | type | in | required | - | --------------- | ------------------------------------ | ------ | ---- | -------- | - | `transactionId` | DID of the asset | string | path | true | - | `logIndex` | custom log index for the transaction | int | path | false | -* Example - - ```bash - curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/triggerCaching' \ - --header 'Content-Type: application/json' \ - --data-raw '' - ``` -* Valid body - - ``` - { - "transactionId": "0x945596edf2a26d127514a78ed94fea86b199e68e9bed8b6f6d6c8bb24e451f27", - "logIndex": 0 - } - ``` -* Responses: - * 200 - * description: triggering successful, updated asset returned - * 400 - * description: request issues: either log index not found, or neither of MetadataCreated, MetadataUpdated found in tx log - * 500 - * description: Error - -## Chains - -### **GET** `/api/aquarius/chains/list` - -* Description - - Get chains list -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/list' - ``` -* Response - * 200 - * Description: Successful request - * Body - - ``` - { "246": true, "3": true, "137": true, - "2021000": true, "4": true, "1": true, - "56": true, "80001": true, "1287": true - } - ``` - -### **GET** `/api/aquarius/chains/status/{chain_id}` - -* Description - - Get index status for a specific chain\_id -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/status/137' - ``` -* Response - * 200 - * Description: Successful request - * Body - - ``` - {"last_block": 25198729} - ``` - -## Others - -### **GET** `/` - -* Description - - Get version, plugin, and software information. -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/' - ``` -* Response - * 200 - * Description: Successful request - * Body - - ``` - { - "plugin": "elasticsearch", - "software": "Aquarius", - "version": "4.2.0" - } - ``` - -### **GET** `/health` - -* Description - - Get health status -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/health' - ``` -* Response - * 200 - * Description: Successful request - * Body - - ``` - Elasticsearch connected - ``` - -### **GET** /spec - -* Description - - Get swagger spec -* Example - - ```bash - curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/spec' - ``` -* Response - * 200 - * Description: Successful request - -### Postman documentation - -Click [here](https://documenter.getpostman.com/view/2151723/UVkmQc7r) to explore the documentation and more examples in postman. diff --git a/api-references/provider-rest-api.md b/api-references/provider-rest-api.md deleted file mode 100644 index 22178f1a..00000000 --- a/api-references/provider-rest-api.md +++ /dev/null @@ -1,633 +0,0 @@ -# Provider REST API - -## Ocean Provider Endpoints Specification - -This document specifies the endpoints for Ocean Provider to be implemented by the core developers. - -If you want to see the provider URLs for our supported networks, kindly -check for `Provider` component on -this [page](https://docs.oceanprotocol.com/core-concepts/networks). - -For inspecting the errors received from `Provider` and their reasons, please revise this -[document](https://github.com/oceanprotocol/provider/blob/main/ocean_provider/routes/README.md). -### nonce endpoint - -#### GET /api/services/nonce - -Parameters - -``` - userAddress: String object containing a user's ethereum address -``` - -Returns: Json object containing the last-used nonce value. -The nonce endpoint is just informative, use the current UTC timestamp as a nonce, -where required in other endpoints. - - -Example: - -``` -POST /api/services/nonce?userAddress=0x990922334 -``` - -Response: - -```json -{ - "nonce": 23 -} -``` - -### Encrypt endpoint - -#### GET /api/services/encrypt - -Body: binary application/octet-stream - -Returns: Bytes string containing the encrypted document. - -Example: - -``` -POST /api/services/encrypt -body: b'\xfd7zXZ\x00\x00\x04\xe6\xd6\xb4F\ ... \x00\x04YZ' -``` - -Response: - -``` -b'0x04b2bfab1f4e...7ed0573' -``` - -### Decrypt endpoint - -#### POST /api/services/decrypt - -Parameters - -``` - decrypterAddress: String object containing the address of the decrypter (required) - chainId: the chain id of the network the document is on (required) - transactionId: the transaction id of the encrypted document (optional) - dataNftAddress: the address of the data nft (optional) - encryptedDocument: the encrypted document (optional) - flags: the flags of the encrypted document (optional) - documentHash: the hash of the encrypted document (optional) - nonce: the nonce of the encrypted document (required) - signature: the signature of the encrypted document (required) -``` - -Returns: Bytes string containing the decrypted document. - -Example: - -``` -POST /api/services/decrypt -payload: { - 'decrypterAddress':'0xA78deb2Fa79463945C247991075E2a0e98Ba7A09' - 'chainId':8996 - 'dataNftAddress':'0xBD558814eE914800EbfeF4a1cbE196F5161823d9' - 'encryptedDocument':'0xfd377a585a0...f07afef7dc214' - 'flags': 1 - 'documentHash':'0x0cb38a7bba49758a86f8556642aff655d00e41da28240d5ea0f596b74094d91f' - 'nonce':'1644315615.24195' - 'signature':'0xd6f27047853203824ab9e5acef87d0a501a64aee93f33a83b6f91cbe8fb4489824defceaccde91273f41290cb2a0c15572368e8bea0b456c7a653659cad7de311b' -} -``` - -Response: - -``` -b'{"@context": ["https://w3id.org/did/v1"], "id": "did:op:0c184915b07b44c888d468be85a9b28253e80070e5294b1aaed81c ...' -``` - -### File info endpoint - -#### POST /api/services/fileinfo - -Retrieves Content-Type and Content-Length from the given URL or asset. - -Parameters - -For published assets: -``` -{ - did: String, DID of the dataset - serviceId: String, ID of the service -} -``` -For file objects,see https://docs.oceanprotocol.com/core-concepts/did-ddo#files - -If checksum is requests, file size should be lower < MAX_CHECKSUM_LENGTH (see Provider ENVs) -If file is larger, checksum WILL NOT be computed. - -Returns: Json document file info object - -Example: - -``` -POST /api/services/fileinfo -payload: -{ - "did":"0x1111", - "serviceId": "0", -} -``` - -Response: - -```json -[ - { - "contentLength":"1161", - "contentType":"application/json", - "index":0, - "valid": true - },... -] -``` - -### Initial service request endpoint - -#### GET /api/services/initialize - -Parameters - -``` - documentId: String object containing document id (e.g. a DID) - serviceId: String, ID of the service the datatoken is attached to - consumerAddress: String object containing consumer's address - environment: String representing a compute environment offered by the provider - validUntil: Integer, date of validity of the service (optional) - fileIndex: Integer, the index of the file from the files list in the dataset. If set, provider will validate the file access. (optional) -``` - -Returns: Json document with a quote for amount of tokens to transfer to the provider account. - -Example: - -``` -GET /api/services/initialize -payload: -{ - "documentId":"0x1111", - "serviceId": 0, - "consumerAddress":"0x990922334", -} -payload (with optional parameters): -{ - "documentId":"0x1111", - "serviceId": 0, - "consumerAddress":"0x990922334", - "validUntil": 1578004800, - "fileIndex": 1 -} -``` - -Response: - -```json -{ - "datatoken": "0x21fa3ea32892091...", - "nonce": 23, - "providerFee": { - "providerFeeAddress": "0xabc123...", - "providerFeeToken": "0xabc123...", - "providerFeeAmount": "200", - "providerData": "0xabc123...", - "v": 27, - "r": "0xabc123...", - "s": "0xabc123...", - "validUntil": 123456, - }, - "computeAddress": "0x8123jdf8sdsa..." -} -``` - -### Download endpoint - -#### GET /api/services/download - -Parameters - -``` - documentId: String object containing document id (e.g. a DID) - serviceId: String, representing the list of `file` objects that describe each file in the dataset - transferTxId: Hex string -- the id of on-chain transaction for approval of datatokens transfer - given to the provider's account - fileIndex: integer, the index of the file from the files list in the dataset - nonce: Nonce - consumerAddress: String object containing consumer's address - signature: String object containg user signature (signed message) -``` - -Returns: File stream. Retrieves the attached asset files. - -Example: - -``` -GET /api/services/download -payload: -{ - "documentId":"0x1111", - "serviceId": 0, - "fileIndex": 0, - "datatoken": "", - "consumerAddress":"0x990922334", - "signature":"0x00110011", - "transferTxId": "0xa09fc23421345532e34829" -``` - -Response: - -```json -{ - "": "" -} -``` - - -### Compute endpoints - -All compute endpoints respond with an Array of status objects, each object describing a compute job info. - -Each status object will contain: - -``` - owner:The owner of this compute job - documentId: String object containing document id (e.g. a DID) - jobId: String object containing workflowId - dateCreated: Unix timestamp of job creation - dateFinished: Unix timestamp when job finished (null if job not finished) - status: Int, see below for list - statusText: String, see below - algorithmLogUrl: URL to get the algo log (for user) - resultsUrls: Array of URLs for algo outputs - resultsDid: If published, the DID -``` - -Status description (`statusText`): (see Operator-Service for full status list) - -| status | Description | -| ------ | ----------------------------- | -| 1 | Warming up | -| 10 | Job started | -| 20 | Configuring volumes | -| 30 | Provisioning success | -| 31 | Data provisioning failed | -| 32 | Algorithm provisioning failed | -| 40 | Running algorith | -| 50 | Filtering results | -| 60 | Publishing results | -| 70 | Job completed | - -### Create new job or restart an existing stopped job - -#### POST /api/services/compute - -Start a new job - -Parameters - -``` - signature: String object containg user signature (signed message) (required) - consumerAddress: String object containing consumer's ethereum address (required) - nonce: Integer, Nonce (required) - environment: String representing a compute environment offered by the provider - dataset: Json object containing dataset information - dataset.documentId: String, object containing document id (e.g. a DID) (required) - dataset.serviceId: String, ID of the service the datatoken is attached to (required) - dataset.transferTxId: Hex string, the id of on-chain transaction for approval of datatokens transfer - given to the provider's account (required) - dataset.userdata: Json, user-defined parameters passed to the dataset service (optional) - algorithm: Json object, containing algorithm information - algorithm.documentId: Hex string, the did of the algorithm to be executed (optional) - algorithm.meta: Json object, defines the algorithm attributes and url or raw code (optional) - algorithm.serviceId: String, ID of the service to use to process the algorithm (optional) - algorithm.transferTxId: Hex string, the id of on-chain transaction of the order to use the algorithm (optional) - algorithm.userdata: Json, user-defined parameters passed to the algorithm running service (optional) - algorithm.algocustomdata: Json object, algorithm custom parameters (optional) - additionalDatasets: Json object containing a list of dataset objects (optional) - - One of `algorithm.documentId` or `algorithm.meta` is required, `algorithm.meta` takes precedence -``` - -Returns: Array of `status` objects as described above, in this case the array will have only one object - -Example: - -``` -POST /api/compute -payload: -{ - "signature": "0x00110011", - "consumerAddress": "0x123abc", - "nonce": 1, - "environment": "env", - "dataset": { - "documentId": "did:op:2222...", - "serviceId": "compute", - "transferTxId": "0x0232123..." - } -} -``` - -Response: - -```json -[ - { - "jobId": "0x1111:001", - "status": 1, - "statusText": "Job started", - ... - } -] -``` - -### Status and Result - -#### GET /api/services/compute - -Get all jobs and corresponding stats - -Parameters - -``` - signature: String object containg user signature (signed message) - documentId: String object containing document did (optional) - jobId: String object containing workflowID (optional) - consumerAddress: String object containing consumer's address (optional) - - At least one parameter from documentId, jobId and owner is required (can be any of them) -``` - -Returns - -Array of `status` objects as described above - -Example: - -``` -GET /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 -``` - -Response: - -```json -[ - { - "owner": "0x1111", - "documentId": "did:op:2222", - "jobId": "3333", - "dateCreated": "2020-10-01T01:00:00Z", - "dateFinished": "2020-10-01T01:00:00Z", - "status": 5, - "statusText": "Job finished", - "algorithmLogUrl": "http://example.net/logs/algo.log", - "resultsUrls": [ - "http://example.net/logs/output/0", - "http://example.net/logs/output/1" - ], - "resultsDid": "did:op:87bdaabb33354d2eb014af5091c604fb4b0f67dc6cca4d18a96547bffdc27bcf" - }, - { - "owner": "0x1111", - "documentId": "did:op:2222", - "jobId": "3334", - "dateCreated": "2020-10-01T01:00:00Z", - "dateFinished": "2020-10-01T01:00:00Z", - "status": 5, - "statusText": "Job finished", - "algorithmLogUrl": "http://example.net/logs2/algo.log", - "resultsUrls": [ - "http://example.net/logs2/output/0", - "http://example.net/logs2/output/1" - ], - "resultsDid": "" - } -] -``` - -#### GET /api/services/computeResult - -Allows download of asset data file. - -Parameters - -``` - jobId: String object containing workflowId (optional) - index: Integer, index of the result to download (optional) - consumerAddress: String object containing consumer's address (optional) - nonce: Integer, Nonce (required) - signature: String object containg user signature (signed message) -``` - -Returns: Bytes string containing the compute result. - -Example: - -``` -GET /api/services/computeResult?index=0&consumerAddress=0xA78deb2Fa79463945C247991075E2a0e98Ba7A09&jobId=4d32947065bb46c8b87c1f7adfb7ed8b&nonce=1644317370 -``` - -Response: - -``` -b'{"result": "0x0000000000000000000000000000000000000000000000000000000000000001"}' -``` - -### Stop - -#### PUT /api/services/compute - -Stop a running compute job. - -Parameters - -``` - signature: String object containg user signature (signed message) - documentId: String object containing document did (optional) - jobId: String object containing workflowID (optional) - consumerAddress: String object containing consumer's address (optional) - - At least one parameter from documentId,jobId and owner is required (can be any of them) -``` - -Returns - -Array of `status` objects as described above - -Example: - -``` -PUT /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 -``` - -Response: - -```json -[ - { - ..., - "status": 7, - "statusText": "Job stopped", - ... - } -] -``` - -### Delete - -#### DELETE /api/services/compute - -Delete a compute job and all resources associated with the job. If job is running it will be stopped first. - -Parameters - -``` - signature: String object containg user signature (signed message) - documentId: String object containing document did (optional) - jobId: String object containing workflowId (optional) - consumerAddress: String object containing consumer's address (optional) - - At least one parameter from documentId, jobId is required (can be any of them) - in addition to consumerAddress and signature -``` - -Returns - -Array of `status` objects as described above - -Example: - -``` -DELETE /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 -``` - -Response: - -```json -[ - { - ..., - "status": 8, - "statusText": "Job deleted successfully", - ... - } -] -``` - -#### GET /api/services/computeEnvironments - -Allows download of asset data file. - -Parameters - -``` -``` - -Returns: List of compute environments. - -Example: - -``` -GET /api/services/computeEnvironments -``` - -Response: - -```json -[ - { - "cpuType":"AMD Ryzen 7 5800X 8-Core Processor", - "currentJobs":0, - "desc":"This is a mocked environment", - "diskGB":2, - "gpuType":"AMD RX570", - "id":"ocean-compute", - "maxJobs":10, - "nCPU":2, - "nGPU":0, - "priceMin":2.3, - "ramGB":1 - }, - ... -] -``` - -### Authentication endpoints - -Provider offers an alternative to signing each request, by allowing users to generate auth tokens. -The generated auth token can be used until its expiration in all supported requests. -Simply omit the signature parameter and add the AuthToken request header based on a created token. - -Please note that if a signature parameter exists, it will take precedence over the AuthToken headers. -All routes that support a signature parameter support the replacement, with the exception of auth-related ones -(createAuthToken and deleteAuthToken need to be signed). - -#### GET /api/services/createAuthToken - -Allows the user to create an auth token. - -Parameters - -``` -address: String object containing consumer's address (optional) -nonce: Integer, Nonce (required) -signature: String object containg user signature (signed message) - The signature is based on hashing the following parameters: - address + nonce -expiration: valid future UTC timestamp (required) -``` - -Returns: -Created auth token. - -Example: - -``` -GET /api/services/createAuthToken?address=&&nonce=&&expiration=&signature= -``` -Inside the angular brackets, the user should provide the valid values for the request. - -Response: - -``` -{"token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJleHAiOjE2NjAwNTMxMjksImFkZHJlc3MiOiIweEE3OGRlYjJGYTc5NDYzOTQ1QzI0Nzk5MTA3NUUyYTBlOThCYTdBMDkifQ.QaRqYeSYxZpnFayzPmUkj8TORHHJ_vRY-GL88ZBFM0o"} -``` - - -#### DELETE /api/services/deleteAuthToken - -Allows the user to delete an existing auth token before it naturally expires. - -Parameters - -``` -address: String object containing consumer's address (optional) -nonce: Integer, Nonce (required) -signature: String object containg user signature (signed message) - The signature is based on hashing the following parameters: - address + nonce -token: token to be expired -``` - -Returns: -Success message if token is successfully deleted. -If the token is not found or already expired, returns an error message. - -Example: - -``` -DELETE /api/services/deleteAuthToken?address=&&nonce=&&token=&signature= -``` -Inside the angular brackets, the user should provide the valid values for the request. - -Response: - -``` -{"success": "Token has been deactivated."} -``` - diff --git a/building-with-ocean/README.md b/building-with-ocean/README.md deleted file mode 100644 index 27a9126b..00000000 --- a/building-with-ocean/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Building with Ocean - diff --git a/building-with-ocean/build-a-marketplace/customising-your-market.md b/building-with-ocean/build-a-marketplace/customising-your-market.md deleted file mode 100644 index a403ec50..00000000 --- a/building-with-ocean/build-a-marketplace/customising-your-market.md +++ /dev/null @@ -1,182 +0,0 @@ ---- -title: Customising Market -order: 2 -hideLanguageSelector: true -description: Step by step guide to customizing your fork of Ocean market -featuredImage: images/creatures/mantaray/mantaray-full@2x.png ---- - -# Customizing your fork of Ocean market - -So you’ve got a fully functioning data marketplace at this point, which is pretty cool. But it doesn’t really look like your data marketplace. Right now, it’s still just a clone of Ocean Market — the same branding, name, logo, etc. The next few steps focus on personalizing your data marketplace. - -- Change your Market Name - -- Change the Logo - -- Change the Styling - -- Edit the Publish Form - -- Change the Fee Address - -- Build and host your Data Marketplace - -## Change your Market Name - -It’s now time to open up your favorite code editor and start getting stuck into the code. The first thing we will be doing is changing the name of your marketplace. A decent code editor (such as VS Code) makes this incredibly simple by searching and replacing all the places where the name appears. - -Let’s start by searching and replacing “Ocean Marketplace”. In VS Code there is a magnifying glass symbol in the left-hand panel (arrow 1 in the image below) that will open up the interface for searching and replacing text. Type “Ocean Marketplace” into the first textbox, and the name of your marketplace into the second textbox (arrow 2). To make things simple, there is a button to the right of the second textbox (arrow 3) that will replace all instances at once. You can take a moment to review all the text you’re changing if you wish, and then click this button. - -![Market Customisation](../../.gitbook/assets/market-customisation-3.png) - -Next up, we need to repeat the process but this time we’ll be searching and replacing “Ocean Market”. As you can see in the screenshot below, we have called our fork “Crypto Photos Market”. - -![Market Customisation](../../.gitbook/assets/market-customisation-4.png) - -![Market Customisation](../../.gitbook/assets/market-customisation-4.1.png) - -![Market Customisation](../../.gitbook/assets/market-customisation-4.2.jpg) - -Now let’s change the tagline of your site. Open up the folder called “content” and then open the file called “site.json”. - -![Market Customisation](../../.gitbook/assets/market-customisation-5.png) - -On line 3 in this file you can enter the tagline that you want for your marketplace. - -![Market Customisation](../../.gitbook/assets/market-customisation-6.png) - -![Market Customisation](../../.gitbook/assets/market-customisation-6.1.png) - -## Change the Logo - -The next important step to personalizing your marketplace is setting your own logo. We highly recommend using your logo in SVG format for this. The site logo is stored in the following location: - -``` -src/@images/logo.svg -``` - -Delete the “logo.svg” file from that folder and paste your own logo in the same folder. Then, if you rename your logo “logo.svg” everything will work without any problems. - -At this point, it’s a good idea to check how things are looking. First check that you have saved all of your changes, then cancel the build that’s running in your terminal (Ctrl + C OR Cmnd + C) and start it again `npm start`. Once the build has finished, navigate to http://localhost:8000/ and see how things look. - -![Market Customisation](../../.gitbook/assets/market-customisation-7.1.png) - -Awesome! Our logo is looking great! - -## Change the Styling - -Hopefully, you like our pink and purple branding, but we don’t expect you to keep it in your own marketplace. This step focuses on applying your own brand colors and styles. - -### Background - -Let’s start with the background. Open up the following CSS file: - -``` -src/components/App/index.module.css -``` - -You’ll notice in the screenshot above that we are setting our “wave” background on line 3. Here, you’ll want to use your own background color or image. For this example, we’ll use an SVG background from [here](https://www.svgbackgrounds.com/). First, we save the new background image into the src/images/ folder (same folder as the logo), then we change the CSS to the file location of the new background (see line 3 in the image below). - -![Market Customisation](../../.gitbook/assets/market-customisation-8.png) - -If we save this file and view the site at this point, we get a white section at the top (see image below). And you’ll also notice that the background doesn’t fill all the way down to the bottom of the screen. - -![Market Customisation](../../.gitbook/assets/market-customisation-10.1.png) -![Market Customisation](../../.gitbook/assets/market-customisation-10.2.png) - -To fix this, we need to change the starting position of the background image and change it from no-repeat to repeat. We can do this on line 3. - -When we view our marketplace, we can see that the new background starts at the top and fills the whole page. Perfect! - -![Market Customisation](../../.gitbook/assets/market-customisation-11.1.png) - -## Brand Colors - -Next up, let’s change the background colors to match your individual style. Open up the following file: src/global/\_variables.css. Here you’ll see the global style colors that are set. Now is the time to get creative, or consult your brand handbook (if you already have one). - -You can change these colors as much as you wish until you’re happy with how everything looks. Each time you save your changes, the site will immediately update so you can see how things look. You can see the styles chosen for this example in the image below. - -![Market Customisation](../../.gitbook/assets/market-customisation-12.png) - -## Change Fonts - -The final part of the styling that we’ll alter in this guide is the fonts. This is an important step because the font used in Ocean Market is one of the few elements of the market that are copyright protected. If you want to use the same font you’ll need to purchase a license. The other copyrighted elements are the logo and the name — which we have already changed. - -If you don’t already have a brand font, head over to Google Fonts to pick some fonts that suit the brand you’re trying to create. Google makes it nice and easy to see how they’ll look, and it’s simple to import them into your project. - -The global fonts are set in the same file as the colors, scroll down and you’ll see them on lines 36 to 41. - -If you are importing fonts, such as from Google fonts, you need to make sure that you include the import statement at the top of the \_variables.css file. - -As with the color changes, it’s a good idea to save the file with each change and check if the site is looking the way that you expected it to. You can see our eclectic choices below. - -![Market Customisation](../../.gitbook/assets/market-customisation-13.png) - -## Customize the Publish Form - -Let’s head to the publish page to see what it looks like with our new styling - so far, so good. But there is one major issue, the publish form is still telling people to publish datasets. On our new marketplace, we want people to publish and sell their photos, so we’re going to have to make some changes here. - -![Market Customisation](../../.gitbook/assets/market-customisation-14.png) - -Open up the index.json file from content/publish/index.json - here we change the text to explain that this form is for publishing photos. - -![Market Customisation](../../.gitbook/assets/market-customisation-15.png) - -Additionally, the asset type current says dataset, and we need to change this so that it says photo. The simplest way to do this is to change the title of the asset type without changing anything else. Ocean can handle selling any digital asset that can be accessed via a URL, so no further changes are needed to accommodate selling photos. - -Open up src/components/Publish/Metadata/index.tsx and change line 28 so that it says ‘Photo’ - -![Market Customisation](../../.gitbook/assets/market-customisation-18.png) - -Great, now our publish page explains that users should be publishing photos and photo is provided as an asset type option. We’ll also leave algorithm as an option in case some data scientists want to do some analysis or image transformation on the photos. - -![Market Customisation](../../.gitbook/assets/market-customisation-19.png) - -There is one more thing that is fun to change before we move away from the publish form. You’ll notice that Ocean Market V4 now has a cool SVG generation feature that creates the images for the Data NFT. It creates a series of pink waves. Let’s change this so that it uses our brand colors in the waves! - -Open up /src/@utils/SvgWaves.ts and have a look at lines 27 to 30 where the colors are specified. Currently, the pink color is the one used in the svg generator. You can replace this with your own brand color: - -![Market Customisation](../../.gitbook/assets/market-customisation-21.png) - -If you’re interested in doing some further customization, take a look at lines 53 to 64. You can change these properties to alter how the image looks. Feel free to play around with it. We’ve increased the number of layers from 4 to 5. - -![Market Customisation](../../.gitbook/assets/market-customisation-22.png) - -And now your customized publish page is ready for your customers: - -![Market Customisation](../../.gitbook/assets/market-customisation-20.png) - -## Change the Fee Address - -At this point, we have made a lot of changes and hopefully you’re happy with the way that your marketplace is looking. Given that you now have your own awesome photo marketplace, it’s about time we talked about monetizing it. Yup, that’s right - you will earn a commission when people buy and sell photos in your marketplace. In Ocean V4, there are a whole host of new fees and customization options that you can use (read more about that here). - -When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that. And in order to receive the fees you’ll you need to set the address that you want to recieve these fees in. - -![Market Customisation](../../.gitbook/assets/market-customisation-23.png) - -This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. Create a new file called .env in the root of your repository. - -Copy and paste the following into the file: - -``` - -NEXT_PUBLIC_MARKET_FEE_ADDRESS="0x123abc" -NEXT_PUBLIC_PUBLISHER_MARKET_ORDER_FEE="0.01" -NEXT_PUBLIC_PUBLISHER_MARKET_FIXED_SWAP_FEE="0.01" -NEXT_PUBLIC_CONSUME_MARKET_ORDER_FEE="0.01" -NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="0.01" - -``` - -You need to replace “0x123abc” with your ethereum address (this is where the fees will be sent) and alter the fees to the levels that you intend them to be at. If you change you mind, these fees can always be altered later. - -Go to [Fees page](https://docs.oceanprotocol.com/concepts/fees/) to know more details about each type of fee and its relevance. - -It is important that the file is saved in the right place at the root of your repository, your file structure should look the same as below. - -![Market Customisation](../../.gitbook/assets/market-customisation-24.png) - -Now that’s it; you now have a fully functioning photo marketplace that operates over the blockchain. Everytime someone uses it, you will receive revenue. - -![Market Customisation](../../.gitbook/assets/market-customisation-25.png) diff --git a/building-with-ocean/compute-to-data/compute-to-data-algorithms.md b/building-with-ocean/compute-to-data/compute-to-data-algorithms.md deleted file mode 100644 index 4d35907a..00000000 --- a/building-with-ocean/compute-to-data/compute-to-data-algorithms.md +++ /dev/null @@ -1,235 +0,0 @@ ---- -title: Writing Algorithms for Compute to Data -description: Learn how to write algorithms for use in Ocean Protocol's Compute-to-Data feature. ---- - -## Overview - -An algorithm in the Ocean Protocol stack is another asset type, in addition to data sets. An algorithm for Compute to Data is composed of the following: - -- an algorithm code -- a Docker image (base image + tag) -- an entry point - -## Environment - -When creating an algorithm asset in Ocean Protocol, the additional `algorithm` object needs to be included in its metadata service to define the Docker container environment: - -```json -{ - "algorithm": { - "container": { - "entrypoint": "node $ALGO", - "image": "node", - "tag": "latest" - } - } -} -``` - -| Variable | Usage | -| ------------ | --------------------------------------------------------------------------------------------------------------------------------------- | -| `image` | The Docker image name the algorithm will run with. | -| `tag` | The Docker image tag that you are going to use. | -| `entrypoint` | The Docker entrypoint. `$ALGO` is a macro that gets replaced inside the compute job, depending where your algorithm code is downloaded. | - -Define your entrypoint according to your dependencies. E.g. if you have multiple versions of python installed, use the appropriate command `python3.6 $ALGO`. - -### What Docker container should I use? - -There are plenty of Docker containers that work out-of-the-box. However, if you have custom dependencies, you may want to configure your own Docker Image. -To do so, create a Dockerfile with the appropriate instructions for dependency management and publish the container, e.g. using Dockerhub. - -We also collect some [example images](https://github.com/oceanprotocol/algo_dockers) which you can also view in Dockerhub. - -When publishing an algorithm through the [Ocean Market](https://market.oceanprotocol.com), these properties can be set via the publish UI. - -### Environment Examples - -Run an algorithm written in JavaScript/Node.js, based on Node.js v14: - -```json -{ - "algorithm": { - "container": { - "entrypoint": "node $ALGO", - "image": "node", - "tag": "14" - } - } -} -``` - -Run an algorithm written in Python, based on Python v3.9: - -```json -{ - "algorithm": { - "container": { - "entrypoint": "python3.9 $ALGO", - "image": "python", - "tag": "3.9.4-alpine3.13" - } - } -} -``` - -### Data Storage - -As part of a compute job, every algorithm runs in a K8s pod with these volumes mounted: - -| Path | Permissions | Usage | -| --------------- | ----------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `/data/inputs` | read | Storage for input data sets, accessible only to the algorithm running in the pod. Contents will be the files themselves, inside indexed folders e.g. `/data/inputs/{did}/{service_id}`. | -| `/data/ddos` | read | Storage for all DDOs involved in compute job (input data set + algorithm). Contents will json files containing the DDO structure. | -| `/data/outputs` | read/write | Storage for all of the algorithm's output files. They are uploaded on some form of cloud storage, and URLs are sent back to the consumer. | -| `/data/logs/` | read/write | All algorithm output (such as `print`, `console.log`, etc.) is stored in a file located in this folder. They are stored and sent to the consumer as well. | - -Please note that when using local Providers or Metatata Caches, the ddos might not be correctly transferred into c2d, but inputs are still available. -If your algorithm relies on contents from the DDO json structure, make sure to use a public Provider and Metadata Cache (Aquarius instance). - -### Environment variables available to algorithms - -For every algorithm pod, the Compute to Data environment provides the following environment variables: - -| Variable | Usage | -| -------------------- | ------------------------------------------------------ | -| `DIDS` | An array of DID strings containing the input datasets. | -| `TRANSFORMATION_DID` | The DID of the algorithm. | - -## Example: JavaScript/Node.js - -The following is a simple JavaScript/Node.js algorithm, doing a line count for ALL input datasets. The algorithm is not using any environment variables, but instead it's scanning the `/data/inputs` folder. - -```js -const fs = require('fs') - -const inputFolder = '/data/inputs' -const outputFolder = '/data/outputs' - -async function countrows(file) { - console.log('Start counting for ' + file) - const fileBuffer = fs.readFileSync(file) - const toString = fileBuffer.toString() - const splitLines = toString.split('\n') - const rows = splitLines.length - 1 - fs.appendFileSync(outputFolder + '/output.log', file + ',' + rows + '\r\n') - console.log('Finished. We have ' + rows + ' lines') -} - -async function processfolder(folder) { - const files = fs.readdirSync(folder) - - for (const i = 0; i < files.length; i++) { - const file = files[i] - const fullpath = folder + '/' + file - if (fs.statSync(fullpath).isDirectory()) { - await processfolder(fullpath) - } else { - await countrows(fullpath) - } - } -} - -processfolder(inputFolder) -``` - -This snippet will create and expose the following files as compute job results to the consumer: - -- `/data/outputs/output.log` -- `/data/logs/algo.log` - -To run this, use the following container object: - -```json -{ - "algorithm": { - "container": { - "entrypoint": "node $ALGO", - "image": "node", - "tag": "12" - } - } -} -``` - -## Example: Python - -A more advanced line counting in Python, which relies on environment variables and constructs a job object, containing all the input files & DDOs - -```python -import pandas as pd -import numpy as np -import os -import time -import json - -def get_job_details(): - """Reads in metadata information about assets used by the algo""" - job = dict() - job['dids'] = json.loads(os.getenv('DIDS', None)) - job['metadata'] = dict() - job['files'] = dict() - job['algo'] = dict() - job['secret'] = os.getenv('secret', None) - algo_did = os.getenv('TRANSFORMATION_DID', None) - if job['dids'] is not None: - for did in job['dids']: - # get the ddo from disk - filename = '/data/ddos/' + did - print(f'Reading json from {filename}') - with open(filename) as json_file: - ddo = json.load(json_file) - # search for metadata service - for service in ddo['service']: - if service['type'] == 'metadata': - job['files'][did] = list() - index = 0 - for file in service['attributes']['main']['files']: - job['files'][did].append( - '/data/inputs/' + did + '/' + str(index)) - index = index + 1 - if algo_did is not None: - job['algo']['did'] = algo_did - job['algo']['ddo_path'] = '/data/ddos/' + algo_did - return job - - -def line_counter(job_details): - """Executes the line counter based on inputs""" - print('Starting compute job with the following input information:') - print(json.dumps(job_details, sort_keys=True, indent=4)) - - """ Now, count the lines of the first file in first did """ - first_did = job_details['dids'][0] - filename = job_details['files'][first_did][0] - non_blank_count = 0 - with open(filename) as infp: - for line in infp: - if line.strip(): - non_blank_count += 1 - print ('number of non-blank lines found %d' % non_blank_count) - """ Print that number to output to generate algo output""" - f = open("/data/outputs/result", "w") - f.write(str(non_blank_count)) - f.close() - - -if __name__ == '__main__': - line_counter(get_job_details()) - -``` - -To run this algorithm, use the following `container` object: - -```json -{ - "algorithm": { - "container": { - "entrypoint": "python3.6 $ALGO", - "image": "oceanprotocol/algo_dockers", - "tag": "python-sql" - } - } -} -``` diff --git a/building-with-ocean/compute-to-data/compute-to-data-architecture.md b/building-with-ocean/compute-to-data/compute-to-data-architecture.md deleted file mode 100644 index 4f6a849e..00000000 --- a/building-with-ocean/compute-to-data/compute-to-data-architecture.md +++ /dev/null @@ -1,75 +0,0 @@ ---- -title: Compute-to-Data -description: Architecture overview ---- - -# Architecture - -### Architecture Overview - -Here's the sequence diagram for starting a new compute job. - -![Sequence Diagram for computing services](../images/Starting%20New%20Compute%20Job.png) - -The Consumer calls the Provider with `start(did, algorithm, additionalDIDs)`. It returns job id `XXXX`. The Provider oversees the rest of the work. At any point, the Consumer can query the Provider for the job status via `getJobDetails(XXXX)`. - -Here's how Provider works. First, it ensures that the Consumer has sent the appropriate datatokens to get access. Then, it calls asks the Operator-Service (a microservice) to start the job, which passes on the request to Operator-Engine (the actual compute system). Operator-Engine runs Kubernetes compute jobs etc as needed. Operator-Engine reports when to Operator-Service when the job has finished. - -Here's the actors/components: - -* Consumers - The end users who need to use some computing services offered by the same Publisher as the data Publisher. -* Operator-Service - Micro-service that is handling the compute requests. -* Operator-Engine - The computing systems where the compute will be executed. -* Kubernetes - a K8 cluster - -Before the flow can begin, these pre-conditions must be met: - -* The Asset DDO has a `compute` service. -* The Asset DDO compute service must permit algorithms to run on it. -* The Asset DDO must specify an Ocean Provider endpoint exposed by the Publisher. - -### Access Control using Ocean Provider - -As with [the `access` service](../../core-concepts/architecture.md#data-nfts-datatokens-and-access-control-tools), the `compute` service requires the **Ocean Provider** as a component handled by Publishers. Ocean Provider is in charge of interacting with users and managing the basics of a Publisher's infrastructure to integrate this infrastructure into Ocean Protocol. The direct interaction with the infrastructure where the data resides happens through this component only. - -Ocean Provider includes the credentials to interact with the infrastructure (initially in cloud providers, but it could be on-premise). - -### Compute-to-Data Environment - -#### Operator Service - -The **Operator Service** is a micro-service in charge of managing the workflow executing requests. - -The main responsibilities are: - -* Expose an HTTP API allowing for the execution of data access and compute endpoints. -* Interact with the infrastructure (cloud/on-premise) using the Publisher's credentials. -* Start/stop/execute computing instances with the algorithms provided by users. -* Retrieve the logs generated during executions. - -Typically the Operator Service is integrated from Ocean Provider, but can be called independently of it. - -The Operator Service is in charge of establishing the communication with the K8s cluster, allowing it to: - -* Register new compute jobs -* List the current compute jobs -* Get a detailed result for a given job -* Stop a running job - -The Operator Service doesn't provide any storage capability, all the state is stored directly in the K8s cluster. - -#### Operator Engine - -The **Operator Engine** is in charge of orchestrating the compute infrastructure using Kubernetes as backend where each compute job runs in an isolated [Kubernetes Pod](https://kubernetes.io/docs/concepts/workloads/pods/). Typically the Operator Engine retrieves the workflows created by the Operator Service in Kubernetes, and manage the infrastructure necessary to complete the execution of the compute workflows. - -The Operator Engine is in charge of retrieving all the workflows registered in a K8s cluster, allowing to: - -* Orchestrate the flow of the execution -* Start the configuration pod in charge of download the workflow dependencies (datasets and algorithms) -* Start the pod including the algorithm to execute -* Start the publishing pod that publish the new assets created in the Ocean Protocol network. -* The Operator Engine doesn't provide any storage capability, all the state is stored directly in the K8s cluster. - -#### Pod: Configuration - -#### Pod: Publishing diff --git a/building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms.md b/building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms.md deleted file mode 100644 index 2191d0d6..00000000 --- a/building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms.md +++ /dev/null @@ -1,23 +0,0 @@ ---- -title: Compute-to-Data -description: Datasets and Algorithms ---- - -# Datasets & Algorithms - -### Datasets & Algorithms - -With Compute-to-Data, datasets are not allowed to leave the premises of the data holder, only algorithms can be permitted to run on them under certain conditions within an isolated and secure environment. Algorithms are an asset type just like datasets and can be priced in the same way. - -Algorithms can be public or private by setting `"attributes.main.type"` value in DDO as follows: - -* `"access"` - public. The algorithm can be downloaded, given appropriate datatoken. -* `"compute"` - private. The algorithm is only available to use as part of a compute job without any way to download it. The Algorithm must be published on the same Ocean Provider as the dataset it's targeted to run on. - -For each dataset, publishers can choose to allow various permission levels for algorithms to run: - -* allow selected algorithms, referenced by their DID -* allow all algorithms published within a network or marketplace -* allow raw algorithms, for advanced use cases circumventing algorithm as an asset type, but most prone to data escape - -All implementations should set permissions to private by default: upon publishing a compute dataset, no algorithms should be allowed to run on it. This is to prevent data escape by a rogue algorithm being written in a way to extract all data from a dataset. diff --git a/building-with-ocean/deploying-components/README.md b/building-with-ocean/deploying-components/README.md deleted file mode 100644 index c3cbfe3a..00000000 --- a/building-with-ocean/deploying-components/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Deploying Components - diff --git a/building-with-ocean/deploying-components/deploying-aquarius.md b/building-with-ocean/deploying-components/deploying-aquarius.md deleted file mode 100644 index 7b6fbf40..00000000 --- a/building-with-ocean/deploying-components/deploying-aquarius.md +++ /dev/null @@ -1,204 +0,0 @@ -# Deploying Aquarius - -### About Aquarius - -Aquarius is an off-chain component with caches the asset metadata published on-chain. By deploying own Aquarius, developers can control which assets are visible in their marketplace. For example, having a custom Aquarius instance allows assets only from specific addresses to be visible on the marketplace. This tutorial will provide the steps to deploy Aquarius. Ocean Protocol provides Aquarius docker images which can be viewed [here](https://hub.docker.com/r/oceanprotocol/aquarius/tags). Visit [this](https://github.com/oceanprotocol/aquarius) page to view Aquarius source code. - -Aquarius consists of two parts:\ -\- **API:** The Aquarius API offers a convenient way to access the medatata without scanning the chain yourself.\ -\- **Event monitor:** Aquarius continually monitors the chains for MetadataCreated and MetadataUpdated events, processes these events and adds them to the database. - -### Prerequisites - -* A server for hosting Aquarius. See [this guide](setup-server.md) on creating a server. -* Docker and Docker compose are installed. Click [here](https://docs.docker.com/engine/install/) to view guide on installing docker. -* [Obtain an API key](../using-ocean-libraries/configuration.md#obtaining-api-key-for-ethereum-node-provider) - -### Create a working directory - -``` -mkdir Aquarius -cd Aquarius -``` - -### Create a \`.env\` file - -Copy the below content into the \`.env\` file and edit the values as needed. - -{% code title=".env" %} -``` -# check the available versions: https://hub.docker.com/repository/docker/oceanprotocol/aquarius -AQUARIUS_VERSION=latest -ALLOWED_PUBLISHERS='[""]' -# Elastic search credentials -DB_USERNAME=username -DB_PASSWORD=password - -# Replace below value with the API provider of your choice -EVENTS_RPC_POLYGON= -EVENTS_RPC_MAINNET= -``` -{% endcode %} - -### Create docker-compose file - -{% code title="docker-compose.yml" %} -```yaml -version: '3' -services: - elasticsearch: - image: elasticsearch:6.8.17 - container_name: elasticsearch - restart: on-failure - environment: - ES_JAVA_OPTS: "-Xms512m -Xmx512m" - MAX_MAP_COUNT: "64000" - discovery.type: "single-node" - volumes: - - data:/usr/share/elasticsearch/data - ports: - - 9200:9200 - networks: - - ocean_backend - aquarius: - image: oceanprotocol/aquarius:${AQUARIUS_VERSION} - container_name: aquarius - restart: on-failure - ports: - - 5000:5000 - networks: - - ocean_backend - depends_on: - - elasticsearch - environment: - DB_MODULE: elasticsearch - DB_HOSTNAME: elasticsearch - DB_PORT: 9200 - DB_USERNAME: ${DB_USERNAME} - DB_PASSWORD: ${DB_PASSWORD} - DB_NAME: aquarius - DB_SCHEME: http - DB_SSL : "false" - LOG_LEVEL: "DEBUG" - AQUARIUS_BIND_URL : "http://0.0.0.0:5000" - AQUARIUS_WORKERS : "8" - RUN_AQUARIUS_SERVER: "1" - AQUARIUS_CONFIG_FILE: "config.ini" - EVENTS_ALLOW: 0 - RUN_EVENTS_MONITOR: 0 - ALLOWED_PUBLISHERS: ${ALLOWED_PUBLISHERS} -volumes: - data: - driver: local -networks: - ocean_backend: - driver: bridge -``` -{% endcode %} - -### Create events monitor docker compose file - -{% tabs %} -{% tab title="Events monitor - Mainnet" %} -{% code title="docker-compose-events-mainnet.yml" %} -```yaml -version: '3' -services: - aquarius-events-mainnet: - image: oceanprotocol/aquarius:${AQUARIUS_VERSION} - container_name: aquarius-events-mainnet - restart: on-failure - networks: - - ocean_backend - depends_on: - - elasticsearch - environment: - DB_MODULE: elasticsearch - DB_HOSTNAME: elasticsearch - DB_PORT: 9200 - DB_USERNAME: ${DB_USERNAME} - DB_PASSWORD: ${DB_PASSWORD} - DB_NAME: aquarius - DB_SCHEME: http - DB_SSL : "false" - LOG_LEVEL: "DEBUG" - AQUARIUS_BIND_URL: "http://0.0.0.0:5000" - AQUARIUS_WORKERS : "1" - RUN_AQUARIUS_SERVER : "0" - AQUARIUS_CONFIG_FILE: "config.ini" - NETWORK_NAME: "mainnet" - EVENTS_RPC: ${EVENTS_RPC_MAINNET} - METADATA_UPDATE_ALL : "0" - OCEAN_ADDRESS : "0x967da4048cD07aB37855c090aAF366e4ce1b9F48" - EVENTS_ALLOW: 0 - RUN_EVENTS_MONITOR: 1 - BLOCKS_CHUNK_SIZE: "5000" -volumes: - data: - driver: local -networks: - ocean_backend: - driver: bridge -``` -{% endcode %} -{% endtab %} - -{% tab title="Events monitor - Polygon" %} -{% code title="docker-compose-events-ploygon.yml" %} -```yaml -version: '3' -services: - aquarius-events-polygon: - image: oceanprotocol/aquarius:${AQUARIUS_VERSION} - container_name: aquarius-events-polygon - restart: on-failure - networks: - - ocean_backend - depends_on: - - elasticsearch - environment: - DB_MODULE: elasticsearch - DB_HOSTNAME: elasticsearch - DB_PORT: 9200 - DB_USERNAME: ${DB_USERNAME} - DB_PASSWORD: ${DB_PASSWORD} - DB_NAME: aquarius - DB_SCHEME: http - DB_SSL : "false" - LOG_LEVEL: "DEBUG" - AQUARIUS_BIND_URL: "http://0.0.0.0:5000" - AQUARIUS_WORKERS : "1" - RUN_AQUARIUS_SERVER : "0" - AQUARIUS_CONFIG_FILE: "config.ini" - NETWORK_NAME: "polygon" - EVENTS_RPC: ${EVENTS_RPC_POLYGON} - METADATA_UPDATE_ALL: "0" - OCEAN_ADDRESS: "0x282d8efCe846A88B159800bd4130ad77443Fa1A1" - EVENTS_ALLOW: 0 - RUN_EVENTS_MONITOR: 1 - METADATA_CONTRACT_ADDRESS: "0x80E63f73cAc60c1662f27D2DFd2EA834acddBaa8" - BLOCKS_CHUNK_SIZE: "5000" -volumes: - data: - driver: local -networks: - ocean_backend: - driver: bridge -``` -{% endcode %} -{% endtab %} -{% endtabs %} - -### Start Aquarius - -``` -docker-compose \ --f docker-compose.yml \ --f docker-compose-events-mainnet.yml \ --f docker-compose-events-polygon.yml \ ---env-file .env \ --d \ -up -``` - -After pulling all the asset metadata from the blockchain, Aquarius can be used to query the assets using Elasticsearch query. Aquarius REST API are documented here. diff --git a/building-with-ocean/deploying-components/deploying-marketplace.md b/building-with-ocean/deploying-components/deploying-marketplace.md deleted file mode 100644 index 249aea08..00000000 --- a/building-with-ocean/deploying-components/deploying-marketplace.md +++ /dev/null @@ -1,62 +0,0 @@ -# Deploying Marketplace - -### Prerequisites - -* A server for hosting Ocean Marketplace. See [this guide](setup-server.md) on creating a server. - -#### Create a directory - -``` -mkdir my-marketplace -cd my-marketplace -``` - -### Create file with name \`.env\` - -Copy the below content into the \`.env\` file. - -{% code title=".env" %} -``` -# Update this value if Market should using custom Aquarius -NEXT_PUBLIC_METADATACACHE_URI=https://v4.aquarius.oceanprotocol.com - -#NEXT_PUBLIC_INFURA_PROJECT_ID="xxx" -#NEXT_PUBLIC_MARKET_FEE_ADDRESS="0xxx" -#NEXT_PUBLIC_PUBLISHER_MARKET_ORDER_FEE="1" -#NEXT_PUBLIC_CONSUME_MARKET_ORDER_FEE="1" -#NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="1" - -# -# ADVANCED SETTINGS -# - -# Toggle pricing options presented during price creation -#NEXT_PUBLIC_ALLOW_FIXED_PRICING="true" -#NEXT_PUBLIC_ALLOW_FREE_PRICING="true" - -# Privacy Preference Center -#NEXT_PUBLIC_PRIVACY_PREFERENCE_CENTER="true" -``` -{% endcode %} - -#### Create a \`Dockerfile\` file and copy the below content into it. - -{% code title="Dockerfile" %} -``` -FROM node:16 -RUN git clone https://github.com/oceanprotocol/market.git /usr/app/market -WORKDIR /usr/app/market -RUN npm ci --legacy-peer-deps -RUN npm run build -EXPOSE 3000 -CMD ["npx", "next", "start"] -``` -{% endcode %} - -Build a docker image - -```bash -docker build . -f Dockerfile -t market:latest -``` - -### Start the marketplace diff --git a/building-with-ocean/deploying-components/deploying-ocean-subgraph.md b/building-with-ocean/deploying-components/deploying-ocean-subgraph.md deleted file mode 100644 index 3d531cb9..00000000 --- a/building-with-ocean/deploying-components/deploying-ocean-subgraph.md +++ /dev/null @@ -1,83 +0,0 @@ -# Deploying Ocean Subgraph - -### About Ocean subgraph - -Ocean subgraph allows querying the datatoken, dataNFT, and all event information using GraphQL. Hosting the Ocean subgraph saves the cost and time required in querying the data directly from the blockchain. The steps in this tutorial will explain how to host Ocean subgraph for the EVM compatible chains supported by Ocean Protocol. - -### Prerequisites - -* A server for hosting Ocean subgraph. See [this guide](setup-server.md) on creating a server. -* Docker and Docker compose are installed. Click [here](https://docs.docker.com/engine/install/) to view guide on installing docker. -* [Obtain an API key](../using-ocean-libraries/configuration.md#obtaining-api-key-for-ethereum-node-provider) - -### Create a working directory - -``` -mkdir ocean-subgraph -cd ocean-subgraph -``` - -### Create a \`.env\` file - -Copy the below content into the \`.env\` file and edit the values as needed. - -{% code title=".env" %} -``` -ETHEREUM_NODE_PROVIDER_API='mumbai:https://polygon-mumbai.infura.io/v3/${INFURA_PROJECT_ID}' -``` -{% endcode %} - -### Create docker-compose file - -{% code title="docker-compose.yml" %} -```yaml -version: '3' -services: - graph-node: - image: graphprotocol/graph-node:v0.26.0 - ports: - - '9000:8000' - - '8001:8001' - - '8020:8020' - - '8030:8030' - - '8040:8040' - depends_on: - - ipfs - - postgres - environment: - postgres_host: postgres - postgres_user: graph-node - postgres_pass: let-me-in - postgres_db: graph-node - ipfs: 'ipfs:5001' - ethereum: ${ETHEREUM_NODE_PROVIDER_API} - RUST_LOG: info - ipfs: - image: ipfs/go-ipfs:v0.4.23 - ports: - - '5001:5001' - volumes: - - ./data/ipfs:/data/ipfs - postgres: - image: postgres - ports: - - '5432:5432' - command: ['postgres', '-cshared_preload_libraries=pg_stat_statements'] - environment: - POSTGRES_USER: graph-node - POSTGRES_PASSWORD: let-me-in - POSTGRES_DB: graph-node - volumes: - - ./data/postgres:/var/lib/postgresql/data -``` -{% endcode %} - -### Start Ocean subgraph - -``` -docker-compose \ --f docker-compose.yml ---env-file .env \ --d \ -up -``` diff --git a/building-with-ocean/deploying-components/deploying-provider.md b/building-with-ocean/deploying-components/deploying-provider.md deleted file mode 100644 index 56516845..00000000 --- a/building-with-ocean/deploying-components/deploying-provider.md +++ /dev/null @@ -1,89 +0,0 @@ -# Deploying Provider - -### About Provider - -Provider encrypts the URL and metadata during publish and decrypts the URL when the dataset is downloaded or a compute job is started. -It enables the access to data assets by streaming data (and never the URL). -It performs checks on chain for buyer permissions and payments. It also Provides compute services (connects to C2D environment). -It is a multichain component, meaning that with the proper configurations it can handle these tasks on multiple chains. -The source code of Provider can be access from [here](https://github.com/oceanprotocol/provider). - -### Prerequisites - -* Docker and Docker compose are installed. Click [here](https://docs.docker.com/engine/install/) to view guide on installing docker. -* [Obtain an API key](../using-ocean-libraries/configuration.md#obtaining-api-key-for-ethereum-node-provider) - -### Create a working directory - -``` -mkdir Provider -cd Provider -``` - -### Create a \`.env\` file - -Copy the below content into the \`.env\` file and edit the values as needed. - -{% code title=".env" %} -``` -# Mandatory variables - -# Update the value to the appropriate tag from here: https://hub.docker.com/r/oceanprotocol/provider-py/tags -PROVIDER_VERSION=latest -PROVIDER_PRIVATE_KEY= -NETWORK_URL= -AQUARIUS_URL= -``` -{% endcode %} - -### Create docker-compose file - -{% hint style="info" %} -Set the value of OCEAN\_PROVIDER\_WORKERS to 2 or more to avoid a race condition when provider checks whether it should call a remote provider or not. -{% endhint %} - -{% code title="docker-compose.provider.yml" %} -```yaml -version: '3' -services: - provider: - image: oceanprotocol/provider-py:v1.0.20 - container_name: provider - ports: - - 8030:8030 - networks: - - ocean_backend - environment: - # the NETWORK_URL and PROVIDER_PRIVATE_KEY settings can be defined for multiple chains - # as the JSON encoding e.g. {"chain_id1": "network_url_1", "chain_id2": "network_url_2"} - NETWORK_URL: '{"8996": "${NETWORK_URL}"}' - PROVIDER_PRIVATE_KEY: '{"8996": "${PROVIDER_PRIVATE_KEY}"}' - # defines the key to use where no chain id is applicable (e.g. for auth tokens) - UNIVERSAL_PRIVATE_KEY: ${PROVIDER_PRIVATE_KEY} - LOG_LEVEL: DEBUG - OCEAN_PROVIDER_URL: "http://0.0.0.0:8030" - OCEAN_PROVIDER_WORKERS: "2" - OCEAN_PROVIDER_TIMEOUT: "9000" - # Defining OPERATOR_SERVICE_URL is optional. Set the value only if Provider should support Compute-to-data. - OPERATOR_SERVICE_URL: "" - # Defining IPFS_GATEWAY is optional. Set the value if Provider should support resolving IPFS urls. - IPFS_GATEWAY: "" - AQUARIUS_URL: ${AQUARIUS_URL} -volumes: - data: - driver: local -networks: - ocean_backend: - driver: bridge -``` -{% endcode %} - -### Start Provider - -``` -docker-compose \ --f docker-compose.provider.yml ---env-file .env \ --d \ -up -``` diff --git a/building-with-ocean/deploying-components/setup-server.md b/building-with-ocean/deploying-components/setup-server.md deleted file mode 100644 index 9e172bab..00000000 --- a/building-with-ocean/deploying-components/setup-server.md +++ /dev/null @@ -1,64 +0,0 @@ ---- -description: >- - The following tutorial shows how to create a server ready for hosting Ocean - Protocol's components. ---- - -# Setup a Server - -## **Using hosting services** - -Ocean Protocol's components can be hosted on any infrastructure providers like AWS, Azure, Heroku, Digitalocean, and many others. The tutorial here explains how to create a server using Digitalocean and installing docker which will be required to host Ocean Protocol's components. Apart from steps for create a server, the remaining part of the tutorial will be same for all hosting providers. - -#### Creating account and setting billing - -Go to [https://www.digitalocean.com/](https://www.digitalocean.com/) and create an account. Provide the appropriate information for billing and accounting. - -#### Create a droplet - -Click on **`Create`** button and choose **`Droplets`** options from dropdown. - -![](../../.gitbook/assets/image.png) - -#### Configure droplet - -Select Ubuntu OS and choose a plan. The required CPU, Memory depends on the number of requests Aquarius is expected to serve. - -![Configure droplet](<../../.gitbook/assets/image (8).png>) - -Also, select the region where you want Aquarius to be hosted and a root password. - -![](<../../.gitbook/assets/image (10).png>) - -![Click Create Droplet](<../../.gitbook/assets/image (7).png>) - -Finalize the parameters for the server, click on `Create Droplet.` After the server is ready, select the `Access console` option from the dropdown. - -![Click Access Console](<../../.gitbook/assets/image (3).png>) - -![Click Launch Droplet Console](<../../.gitbook/assets/image (9).png>) - -A window will open with a terminal session. Now, the required infrastructure is ready for hosting Aquarius, Provider or the Subgraph. Let's install docker and docker-compose on the server. Follow the installation guide [here](https://docs.docker.com/engine/install/ubuntu/). - -The below commands shows the commands executed by following the guide. - -```bash -sudo apt-get update -sudo apt-get install ca-certificates curl gnupg lsb-release -sudo mkdir -p /etc/apt/keyrings -curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg -echo \ - "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \ - $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null -sudo apt-get update -sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin - -# Now install docker-compose -sudo apt-get update -sudo apt-get install docker-compose-plugin -``` - -Now that, the server is ready with all the required dependencies are installed for hosting Ocean Components, follow the instructions given in Component specific guide. - -* [Deploying Marketplace](deploying-marketplace.md) -* [Deploying Aquarius](deploying-aquarius.md) diff --git a/building-with-ocean/images/Starting New Compute Job.png b/building-with-ocean/images/Starting New Compute Job.png deleted file mode 100644 index 6f0ad176..00000000 Binary files a/building-with-ocean/images/Starting New Compute Job.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/add-allow-list.png b/building-with-ocean/images/allow-deny-lists/add-allow-list.png deleted file mode 100644 index dd397df1..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/add-allow-list.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/advanced-settings.png b/building-with-ocean/images/allow-deny-lists/advanced-settings.png deleted file mode 100644 index 98f99895..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/advanced-settings.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/metamask-transaction.png b/building-with-ocean/images/allow-deny-lists/metamask-transaction.png deleted file mode 100644 index 7f5e1c82..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/metamask-transaction.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/removing-allow-deny.png b/building-with-ocean/images/allow-deny-lists/removing-allow-deny.png deleted file mode 100644 index bd47913d..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/removing-allow-deny.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/submit.png b/building-with-ocean/images/allow-deny-lists/submit.png deleted file mode 100644 index 6e429509..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/submit.png and /dev/null differ diff --git a/building-with-ocean/images/allow-deny-lists/update-success.png b/building-with-ocean/images/allow-deny-lists/update-success.png deleted file mode 100644 index 5d9aa979..00000000 Binary files a/building-with-ocean/images/allow-deny-lists/update-success.png and /dev/null differ diff --git a/building-with-ocean/images/confirm-backup-phrase.png b/building-with-ocean/images/confirm-backup-phrase.png deleted file mode 100644 index 5fa40a1c..00000000 Binary files a/building-with-ocean/images/confirm-backup-phrase.png and /dev/null differ diff --git a/building-with-ocean/images/create-new-metamask-wallet.png b/building-with-ocean/images/create-new-metamask-wallet.png deleted file mode 100644 index f53a81cd..00000000 Binary files a/building-with-ocean/images/create-new-metamask-wallet.png and /dev/null differ diff --git a/building-with-ocean/images/login-options.png b/building-with-ocean/images/login-options.png deleted file mode 100644 index 187f2446..00000000 Binary files a/building-with-ocean/images/login-options.png and /dev/null differ diff --git a/building-with-ocean/images/main-wallet-page.png b/building-with-ocean/images/main-wallet-page.png deleted file mode 100644 index bbae9830..00000000 Binary files a/building-with-ocean/images/main-wallet-page.png and /dev/null differ diff --git a/building-with-ocean/images/manage-tokens.png b/building-with-ocean/images/manage-tokens.png deleted file mode 100644 index 09a6f4c4..00000000 Binary files a/building-with-ocean/images/manage-tokens.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/azure-10.png b/building-with-ocean/images/marketplace/publish/azure-10.png deleted file mode 100644 index 36cf04aa..00000000 Binary files a/building-with-ocean/images/marketplace/publish/azure-10.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field (1).png b/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field (1).png deleted file mode 100644 index 706aba2d..00000000 Binary files a/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field (1).png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field.png b/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field.png deleted file mode 100644 index 706aba2d..00000000 Binary files a/building-with-ocean/images/marketplace/publish/marketplace-publish-file-field.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/one-drive-1.png b/building-with-ocean/images/marketplace/publish/one-drive-1.png deleted file mode 100644 index 00910484..00000000 Binary files a/building-with-ocean/images/marketplace/publish/one-drive-1.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/one-drive-2.png b/building-with-ocean/images/marketplace/publish/one-drive-2.png deleted file mode 100644 index 7334561d..00000000 Binary files a/building-with-ocean/images/marketplace/publish/one-drive-2.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/one-drive-3.png b/building-with-ocean/images/marketplace/publish/one-drive-3.png deleted file mode 100644 index 73c4f043..00000000 Binary files a/building-with-ocean/images/marketplace/publish/one-drive-3.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/one-drive-4.png b/building-with-ocean/images/marketplace/publish/one-drive-4.png deleted file mode 100644 index 61c964e1..00000000 Binary files a/building-with-ocean/images/marketplace/publish/one-drive-4.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/publish-google-drive (1).png b/building-with-ocean/images/marketplace/publish/publish-google-drive (1).png deleted file mode 100644 index f472a96f..00000000 Binary files a/building-with-ocean/images/marketplace/publish/publish-google-drive (1).png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/publish-google-drive-2 (1).png b/building-with-ocean/images/marketplace/publish/publish-google-drive-2 (1).png deleted file mode 100644 index 9b126c47..00000000 Binary files a/building-with-ocean/images/marketplace/publish/publish-google-drive-2 (1).png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/publish-google-drive-2.png b/building-with-ocean/images/marketplace/publish/publish-google-drive-2.png deleted file mode 100644 index 9b126c47..00000000 Binary files a/building-with-ocean/images/marketplace/publish/publish-google-drive-2.png and /dev/null differ diff --git a/building-with-ocean/images/marketplace/publish/publish-google-drive.png b/building-with-ocean/images/marketplace/publish/publish-google-drive.png deleted file mode 100644 index f472a96f..00000000 Binary files a/building-with-ocean/images/marketplace/publish/publish-google-drive.png and /dev/null differ diff --git a/building-with-ocean/images/matic-bridge.png b/building-with-ocean/images/matic-bridge.png deleted file mode 100644 index 2c35cbdb..00000000 Binary files a/building-with-ocean/images/matic-bridge.png and /dev/null differ diff --git a/building-with-ocean/images/metamask-add-network.png b/building-with-ocean/images/metamask-add-network.png deleted file mode 100644 index 7b756c36..00000000 Binary files a/building-with-ocean/images/metamask-add-network.png and /dev/null differ diff --git a/building-with-ocean/images/metamask-browser-extension.png b/building-with-ocean/images/metamask-browser-extension.png deleted file mode 100644 index 7f590505..00000000 Binary files a/building-with-ocean/images/metamask-browser-extension.png and /dev/null differ diff --git a/building-with-ocean/images/metamask-chrome-extension.png b/building-with-ocean/images/metamask-chrome-extension.png deleted file mode 100644 index af811b08..00000000 Binary files a/building-with-ocean/images/metamask-chrome-extension.png and /dev/null differ diff --git a/building-with-ocean/images/metamask-secret-passcode.png b/building-with-ocean/images/metamask-secret-passcode.png deleted file mode 100644 index 2b9b60af..00000000 Binary files a/building-with-ocean/images/metamask-secret-passcode.png and /dev/null differ diff --git a/building-with-ocean/images/metamask_T_C.png b/building-with-ocean/images/metamask_T_C.png deleted file mode 100644 index 9d5541c3..00000000 Binary files a/building-with-ocean/images/metamask_T_C.png and /dev/null differ diff --git a/building-with-ocean/images/metamask_view-account-empty.png b/building-with-ocean/images/metamask_view-account-empty.png deleted file mode 100644 index 0d1c21a3..00000000 Binary files a/building-with-ocean/images/metamask_view-account-empty.png and /dev/null differ diff --git a/building-with-ocean/images/metamask_view-account-options.png b/building-with-ocean/images/metamask_view-account-options.png deleted file mode 100644 index 755a17e7..00000000 Binary files a/building-with-ocean/images/metamask_view-account-options.png and /dev/null differ diff --git a/building-with-ocean/images/rbac/connect-wallet.png b/building-with-ocean/images/rbac/connect-wallet.png deleted file mode 100644 index b52649d7..00000000 Binary files a/building-with-ocean/images/rbac/connect-wallet.png and /dev/null differ diff --git a/building-with-ocean/images/rbac/without-browse-permission.png b/building-with-ocean/images/rbac/without-browse-permission.png deleted file mode 100644 index 9da7a411..00000000 Binary files a/building-with-ocean/images/rbac/without-browse-permission.png and /dev/null differ diff --git a/building-with-ocean/images/rbac/without-consume-permission.png b/building-with-ocean/images/rbac/without-consume-permission.png deleted file mode 100644 index d81988b7..00000000 Binary files a/building-with-ocean/images/rbac/without-consume-permission.png and /dev/null differ diff --git a/building-with-ocean/images/rbac/without-publish-permission.png b/building-with-ocean/images/rbac/without-publish-permission.png deleted file mode 100644 index 7c39bf99..00000000 Binary files a/building-with-ocean/images/rbac/without-publish-permission.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-01.png b/building-with-ocean/images/react-app-01.png deleted file mode 100644 index a299e2ab..00000000 Binary files a/building-with-ocean/images/react-app-01.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-02.png b/building-with-ocean/images/react-app-02.png deleted file mode 100644 index bd48b60c..00000000 Binary files a/building-with-ocean/images/react-app-02.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-03.png b/building-with-ocean/images/react-app-03.png deleted file mode 100644 index f078d84f..00000000 Binary files a/building-with-ocean/images/react-app-03.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-04.png b/building-with-ocean/images/react-app-04.png deleted file mode 100644 index b03578db..00000000 Binary files a/building-with-ocean/images/react-app-04.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-05.png b/building-with-ocean/images/react-app-05.png deleted file mode 100644 index c273a31c..00000000 Binary files a/building-with-ocean/images/react-app-05.png and /dev/null differ diff --git a/building-with-ocean/images/react-app-06.png b/building-with-ocean/images/react-app-06.png deleted file mode 100644 index 1b8e9466..00000000 Binary files a/building-with-ocean/images/react-app-06.png and /dev/null differ diff --git a/building-with-ocean/images/secret-backup-phrase.png b/building-with-ocean/images/secret-backup-phrase.png deleted file mode 100644 index 04a2a278..00000000 Binary files a/building-with-ocean/images/secret-backup-phrase.png and /dev/null differ diff --git a/building-with-ocean/images/transferring-process.png b/building-with-ocean/images/transferring-process.png deleted file mode 100644 index f131fc78..00000000 Binary files a/building-with-ocean/images/transferring-process.png and /dev/null differ diff --git a/building-with-ocean/images/view-metamask-account.png b/building-with-ocean/images/view-metamask-account.png deleted file mode 100644 index 0d1c21a3..00000000 Binary files a/building-with-ocean/images/view-metamask-account.png and /dev/null differ diff --git a/building-with-ocean/marketplace.md b/building-with-ocean/marketplace.md deleted file mode 100644 index a17f7ac2..00000000 --- a/building-with-ocean/marketplace.md +++ /dev/null @@ -1,21 +0,0 @@ ---- -title: Set Up a Marketplace -description: ---- - -## About marketplace - -Ocean Protocol's [marketplace](https://v4.market.oceanprotocol.com/) provides a web interface for accessing assets published on the chain. By default, assets metadata is pulled from Aquarius, component hosted by Ocean Protocol. To extend the existing features of the marketplace, developers can fork the marketplace. - -By doing so, developers can: - -- Change the name of the marketplace. -- Implement their own branding and style. -- Change the source of the asset information. -- Customise the fees - -## Forking marketplace - -To setup a marketplace follow the steps here: - -[Launch a blockchain-based data marketplace in under 1 hour](https://blog.oceanprotocol.com/launch-a-blockchain-based-data-marketplace-in-under-1-hour-9baa85a65ece) diff --git a/building-with-ocean/projects-using-ocean.md b/building-with-ocean/projects-using-ocean.md deleted file mode 100644 index 2bb2ff77..00000000 --- a/building-with-ocean/projects-using-ocean.md +++ /dev/null @@ -1,25 +0,0 @@ ---- -title: Partners & Collaborators -description: We work with many partners who understand the value of Ocean ---- - -We work closely with our collaborators and service partners to iterate on our underlying technology, and to deliver world-class Web3 experiences built on top of Ocean Protocol. An up to date list of our partners and collaborators is maintained on our [main site](https://oceanprotocol.com/collaborators). - -## Other Useful Information - -### OceanDAO projects - -[Ocean Pearl](https://oceanpearl.io/projects) is a great way to browse 80+ Ocean projects that came through [OceanDAO](https://oceanprotocol.com/dao). These projects may be building on the Ocean stack, doing outreach, unlocking data, or more. - -### Using Ocean Market - -[Ocean Market](https://market.oceanprotocol.com) is the best place to find projects that publish datasets. Simply go there and browse the datasets:) - -### Learning about Ocean - -The [Ocean Academy](https://oceanacademy.io/) project is a great way to learn more about Ocean beyond [oceanprotocol.com](https://www.oceanprotocol.com) and [docs.oceanprotocol.com](https://docs.oceanprotocol.com). - -### Trading OCEAN - -The [Coingecko OCEAN markets page](https://www.coingecko.com/en/coins/ocean-protocol#markets) lists forums to exchange OCEAN. Many of them offer liquidity mining and other yield opportunities. - diff --git a/building-with-ocean/using-ocean-libraries/README.md b/building-with-ocean/using-ocean-libraries/README.md deleted file mode 100644 index 725d1f34..00000000 --- a/building-with-ocean/using-ocean-libraries/README.md +++ /dev/null @@ -1,14 +0,0 @@ -# Using Ocean libraries - -Ocean Protocol officially supports two client libraries: - -* ocean.js -* ocean.py - -| ocean.js | ocean.py | -| -------------------------------------------------------------- | -------------------------------------------------------------- | -| Written in Javascript | Written in Python | -| [Source code](https://github.com/oceanprotocol/ocean.js) | [Source code](https://github.com/oceanprotocol/ocean.py) | -| [Releases](https://github.com/oceanprotocol/ocean.js/releases) | [Releases](https://github.com/oceanprotocol/ocean.py/releases) | - -The tutorials in this section will guide you how to setup the required configuration and interact with Ocean Protocol's smart contracts, Aquarius and Provider using the supported libraries. diff --git a/building-with-ocean/using-ocean-libraries/configuration.md b/building-with-ocean/using-ocean-libraries/configuration.md deleted file mode 100644 index e3aee0f0..00000000 --- a/building-with-ocean/using-ocean-libraries/configuration.md +++ /dev/null @@ -1,211 +0,0 @@ -# Configuration - -### Obtaining API key for Ethereum node provider - -Ocean Protocol's smart contracts are deployed on EVM-compatible networks. Using an API key provided by a third-party Ethereum node provider allows you to interact with the Ocean Protocol's smart contracts on the supported networks without requiring you to host a local node. - -Choose any API provider of your choice. Some of the commonly used are: - -* [Infura](https://infura.io/) -* [Alchemy](https://www.alchemy.com/) -* [Moralis](https://moralis.io/) - -The supported networks are listed [here](../../core-concepts/networks.md). - -### Create a directory - -Let's start with creating a working directory where we store the environment variable file, configuration files and the scripts. - -``` -mkdir my-ocean-project -cd my-ocean-project -``` - -### Create a `.env` file - -In the working directory create a `.env` file. The content of this file will store the values for following variables: - -| Variable name | Description | Required | -| ----------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | -| **OCEAN\_NETWORK** | Name of the network where the Ocean Protocol's smart contracts are deployed. | Yes | -| **OCEAN\_NETWORK\_URL** | The URL of the Ethereum node (along with API key for non-local networks) | Yes | -| **PRIVATE\_KEY** | The private key of the account which you want to use. A private key is made up of 64 hex characters. Make sure you have sufficient balance to pay for the transaction fees. | Yes | -| **AQUARIUS\_URL** | The URL of the Aquarius. This value is needed when reading an asset from off-chain store. | No | -| **PROVIDER\_URL** | The URL of the Provider. This value is needed when publishing a new asset or update an existing asset. | No | - -{% hint style="info" %} -Treat this file as a secret and do not commit this file to git or share the content publicly. If you are using git, then include this file name in `.gitignore` file. -{% endhint %} - -The below tabs show partially filled `.env` file content for some of the supported networks. - -{% tabs %} -{% tab title="Mainnet" %} -{% code title=".env" %} -``` -# Mandatory environment variables - -OCEAN_NETWORK=mainnet -OCEAN_NETWORK_URL= -PRIVATE_KEY= - -# Optional environment variables - -AQUARIUS_URL=https://v4.aquarius.oceanprotocol.com/ -PROVIDER_URL=https://v4.provider.mainnet.oceanprotocol.com -``` -{% endcode %} -{% endtab %} - -{% tab title="Polygon" %} -{% code title=".env" %} -``` -# Mandatory environment variables - -OCEAN_NETWORK=polygon -OCEAN_NETWORK_URL= -PRIVATE_KEY= - -# Optional environment variables - -AQUARIUS_URL=https://v4.aquarius.oceanprotocol.com/ -PROVIDER_URL=https://v4.provider.polygon.oceanprotocol.com -``` -{% endcode %} -{% endtab %} - -{% tab title="Local (using Barge)" %} -{% code title=".env" %} -``` -# Mandatory environment variables -OCEAN_NETWORK=development -OCEAN_NETWORK_URL=http://172.15.0.3:8545/ -AQUARIUS_URL=http://172.15.0.5:5000 -PROVIDER_URL=http://172.15.0.4:8030 - -# Replace PRIVATE_KEY if needed -PRIVATE_KEY=0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58 -``` -{% endcode %} -{% endtab %} -{% endtabs %} - -_NOTE: If using ocean.py, additionally specify **ADDRESS\_FILE** variable in the `.env` file. Copy the content of this_ [_link_](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json) _locally and set the **ADDRESS\_FILE** so that its value is a correct file path._ - -### Setup dependencies - -In this step the required dependencies will be installed. - -{% tabs %} -{% tab title="ocean.js" %} -```bash -npm init -npm install @oceanprotocol/lib@latest dotenv web3 @truffle/hdwallet-provider -``` -{% endtab %} - -{% tab title="ocean.py" %} -```bash -python3 -m venv venv -source venv/bin/activate -pip install wheel - -# Install Ocean library. Allow pre-releases to get the latest v4 version. -pip install ocean-lib -``` -{% endtab %} -{% endtabs %} - -### Create a configuration file - -A configuration file will read the content of the `.env` file and initialize the required configuration objects which will be used in the further tutorials. The below scripts creates a Web3 wallet instance and a Ocean's configuration object. - -Create the configuration file in the working directory i.e. at the same path where the `.env` is located. - -{% tabs %} -{% tab title="ocean.js" %} -{% code title="config.js" %} -```javascript -// Import dependencies -require('dotenv').config(); -const HDWalletProvider = require('@truffle/hdwallet-provider'); -const fs = require('fs'); -const { homedir } = require('os'); -const { ConfigHelper } = require('@oceanprotocol/lib'); - -// Get configuration for the given network -let oceanConfig = new ConfigHelper().getConfig(process.env.OCEAN_NETWORK); - -// If using local development environment, read the addresses from local file. -// The local deployment address file can be generated using barge. -if (process.env.OCEAN_NETWORK === 'development') { - const addressData = JSON.parse( - fs.readFileSync( - process.env.ADDRESS_FILE - || `${homedir}/.ocean/ocean-contracts/artifacts/address.json`, - 'utf8' - ) - ); - const addresses = addressData[process.env.OCEAN_NETWORK]; - - oceanConfig = { - ...oceanConfig, - oceanTokenAddress: addresses.Ocean, - poolTemplateAddress: addresses.poolTemplate, - fixedRateExchangeAddress: addresses.FixedPrice, - dispenserAddress: addresses.Dispenser, - erc721FactoryAddress: addresses.ERC721Factory, - sideStakingAddress: addresses.Staking, - opfCommunityFeeCollector: addresses.OPFCommunityFeeCollector - }; -} - -oceanConfig = { - ...oceanConfig, - nodeUri: process.env.OCEAN_NETWORK_URL, - // Set optional properties - Provider URL and Aquarius URL - metadataCacheUri: process.env.AQUARIUS_URL || oceanConfig.metadataCacheUri, - providerUri: process.env.PROVIDER_URL || oceanConfig.providerUri -}; - -const web3Provider = new HDWalletProvider( - process.env.PRIVATE_KEY, - oceanConfig.nodeUri -); - -module.exports = { - web3Provider, - oceanConfig -}; -``` -{% endcode %} -{% endtab %} - -{% tab title="ocean.py" %} -{% code title="config.py" %} -```python -import os -from dotenv import load_dotenv -from ocean_lib.example_config import get_config_dict -from ocean_lib.web3_internal.utils import connect_to_network -from ocean_lib.ocean.ocean import Ocean - -load_dotenv() - -# Create Ocean instance -network_name = os.getenv("OCEAN_NETWORK") -connect_to_network(network_name) - -config = get_config_dict(network_name) -ocean = Ocean(config) - -from brownie.network import accounts -accounts.clear() -user_private_key = os.getenv('PRIVATE_KEY') -wallet = accounts.add(user_private_key) -``` -{% endcode %} -{% endtab %} -{% endtabs %} - -Now, all the dependencies are ready and you can proceed with interacting with Ocean infrastructure using Ocean libraries. diff --git a/building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing.md b/building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing.md deleted file mode 100644 index b7f64804..00000000 --- a/building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing.md +++ /dev/null @@ -1,180 +0,0 @@ -# Publish with Fixed Pricing - -This tutorial guides you through the process of creating your own data NFT and a datatoken with fixed pricing, using Ocean libraries. To know more about data NFTs and datatokens please refer [this page](../../core-concepts/datanft-and-datatoken.md). Ocean Protocol supports different pricing schemes which can be set while publishing an asset. Please refer [this page](../../core-concepts/asset-pricing.md) for more details on pricing schemes. - -#### Prerequisites - -- [Obtain an API key](configuration.md#obtaining-api-key-for-ethereum-node-provider) -- [Set up the .env file](configuration.md#create-a-.env-file) -- [Install the dependencies](configuration.md#setup-dependencies) -- [Create a configuration file](configuration.md#create-a-configuration-file) - -#### Create a script to deploy data NFT and datatoken with fixed pricing. - -Create a new file in the same working directory where configuration file (`config.py`/`config.js`) and `.env` files are present, and copy the code as listed below. - -{% hint style="info" %} -**Fees**: The code snippets below define fees related parameters. Please refer [fees page ](../../core-concepts/fees.md)for more details -{% endhint %} - -{% tabs %} -{% tab title="ocean.js" %} -{% code title="create_datatoken_with_fre.js" %} -```javascript -// Import dependencies -const { NftFactory, Datatoken } = require('@oceanprotocol/lib'); -const Web3 = require('web3'); -const { web3Provider, oceanConfig } = require('./config'); - -// Create a web3 instance -const web3 = new Web3(web3Provider); - -// Define a function createFRE() -const createFRE = async () => { - const Factory = new NftFactory(oceanConfig.erc721FactoryAddress, web3); - - // Get accounts from web3 instance - const accounts = await web3.eth.getAccounts(); - const publisherAccount = accounts[0]; - - // data NFT parameters: name, symbol, templateIndex, etc. - const nftParams = { - name: '72120Bundle', - symbol: '72Bundle', - templateIndex: 1, - tokenURI: 'https://example.com', - transferable: true, - owner: publisherAccount - }; - - // datatoken parameters: name, symbol, templateIndex, etc. - const erc20Params = { - name: "Sample datatoken", - symbol: "SDT", - templateIndex: 1, - cap: '100000', - feeAmount: '0', - // paymentCollector is the address - paymentCollector: '0x0000000000000000000000000000000000000000', - feeToken: '0x0000000000000000000000000000000000000000', - minter: publisherAccount, - mpFeeAddress: '0x0000000000000000000000000000000000000000' - }; - - const fixedPriceParams = { - fixedRateAddress: oceanConfig.fixedRateExchangeAddress, - baseTokenAddress: oceanConfig.oceanTokenAddress, - owner: publisherAccount, - marketFeeCollector: publisherAccount, - baseTokenDecimals: 18, - datatokenDecimals: 18, - fixedRate: '100', - marketFee: '0', - // Optional parameters - // allowedConsumer: publisherAccount, // only account that consume the exhchange - withMint: false // add FixedPriced contract as minter if withMint == true - } - - // Create data NFT and a datatoken with Fixed Rate exchange - const result = await Factory.createNftErc20WithFixedRate( - publisherAccount, - nftParams, - erc20Params, - fixedPriceParams - ); - - // Get the data NFT address and datatoken address from the result - const erc721Address = result.events.NFTCreated.returnValues[0]; - const datatokenAddress = result.events.TokenCreated.returnValues[0]; - - return { - erc721Address, - datatokenAddress - }; -}; - -// Call the createFRE() function -createFRE() - .then(({ erc721Address, datatokenAddress }) => { - console.log(`DataNft address ${erc721Address}`); - console.log(`Datatoken address ${datatokenAddress}`); - process.exit(1); - - }) - .catch((err) => { - console.error(err); - process.exit(1); - }); -``` -{% endcode %} - -Execute script - -``` -node create_datatoken_with_fre.js -``` -{% endtab %} - -{% tab title="ocean.py" %} -{% code title="create_datatoken_with_fre.py" %} -```python -# Note: Ensure that .env and config.py are correctly setup -from config import web3_wallet, ocean - -data_nft = ocean.create_data_nft( - name="NFTToken1", - symbol="NFT1", - from_wallet=web3_wallet, - # Optional parameters - token_uri="https://example.com", - template_index=1, - transferable=True, - owner=web3_wallet.address, -) -print(f"Created dataNFT. Its address is {data_nft.address}") - -# replace the addresses here -fee_manager = "0x0000000000000000000000000000000000000000" -publish_market_order_fee_address = "0x0000000000000000000000000000000000000000" -publish_market_order_fee_token = "0x0000000000000000000000000000000000000000" -minter = web3_wallet.address - -# replace the fee amount -publish_market_order_fee_amount = 0 - -datatoken = data_nft.create_datatoken( - name="Datatoken 1", - symbol="DT1", - datatoken_cap="100000", - from_wallet=web3_wallet, - # Ootional parameters below - template_index=1, - fee_manager=fee_manager, - publish_market_order_fee_token=publish_market_order_fee_token, - publish_market_order_fee_amount=publish_market_order_fee_amount, - minter=minter, - publish_market_order_fee_address=publish_market_order_fee_address, -) -print(f"Created datatoken. Its address is {datatoken.address}") - - -exchange_id = ocean.create_fixed_rate( - datatoken=datatoken, - base_token=ocean.OCEAN_token, - amount=ocean.to_wei(100), - from_wallet=web3_wallet, -) - -print(f"Created fixed rate exchange with ID {exchange_id.hex()}") - -``` -{% endcode %} - -#### Execute script - -``` -python create_datatoken_with_fre.py -``` -{% endtab %} -{% endtabs %} - diff --git a/building-with-ocean/using-ocean-libraries/creating_dataNFT.md b/building-with-ocean/using-ocean-libraries/creating_dataNFT.md deleted file mode 100644 index 9eee6e76..00000000 --- a/building-with-ocean/using-ocean-libraries/creating_dataNFT.md +++ /dev/null @@ -1,106 +0,0 @@ -# Creating a dataNFT - -This tutorial guides you through the process of creating your own data NFT using Ocean libraries. To know more about data NFT please refer [this page](../../core-concepts/datanft-and-datatoken.md). - -#### Prerequisites - -- [Obtain an API key](configuration.md#obtaining-api-key-for-ethereum-node-provider) -- [Set up the .env file](configuration.md#create-a-.env-file) -- [Install the dependencies](configuration.md#setup-dependencies) -- [Create a configuration file](configuration.md#create-a-configuration-file) - -#### Create a script to deploy dataNFT - -Create a new file in the same working directory where configuration file (`config.py`/`config.js`) and `.env` files are present, and copy the code as listed below. - -{% tabs %} -{% tab title="ocean.js" %} -{% code title="create_dataNFT.js" %} -```javascript -// Import dependencies -const { NftFactory } = require('@oceanprotocol/lib'); -const Web3 = require('web3'); - -// Note: Make sure .env file and config.js are created and setup correctly -const { web3Provider, oceanConfig } = require('./config'); - -const web3 = new Web3(web3Provider); - -// Deinfe a function which will create a dataNFT using Ocean.js library -const createDataNFT = async () => { - - // Create a NFTFactory - const Factory = new NftFactory(oceanConfig.erc721FactoryAddress, web3); - - const accounts = await web3.eth.getAccounts(); - const publisherAccount = accounts[0]; - - // Define dataNFT parameters - const nftParams = { - name: '72120Bundle', - symbol: '72Bundle', - // Optional parameters - templateIndex: 1, - tokenURI: 'https://example.com', - transferable: true, - owner: publisherAccount - }; - - // Call a Factory.createNFT(...) which will create a new dataNFT - const erc721Address = await Factory.createNFT( - publisherAccount, - nftParams - ); - - return { - erc721Address - }; -}; - -// Call the create createDataNFT() function -createDataNFT() - .then(({ erc721Address }) => { - console.log(`DataNft address ${erc721Address}`); - process.exit(); - }) - .catch((err) => { - console.error(err); - process.exit(1); - }); -``` -{% endcode %} - -Executing script - -```bash -node create_dataNFT.js -``` -{% endtab %} - -{% tab title="ocean.py" %} -{% code title="create_dataNFT.py" %} -```python -# Note: Ensure that .env and config.py are correctly setup -from config import web3_wallet, ocean - -data_nft = ocean.create_data_nft( - name="NFTToken1", - symbol="NFT1", - from_wallet=web3_wallet, - # Optional parameters - token_uri="https://example.com", - template_index=1, - transferable=True, - owner=web3_wallet.address, -) -print(f"Created dataNFT. Its address is {data_nft.address}") -``` -{% endcode %} - -Executing script - -```bash -python create_dataNFT.py -``` -{% endtab %} -{% endtabs %} diff --git a/building-with-ocean/using-ocean-libraries/mint-datatoken.md b/building-with-ocean/using-ocean-libraries/mint-datatoken.md deleted file mode 100644 index f91d41af..00000000 --- a/building-with-ocean/using-ocean-libraries/mint-datatoken.md +++ /dev/null @@ -1,117 +0,0 @@ -# Mint Datatokens - -This tutorial guides you through the process of minting datatokens and sending them to a receiver address. The tutorial assumes that you already have the address of the datatoken contract which is owned by you. - -#### Prerequisites - -- [Obtain an API key](configuration.md#obtaining-api-key-for-ethereum-node-provider) -- [Set up the .env file](configuration.md#create-a-.env-file) -- [Install the dependencies](configuration.md#setup-dependencies) -- [Create a configuration file](configuration.md#create-a-configuration-file) - -#### Create a script to mint datatokens - -Create a new file in the same working directory where configuration file (`config.py`/`config.js`) and `.env` files are present, and copy the code as listed below. - -{% tabs %} -{% tab title="ocean.js" %} -{% code title="mint_datatoken.js" %} -```javascript -// Import dependencies -const { NftFactory, Datatoken } = require('@oceanprotocol/lib'); -const Web3 = require('web3'); -const { web3Provider, oceanConfig } = require('./config'); - -// Create a web3 instance -const web3 = new Web3(web3Provider); - -// Change this -const datatokenAddress = "0xD3542e5F56655fb818F9118CE219e1D10751BC82" -const receiverAddress = "0xBE5449a6A97aD46c8558A3356267Ee5D2731ab5e" - -// Create a function which will take `datatokenAddress` and `receiverAddress` as parameters -const mintDatatoken = async (datatokenAddress, receiverAddress) => { - const accounts = await web3.eth.getAccounts(); - const publisherAccount = accounts[0]; - - // Create datatoken instance - const datatoken = new Datatoken(web3); - - // Get current datatoken balance of receiver - let receiverBalance = await datatoken.balance( - datatokenAddress, - receiverAddress - ); - console.log(`Receiver balance before mint: ${receiverBalance}`); - - // Mint datatoken - await datatoken.mint( - datatokenAddress, - publisherAccount, - '1', - receiverAddress - ); - - // Get new datatoken balance of receiver - receiverBalance = await datatoken.balance( - datatokenAddress, - receiverAddress - ); - console.log(`Receiver balance after mint: ${receiverBalance}`); -}; - -// Call mintDatatoken(...) function defined above -mintDatatoken(datatokenAddress, receiverAddress) - .then(() => { - process.exit((err) => { - console.error(err); - process.exit(1); - }); - }) - .catch((err) => { - console.error(err); - process.exit(1); - }); -``` -{% endcode %} - -#### Execute script - -``` -node mint_datatoken.js -``` -{% endtab %} - -{% tab title="ocean.py" %} -{% code title="mint_datatoken.py" %} -```python -# Note: Ensure that .env and config.py are correctly setup -from config import web3_wallet, ocean - -# Change this -datatoken_address = "0xD3542e5F56655fb818F9118CE219e1D10751BC82" -receiver_address = "0xBE5449a6A97aD46c8558A3356267Ee5D2731ab5e" - -datatoken = ocean.get_datatoken(datatoken_address) - -print(f"Balance before mint: {datatoken.balanceOf(receiver_address)}") - -# Mint datatokens -datatoken.mint( - account_address=receiver_address, - value=ocean.to_wei("1"), - from_wallet=web3_wallet, -) - -print(f"Balance after mint: {datatoken.balanceOf(receiver_address)}") -nt_d -``` -{% endcode %} - -#### Execute script - -``` -python mint_datatoken.py -``` -{% endtab %} -{% endtabs %} diff --git a/building-with-ocean/using-ocean-libraries/update-metadata.md b/building-with-ocean/using-ocean-libraries/update-metadata.md deleted file mode 100644 index d78e3758..00000000 --- a/building-with-ocean/using-ocean-libraries/update-metadata.md +++ /dev/null @@ -1,102 +0,0 @@ -# Update Metadata - -This tutorial will guide you to update an existing asset published on-chain using Ocean libraries. The tutorial assumes that you already have the `did` of the asset which needs to be updated. In this tutorial, we will update the name, description, tags of the data NFT. Please refer [the page on DDO](../../core-concepts/did-ddo.md) to know more about additional the fields which can be updated. - -#### Prerequisites - -- [Obtain an API key](configuration.md#obtaining-api-key-for-ethereum-node-provider) -- [Set up the .env file](configuration.md#create-a-.env-file) -- [Install the dependencies](configuration.md#setup-dependencies) -- [Create a configuration file](configuration.md#create-a-configuration-file) - -{% hint style="info" %} -The variable **AQUARIUS\_URL** and **PROVIDER\_URL** should be set correctly in `.env` file -{% endhint %} - -#### Create a script to update the metadata - -Create a new file in the same working directory where configuration file (`config.py`/`config.js`) and `.env` files are present, and copy the code as listed below. - -{% tabs %} -{% tab title="ocean.js" %} -{% code title="updateMetadata.js" %} -```javascript -// Import dependencies -const { - Nft, - ProviderInstance, - getHash, - Aquarius -} = require('@oceanprotocol/lib'); -const { SHA256 } = require('crypto-js'); -const Web3 = require('web3'); -const { web3Provider, oceanConfig } = require('./config'); - -// Create a web3 instance -const web3 = new Web3(web3Provider); - -// Create Aquarius instance -const aquarius = new Aquarius(oceanConfig.metadataCacheUri); -const nft = new Nft(web3); -const providerUrl = oceanConfig.providerUri; - -// replace the did here -const did = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f"; - -// This function takes did as a parameter and updates the data NFT information -const setMetadata = async (did) => { - const accounts = await web3.eth.getAccounts(); - const publisherAccount = accounts[0]; - - // Fetch ddo from Aquarius - const ddo = await aquarius.resolve(did); - - // update the ddo here - ddo.metadata.name = "Sample dataset v2"; - ddo.metadata.description = "Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam"; - ddo.metadata.tags = ["new tag1", "new tag2"]; - - providerResponse = await ProviderInstance.encrypt(ddo, providerUrl); - const encryptedResponse = await providerResponse; - const metadataHash = getHash(JSON.stringify(ddo)); - - // Update the data NFT metadata - await nft.setMetadata( - ddo.nftAddress, - publisherAccount, - 0, - providerUrl, - '', - '0x2', - encryptedResponse, - `0x${metadataHash}` - ); - - // Check if ddo is correctly udpated in Aquarius - await aquarius.waitForAqua(ddo.id); - - console.log(`Resolved asset did [${ddo.id}]from aquarius.`); - console.log(`Updated name: [${ddo.metadata.name}].`); - console.log(`Updated description: [${ddo.metadata.description}].`); - console.log(`Updated tags: [${ddo.metadata.tags}].`); - -}; - -// Call setMetadata(...) function defined above -setMetadata(did).then(() => { - process.exit(); -}).catch((err) => { - console.error(err); - process.exit(1); -}); - -``` -{% endcode %} - -Execute the script - -```bash -node updateMetadata.js -``` -{% endtab %} -{% endtabs %} diff --git a/building-with-ocean/using-ocean-subgraph/README.md b/building-with-ocean/using-ocean-subgraph/README.md deleted file mode 100644 index 5df18f60..00000000 --- a/building-with-ocean/using-ocean-subgraph/README.md +++ /dev/null @@ -1,22 +0,0 @@ -# Using Ocean Subgraph - -Ocean subgraph is an off-chain service that provides information about datatokens, users, and balances using GraphQL. The Ocean subgraph provides a faster way to fetch data rather than an on-chain query. The data from Ocean subgraph can be queried using [GraphQL](https://graphql.org/learn/). - -You can use the Subgraph instances hosted by Ocean Protocol or host your instance. The page on Deploying Ocean Subgraph provides a guide on hosting your own Ocean Subgraph instance. - -For each supported network, an individual Ocean subgraph is deployed. The information about supported networks are available on this [page](../../core-concepts/networks.md). - -#### Ocean Subgraph GraphiQL - -The below table provides the link to GraphiQL for the support networks. The Graphql queries can be directly executed using the GraphiQL interface without the need of any setup. - -| Network and link | -| ----------------------------------------------------------------------------------------------------------------------- | -| [Ethereum Mainnet](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Polygon Mainnet](https://v4.subgraph.polygon.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Binance Smart Chain](https://v4.subgraph.bsc.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Energy Web Chain](https://v4.subgraph.energyweb.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Moonriver](https://v4.subgraph.moonriver.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Mumbai](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | -| [Görli](https://v4.subgraph.goerli.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | - diff --git a/building-with-ocean/wallets-and-ocean-tokens.md b/building-with-ocean/wallets-and-ocean-tokens.md deleted file mode 100644 index 5b8f029c..00000000 --- a/building-with-ocean/wallets-and-ocean-tokens.md +++ /dev/null @@ -1,61 +0,0 @@ ---- -description: >- - How to use crypto wallet software to check your Ocean Token balance and to - send Ocean Tokens to others. ---- - -# Use Your Wallet to Manage OCEAN Tokens - -If you don't see any Ocean Tokens in your crypto wallet software (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet. - -### Token Information - -Almost all ERC-20 wallets require these values for adding a custom token: - -**Mainnet** - -* Contract Address: `0x967da4048cD07aB37855c090aAF366e4ce1b9F48` -* Symbol: `OCEAN` -* Decimals: `18` - -**Polygon Mainnet (previously Matic)** - -* Contract Address: `0x282d8efCe846A88B159800bd4130ad77443Fa1A1` -* Symbol: `mOCEAN` -* Decimals: `18` - -**Binance Smart Chain (BSC)** - -* Contract Address: `0xdce07662ca8ebc241316a15b611c89711414dd1a` -* Symbol: `OCEAN` -* Decimals: `18` - -**Görli** - -* Contract Address: `0xCfDdA22C9837aE76E0faA845354f33C62E03653a` -* Symbol: `OCEAN` -* Decimals: `18` - -**Mumbai** - -* Contract Address: `0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8` -* Symbol: `OCEAN` -* Decimals: `18` - -The [OCEAN Token page](https://oceanprotocol.com/token) at oceanprotocol.com has further details. - -### MetaMask - -1. Make sure MetaMask is connected to the Ethereum Mainnet. -2. Select the account you want to manage. -3. Scroll down until the `Add Token` link is visible, then click on it. -4. Click on `Custom Token`. -5. Paste the Ocean Token contract address listed above into the _Token Contract Address_ field. The other two fields should auto-fill. If not, add `OCEAN` for the symbol and `18` for the precision. -6. Click `Next`. -7. Click `Add Tokens`. - -MetaMask should now show your Ocean Token (OCEAN) balance, and when you're looking at that, there should be a `Send` button to send Ocean Tokens to others. For help with that, see [the MetaMask docs about how to send tokens](https://metamask.zendesk.com/hc/en-us/articles/360015488931-How-to-Send-Tokens). - -### Other Wallet Software - -Do a web search to find out how to add a custom ERC-20 token to the wallet software you're using. diff --git a/building-with-ocean/wallets.md b/building-with-ocean/wallets.md deleted file mode 100644 index 3a2367d8..00000000 --- a/building-with-ocean/wallets.md +++ /dev/null @@ -1,31 +0,0 @@ -# Wallet Basics - -Ocean users need an ERC-20 compatible wallet to manage their ETH and OCEAN tokens. - -### Recommendations - -* **Easiest:** Use [MetaMask](https://metamask.io/) browser plug-in. -* **Still easy, but more secure:** Get a [Trezor](https://trezor.io/) or [Ledger](https://www.ledger.com/) hardware wallet, and use MetaMask to interact with it. -* The [OCEAN Token page](https://oceanprotocol.com/token) at oceanprotocol.com lists some other possible wallets. - -### The Meaning of "Wallet" - -A wallet usually means "a thing that stores private keys (and maybe signs transactions)" (explained below). Examples include MetaMask, Trezor, and Ledger wallets. - -A wallet can sometimes mean (web3) _software_ for interacting with a thing that stores private keys. Examples include MetaMask, [MyEtherWallet](https://www.myetherwallet.com/), and [MyCrypto](https://www.mycrypto.com/). - -Note how MetaMask is in both lists! - -### Related Terminology - -When you set up a new wallet, it might generate a **seed phrase** for you. Store that seed phrase somewhere secure and non-digital (e.g. on paper in a safe). It's extremely secret and sensitive. Anyone with your wallet's seed phrase could spend all the Ether and Ocean Tokens in all the accounts in your wallet. - -Once your wallet is set up, it will have one or more **accounts**. - -Each account has several **balances**, e.g. an Ether balance, an Ocean Token balance, and maybe other balances. All balances start at zero. - -An account's Ether balance might be 7.1 ETH in the Ethereum Mainnet, 2.39 ETH in Görli testnet. You can move ETH from one network to another only with a specially setup exchange or bridge. Also, you can't transfer tokens from networks holding value such as Ethereum mainnet to networks not holding value, i.e., testnets like Görli. The same is true of OCEAN token balances. - -Each account has one **private key** and one **address**. The address can be calculated from the private key. You must keep the private key secret because it's what's needed to spend/transfer ETH and OCEAN (or to sign transactions of any kind). You can share the address with others. In fact, if you want someone to send some ETH or OCEAN to an account, you give them the account's address. - -Note that unlike traditional pocket wallets, crypto wallets don't actually store ETH or OCEAN. They store private keys. diff --git a/contribute/README.md b/contribute/README.md new file mode 100644 index 00000000..91906867 --- /dev/null +++ b/contribute/README.md @@ -0,0 +1,79 @@ +--- +title: Ways to Contribute +description: Help develop Ocean Protocol software like a superhero +cover: ../.gitbook/assets/cover/contribute_banner.png +coverY: 0 +--- + +# 🤝 Contribute + +
+ +### Report a bug 🐞 + +Do you think you see a bug in our code? To report a bug that _isn't a vulnerability_, go to the relevant GitHub repository, click on the _Issues_ tab, and select _Bug Report_. + +First, make sure that you search existing open + closed issues + PRs to see if your bug has already been reported there. If not, then go ahead and create a new bug report! 🦸 + +#### Do you see an error in the Ocean Market? + +Follow our steps below to properly document your bug! Paste the screenshots into your GitHub issue. + +{% embed url="https://app.arcade.software/share/fUNrK6z2eurJ2C1ty2OG" fullWidth="false" %} +{% endembed %} + +### Report vulnerabilities + +For all the super sleuths out there, you may be able to earn a bounty for reporting vulnerabilities in sensitive parts of our code. Check out our page on [Immunify](https://immunefi.com/bounty/oceanprotocol/) for the latest bug bounties available. You can also responsibly disclose flaws by emailing us at [security@oceanprotocol.com](mailto:security@oceanprotocol.com). + +

Did you find a glitch in our code matrix?

+ +### Suggest a new feature 🤔💭 + +Use the _Issues_ section of each repository and select _`Feature request`_ to suggest and discuss any features you would like to see added. + +As with bug reports, don't forget to search existing open + closed issues + PRs to see if something has already been suggested. + +### Improve core software + +It takes a tribe of awesome coders to build our tech stack, and you're invited to pitch in 😊 We'd love to have you contribute to any repository within the `oceanprotocol` [GitHub](https://github.com/oceanprotocol) organization! + +Before you start coding, please follow these basic guidelines: + +* If no feature request issue for your case is present, **please open one first before starting to work on something, so it can be discussed openly with Ocean core team**. +* Make yourself familiar with the repository-specific contribution requirements and code style requirements. +* Because of the weird world of intellectual property, we need you to follow the [legal requirements](legal-reqs.md) for contributing code. +* Be excellent to each other in the comments, as outlined in our [Contributor Code of Conduct](code-of-conduct.md). + +#### Your contribution workflow + +1. As an external developer, fork the respective repo and **push your code changes to your own fork.** Ocean core developers push directly on the repo under `oceanprotocol` org. +2. Provide the issue # information when you open a PR, for example: `issue-001-short-feature-description`. The issue number `issue-001` needs to reference the GitHub issue that you are trying to fix. The short feature description helps us to quickly distinguish your PR among the other PRs in play. +3. To get visibility and Continuous Integration feedback as early as possible, open your Pull Request as a `Draft`. +4. Give it a meaningful title, and at least link to the respective issue in the Pull Request description, like `Fixes #23`. Describe your changes, mention things for reviewers to look out for, and for UI changes screenshots and videos are helpful. +5. Once your Pull Request is ready, mark it as `Ready for Review`, in most repositories code owners are automatically notified and asked for review. +6. Get all CI checks green and address eventual change requests. +7. If your PR stays open for longer and merge conflicts are detected, merge or rebase your branch against the current `main` branch. +8. Once a Pull Request is approved, you can merge it. + +Depending on the release management of each repository, your contribution will be either included in a next release, or deployed live automatically. + +Except for GitHub, you can chat with most Ocean Protocol core developers in our [Discord](https://discord.gg/TnXjkR5) if you have further development questions. + +### Develop a dApp or integration on top of Ocean Protocol + +We LOVE builders of dApps on Ocean! Nothing makes us feel prouder than seeing you create awesome things with our open-source tools. + +If you need ANY help, then we're here to talk with you on [Discord](https://discord.gg/TnXjkR5) to give you advice. We're also consistently improving our docs to help you. And... you're here :) + +### Improve our docs + +Our docs repo can always be improved. If you found a mistake or have an improvement to make, then follow the steps in our [contribution workflow](./#your-contribution-workflow) to commit your changes. + +### Apply for a developer job + +Do you REALLY love building on Ocean Protocol? Consider joining us full-time! Our openings are listed at [https://github.com/oceanprotocol/jobs](https://github.com/oceanprotocol/jobs). + + + +Check our [Community Page](https://www.oceanprotocol.com/community) for our social media links where you can join the buzz around Ocean or chat with us directly 😊 Toodles! diff --git a/contribute/code-of-conduct.md b/contribute/code-of-conduct.md new file mode 100644 index 00000000..cbf44393 --- /dev/null +++ b/contribute/code-of-conduct.md @@ -0,0 +1,32 @@ +--- +title: Contributor Code of Conduct +description: Be excellent to each other. +--- + +# Contributor Code of Conduct + +As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute to the project. + +We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, nationality, or species. + +Examples of unacceptable behavior by participants include: + +* The use of sexualized language or imagery +* Personal attacks +* Trolling or insulting/derogatory comments +* Public or private harassment +* Publishing other's private information, such as physical or electronic addresses, without explicit permission +* Deliberate intimidation +* Other unethical or unprofessional conduct + +Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. + +By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team. + +This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. + +Instances of abusive, harassing, or otherwise unacceptable behavior directed at yourself or another community member may be reported by contacting a project maintainer at [conduct@oceanprotocol.com](mailto:conduct@oceanprotocol.com). All complaints will be reviewed and investigated and will result in a response that is appropriate to the circumstances. Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident. + + + +This Code of Conduct is adapted from the [Contributor Covenant](http://contributor-covenant.org), version 1.3.0, available at [contributor-covenant.org/version/1/3/0/](http://contributor-covenant.org/version/1/3/0/) diff --git a/core-concepts/legal-reqs.md b/contribute/legal-reqs.md similarity index 100% rename from core-concepts/legal-reqs.md rename to contribute/legal-reqs.md diff --git a/contribute/projects-using-ocean.md b/contribute/projects-using-ocean.md new file mode 100644 index 00000000..45adf7e6 --- /dev/null +++ b/contribute/projects-using-ocean.md @@ -0,0 +1,18 @@ +--- +title: Partners & Collaborators +description: We are so proud of the companies that use Ocean Protocol tools! +--- + +# Partners & Collaborators + +
+ +From startups to full enterprises, we have so many partners and collaborators using Ocean tech. Curious who's working with Ocean tools? Check out our up-to-date list of our partners and collaborators on the [Ecosystem page](https://oceanprotocol.com/ecosystem) of our website. + +### Show your support by trading OCEAN tokens + +Visit [Coingecko's OCEAN markets page](https://www.coingecko.com/en/coins/ocean-protocol#markets) to see all the exchanges that support OCEAN. Here, you can see the most liquid exchanges, and many of them even offer liquidity mining and other yield opportunities. + +### Acknowledgements + +[GitBook](https://www.gitbook.com/) is a supporter of this open-source project by providing hosting for this documentation. diff --git a/core-concepts/README.md b/core-concepts/README.md deleted file mode 100644 index e8a038cc..00000000 --- a/core-concepts/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Core Concepts - diff --git a/core-concepts/architecture.md b/core-concepts/architecture.md deleted file mode 100644 index 3bb157ba..00000000 --- a/core-concepts/architecture.md +++ /dev/null @@ -1,69 +0,0 @@ ---- -title: Architecture Overview -description: Data NFTs and datatokens architecture ---- - -# Architecture Overview - -### Overview - -Here is the Ocean architecture. - -![Ocean Protocol tools architecture]() - -Here’s an overview of the figure. - -* The top layer is **applications** like Ocean Market. With these apps, users can onboard services like data, algorithms, compute-to-data into crypto (publish and mint data NFTs and datatokens), hold datatokens as assets (data wallets), discover assets, and buy/sell datatokens for a fixed or auto-determined price (data marketplaces), and use data services (spend datatokens). -* Below are **libraries** used by the applications: Ocean.js (JavaScript library) and Ocean.py (Python library). This also includes middleware to assist discovery: - * **Aquarius**: Provides metadata cache for faster search by caching on-chain data into elasticsearch - * **Provider**: Facilitates downloading assets, DDO encryption, and communicating with `operator-service` for Compute-to-Data jobs. - * **The Graph**: It is a 3rd party tool that developers can utilize the libraries to build their custom applications and marketplaces. -* The lowest level has the **smart contracts**. The smart contracts are deployed on the Ethereum mainnet and other compatible networks. Libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To see the supported networks click [here](networks.md). - -### Data NFTs, Datatokens and Access Control Tools - -Data NFTs are based on [ERC721](https://eips.ethereum.org/EIPS/eip-721) standard. The publisher can use Marketplace or client libraries to deploy a new data NFT contract. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**. Publisher can then assign manager role to other Ethereum addresses who can deploy new datatoken contracts and even mint them. Each datatoken contract is associated with one data NFT contract. Click [here](datanft-and-datatoken.md) to further read about data NFTs and datatokens. - -ERC721 data NFTs represent holding copyright/base IP of a data asset, and ERC20 datatokens represent licenses to access the asset by downloading the content or running Compute-to-Data jobs. - -Datatoken represents the asset that the publisher wants to monetize. The asset can be a dataset or an algorithm. The publisher actor holds the asset in Google Drive, Dropbox, AWS S3, on their phone, on their home server, etc. The publisher can optionally use IPFS for a content-addressable URL. Or instead of a file, the publisher may run a compute-to-data service. - -In the **publish** step, the publisher invokes **Ocean Datatoken Factory** to deploy a new datatoken to the chain. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC20 datatoken template**. The publisher then mints datatokens. - -The publisher runs their own **Ocean Provider** or can use one deployed by Ocean Protocol. In the **download** step or while running C2D job, Provider software needs to retrieve the data service URL given a datatoken address. One approach would be for the publisher to run a database. However, this adds another dependency. To avoid this, the Provider encrypts the URL, which then gets published on-chain. - -To initiate the **download** step, the data buyer sends 1.0 datatokens to the Provider wallet. Then they make a service request to the Provider. The Provider loads the encrypted URL, decrypts it, and provisions the requested service (send static data, or enable a compute-to-data job). - -Instead of running a Provider themselves, the publisher can have a 3rd party like Ocean Market to run it. While more convenient, it means that the 3rd party has custody of the private encryption/decryption key (more centralized). Ocean will support more service types and URL custody options in the future. - -**Ocean JavaScript and Python libraries** act as drivers for the lower-level contracts. Each library integrates with Ocean Provider to provision & access data services, and Ocean Aquarius for metadata. - -### Market Tools - -Once someone has generated datatokens, they can be used in any ERC20 exchange, centralized or decentralized. In addition, Ocean provides a convenient default marketplace that is tuned for data: **Ocean Market**. It’s a vendor-neutral reference data marketplace for use by the Ocean community. - -The marketplaces are decentralized (no single owner or controller), and non-custodial (only the data owner holds the keys for the datatokens). - -Ocean Market supports fixed pricing or free pricing. For more detials on pricing schema refer [this guide](asset-pricing.md). - -Complementary to Ocean Market, Ocean has reference code to ease building **third-party data marketplaces**, such as for logistics ([dexFreight data marketplace](https://blog.oceanprotocol.com/dexfreight-ocean-protocol-partner-to-enable-transportation-logistics-companies-to-monetize-data-7aa839195ac)) or mobility ([Daimler](https://blog.oceanprotocol.com/ocean-protocol-delivers-proof-of-concept-for-daimler-ag-in-collaboration-with-daimler-south-east-564aa7d959ca)). - -[This post](https://blog.oceanprotocol.com/ocean-market-an-open-source-community-marketplace-for-data-4b99bedacdc3) elaborates on Ocean marketplace tools. - -### Metadata Tools - -Marketplaces use the Metadata of the asset for discovery. Metadata consists of information like the type of asset, name of the asset, creation date, license, etc. Each data asset can have a [decentralized identifier](https://w3c-ccg.github.io/did-spec/) (DID) that resolves to a DID document (DDO) for associated metadata. The DDO is essentially [JSON](https://www.json.org/) filling in metadata fields. For more details on working with OCEAN DIDs check out the [DID concept documentation](did-ddo.md). The [DDO Metadata documentation](did-ddo.md#metadata) goes into more depth regarding metadata structure. - -[OEP8](did-ddo.md) specifies Ocean metadata schema, including fields that must be filled. It’s based on the public [DataSet schema from schema.org](https://schema.org/Dataset). - -Ocean uses the Ethereum mainnet and other compatible networks as an **on-chain metadata store**, i.e. to store both DID and DDO. This means that once the transaction fee is paid, there are no further expenses or devops work needed to ensure metadata availability into the future, aiding in the discoverability of data assets. It also simplifies integration with the rest of the Ocean system, which is Ethereum-based. Storage cost on Ethereum mainnet is not negligible, but not prohibitive and the other benefits are currently worth the trade-off compared to alternatives. - -Due to the permissionless, decentralized nature of data on the Ethereum mainnet, any last mile tool can access metadata. **Ocean Aquarius** supports different metadata fields for each different Ocean-based marketplace. Developers could also use [The Graph](https://www.thegraph.com) to see metadata fields that are common across all marketplaces. - -### Third-Party ERC20 Apps & Tools - -The ERC20 nature of datatokens eases composability with other Ethereum tools and apps, including **MetaMask** and **Trezor** as data wallets, DEXes as data exchanges, and more. [This post](https://blog.oceanprotocol.com/ocean-datatokens-from-money-legos-to-data-legos-4f867cec1837) has details. - -### Actor Identities - -Actors like data providers and buyers have Ethereum addresses, aka web3 accounts. These are managed by crypto wallets, as one would expect. For most use cases, this is all that’s needed. There are cases where the Ocean community could layer on protocols like [Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) or tools like [3Box](https://3box.io/). diff --git a/core-concepts/asset-pricing.md b/core-concepts/asset-pricing.md deleted file mode 100644 index 944415fb..00000000 --- a/core-concepts/asset-pricing.md +++ /dev/null @@ -1,32 +0,0 @@ ---- -title: Asset Pricing -description: Choose the revenue model during asset publishing ---- - -# Asset Pricing - -Ocean Protocol offers two types of pricing options for asset monetization. The publisher can choose a pricing model which best suits their needs while publishing an asset. The pricing model selected cannot be changed once the asset is published. - -The price of an asset is determined by the number of Ocean tokens a buyer must pay to access the asset. When users pay the right amount of Ocean tokens, they get a _datatoken_ in their wallets, a tokenized representation of the access right stored on the blockchain. To read more about datatoken and data NFT click [here](datanft-and-datatoken.md). - -### Fixed pricing - -With the fixed price model, publishers set the price for the data in OCEAN. Ocean Market creates a datatoken in the background with a value equal to the dataset price in OCEAN so that buyers do not have to know about the datatoken. Buyers pay the amount specified in OCEAN for access. The publisher can update the price of the dataset later anytime. - -A [FixedRateExchange](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/pools/fixedRate/FixedRateExchange.sol) smart contract stores the information about the price of the assets published using this model. - -The image below shows how to set the fixed pricing of an asset in the Ocean's Marketplace. Here, the price of the asset is set to 10 Ocean tokens. - -![fixed-asset-pricing]() - -### Free pricing - -With the free pricing model, the buyers can access an asset without requiring them to pay for it except for the transaction fees. - -With this pricing model, datatokens are allocated to the [dispenser](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/pools/dispenser/Dispenser.sol) smart contract, which dispenses data tokens to users for free whenever they are accessing an asset. - -Free pricing is suitable for individuals and organizations working in the public domain and want their datasets to be freely available. Publishers can also choose this model if they publish assets with licenses that require them to make them freely available. - -The image below shows how to set free access to an asset in the Ocean's Marketplace. - -![free-asset-pricing]() diff --git a/core-concepts/code-of-conduct.md b/core-concepts/code-of-conduct.md deleted file mode 100644 index 2c766357..00000000 --- a/core-concepts/code-of-conduct.md +++ /dev/null @@ -1,50 +0,0 @@ ---- -title: Contributor Code of Conduct -description: Be excellent to each other. ---- - -As contributors and maintainers of this project, and in the interest of -fostering an open and welcoming community, we pledge to respect all people who -contribute to the project. - -We are committed to making participation in this project a harassment-free -experience for everyone, regardless of level of experience, gender, gender -identity and expression, sexual orientation, disability, personal appearance, -body size, race, ethnicity, age, religion, nationality, or species. - -Examples of unacceptable behavior by participants include: - -- The use of sexualized language or imagery -- Personal attacks -- Trolling or insulting/derogatory comments -- Public or private harassment -- Publishing other's private information, such as physical or electronic - addresses, without explicit permission -- Deliberate intimidation -- Other unethical or unprofessional conduct - -Project maintainers have the right and responsibility to remove, edit, or -reject comments, commits, code, wiki edits, issues, and other contributions -that are not aligned to this Code of Conduct, or to ban temporarily or -permanently any contributor for other behaviors that they deem inappropriate, -threatening, offensive, or harmful. - -By adopting this Code of Conduct, project maintainers commit themselves to -fairly and consistently applying these principles to every aspect of managing -this project. Project maintainers who do not follow or enforce the Code of -Conduct may be permanently removed from the project team. - -This Code of Conduct applies both within project spaces and in public spaces -when an individual is representing the project or its community. - -Instances of abusive, harassing, or otherwise unacceptable behavior directed at yourself or another community member may be reported by contacting a project maintainer at [conduct@oceanprotocol.com](mailto:conduct@oceanprotocol.com). All -complaints will be reviewed and investigated and will result in a response that -is appropriate to the circumstances. Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident. - ---- - -This Code of Conduct is adapted from the [Contributor Covenant][homepage], -version 1.3.0, available at [contributor-covenant.org/version/1/3/0/][version] - -[homepage]: http://contributor-covenant.org -[version]: http://contributor-covenant.org/version/1/3/0/ diff --git a/core-concepts/contributing.md b/core-concepts/contributing.md deleted file mode 100644 index 8d0a99a0..00000000 --- a/core-concepts/contributing.md +++ /dev/null @@ -1,83 +0,0 @@ ---- -title: Ways to Contribute -description: Help to improve and develop Ocean core software. ---- - -## Report a bug - -To report a bug that isn't a vulnerability, go to the relevant GitHub repository, click on the _Issues_ tab and select _Bug report_. - -Before reporting a bug, search existing open and closed issues and PRs to see if something has already been reported. If not, then go ahead and create a new bug report, following the structure suggested in the issue template. - -## Report Vulnerabilities - -You may be able to earn a bounty for reporting vulnerabilities in sensitive parts of our code. Check our page on [Immunify](https://immunefi.com/bounty/oceanprotocol/) for the latest bug bounties available. You can also responsibly disclose flaws by emailing us at security@oceanprotocol.com. - -## Suggest a new feature - -Use the _Issues_ section of each repository and select _Feature request_ to suggest and discuss any features you would like to see added. - -As with bug reports, search existing open and closed issues and PRs to see if something has already been reported. - -## Fix or improve core software - -We'd love to have you contribute to any repository within the `oceanprotocol` GitHub organization! - -Before you start coding right away, please follow those basic guidelines: - -- If no issue for your case is present, open one first before starting to work on something, so it can be discussed. -- Make yourself familiar with eventual repository-specific contribution requirements and code style requirements. -- Because of the weird world of intellectual property, we need you to follow the [legal requirements](./legal-reqs.md) for contributing code. -- Be excellent to each other, as outlined in our [Contributor Code of Conduct](./code-of-conduct.md). - -### Workflow - -A typical code contribution in any Ocean Protocol repository would go as follows: - -1. As an external developer, fork the respective repo and push to your own fork. Ocean core developers push directly on the repo under `oceanprotocol` org. -2. You should create a new branch for your changes. The naming convention for branches is: `issue-001-short-feature-description`. The issue number `issue-001` needs to reference the GitHub issue that you are trying to fix. The short feature description helps to quickly distinguish your branch among the other branches in play. -3. To get visibility and Continuous Integration feedback as early as possible, open your Pull Request as a `Draft`. -4. Give it a meaningful title, and at least link to the respective issue in the Pull Request description, like `Fixes #23`. Describe your changes, mention things for reviewers to look out for, and for UI changes screenshots and videos are helpful. -5. Once your Pull Request is ready, mark it as `Ready for Review`, in most repositories code owners are automatically notified and asked for review. -6. Get all CI checks green and address eventual change requests. -7. If your PR stays open for longer and merge conflicts are detected, merge or rebase your branch against the current `main` branch. -8. Once a Pull Request is approved, you can merge it. - -Depending on the release management of each repository, your contribution will be either included in a next release, or is put live automatically. - -Except for GitHub, you can find most Ocean Protocol core developers in [Discord](https://discord.gg/TnXjkR5) if you have further development questions. - -## Develop an app or integration on top of Ocean Protocol - -Create an app with one of Ocean Protocol's interface points: - - - - - - -Ocean documentation will help. And... you're here:) - -## Improve these docs - -These docs can always be improved. Every content page has an edit link at its end linking you to the content source on GitHub for simple copy editing. - -If you found a technical bug or have an improvement suggestion, head over to the repo's _Issues_ section: - - - -## Apply for a developer job - -Really love building on Ocean and want to dive deeper? Consider joining us full time. Our openings are listed at https://github.com/oceanprotocol/devjobs. - -## Get Funding - -Funding can be for contributing to the core software, building apps, doing integrations, fixing bugs, community outreach, and more. Checkout our active funding programs for more information: - -- **[Ocean DAO](https://www.oceanprotocol.com/fund)** (grants curated by the community). -- **[Shipyard](https://oceanprotocol.com/shipyard)** (Ocean curated grants). -- **[Data Bounties](https://oceanprotocol.com/bounties)** (rewards for publishing algorithms and datasets). - -## Other ways to get involved - -Please go to the [Ocean Community Page](https://www.oceanprotocol.com/community) for more ideas on how to get involved. diff --git a/core-concepts/datanft-and-datatoken.md b/core-concepts/datanft-and-datatoken.md deleted file mode 100644 index ef1608dc..00000000 --- a/core-concepts/datanft-and-datatoken.md +++ /dev/null @@ -1,117 +0,0 @@ ---- -title: Data NFTs and Datatokens -description: >- - In Ocean Protocol, ERC721 data NFTs represent holding copyright/base IP of a - data asset, and ERC20 datatokens represent licenses to access the assets. ---- - -# Data NFTs and Datatokens - -A non-fungible token stored on the blockchain represents a unique asset. NFTs can represent images, videos, digital art, or any piece of information. NFTs can be traded, and allow transfer of copyright/base IP. [EIP-721](https://eips.ethereum.org/EIPS/eip-721) defines an interface for handling NFTs on EVM-compatible blockchains. The creator of the NFT can deploy a new contract on Ethereum or any Blockchain supporting NFT related interface and also, transfer the ownership of copyright/base IP through transfer transactions. - -Fungible tokens represent fungible assets. If you have 5 ETH and Alice has 5 ETH, you and Alice could swap your ETH and your final holdings remain the same. They're apples-to-apples. Licenses (contracts) to access a copyrighted asset are naturally fungible - they can be swapped with each other. - -![Data NFT and datatoken](../.gitbook/assets/datanft-and-datatoken.png) - -## What is a Data NFT? - -A data NFT represents the copyright (or exclusive license against copyright) for a data asset on the blockchain — we call this the “base IP”. When a user publishes a dataset in OceanOnda V4, they create a new NFT as part of the process. This data NFT is proof of your claim of base IP. Assuming a valid claim, you are entitled to the revenue from that asset, just like a title deed gives you the right to receive rent. - -The data NFT smart contract holds metadata about the data asset, stores roles like “who can mint datatokens” or “who controls fees”, and an open-ended key-value store to enable custom fields. - -If you have the private key that controls the NFT, you are the owner of that NFT. The owner has the claim on the base IP and is the default recipient of any revenue. They can also assign another account to receive revenue. This enables the publisher to sell their base IP and the revenues that come with it. When the Data NFT is transferred to another user, all the information about roles and where the revenue should be sent is reset. The default recipient of the revenue is the new owner of the data NFT. - -### Data NFTs Open Up New Possibilities - -With data NFTs, you are able to take advantage of the wider NFT ecosystem and all the tools and possibilities that come with it. As a first example, many leading crypto wallets have first-class support for NFTs, allowing you to manage data NFTs from those wallets. Or, you can post your data NFT for sale on a popular NFT marketplace like [OpenSea](https://www.opensea.io/) or [Rarible](https://www.rarible.com/). As a final example, we’re excited to see [data NFTs linked to physical items via WiseKey chips](https://www.globenewswire.com/news-release/2021/05/19/2232106/0/en/WISeKey-partners-with-Ocean-Protocol-to-launch-TrustedNFT-io-a-decentralized-marketplace-for-objects-of-value-designed-to-empower-artists-creators-and-collectors-with-a-unique-solu.html). - -## High-Level Architecture - -The image above describes how ERC721 data NFTs and ERC20 datatokens relate. - -* Bottom: The publisher deploys an ERC721 data NFT contract representing the base IP for the data asset. They are now the manager of the data NFT. -* Top: The manager then deploys an ERC20 datatoken contract against the data NFT. The ERC20 represents a license with specific terms like "can download for the next 3 days". They could even publish further ERC20 datatoken contracts, to represent different license terms or for compute-to-data. - -### Terminology - -* **Base IP** means the artifact being copyrighted. Represented by the {ERC721 address, tokenId} from the publish transactions. -* **Base IP holder** means the holder of the Base IP. Represented as the actor that did the initial "publish" action. -* **Sub-licensee** is the holder of the sub-license. Represented as the entity that controls address ERC721.\_owners\[tokenId=x]. -* **To Publish**: Claim copyright or exclusive base license. -* **To Sub-license**: Transfer one (of many) sub-licenses to new licensee: ERC20.transfer(to=licensee, value=1.0). - -### Implementation in Ocean Protocol - -We have implemented data NFTs using the [ERC721 standard](https://erc721.org/). Ocean Protocol defines the [ERC721Factory](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/ERC721Factory.sol) contract, allowing **Base IP holders** to create their ERC721 contract instances on any supported networks. The deployed contract stores Metadata, ownership, sub-license information, permissions. The contract creator can also create and mint ERC20 token instances for sub-licensing the **Base IP**. - -ERC721 tokens are non-fungible, thus cannot be used for automatic price discovery like ERC20 tokens. ERC721 and ERC20 combined together can be used for sub-licensing. Ocean Protocol's [ERC721Template](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC721Template.sol) solves this problem by using ERC721 for tokenizing the **Base IP** and tokenizing sub-licenses by using ERC20. - -Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. - -OceanOnda V4’s data NFT factory can deploy different types of data NFTs based on a variety of templates. Some templates could be tuned for data unions, others for DeFi, and others yet for enterprise use cases. - -Something else that we’re super excited about in our data NFTs is a cutting-edge standard called [ERC725](https://github.com/ERC725Alliance/erc725/blob/main/docs/ERC-725.md) being driven by our friends at [Lukso](https://lukso.network/about). The ERC725y feature enables the NFT owner (or a user with the “store updater” role) to input and update information in a key-value store. These values can be viewed externally by anyone. - -ERC725y is incredibly flexible and can be used to store any string; you could use it for anything from additional metadata to encrypted values. This helps future-proof the data NFTs and ensure that they are suitable for a wide range of projects that have not been launched yet. As you can imagine, the inclusion of ERC725y has huge potential and we look forward to seeing the different ways people end up using it. If you’re interested in using this, take a look at [EIP725](https://eips.ethereum.org/EIPS/eip-725#erc725y). - -Continuing the theme of flexibility, for a given data NFT, you can have one or more ERC20 datatoken contracts. Here’s the main idea: 1.0 datatokens allows you to consume the corresponding dataset. Put another way, it’s a sub-license from the base IP to be able to use the dataset according to the license terms (when you send it to the publisher). License terms can be set from a “good default”, or by the Data NFT owner. ERC20 fungible token standard is a natural choice for datatokens, because licenses themselves are fungible: one license can be exchanged 1:1 another. Using the ERC20 standard enables interoperability of datatokens with ERC20-based wallets, DEXes, DAOs, and more. Datatokens can be given (simply transferred), purchased on a marketplace / exchange, airdropped, etc. - -You can publish a data NFT initially with no ERC20 datatoken contracts. This means you simply aren’t ready to grant access to your data asset yet (sub-license it). Then, you can publish one or more ERC20 datatoken contracts against the data NFT. One datatoken contract might grant consume rights for 1 day, another for 1 week, etc. Each different datatoken contract is for different license terms. - -Ocean provides convenient methods to list ERC20 datatokens for sale, with fixed-price (atomic swap), or for free. Like any ERC20 token, datatokens may be listed in many decentralised exchanges (DEXes), centralised exchanges (CEXes), over-the-counter, or otherwise. - -### High-Level Behavior - -![Flow]() - -Here's an example. - -* In step 1, Alice **publishes** her dataset with Ocean: this means deploying an ERC721 data NFT contract (claiming copyright/base IP), then an ERC20 datatoken contract (license against base IP). -* In step 2, she **mints** some ERC20 datatokens and **transfers** 1.0 of them to Bob's wallet; now he has a license to be able to download that dataset. - -### Revenue - -By default revenue is sent to the owner of the data NFT and this automatically updates when the data NFT is sent to a new owner. Owning a data NFT therefore has an explicit value - the revenue stream associated with selling the digital asset. - -In some situations, you may want the revenue to be sent to another account rather than the owner. This can be done by setting a new **payment collector**. Changing the payment collector can be particularly useful when the data NFT is owned by an organization or enterprise, rather than an individual. - -In order to set a new payment collector, you need to visit the asset detail page and then click on “Edit Asset” and then scroll down to the field call “Payment Collector Address”. Add the new Ethereum address in this field and then click “Submit“. Finally, you will then need to sign two transactions to finalize the update. - -![Update Payment Collector]() - -### TemplateIds - -Each data NFT or a datatoken is cloned from pre-defined template contracts. The _templateId_ parameter refers to the template from which a data NFT or datatoken is created. The templateId can be set while creating data NFT/datatoken. The templateId is stored in the code of the smart contract and can be retrived using `getId()` function. Currently, Ocean protocol supports 1 template type for data NFT and 2 template variants for datatokens, namely: **regular template** and **enterprise template**. Each template supports the same interfaces but differs in the underlying implementation and can have additional features. - -The only data NFT template currently available has templateId `1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC721Template.sol). - -The details regarding currently supported datatoken templates are as follows: - -* **Regular template**: The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a `MINTER` role, making the supply of datatoken variable. This template is assigned templateID `1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol). -* **Enterprise template**: The enterprise template has additional functions apart from methods in the ERC20 interface. This additional feature allows access to the service by paying in the basetoken instead of datatoken. Internally, the smart contract handles conversion of basetoken to datatoken, initiating an order to access the service, and minting/burning the datatoken. The total supply of the datatoken effectively remains 0 in the case of the enterprise template. This template is assigned templateID `2` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20TemplateEnterprise.sol). - -_NOTE: Ocean Protocol might support additional variations of data NFT/datatoken by adding new templates._ - -### Fractional Ownership - -Fractional ownership is an exciting sub-niche of Web3, at the intersection of NFTs and DeFi. IT allows co-ownership of data IP. - -Ocean provides two approaches to fractional ownership: - -* Sharded holding of ERC20 datatokens, where each ERC20 holder has the usual datatoken rights as described above, e.g. 1.0 datatokens to consume an asset. This comes out-of-the-box with Ocean. -* Sharding ERC721 data NFT, where each co-holder has right to some earnings against base IP, and co-controls the data NFT. For example, there’s a DAO with the sole purpose to hold the data NFT; this DAO has its own ERC20 token; DAO members vote with tokens to update data NFT roles or deploy ERC20 datatokens against the ERC721. - -Note: For (2), one might consider doing sharding with something like Niftex. But then there are questions: what rights do the shard-holders get exactly? It could be zero; for example, Amazon shareholders don’t have the right to walk the hallways of the Amazon offices just because they hold shares. Secondly, how do the shard-holders control the data NFT? These questions get resolved by using a tokenized DAO, as described above. - -Data DAOs are a cool use case whenever you have a group of people that wish to co-manage data, or bundle up data for larger collective bargaining power. The DAO may be a union, co-op, or trust. - -Consider the following mobile app example. You install the app; it has a built-in crypto wallet; you give the app permission to see your location data; the app gets the DAO to sell your (anonymized) location data on your behalf; the DAO sells your data bundled along with thousands of other DAO members; as a DAO member you get a cut of the profits. - -This has several variants. Each member’s data feed could be its own data NFT with associated datatokens. Or, there’s simply one data NFT aggregating datafeeds across all members into a single feed, and the feed is fractionalized by sharded holding of ERC20 tokens (1 above) or sharding the ERC721 data NFT (2 above). If you’re interested in starting a data union then we recommend getting in touch with our friends at [Data Union](https://www.dataunion.app/). - -### Other References - -* [Data & NFTs 1: Practical Connections of ERC721 with Intellectual Property](https://blog.oceanprotocol.com/nfts-ip-1-practical-connections-of-erc721-with-intellectual-property-dc216aaf005d) -* [Data & NFTs 2: Leveraging ERC20 Fungibility](https://blog.oceanprotocol.com/nfts-ip-2-leveraging-erc20-fungibility-bcee162290e3) -* [Data & NFTs 3: Combining ERC721 & ERC20](https://blog.oceanprotocol.com/nfts-ip-3-combining-erc721-erc20-b69ea659115e) -* [Fungibility sightings in NFTs](https://blog.oceanprotocol.com/on-difficult-to-explain-fungibility-sightings-in-nfts-26bc18620f70) diff --git a/core-concepts/did-ddo.md b/core-concepts/did-ddo.md deleted file mode 100644 index 71308250..00000000 --- a/core-concepts/did-ddo.md +++ /dev/null @@ -1,873 +0,0 @@ ---- -title: DID & DDO -slug: /concepts/did-ddo/ -section: concepts -description: >- - Specification of decentralized identifiers for assets in Ocean Protocol using - the DID & DDO standards. ---- - -# DID & DDO - -**v4.1.0** - -### Overview - -This document describes how Ocean assets follow the DID/DDO specification, such that Ocean assets can inherit DID/DDO benefits and enhance interoperability. DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/). - -Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more. - -A DID Document (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID. - -### Rules for DID & DDO - -An _asset_ in Ocean represents a downloadable file, compute service, or similar. Each asset is a _resource_ under the control of a _publisher_. The Ocean network itself does _not_ store the actual resource (e.g. files). - -An _asset_ has a DID and DDO. The DDO should include [metadata](did-ddo.md#metadata) about the asset, and define access in at least one [service](did-ddo.md#services). Only _owners_ or _delegated users_ can modify the DDO. - -All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like _Aquarius_ can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted. - -### Publishing & Retrieving DDOs - -The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the _Provider_. To resolve it, a metadata cache like _Aquarius_ must query the provider to decrypt the DDO. - -Here is the flow: - -![DDO flow](images/ddo-flow.png) - -
- -UML source - -``` -title DDO flow - -User(Ocean library) -> User(Ocean library): Prepare DDO -User(Ocean library) -> Provider: encrypt DDO -Provider -> User(Ocean library): encryptedDDO -User(Ocean library) -> ERC721 contract: publish encryptedDDO -Aquarius <-> ERC721 contract: monitors ERC721 contract and gets MetdadataCreated Event (contains encryptedDDO) -Aquarius -> ERC721 contract: calls getMetaData() -Aquarius -> Provider: decrypt encryptedDDO, signed request using Aquarius's private key -Provider -> ERC721 contract: checks state using getMetaData() -Provider -> Provider: depending on metadataState (expired,retired) and aquarius address, validates the request -Provider -> Aquarius: DDO -Aquarius -> Aquarius : validate DDO -Aquarius -> Aquarius : cache DDO -Aquarius -> Aquarius : enhance cached DDO in response with additional infos like events & stats -``` - -
- -### DID - -In Ocean, a DID is a string that looks like this: - -``` -did:op:0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea -``` - -The part after `did:op:` is the ERC721 contract address(in checksum format) and the chainId (expressed as a decimal) the asset has been published to: - -```js -const checksum = sha256(ERC721 contract address + chainId) -console.log(checksum) -// 0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea -``` - -It follows [the generic DID scheme](https://w3c-ccg.github.io/did-spec/#the-generic-did-scheme). - -### DDO - -A DDO in Ocean has these required attributes: - -| Attribute | Type | Description | -| ----------------- | ------------------------------------- | -------------------------------------------------------------------------------------------------------------- | -| **`@context`** | Array of `string` | Contexts used for validation. | -| **`id`** | `string` | Computed as `sha256(address of ERC721 contract + chainId)`. | -| **`version`** | `string` | Version information in [SemVer](https://semver.org) notation referring to this DDO spec version, like `4.1.0`. | -| **`chainId`** | `number` | Stores chainId of the network the DDO was published to. | -| **`nftAddress`** | `string` | NFT contract linked to this asset | -| **`metadata`** | [Metadata](did-ddo.md#metadata) | Stores an object describing the asset. | -| **`services`** | [Services](did-ddo.md#services) | Stores an array of services defining access to the asset. | -| **`credentials`** | [Credentials](did-ddo.md#credentials) | Describes the credentials needed to access a dataset in addition to the `services` definition. | - -#### Metadata - -This object holds information describing the actual asset. - -| Attribute | Type | Required | Description | -| --------------------------- | --------------------------------------------------- | --------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| **`created`** | `ISO date/time string` | | Contains the date of the creation of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. | -| **`updated`** | `ISO date/time string` | | Contains the date of last update of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. | -| **`description`** | `string` | **✓** | Details of what the resource is. For a dataset, this attribute explains what the data represents and what it can be used for. | -| **`copyrightHolder`** | `string` | | The party holding the legal copyright. Empty by default. | -| **`name`** | `string` | **✓** | Descriptive name or title of the asset. | -| **`type`** | `string` | **✓** | Asset type. Includes `"dataset"` (e.g. csv file), `"algorithm"` (e.g. Python script). Each type needs a different subset of metadata attributes. | -| **`author`** | `string` | **✓** | Name of the entity generating this data (e.g. Tfl, Disney Corp, etc.). | -| **`license`** | `string` | **✓** | Short name referencing the license of the asset (e.g. Public Domain, CC-0, CC-BY, No License Specified, etc. ). If it's not specified, the following value will be added: "No License Specified". | -| **`links`** | Array of `string` | | Mapping of URL strings for data samples, or links to find out more information. Links may be to either a URL or another asset. | -| **`contentLanguage`** | `string` | | The language of the content. Use one of the language codes from the [IETF BCP 47 standard](https://tools.ietf.org/html/bcp47) | -| **`tags`** | Array of `string` | | Array of keywords or tags used to describe this content. Empty by default. | -| **`categories`** | Array of `string` | | Array of categories associated to the asset. Note: recommended to use `tags` instead of this. | -| **`additionalInformation`** | Object | | Stores additional information, this is customizable by publisher | -| **`algorithm`** | [Algorithm Metadata](did-ddo.md#algorithm-metadata) | **✓** (for algorithm assets only) | Information about asset of `type` `algorithm` | - -Example: - -```json -{ - "metadata": { - "created": "2020-11-15T12:27:48Z", - "updated": "2021-05-17T21:58:02Z", - "description": "Sample description", - "name": "Sample asset", - "type": "dataset", - "author": "OPF", - "license": "https://market.oceanprotocol.com/terms" - } -} -``` -#### Algorithm Metadata - -An asset of type `algorithm` has additional attributes under `metadata.algorithm`, describing the algorithm and the Docker environment it is supposed to be run under. - -| Attribute | Type | Required | Description | -| ------------------------ | ----------------------------------------------------- | -------- | ------------------------------------------------------------------------------------------ | -| **`language`** | `string` | | Language used to implement the software. | -| **`version`** | `string` | | Version of the software preferably in [SemVer](https://semver.org) notation. E.g. `1.0.0`. | -| **`consumerParameters`** | [Consumer Parameters](did-ddo.md#consumer-parameters) | | An object that defines required consumer input before running the algorithm | -| **`container`** | `container` | **✓** | Object describing the Docker container image. See below | - -The `container` object has the following attributes defining the Docker image for running the algorithm: - -| Attribute | Type | Required | Description | -| ---------------- | -------- | -------- | ----------------------------------------------------------------- | -| **`entrypoint`** | `string` | **✓** | The command to execute, or script to run inside the Docker image. | -| **`image`** | `string` | **✓** | Name of the Docker image. | -| **`tag`** | `string` | **✓** | Tag of the Docker image. | -| **`checksum`** | `string` | **✓** | Digest of the Docker image. (ie: sha256:xxxxx) | - -```json -{ - "metadata": { - "created": "2020-11-15T12:27:48Z", - "updated": "2021-05-17T21:58:02Z", - "description": "Sample description", - "name": "Sample algorithm asset", - "type": "algorithm", - "author": "OPF", - "license": "https://market.oceanprotocol.com/terms", - "algorithm": { - "language": "Node.js", - "version": "1.0.0", - "container": { - "entrypoint": "node $ALGO", - "image": "ubuntu", - "tag": "latest", - "checksum": "sha256:44e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550" - }, - "consumerParameters": {} - } - } -} -``` - -#### Services - -Services define the access for an asset, and each service is represented by its respective datatoken. - -An asset should have at least one service to be actually accessible, and can have as many services which make sense for a specific use case. - -| Attribute | Type | Required | Description | -| --------------------------- | ----------------------------------------------------- | ------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | -| **`id`** | `string` | **✓** | Unique ID | -| **`type`** | `string` | **✓** | Type of service (`access`, `compute`, `wss`, etc. | -| **`name`** | `string` | | Service friendly name | -| **`description`** | `string` | | Service description | -| **`datatokenAddress`** | `string` | **✓** | Datatoken address | -| **`serviceEndpoint`** | `string` | **✓** | Provider URL (schema + host) | -| **`files`** | [Files](did-ddo.md#files) | **✓** | Encrypted file URLs. | -| **`timeout`** | `number` | **✓** | Describing how long the service can be used after consumption is initiated. A timeout of `0` represents no time limit. Expressed in seconds. | -| **`compute`** | [Compute](did-ddo.md#compute-options) | **✓** (for compute assets only) | If service is of `type` `compute`, holds information about the compute-related privacy settings & resources. | -| **`consumerParameters`** | [Consumer Parameters](did-ddo.md#consumer-parameters) | | An object the defines required consumer input before consuming the asset | -| **`additionalInformation`** | Object | | Stores additional information, this is customizable by publisher | - -#### Files - -The `files` field is returned as a `string` which holds the encrypted file URLs. - -Example: - -```json -{ - "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735" -} -``` - -During the publish process, file URLs must be encrypted with a respective _Provider_ API call before storing the DDO on-chain. For this, you need to send the following object to Provider: - -```json -{ - "datatokenAddress":"0x1", - "nftAddress": "0x2", - "files": [ - ... - ] -} -``` - -where "files" contains one or more storage objects. - -**Type of objects supported:** - -**`URL`** - -Static URLs. - -Parameters: -* `url` - File url, required -* `method` - The HTTP method, required -* `headers` - Additional HTTP headers, optional - -``` -{ - "type": "url", - "url": "https://url.com/file1.csv", - "method": "GET", - "headers": - { - "Authorization": "Bearer 123", - "APIKEY": "124", - } -} -``` - -**`IPFS`** - -The [Interplanetary File System](https://ipfs.tech/) (IPFS) is a distributed file storage protocol that allows computers all over the globe to store and serve files as part of a giant peer-to-peer network. Any computer, anywhere in the world, can download the IPFS software and start hosting and serving files. - -Parameters: -* `hash` - The file hash - -``` -{ - "type": "ipfs", - "hash": "XXX" -} -``` - -**`GraphQL`** - -[GraphQL](https://graphql.org/) is a query language for APIs and a runtime for fulfilling those queries with your existing data. - -Parameters: -* `url` - Server endpoint url, required -* `query` - The query to be executed, required -* `headers` - Additional HTTP headers, optional - -``` -{ - "type": "graphql", - "url": "http://172.15.0.15:8000/subgraphs/name/oceanprotocol/ocean-subgraph", - "headers":{ - "Authorization": "Bearer 123", - "APIKEY": "124", - }, - "query": """query{ - nfts(orderBy: createdTimestamp,orderDirection:desc){ - id - symbol - createdTimestamp - } - }""" -} -``` - -**`On-Chain`** - -Use a smart contract as data source. - -Parameters: - -* `chainId` - The chainId used to query the contract, required -* `address` - The smartcontract address, required -* `abi` - The function abi (NOT the entire contract abi), required - -``` -{ - "type": "smartcontract", - "chainId": 1, - "address": "0x8149276f275EEFAc110D74AFE8AFECEaeC7d1593", - "abi": { - "inputs": [], - "name": "swapOceanFee", - "outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}], - "stateMutability": "view", - "type": "function", - } -} -``` - -**`Arweave`** - -[Arweave](https://www.arweave.org/) is a decentralized data storage that allows to permanently store files over a distributed network of computers. - -Parameters: -* `transactionId` - The transaction identifier - -``` -{ - { - "type": "arweave", - "transactionId": "a4qJoQZa1poIv5guEzkfgZYSAD0uYm7Vw4zm_tCswVQ", - } -} -``` - - -First class integrations supported in the future : -**`Filecoin`** -**`Storj`** -**`SQL`** - -A service can contain multiple files, using multiple storage types. - -Example: - -```json -{ - "datatokenAddress":"0x1", - "nftAddress": "0x2", - "files": [ - { - "type": "url", - "url": "https://url.com/file1.csv", - "method": "GET" - }, - { - "type": "ipfs", - "hash": "XXXX" - } - ] -} -``` - -To get information about the files after encryption, the `/fileinfo` endpoint of _Provider_ returns based on a passed DID an array of file metadata (based on the file type): - -```json -[ - { - "type": "url", - "contentLength": 100, - "contentType": "application/json" - }, - { - "type": "ipfs", - "contentLength": 130, - "contentType": "application/text" - } -] -``` - -This only concerns metadata about a file, but never the file URLs. The only way to decrypt them is to exchange at least 1 datatoken based on the respective service pricing scheme. - -#### Compute Options - -An asset with a service of `type` `compute` has the following additional attributes under the `compute` object. This object is required if the asset is of `type` `compute`, but can be omitted for `type` of `access`. - - -| Attribute | Type | Required | Description | -| ------------------------------- | ----- | ----- | ----------------------------------------------------------- | -| `allowRawAlgorithm` | `boolean` | **✓** | If `true`, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be `false` by default in all implementations. | -| `allowNetworkAccess` | `boolean` | **✓** | If `true`, the algorithm job will have network access. | -| `publisherTrustedAlgorithmPublishers` | Array of `string` | **✓** | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed. | -| `publisherTrustedAlgorithms` | Array of `publisherTrustedAlgorithms` | **✓** | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below). | - - -The `publisherTrustedAlgorithms` is an array of objects with the following structure: - -| Attribute | Type | Required | Description | -| ------------------------------ | -------- | ---- | -------------------------------------------------------------- | -| **`did`** | `string` | **✓** | The DID of the algorithm which is trusted by the publisher. | -| **`filesChecksum`** | `string` | **✓** | Hash of algorithm's files (as `string`). | -| **`containerSectionChecksum`** | `string` | **✓** | Hash of algorithm's image details (as `string`). | - -To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc) - -To produce `containerSectionChecksum`: - -```js -sha256(algorithm_ddo.metadata.algorithm.container.entrypoint + algorithm_ddo.metadata.algorithm.container.checksum) -``` - -Example: - -```json -{ - "services": [ - { - "id": "1", - "type": "access", - "files": "0x044736da6dae39889ff570c34540f24e5e084f...", - "name": "Download service", - "description": "Download service", - "datatokenAddress": "0x123", - "serviceEndpoint": "https://myprovider.com", - "timeout": 0 - }, - { - "id": "2", - "type": "compute", - "files": "0x6dd05e0edb460623c843a263291ebe757c1eb3...", - "name": "Compute service", - "description": "Compute service", - "datatokenAddress": "0x124", - "serviceEndpoint": "https://myprovider.com", - "timeout": 0, - "compute": { - "allowRawAlgorithm": false, - "allowNetworkAccess": true, - "publisherTrustedAlgorithmPublishers": ["0x234", "0x235"], - "publisherTrustedAlgorithms": [ - { - "did": "did:op:123", - "filesChecksum": "100", - "containerSectionChecksum": "200" - }, - { - "did": "did:op:124", - "filesChecksum": "110", - "containerSectionChecksum": "210" - } - ] - } - } - ] -} -``` - -#### Consumer Parameters - -Sometimes, the asset needs additional input data before downloading or running a Compute-to-Data job. Examples: - -* The publisher needs to know the sampling interval before the buyer downloads it. Suppose the dataset URL is `https://example.com/mydata`. The publisher defines a field called `sampling` and asks the buyer to enter a value. This parameter is then added to the URL of the published dataset as query parameters: `https://example.com/mydata?sampling=10`. -* An algorithm that needs to know the number of iterations it should perform. In this case, the algorithm publisher defines a field called `iterations`. The buyer needs to enter a value for the `iterations` parameter. Later, this value is stored in a specific location in the Compute-to-Data pod for the algorithm to read and use it. - -The `consumerParameters` is an array of objects. Each object defines a field and has the following structure: - -| Attribute | Type | Required | Description | -| ----------------- | -------------------------------- | ---- | -------------------------------------------------------------------------- | -| **`name`** | `string` | **✓** | The parameter name (this is sent as HTTP param or key towards algo) | -| **`type`** | `string` | **✓** | The field type (text, number, boolean, select) | -| **`label`** | `string` | **✓** | The field label which is displayed | -| **`required`** | `boolean` | **✓** | If customer input for this field is mandatory. | -| **`description`** | `string` | **✓** | The field description. | -| **`default`** | `string`, `number`, or `boolean` | **✓** | The field default value. For select types, `string` key of default option. | -| **`options`** | Array of `option` | | For select types, a list of options. | - -Each `option` is an `object` containing a single key:value pair where the key is the option name, and the value is the option value. - -Example: - -```json -[ - { - "name": "hometown", - "type": "text", - "label": "Hometown", - "required": true, - "description": "What is your hometown?", - "default": "Nowhere" - }, - { - "name":"age", - "type": "number", - "label": "Age", - "required": false, - "description":"Please fill your age", - "default": 0 - }, - { - "name":"developer", - "type": "boolean", - "label": "Developer", - "required": false, - "description":"Are you a developer?", - "default": false - }, - { - "name":"languagePreference", - "type": "select", - "label": "Language", - "required": false, - "description": "Do you like NodeJs or Python", - "default": "nodejs", - "options": [ - { - "nodejs" : "I love NodeJs" - }, - { - "python" : "I love Python" - } - ] - } -] -``` - -Algorithms will have access to a JSON file located at /data/inputs/algoCustomData.json, which contains the keys/values for input data required. Example: - -```json -{ - "hometown": "São Paulo", - "age": 10, - "developer": true, - "languagePreference": "nodejs" -} -``` - -#### Credentials - -By default, a consumer can access a resource if they have 1 datatoken. _Credentials_ allow the publisher to optionally specify more fine-grained permissions. - -Consider a medical data use case, where only a credentialed EU researcher can legally access a given dataset. Ocean supports this as follows: a consumer can only access the resource if they have 1 datatoken _and_ one of the specified `"allow"` credentials. - -This is like going to an R-rated movie, where you can only get in if you show both your movie ticket (datatoken) _and_ some identification showing you're old enough (credential). - -Only credentials that can be proven are supported. This includes Ethereum public addresses, and in the future [W3C Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) and more. - -Ocean also supports `"deny"` credentials: if a consumer has any of these credentials, they can not access the resource. - -Here's an example object with both `"allow"` and `"deny"` entries: - -```json -{ - "credentials": { - "allow": [ - { - "type": "address", - "values": ["0x123", "0x456"] - } - ], - "deny": [ - { - "type": "address", - "values": ["0x2222", "0x333"] - } - ] - } -} -``` - -#### DDO Checksum - -In order to ensure the integrity of the DDO, a checksum is computed for each DDO: - -```js -const checksum = sha256(JSON.stringify(ddo)) -``` - -The checksum hash is used when publishing/updating metadata using the `setMetaData` function in the ERC721 contract, and is stored in the event generated by the ERC721 contract: - -```solidity -event MetadataCreated( - address indexed createdBy, - uint8 state, - string decryptorUrl, - bytes flags, - bytes data, - bytes metaDataHash, - uint256 timestamp, - uint256 blockNumber -); - -event MetadataUpdated( - address indexed updatedBy, - uint8 state, - string decryptorUrl, - bytes flags, - bytes data, - bytes metaDataHash, - uint256 timestamp, - uint256 blockNumber -); -``` - -_Aquarius_ should always verify the checksum after data is decrypted via a _Provider_ API call. - -#### State - -Each asset has a state, which is held by the NFT contract. The possible states are: - -| State | Description | Discoverable in Ocean Market | Ordering allowed | Listed under profile | -| ----- | ----------- | ------------------ | ---------------- | ---- | -| **`0`** | Active | Yes | Yes | Yes | -| **`1`** | End-of-life | No | No | No | -| **`2`** | Deprecated (by another asset) | No | No | No | -| **`3`** | Revoked by publisher | No | No | No | -| **`4`** | Ordering is temporary disabled | Yes | No | Yes | -| **`5`** | Asset unlisted.| No | Yes | Yes | - -### Aquarius Enhanced DDO Response - -The following fields are added by _Aquarius_ in its DDO response for convenience reasons, where an asset returned by _Aquarius_ inherits the DDO fields stored on-chain. - -These additional fields are never stored on-chain, and are never taken into consideration when [hashing the DDO](did-ddo.md#ddo-checksum). - -#### NFT - -The `nft` object contains information about the ERC721 NFT contract which represents the intellectual property of the publisher. - -| Attribute | Type | Description | -| -------------- | ---------------------- | ----------------------------------------------------------------------------------- | -| **`address`** | `string` | Contract address of the deployed ERC721 NFT contract. | -| **`name`** | `string` | Name of NFT set in contract. | -| **`symbol`** | `string` | Symbol of NFT set in contract. | -| **`owner`** | `string` | ETH account address of the NFT owner. | -| **`state`** | `number` | State of the asset reflecting the NFT contract value. See [State](did-ddo.md#state) | -| **`created`** | `ISO date/time string` | Contains the date of NFT creation. | -| **`tokenURI`** | `string` | tokenURI | - -Example: - -```json -{ - "nft": { - "address": "0x000000", - "name": "Ocean Protocol Asset v4", - "symbol": "OCEAN-A-v4", - "owner": "0x0000000", - "state": 0, - "created": "2000-10-31T01:30:00Z" - } -} -``` - -#### Datatokens - -The `datatokens` array contains information about the ERC20 datatokens attached to [asset services](did-ddo.md#services). - -| Attribute | Type | Description | -| --------------- | -------- | ------------------------------------------------ | -| **`address`** | `string` | Contract address of the deployed ERC20 contract. | -| **`name`** | `string` | Name of NFT set in contract. | -| **`symbol`** | `string` | Symbol of NFT set in contract. | -| **`serviceId`** | `string` | ID of the service the datatoken is attached to. | - -Example: - -```json -{ - "datatokens": [ - { - "address": "0x000000", - "name": "Datatoken 1", - "symbol": "DT-1", - "serviceId": "1" - }, - { - "address": "0x000001", - "name": "Datatoken 2", - "symbol": "DT-2", - "serviceId": "2" - } - ] -} -``` - -#### Event - -The `event` section contains information about the last transaction that created or updated the DDO. - -Example: - -```json -{ - "event": { - "tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c", - "block": 12831214, - "from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83", - "contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf", - "datetime": "2000-10-31T01:30:00" - } -} -``` - -#### Purgatory - -Contains information about an asset's purgatory status defined in [`list-purgatory`](https://github.com/oceanprotocol/list-purgatory). Marketplace interfaces are encouraged to prevent certain user actions like adding liquidity on assets in purgatory. - -| Attribute | Type | Description | -| ------------ | --------- | --------------------------------------------------------------------------------------------- | -| **`state`** | `boolean` | If `true`, asset is in purgatory. | -| **`reason`** | `string` | If asset is in purgatory, contains the reason for being there as defined in `list-purgatory`. | - -Example: - -```json -{ - "purgatory": { - "state": true, - "reason": "Copyright violation" - } -} -``` - -```json -{ - "purgatory": { - "state": false - } -} -``` - -#### Statistics - -The `stats` section contains different statistics fields. - -| Attribute | Type | Description | -| ------------ | -------- | ------------------------------------------------------------------------------------------------------------ | -| **`orders`** | `number` | How often an asset was ordered, meaning how often it was either downloaded or used as part of a compute job. | - -Example: - -```json -{ - "stats": { - "orders": 4 - } -} -``` - -### Full Enhanced DDO Example - -```json -{ - "@context": ["https://w3id.org/did/v1"], - "id": "did:op:ACce67694eD2848dd683c651Dab7Af823b7dd123", - "version": "4.1.0", - "chainId": 1, - "nftAddress": "0x123", - "metadata": { - "created": "2020-11-15T12:27:48Z", - "updated": "2021-05-17T21:58:02Z", - "description": "Sample description", - "name": "Sample asset", - "type": "dataset", - "author": "OPF", - "license": "https://market.oceanprotocol.com/terms" - }, - "services": [ - { - "id": "1", - "type": "access", - "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735", - "name": "Download service", - "description": "Download service", - "datatokenAddress": "0x123", - "serviceEndpoint": "https://myprovider.com", - "timeout": 0, - "consumerParameters": [ - { - "name": "surname", - "type": "text", - "label": "Name", - "required": true, - "default": "NoName", - "description": "Please fill your name" - }, - { - "name": "age", - "type": "number", - "label": "Age", - "required": false, - "default": 0, - "description": "Please fill your age" - } - ] - }, - { - "id": "2", - "type": "compute", - "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735", - "name": "Compute service", - "description": "Compute service", - "datatokenAddress": "0x124", - "serviceEndpoint": "https://myprovider.com", - "timeout": 3600, - "compute": { - "allowRawAlgorithm": false, - "allowNetworkAccess": true, - "publisherTrustedAlgorithmPublishers": ["0x234", "0x235"], - "publisherTrustedAlgorithms": [ - { - "did": "did:op:123", - "filesChecksum": "100", - "containerSectionChecksum": "200" - }, - { - "did": "did:op:124", - "filesChecksum": "110", - "containerSectionChecksum": "210" - } - ] - } - } - ], - "credentials": { - "allow": [ - { - "type": "address", - "values": ["0x123", "0x456"] - } - ], - "deny": [ - { - "type": "address", - "values": ["0x2222", "0x333"] - } - ] - }, - - "nft": { - "address": "0x123", - "name": "Ocean Protocol Asset v4", - "symbol": "OCEAN-A-v4", - "owner": "0x0000000", - "state": 0, - "created": "2000-10-31T01:30:00", - "tokenURI": "xxx" - }, - - "datatokens": [ - { - "address": "0x000000", - "name": "Datatoken 1", - "symbol": "DT-1", - "serviceId": "1" - }, - { - "address": "0x000001", - "name": "Datatoken 2", - "symbol": "DT-2", - "serviceId": "2" - } - ], - - "event": { - "tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c", - "block": 12831214, - "from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83", - "contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf", - "datetime": "2000-10-31T01:30:00" - }, - - "purgatory": { - "state": false - }, - - "stats": { - "orders": 4 - } -} -``` diff --git a/core-concepts/fees.md b/core-concepts/fees.md deleted file mode 100644 index aa16b8f6..00000000 --- a/core-concepts/fees.md +++ /dev/null @@ -1,105 +0,0 @@ ---- -title: Fees -description: The Ocean Protocol defines various fees for creating a sustainability loop. ---- - -# Fees - -### Path to sustainability - -Ocean Protocol achieves sustainability via the [Web3 sustainability loop](https://blog.oceanprotocol.com/the-web3-sustainability-loop-b2a4097a36e). - -* The project grows and improves through the efforts of OceanDAO grant recipients. -* The OceanDAO votes to decide which proposals receive grants. -* Grant funds are sourced from the Ocean Protocol community treasury. -* The Ocean Protocol community collects fees when users interact with the protocol, thus completing the sustainability loop. - -### Fee types - -#### Swap fee - -Swap fees are collected whenever someone swaps a datatoken for base token (e.g., OCEAN) or base token for a datatoken. The swap can be conducted using a a fixed-rate exchange. These are the fees that are applied whenever a user swaps base token or datatoken: - -* Publisher Marketplace swap fee -* Consumer Marketplace swap fee -* Provider Consumption Fees -* [Ocean Community Fee](fees.md#ocean-community-fee) - -#### Publish fee - -Publish fees can be charged to a publisher when they publish an asset. - -Currently, the Ocean marketplace does not charge a publishing fee. Custom marketplaces can charge a publishing fee by adding an extra transaction in the publish flow. - -Based on the use case of the marketplace, the marketplace owner can decide if this fee should be charged or not. - -#### Consume fee - -Consume fees (aka. Order fees) are charged when a user holding a datatoken exchanges it for the right to download an asset or to start a compute job that uses the asset. - -These are the fees that are applied whenever a user pays to access an asset: - -* Consume Market Consumption Fee -* Publisher Market Consumption Fee -* Provider Consumption Fees -* [Ocean Community Fee](fees.md#ocean-community-fee) - -#### Ocean Community fee - -Ocean's smart contracts collect **Ocean Community fees** during swap and order operations. These fees are reinvested in community projects via OceanDAO and other initiatives. - -For swaps involving approved base tokens like OCEAN and H2O, the Ocean Community swap fee is 0.1%. For swaps involving other base tokens, the Ocean Community swap fee is 0.2%. The Ocean Community order fee is 0.03 DT per order operation. - -These fees can be updated by the Ocean Protocol Foundation. - -#### Provider fee - -Provider is a component of Ocean Protocol's ecosystem that facilitates data consumption, starts compute jobs, encrypts DDOs, and decrypts DDOs. Provider also validates if the user can access a particular data asset or service. To learn more about Provider, click [here](https://github.com/oceanprotocol/provider). - -Provider fees are paid to the individual or organization running their Provider instance when the user orders an asset. These fees can be set to an absolute amount, not as a percentage. The provider can also specify which token the fees must be paid in - they don't have to be the same token used in the consuming market. - -Provider fees can also be used to charge for computing resources. Based on the compute resources needed to run an algorithm in the Compute-to-Data environment, a consumer can choose the amount to pay according to their needs. - -These fees incentivize individuals and organizations to run their provider instances and charge consumers according to resource usage. - -### Fee values - -The table is periodically updated. Users are advised to confirm new values through the [contracts](https://github.com/oceanprotocol/contracts) and the [market](https://github.com/oceanprotocol/market). - -#### Swap fees - -| Market/Type | Value in Ocean Market, using any Provider | Value in Other Markets | -| ------------------------------------------------------------- | ----------------------------------------- | --------------------------------------------------------------------------------------- | -| publishMarket: FixedRate | 0% |

Set in the market config, by the publishing market.
Min = 0.001%
Max = 50%

| -|

consumeMarket: FixedRate
ERC20Template

| 0% | 0% | -|

consumeMarket: FixedRate
EnterpriseTemplate

| 0% | Set in market config, by the consuming market. | -|

Ocean Community: FixedRate
OCEAN, H2O as base token

| 0.1% | 0.1% | -|

Ocean Community: FixedRate
other base token

| 0.2% | 0.2% | - -#### Publish fees - -| Market/Type | Value in Ocean Market, using any Provider | Value in Other Markets | -| ----------- | ----------------------------------------- | ---------------------- | -| - | 0% | 0% | - -#### Order fees (1 DT) - -| Market/Type | Value in Ocean Market, using any Provider | Value in Other Markets | -| ----------------------------------------------------------------- | ----------------------------------------- | ----------------------------------------------- | -|

publishMarket
Absolute value, in any token. E.g. 5 USDT

| 0 | Set in market config, by the publishing market. | -|

consumeMarket
Absolute value, in any token. E.g. 2 DAI

| 0 | Set in market config, by the consuming market. | -|

Ocean Community
Fixed price in DT

| 0.03 DT | 0.03 DT | - -#### Ocean Provider fees - -| Type | OPF Provider | 3rd party Provider | -| ---------------------------------------------------------------------------- | :--------------------: | ------------------------------ | -| Token in which fee is charged: `PROVIDER_FEE_TOKEN` | OCEAN | E.g. USDC | -| Download: `COST_PER_MB` | 0 | Set in Provider envvars. | -|

Compute: COST_PER_MIN
Environment: 1 CPU, 60 secs max

| 0 | Set in OperatorEngine envvars. | -|

Compute: COST_PER_MIN
Environment: 1 CPU, 1 hour max

| 1.0 OCEAN/min | Set in OperatorEngine envvars. | -| Ocean Community | 0% of the Provider fee | 0% of the Provider fee | - -### Further reading - -* [The Web3 Sustainability Loop](https://blog.oceanprotocol.com/the-web3-sustainability-loop-b2a4097a36e) diff --git a/core-concepts/images/architecture (1).png b/core-concepts/images/architecture (1).png deleted file mode 100644 index 97084f85..00000000 Binary files a/core-concepts/images/architecture (1).png and /dev/null differ diff --git a/core-concepts/images/architecture (2).png b/core-concepts/images/architecture (2).png deleted file mode 100644 index 97084f85..00000000 Binary files a/core-concepts/images/architecture (2).png and /dev/null differ diff --git a/core-concepts/images/architecture (3).png b/core-concepts/images/architecture (3).png deleted file mode 100644 index 97084f85..00000000 Binary files a/core-concepts/images/architecture (3).png and /dev/null differ diff --git a/core-concepts/images/architecture.png b/core-concepts/images/architecture.png deleted file mode 100644 index 97084f85..00000000 Binary files a/core-concepts/images/architecture.png and /dev/null differ diff --git a/core-concepts/images/blowfish.png b/core-concepts/images/blowfish.png deleted file mode 100644 index 742b5c49..00000000 Binary files a/core-concepts/images/blowfish.png and /dev/null differ diff --git a/core-concepts/images/datanft-and-datatoken (1).png b/core-concepts/images/datanft-and-datatoken (1).png deleted file mode 100644 index 2a0c79fb..00000000 Binary files a/core-concepts/images/datanft-and-datatoken (1).png and /dev/null differ diff --git a/core-concepts/images/datanft-and-datatoken.png b/core-concepts/images/datanft-and-datatoken.png deleted file mode 100644 index 2a0c79fb..00000000 Binary files a/core-concepts/images/datanft-and-datatoken.png and /dev/null differ diff --git a/core-concepts/images/ddo-flow (1).png b/core-concepts/images/ddo-flow (1).png deleted file mode 100644 index fab9a027..00000000 Binary files a/core-concepts/images/ddo-flow (1).png and /dev/null differ diff --git a/core-concepts/images/ddo-flow (2).png b/core-concepts/images/ddo-flow (2).png deleted file mode 100644 index fab9a027..00000000 Binary files a/core-concepts/images/ddo-flow (2).png and /dev/null differ diff --git a/core-concepts/images/ddo-flow.png b/core-concepts/images/ddo-flow.png deleted file mode 100644 index fab9a027..00000000 Binary files a/core-concepts/images/ddo-flow.png and /dev/null differ diff --git a/core-concepts/images/dynamic-asset-pricing (1).png b/core-concepts/images/dynamic-asset-pricing (1).png deleted file mode 100644 index 156114e3..00000000 Binary files a/core-concepts/images/dynamic-asset-pricing (1).png and /dev/null differ diff --git a/core-concepts/images/dynamic-asset-pricing.png b/core-concepts/images/dynamic-asset-pricing.png deleted file mode 100644 index 156114e3..00000000 Binary files a/core-concepts/images/dynamic-asset-pricing.png and /dev/null differ diff --git a/core-concepts/images/fixed-asset-pricing (1).png b/core-concepts/images/fixed-asset-pricing (1).png deleted file mode 100644 index cc55112a..00000000 Binary files a/core-concepts/images/fixed-asset-pricing (1).png and /dev/null differ diff --git a/core-concepts/images/fixed-asset-pricing (2).png b/core-concepts/images/fixed-asset-pricing (2).png deleted file mode 100644 index cc55112a..00000000 Binary files a/core-concepts/images/fixed-asset-pricing (2).png and /dev/null differ diff --git a/core-concepts/images/fixed-asset-pricing (3).png b/core-concepts/images/fixed-asset-pricing (3).png deleted file mode 100644 index cc55112a..00000000 Binary files a/core-concepts/images/fixed-asset-pricing (3).png and /dev/null differ diff --git a/core-concepts/images/fixed-asset-pricing (4).png b/core-concepts/images/fixed-asset-pricing (4).png deleted file mode 100644 index cc55112a..00000000 Binary files a/core-concepts/images/fixed-asset-pricing (4).png and /dev/null differ diff --git a/core-concepts/images/fixed-asset-pricing.png b/core-concepts/images/fixed-asset-pricing.png deleted file mode 100644 index cc55112a..00000000 Binary files a/core-concepts/images/fixed-asset-pricing.png and /dev/null differ diff --git a/core-concepts/images/free-asset-pricing (1).png b/core-concepts/images/free-asset-pricing (1).png deleted file mode 100644 index 8876bc31..00000000 Binary files a/core-concepts/images/free-asset-pricing (1).png and /dev/null differ diff --git a/core-concepts/images/free-asset-pricing (2).png b/core-concepts/images/free-asset-pricing (2).png deleted file mode 100644 index 8876bc31..00000000 Binary files a/core-concepts/images/free-asset-pricing (2).png and /dev/null differ diff --git a/core-concepts/images/free-asset-pricing (3).png b/core-concepts/images/free-asset-pricing (3).png deleted file mode 100644 index 8876bc31..00000000 Binary files a/core-concepts/images/free-asset-pricing (3).png and /dev/null differ diff --git a/core-concepts/images/free-asset-pricing (4).png b/core-concepts/images/free-asset-pricing (4).png deleted file mode 100644 index 8876bc31..00000000 Binary files a/core-concepts/images/free-asset-pricing (4).png and /dev/null differ diff --git a/core-concepts/images/free-asset-pricing.png b/core-concepts/images/free-asset-pricing.png deleted file mode 100644 index 8876bc31..00000000 Binary files a/core-concepts/images/free-asset-pricing.png and /dev/null differ diff --git a/core-concepts/images/token-tool.png b/core-concepts/images/token-tool.png deleted file mode 100644 index eea3ded5..00000000 Binary files a/core-concepts/images/token-tool.png and /dev/null differ diff --git a/core-concepts/images/use-case (1).png b/core-concepts/images/use-case (1).png deleted file mode 100644 index a581963f..00000000 Binary files a/core-concepts/images/use-case (1).png and /dev/null differ diff --git a/core-concepts/images/use-case (2).png b/core-concepts/images/use-case (2).png deleted file mode 100644 index a581963f..00000000 Binary files a/core-concepts/images/use-case (2).png and /dev/null differ diff --git a/core-concepts/images/use-case (3).png b/core-concepts/images/use-case (3).png deleted file mode 100644 index a581963f..00000000 Binary files a/core-concepts/images/use-case (3).png and /dev/null differ diff --git a/core-concepts/images/use-case (4).png b/core-concepts/images/use-case (4).png deleted file mode 100644 index a581963f..00000000 Binary files a/core-concepts/images/use-case (4).png and /dev/null differ diff --git a/core-concepts/images/use-case.png b/core-concepts/images/use-case.png deleted file mode 100644 index a581963f..00000000 Binary files a/core-concepts/images/use-case.png and /dev/null differ diff --git a/core-concepts/networks.md b/core-concepts/networks.md deleted file mode 100644 index 2d9b5488..00000000 --- a/core-concepts/networks.md +++ /dev/null @@ -1,205 +0,0 @@ ---- -title: Supported Networks -description: >- - All the public networks the Ocean Protocol contracts are deployed to, and - additional core components deployed to them. ---- - -# Networks - -Ocean Protocol contracts are deployed on multiple public networks. You can always find the most up-to-date deployment addresses for all individual contracts in the [address.json](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json). - -In each network, you’ll need ETH to pay for gas, and OCEAN for certain Ocean actions. Because the Ethereum mainnet is a network for production settings, ETH and OCEAN tokens have real value on there. The ETH and OCEAN tokens in each test network don’t have real value and are used for testing-purposes only. They can be obtained with _faucets_ to dole out ETH and OCEAN. - -The universal Aquarius Endpoint is [`https://v4.aquarius.oceanprotocol.com`](https://v4.aquarius.oceanprotocol.com). - -### Ethereum Mainnet - -Ethereum mainnet is a production network. In MetaMask and other ERC20 wallets, click on the network name dropdown, then select _Ethereum mainnet_. - -**Tokens** - -* Mainnet ETH: - * Native token to pay transaction fees -* Mainnet OCEAN: - * Address: [0x967da4048cD07aB37855c090aAF366e4ce1b9F48](https://etherscan.io/token/0x967da4048cD07aB37855c090aAF366e4ce1b9F48) - -**Additional Components** - -| What | URL | -| ------------ | --------------------------------------------------------------------------------- | -| Explorer | https://etherscan.io | -| Ocean Market | Point wallet to Ethereum Mainnet network, at https://v4.market.oceanprotocol.com/ | -| Provider | `https://v4.provider.mainnet.oceanprotocol.com` | -| Subgraph | `https://v4.subgraph.mainnet.oceanprotocol.com` | - -### Polygon Mainnet - -Ocean is deployed to Polygon Mainnet, another production network. Polygon’s native token is MATIC. If you don’t find Polygon as a predefined network in your wallet, you can connect to it manually via Polygon's guide [here](https://docs.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/#add-the-polygon-network-manually). - -**Tokens** - -* Matic: - * Native token to pay transaction fees -* Matic OCEAN: - * Address: [0x282d8efCe846A88B159800bd4130ad77443Fa1A1](https://polygonscan.com/token/0x282d8efce846a88b159800bd4130ad77443fa1a1) - -**Additional Components** - -| What | URL | -| ------------ | -------------------------------------------------------------------------------- | -| Explorer | https://polygonscan.com | -| Ocean Market | Point wallet to Ploygon Mainnet network, at https://v4.market.oceanprotocol.com/ | -| Provider | `https://v4.provider.polygon.oceanprotocol.com/` | -| Subgraph | `https://v4.subgraph.polygon.oceanprotocol.com/` | - -**Bridge** - -Check our Polygon Bridge [guide](../core-concepts/networks/bridges.md#polygon-ex-matic-bridge) to learn how you can deposit, withdraw and send tokens. - -### Binance Smart Chain - -Ocean is deployed to Binance Smart Chain (BSC), another production network. BSC’s native token is BNB - the Binance token. - -If you don’t find BSC as a predefined network in your wallet, you can connect to it manually via Binance’s guide [here](https://academy.binance.com/en/articles/connecting-metamask-to-binance-smart-chain). - -**Tokens** - -* BSC BNB: - * Native token to pay transaction fees. -* BSC OCEAN: - * Address: [0xdce07662ca8ebc241316a15b611c89711414dd1a](https://bscscan.com/token/0xdce07662ca8ebc241316a15b611c89711414dd1a) - -**Additional Components** - -| What | URL | -| ------------ | ------------------------------------------------------------------------------------ | -| Explorer | https://bscscan.com/ | -| Ocean Market | Point wallet to Binance Smart Chain network, at https://v4.market.oceanprotocol.com/ | -| Provider | `https://v4.provider.bsc.oceanprotocol.com` | -| Subgraph | `https://v4.subgraph.bsc.oceanprotocol.com` | - -**Bridge** - -Check our BSC Bridge [guide](../core-concepts/networks/bridges.md#binance-smart-chain-bsc-bridge) to learn how you can deposit, withdraw and send tokens. - -### Energy Web Chain - -Ocean is deployed to [Energy Web Chain](https://energy-web-foundation.gitbook.io/energy-web/technology/the-stack/trust-layer-energy-web-chain), another production network. Energy Web’s native token is EWT. - -If you don’t find Energy Web Chain as a predefined network in your wallet, you can connect to it using the guide [here](https://energy-web-foundation.gitbook.io/energy-web/how-tos-and-tutorials/connect-to-energy-web-chain-main-network-with-metamash). - -**Tokens** - -* Energy Web Chain EWT: - * Native token to pay transaction fees. -* Energy Web Chain OCEAN: - * Address: [0x593122aae80a6fc3183b2ac0c4ab3336debee528](https://explorer.energyweb.org/token/0x593122aae80a6fc3183b2ac0c4ab3336debee528) - -**Additional Components** - -| What | URL | -| ------------ | --------------------------------------------------------------------------------- | -| Explorer | https://explorer.energyweb.org/ | -| Ocean Market | Point wallet to Energy Web Chain network, at https://v4.market.oceanprotocol.com/ | -| Provider | `https://v4.provider.energyweb.oceanprotocol.com/` | -| Subgraph | `https://v4.subgraph.energyweb.oceanprotocol.com` | - -**Bridge** - -Use the link [here](https://bridge.carbonswap.exchange) to bridge the assets between EWC and Ethereum mainnet. - -### Moonriver - -Ocean is deployed to [Moonriver](https://docs.moonbeam.network/builders/get-started/networks/moonriver/), another production network. Moonriver’s native token is MOVR. - -If you don’t find Moonriver as a predefined network in your wallet, you can connect to it using the guide [here](https://docs.moonbeam.network/builders/get-started/networks/moonriver/#connect-metamask). - -**Tokens** - -* Moonriver MOVR: - * Native token to pay transaction fees. -* Moonriver OCEAN: - * Address: [0x99C409E5f62E4bd2AC142f17caFb6810B8F0BAAE](https://blockscout.moonriver.moonbeam.network/token/0x99C409E5f62E4bd2AC142f17caFb6810B8F0BAAE/token-transfers) - -**Additional Components** - -| What | URL | -| ------------ | -------------------------------------------------------------------------- | -| Explorer | https://blockscout.moonriver.moonbeam.network | -| Ocean Market | Point wallet to Moonriver network, at https://v4.market.oceanprotocol.com/ | -| Provider | `https://v4.provider.moonriver.oceanprotocol.com` | -| Subgraph | `https://v4.subgraph.moonriver.oceanprotocol.com` | - -**Bridge** - -Use [Anyswap](https://anyswap.exchange/#/bridge) to bridge between ETH Mainnet and Moonriver. - -### Görli - -Görli is a test network. - -In MetaMask and other ERC20 wallets, click on the network name dropdown, then select _Goerli_. - -**Tokens** - -* Görli ETH: - * Native token to pay transaction fees - * [Faucet](https://goerlifaucet.com/). You may find others by [searching](https://www.google.com/search?q=goerli+ether+faucet%5C&oq=goerli+ether+faucet). -* Goerli OCEAN: - * Address: [0xCfDdA22C9837aE76E0faA845354f33C62E03653a](https://goerli.etherscan.io/address/0xcfdda22c9837ae76e0faa845354f33c62e03653a) - * [Faucet](https://faucet.goerli.oceanprotocol.com) - -**Additional Components** - -| What | URL | -| ------------ | -------------------------------------------------------------------- | -| Explorer | https://goerli.etherscan.io/ | -| Ocean Market | Point wallet to Görli network, at https://market.oceanprotocol.com | -| Provider | `https://v4.provider.goerli.oceanprotocol.com` | -| Subgraph | `https://v4.subgraph.goerli.oceanprotocol.com` | - -### Mumbai - -Mumbai is a test network tuned for Matic / Polygon. - -If you don't find Mumbai as a predefined network in your wallet, you can connect to it manually via [Matic's guide](https://docs.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/). - -**Tokens** - -* Mumbai MATIC: - * Native token to pay transaction fees - * [Faucet](https://faucet.matic.network/). You may find others by [searching](https://www.google.com/search?q=mumbai+faucet). -* Mumbai OCEAN: - * Address: [0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8](https://mumbai.polygonscan.com/token/0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8) - * [Faucet](https://faucet.mumbai.oceanprotocol.com/) - -**Additional Components** - -| What | URL | -| ------------ | ------------------------------------------------------------------- | -| Explorer | https://mumbai.polygonscan.com | -| Ocean Market | Point wallet to Mumbai network, at https://market.oceanprotocol.com | -| Provider | `https://v4.provider.mumbai.oceanprotocol.com` | -| Subgraph | `https://v4.subgraph.mumbai.oceanprotocol.com` | - - -### Local / Ganache - -The most straightforward way for local-only development is to use [Barge](https://www.github.com/oceanprotocol/barge), which runs [Ganache](https://www.trufflesuite.com/ganache), Aquarius, and Provider. It is used extensively by the Ocean core devs and for automated integration testing. - -To connect to it from MetaMask, select the network called _Localhost 8545_. - -Alternatively, you can run Ganache independently. Install it according to [the Ganache docs](https://www.trufflesuite.com/ganache). Then deploy Ocean contracts onto Ganache following [docs in Ocean contracts repo](https://www.github.com/oceanprotocol/contracts). Ganache is at the RPC URL [http://localhost:8545](http://localhost:8545). - -**Tokens** - -* Ganache ETH: - * Native token to pay transaction fees - * By default, Ganache creates several Ethereum accounts at launch, gives each some ETH, and makes their private keys available in the logs. You can also instruct Ganache to give ETH to specific Ethereum addresses. -* Ganache OCEAN: - * You can deploy an ERC20 token with label OCEAN. At a minimum, the token needs to be ERC20Detailed and ERC20Capped. You’ll see examples in the quickstarts for the Ocean JavaScript and Python drivers. - -### Other - -Some apps may need `network_id` and `chain_id`. Here's a [list of values for major Ethereum networks](https://medium.com/@piyopiyo/list-of-ethereums-major-network-and-chain-ids-2bc58e928508). diff --git a/core-concepts/roles.md b/core-concepts/roles.md deleted file mode 100644 index ce6803a1..00000000 --- a/core-concepts/roles.md +++ /dev/null @@ -1,68 +0,0 @@ ---- -title: Data NFTs and datatoken roles -description: The permissions stored on chain in the contracts control the access to the data NFT (ERC721) and datatoken (ERC20) smart contract functions. ---- - -The permissions are stored in the data NFT (ERC721) smart contract. The data NFT (ERC721) and datatoken (ERC20) smart contracts both use this information to restrict access to the smart contract functions. The tables below list restricted actions that are accessible only to the allowed users. - -## What Roles Can The Data NFT Owner Assign? - -The data NFT is the base IP for the asset and all the datatokens are therefore linked to the data NFT smart contract — this has enabled us to do a bunch of cool new things around role administration. We’ve introduced a host of useful roles which give you flexibility in how you manage your project. This can be a big help for enterprises and startups who are ready to scale up and introduce a level of administration. - -### NFT Owner - -The NFT owner is the owner of the base-IP and is therefore at the highest level. The NFT owner can perform any action or assign any role but crucially, the NFT owner is the only one who can assign the manager role. Upon deployment or transfer of the data NFT, the NFT owner is automatically added as a manager. The NFT owner is also the only role that can’t be assigned to multiple users — the only way to share this role is via multi-sig or a DAO. - -### Manager - -The manager can assign or revoke three main roles (deployer, metadata updater, store updater). The manager is also able to interact with the ERC725 data. - -### ERC20 Deployer - -The Deployer has a bunch of privileges at the ERC20 datatoken level. They can deploy new datatokens with fixed price exchange, or free pricing. They can also update the ERC725Y key-value store and assign roles the ERC20 level. - -### Metadata Updater - -There is also a specific role for updating the metadata. The Metadata updater has the ability to update the information about the data asset (title, description, sample data etc) that is displayed to the user on the asset detail page within the market. - -### Store Updater - -The store updater can store, remove or update any arbitrary key value using the ERC725Y implementation (at the ERC721 level). The use case for this role depends a lot on what data is being stored in the ERC725Y key-value pair — as mentioned above, this is highly flexible. - -### Minter - -The Minter has the ability to mint new datatokens, provided the limit has not been exceeded. In most cases, this role will not be used as the alternative is for the datatokens to be minted by the side-staking bot which has many advantages. We highly recommend taking a read of this article if you’re interested in learning more about safer staking and one-sided staking. - -### Fee Manager - -Finally, we also have a fee manager which has the ability to set a new fee collector — this is the account that will receive the datatokens when a data asset is consumed. If no fee collector account has been set, the datatokens will be sent by default to the NFT Owner. The applicable fees (market and community fees) are automatically deducted from the datatokens that are received. - -## Roles in data NFT (ERC721) smart contract - -| Action ↓ / Role → | NFT Owner | Manager | ERC20 Deployer | Store Updater | Metadata Updater | -| --------------------------------- | ---------------------- | ---------------------- | ---------------------- | ---------------------- | ---------------------- | -| Set token URI | | | | | | -| Add manager |
**✓**
| | | | | -| Remove manager |
**✓**
| | | | | -| Clean permissions |
**✓**
| | | | | -| Set base URI |
**✓**
| | | | | -| Set Metadata state | | | | |
**✓**
| -| Set Metadata | | | | |
**✓**
| -| Create new datatoken | | |
**✓**
| | | -| Executes any other smart contract | |
**✓**
| | | | -| Set new key-value in store | | | |
**✓**
| | - -## Roles in datatoken (ERC20) smart contract - -| Action ↓ / Role → | ERC20 Deployer | Minter | NFT owner | Fee manager | -| --------------------------- | ---------------------- | ---------------------- | ---------------------- | ---------------------- | -| Create Fixed Rate exchange |
**✓**
| | | | -| Create Dispenser |
**✓**
| | | | -| Add minter |
**✓**
| | | | -| Remove minter |
**✓**
| | | | -| Add fee manager |
**✓**
| | | | -| Remove fee manager |
**✓**
| | | | -| Set data |
**✓**
| | | | -| Clean permissions | | |
**✓**
| | -| Mint | |
**✓**
| | | -| Set fee collector | | | |
**✓**
| diff --git a/data-science/README.md b/data-science/README.md new file mode 100644 index 00000000..f37c8497 --- /dev/null +++ b/data-science/README.md @@ -0,0 +1,30 @@ +--- +description: Ocean Protocol is built by data scientists, for data scientists. +cover: ../.gitbook/assets/cover/data_science_banner.png +coverY: 0 +--- + +# 📊 Data Science + +

Ocean Protocol - Built to protect your precious.

+ +### Why should data scientists use Ocean Protocol? + +Ocean Protocol is built for data scientists to **monetize data effectively and** solve the ["Data Value Creation Loop"](the-data-value-creation-loop.md). Our [open-source tools](https://github.com/oceanprotocol) tackle some of **the biggest problems for data scientists**: how to sell data anonymously, how to sell compute jobs on datasets, how to control access to data, etc. By using blockchain architecture, Ocean achieves several tactical advantages over Web2 to solve these data sharing problems. + +### What are some use cases for Ocean Protocol? + +* Enable trustless transactions (i.e. buy, sell, and transfer data) +* Trace data provenance and consumption +* Token gate a website or dApp using datatokens +* Deploy a decentralized data marketplace +* Sell algorithmic compute jobs on private datasets + +### How to design a ML system using Ocean Protocol? + +The first step is to tokenize data into data NFTs and datatokens on the blockchain. We offer a no-code way to tokenize data via the [Ocean Market](https://market.oceanprotocol.com). But we also offer code options for data scientists to use the [Ocean.py](../developers/ocean.py/README.md) and [Ocean.js](../developers/ocean.js/README.md) libraries. Data scientists can then build sophisticated ML systems on top of the tokenized data by using composable Ocean Protocol tools. ML models can use a variety of Ocean smart contracts, including Ocean's [Compute-to-Data](../developers/compute-to-data/README.md), to build model outputs all the way to the last-mile delivery for businesses. + +### **Key Links for Data Scientists:** + +* Learn the difference between Ocean Protocol [data NFTs and datatokens](../developers/contracts/datanft-and-datatoken.md), the two types of tokenized data assets you need to start building your ML systems. +* Discover Ocean's [Compute-to-Data](../developers/compute-to-data/README.md) engine that can help you to solve the difficult problem of selling algorithmic compute jobs on your datasets without actually revealing the contents of the algorithm & dataset to the consumer. diff --git a/data-science/data-engineers.md b/data-science/data-engineers.md new file mode 100644 index 00000000..15a5ecae --- /dev/null +++ b/data-science/data-engineers.md @@ -0,0 +1,26 @@ +--- +description: How to research where supply meets demand... 💰🧑‍🏫 +--- + +# What data is valuable? + +

When you sell the right data at the right price to meet demand.

+ +### Simple Truths + +A lot of people miss the mark on tokenizing data that actually _sells_. If your goal is to make money, then you have to research target audiences, who's currently buying data, and \*correctly price\* your data to meet that demand. + +To figure out which market segments are paying for data, then it may help you to **go to the Ocean Market and sort by Sales.** + +But even then, it's not enough to just publish useful data on Ocean. **You need to market your data** **assets** to close sales. + +Have you tried all these things and are still having trouble making money? Never fear! You can enter one of our [data challenges](https://oceanprotocol.com/challenges) to make sweet OCEAN rewards and build your data science skills. + +But what if you're a well-heeled company looking to create dApps or source data predictions? You can kickstart the value creation loop by working with Ocean Protocol to [sponsor a data challenge](../user-guides/sponsor-a-data-challenge.md). + +### What data could be useful for dApp builders? + +* **Government Open Data:** Governments serve as a rich and reliable source of data. However, this data often lacks proper documentation or poses challenges for data scientists to work with effectively. One idea is to clean and organize this data in a way that others can tap into this wealth of information with ease. For example, in one of our [data challenges](https://desights.ai/shared/challenge/8) we leveraged public real estate data from Dubai to build use cases for understanding and predicting valuations and rents. Local, state, and federal governments around the world provide access to valuable data. So make consuming that data easier to help consumers build useful products and help your local community. +* **Public APIs:** Data scientists can use free, public APIs to tokenize data in such a way that consumers can easily access it. [This ](https://github.com/public-apis/public-apis)is a repository of some public APIs for a wide range of topics, from weather to gaming to finance. +* **On-Chain Data:** There is consistent demand for good decentralized finance (DeFi) data and an emerging need for decentralized social data. Thus, data scientists can query blockchain data to build and sell valuable datasets for consumers. +* **Datasets for training AI and foundation models:** Much of the uniqueness and value in your data consists of aggregating and cleaning data from different sources. You can scrape the web or source data from other sources to present to AI/ML engineers looking for data to train their models. diff --git a/data-science/the-data-value-creation-loop.md b/data-science/the-data-value-creation-loop.md new file mode 100644 index 00000000..33b9eb88 --- /dev/null +++ b/data-science/the-data-value-creation-loop.md @@ -0,0 +1,27 @@ +--- +description: When you have problems, but then you solve them 💁‍♀️ +--- + +# The Data Value Creation Loop + +

Tell me more.

+ +### What is the Data Value Creation Loop? + +The Data Value Creation Loop is a journey where **data progresses from a business problem to valuable insights**. It involves collaboration among various skillsets like business stakeholders, data engineers, data scientists, MLOps engineers, and application developers. + +Here's a condensed breakdown of the loop: + +1. Business Problem: Identify the specific problem to solve using data science, such as reducing customer churn or predicting token prices. +2. Raw Data: Gather unprocessed data directly from sources, whether static or dynamic, like user profiles or historical prices. +3. Cleaned Data and Feature Vectors: Transform raw data into organized numerical representations, like cleaned sales data or preprocessed text transformed into word embeddings. +4. Trained Models: Train machine learning models on feature vectors to learn patterns and relationships, such as a random forest predicting coffee sales or GPT-3 trained on a text corpus. +5. Data to Tune Models: Introduce additional data to further refine and enhance model performance, like new sales data for the coffee shop model or domain-specific text data for GPT-3. +6. Tuned Models: Optimize models for high performance, accuracy, and robustness, such as a tuned random forest predicting busy hours for the coffee shop or a fine-tuned GPT-3 generating expert-level text. +7. Model Prediction Inputs: Provide inputs to the models to generate insights, like today's date and weather for the sales model or a text prompt for GPT-3 to generate a blog post. +8. Model Prediction Outputs: Obtain predictions or insights from the models based on the inputs, such as the sales model forecasting a spike in iced coffee sales or GPT-3 generating a blog post on sustainability in business. +9. Application: Package the models into applications that can impact real-world scenarios. Build user experiences around the data and model assets to make them usable and valuable. + +### What is an example of a Data Value Creation Loop? + +Let's explore an example to showcase the process of the data value creation loop. Imagine a healthcare organization seeking to develop a predictive model for early detection of diseases. They collaborate with data engineers to collect and preprocess various medical datasets, including patient demographics, lab results, and medical imaging. These datasets are tokenized and made available on the Ocean Protocol platform for secure computation. Data scientists utilize the tokenized data to train machine learning models that can accurately identify early warning signs of diseases. These models are then published as compute assets on Ocean Market. Application developers work with the healthcare organization to integrate the models into their existing patient management system, allowing doctors to receive automated risk assessments and personalized recommendations for preventive care. As a result, patients benefit from early detection, doctors can make more informed decisions, and the healthcare organization generates insights to improve patient outcomes while fostering data and model asset collaboration. Et voilà! diff --git a/defi/README.md b/defi/README.md new file mode 100644 index 00000000..a080369a --- /dev/null +++ b/defi/README.md @@ -0,0 +1,37 @@ +--- +description: All you need to know about Ocean and decentralized finance +cover: ../.gitbook/assets/cover/defi_banner.png +coverY: 0 +--- + +# 🤑 DeFi + +## Capitalize with Ocean Protocol 💸 + +Ocean Protocol's open-source tools are useful in a variety of DeFi applications. Here we show you how to capitalize with Ocean Protocol tech in several ways using crypto trading! + +## Predicting the Price of ETH 📈 + +We offer a Predict ETH data challenge where participants can win prizes 💰 for making accurate ETH price predictions. Participants use machine learning to extrapolate ETH prices for the next 12 hours as accurately as possible and upload these predictions to the Ocean Market as data NFTs. It's a win-win for both Ocean Protocol and challenge participants! This challenge promotes more valuable data feeds on the Ocean Market and participants gain cash prizes! + +Check our [challenges page](https://oceanprotocol.com/challenges) to see if there is an active Predict ETH challenge! + +Wondering how to start predicting the price of ETH? We have a [blogpost](https://blog.oceanprotocol.com/capitalize-with-ocean-protocol-a-predict-eth-tutorial-b2da136633f0?source=search\_post---------0----------------------------) 📖 for that! + +## Algorithmic Crypto Trading 🤖📊 + +There are a variety of NFTs on the Ocean Market that offer profitable algorithms for crypto trading using [Trading View](https://www.tradingview.com) and [freqtrade](http://freqtrade.io) open-source Github library. Not sure how to algorithmically trade crypto? We have a few tutorials for that: + +Check out our video [tutorial](https://www.youtube.com/watch?v=c7A4vA8YUyI) on how to begin algorithmically trading crypto with no experience using an EMA crossover strategy on the Ocean Market! + +{% embed url="https://www.youtube.com/watch?v=c7A4vA8YUyI" %} + +Are you more of a reader? Check out our [blogpost](https://blog.oceanprotocol.com/capitalize-with-ocean-protocol-a-sma-algorithmic-trading-tutorial-a2490661ab85) 📖 how to use a Python SMA crossover strategy with grid search optimization to begin accurately predicting ETH. + +## Create Trading Strategy NFTs 📲 + +Do you have a valuable trading strategy that you want to share or sell? You can anonymously upload your trading strategies as NFTs on-chain using the Ocean Market! + +Check out our video tutorial 🧑‍🏫 how to [publish trading strategy NFTs](https://youtu.be/Q4jj5ukiTZA) on-chain 🔗 with the Ocean Market! + +{% embed url="https://youtu.be/Q4jj5ukiTZA" %} diff --git a/developers/README.md b/developers/README.md new file mode 100644 index 00000000..868319a0 --- /dev/null +++ b/developers/README.md @@ -0,0 +1,24 @@ +--- +description: >- + Your resource hub for diving deep into the core concepts, exploring main + components, and accessing practical examples and integration guides to unleash + the power of Ocean Protocol in your applications +cover: ../.gitbook/assets/cover/developer_banner.png +coverY: 0 +--- + +# 👨💻 Developers + +With Ocean, crypto wallets transform into magical data wallets, where your data can roam freely and securely. Crypto exchanges? Well, they've taken on a new role as data marketplaces, where you can showcase and trade your valuable data treasures. And hold on tight because DAOs are here to create epic data co-ops, where collaboration and innovation reign supreme! 🤝 + +But hold on tight, because we have even more in store for you! With Ocean Protocol, you gain access to a treasure trove of tools that will unlock your data scientist superpowers and allow you to unleash your creativity. Whether you're a Python aficionado or a JavaScript maestro, we have you covered with [ocean.py](ocean.py/README.md) and [ocean.js](ocean.js/README.md) libraries. So, get ready to dive into the depths of data innovation and create the next groundbreaking dAapp (that's a decentralized App, by the way) using [ocean.js's](ocean.js/README.md) powerful capabilities or unleash your skills with [ocean.py](ocean.py/README.md). It's time to shake up the data world like never before! 🌐🚀 + +

Ocean Protocol Explorer

+ +At the core of the Ocean Protocol, you'll find a constellation of [smart contracts](contracts/README.md) that bring extraordinary capabilities to every data asset. Here's where the magic happens! Every asset gets its own cool and unique [**ERC721 data NFT**](contracts/data-nfts.md#what-is-a-data-nft), along with one (or more) [**ERC20 datatokens**](contracts/datanft-and-datatoken.md). It's like giving your data its very own superhero cape! 🦸‍♂️ + +These [smart contracts](contracts/README.md) form the backbone of Ocean Protocol, empowering data assets with unparalleled value and enabling seamless integration with the wider blockchain ecosystem. Through the [contracts](contracts/README.md), data becomes not only valuable but also tradable, allowing you to unleash the true potential of your data treasures. + +

Smart Contracts

+ +Now, if you're new to the world of web3 and blockchain technologies, fear not! We've got you covered. Before diving into the depths of Ocean Protocol, we recommend starting with some introductory guides. These [guides](../user-guides/README.md) will gently introduce you to the magical world of [web3](../discover/wallets/README.md) and help you understand the [basics](../discover/wallets-and-ocean-tokens.md) before you embark on your epic data-driven adventure. diff --git a/developers/aquarius/README.md b/developers/aquarius/README.md new file mode 100644 index 00000000..9691b788 --- /dev/null +++ b/developers/aquarius/README.md @@ -0,0 +1,37 @@ +# Aquarius + +### What is Aquarius? + +Aquarius is a tool that tracks and caches the metadata from each chain where the Ocean Protocol smart contracts are deployed. It operates off-chain, running an Elasticsearch database. This makes it easy to query the metadata generated on-chain. + +The core job of Aquarius is to continually look out for new metadata being created or updated on the blockchain. Whenever such events occur, Aquarius takes note of them, processes this information, and adds it to its database. This allows it to keep an up-to-date record of the metadata activity on the chains. + +Aquarius has its own interface (API) that allows you to easily query this metadata. With Aquarius, you don't need to do the time-consuming task of scanning the data chains yourself. It offers you a convenient shortcut to the information you need. It's ideal for when you need a search feature within your dApp. + + + +

Aquarius high level overview

+ +### What does Aquarius do? + +1. Acts as a cache: It stores metadata from multiple blockchains in off-chain in an Elasticsearch database. +2. Monitors events: It continually checks for MetadataCreated and MetadataUpdated events, processing these events and updating them in the database. +3. Offers easy query access: The Aquarius API provides a convenient method to access metadata without needing to scan the blockchain. +4. Serves as an API: It provides a REST API that fetches data from the off-chain datastore. +5. Features an EventsMonitor: This component runs continually to retrieve and index chain Metadata, saving results into an Elasticsearch database. +6. Configurable components: The EventsMonitor has customizable features like the MetadataContract, Decryptor class, allowed publishers, purgatory settings, VeAllocate, start blocks, and more. + +### How to run Aquarius? + +We recommend checking the README in the [Aquarius GitHub repository](https://github.com/oceanprotocol/aquarius) for the steps to run the Aquarius. If you see any errors in the instructions, please open an issue within the GitHub repository. + +### What technology does Aquarius use? + +* Python: This is the main programming language used in Aquarius. +* Flask: This Python framework is used to construct the Aquarius API. +* Elasticsearch: This is a search and analytics engine used for efficient data indexing and retrieval. +* REST API: Aquarius uses this software architectural style for providing interoperability between computer systems on the internet. + +### Postman documentation + +Click [here](https://documenter.getpostman.com/view/2151723/UVkmQc7r) to explore the documentation and more examples in postman. diff --git a/developers/aquarius/asset-requests.md b/developers/aquarius/asset-requests.md new file mode 100644 index 00000000..14708ed6 --- /dev/null +++ b/developers/aquarius/asset-requests.md @@ -0,0 +1,275 @@ +# Asset Requests + +The universal Aquarius Endpoint is [`https://v4.aquarius.oceanprotocol.com`](https://v4.aquarius.oceanprotocol.com). + +### **DDO** + +A method for retrieving all information about the asset using a unique identifier known as a Decentralized Identifier (DID). + +* **Endpoint**: `GET /api/aquarius/assets/ddo/` +* **Purpose**: This endpoint is used to fetch the Decentralized Document (DDO) of a particular asset. A DDO is a detailed information package about a specific asset, including its ID, metadata, and other necessary data. +* **Parameters**: The `` in the URL is a placeholder for the DID, a unique identifier for the asset you want to retrieve the DDO for. + +
NameDescriptionTypeWithinRequired
didDID of the assetstringpathtrue
+ +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. In this case, it means the server successfully found and returned the DDO for the given DID. The returned data is formatted in JSON. +* **404**: This is an HTTP response code that signifies the requested resource couldn't be found on the server. In this context, it means the asset DID you requested isn't found in Elasticsearch, the database Aquarius uses. The server responds with a JSON-formatted message stating that the asset DID wasn't found. + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') +const did = 'did:op:ce3f161fb98c64a2ded37fd34e25f28343f2c88d0c8205242df9c621770d4b3b' +const response = await axios(`https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/${did}`) +console.log(response.status) +console.log(response.data.nftAddress) +console.log(response.data.metadata.name) +console.log(response.data.metadata.description) + +``` + +### **Metadata** + +A method for retrieving the metadata about the asset using the Decentralized Identifier (DID). + +* **Endpoint**: `GET /api/aquarius/assets/metadata/` +* **Purpose**: This endpoint is used to fetch the metadata of a particular asset. It includes details about the asset such as its name, description, creation date, owner, etc. +* **Parameters**: The `` in the URL is a placeholder for the DID, a unique identifier for the asset you want to retrieve the metadata for. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. In this case, it means the server successfully found and returned the metadata for the given DID. The returned data is formatted in JSON. +* **404**: This is an HTTP response code that signifies the requested resource couldn't be found on the server. In this context, it means the asset DID you requested isn't found in the database. The server responds with a JSON-formatted message stating that the asset DID wasn't found. + +#### Parameters + +
NameDescriptionTypeWithinRequired
didDID of the assetstringpathtrue
+ +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/metadata/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') +const did = 'did:op:ce3f161fb98c64a2ded37fd34e25f28343f2c88d0c8205242df9c621770d4b3b' +const response = await axios(`https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/metadata/${did}`) +console.log(response.status) +console.log(response.data.name) +console.log(response.data.description) + +``` + +### **Asset Names** + +Used to retrieve the names of a group of assets using a list of unique identifiers known as Decentralized Identifiers (DIDs). + +Here's a more detailed explanation: + +* **Endpoint**: `POST /api/aquarius/assets/names` +* **Purpose**: This endpoint is used to fetch the names of specific assets. These assets are identified by a list of DIDs provided in the request payload. The returned asset names are those specified in the assets' metadata. +* **Parameters**: The parameters are sent in the body of the POST request, formatted as JSON. Specifically, an array of DIDs (named "didList") should be provided. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. In this case, it means the server successfully found and returned the names for the assets corresponding to the provided DIDs. The returned data is formatted in JSON, mapping each DID to its respective asset name. +* **400**: This is an HTTP response code that signifies a client error in the request. In this context, it means that the "didList" provided in the request payload was empty. The server responds with a JSON-formatted message indicating that the requested "didList" cannot be empty. + +#### Parameters + +
NameDescriptionTypeWithinRequired
didListlist of asset DIDslistbodytrue
+ +#### Curl Example + +```bash +curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/names' \ +--header 'Content-Type: application/json' \ +--data-raw '{ + "didList" : ["did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6"] +}' +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const body = {didList : ["did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6", "did:op:ce3f161fb98c64a2ded37fd34e25f28343f2c88d0c8205242df9c621770d4b3b"]} + +const response = await axios.post('https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/names', body) +console.log(response.status) +for (let key in response.data) { + console.log(key + ': ' + response.data[key]); +} + +``` + +### Query Assets + +Used to run a custom search query on the assets using Elasticsearch's native query syntax. We recommend reading the [Elasticsearch documentation](https://www.elastic.co/guide/index.html) to understand their syntax. + +* **Endpoint**: `POST /api/aquarius/assets/query` +* **Purpose**: This endpoint is used to execute a native Elasticsearch (ES) query against the stored assets. This allows for highly customizable searches and can be used to filter and sort assets based on complex criteria. The body of the request should contain a valid JSON object that defines the ES query. +* **Parameters**: The parameters for this endpoint are provided in the body of the POST request as a valid JSON object that conforms to the Elasticsearch query DSL (Domain Specific Language). + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server successfully ran your ES query and returned the results. The results are returned as a JSON object. +* **500**: This HTTP status code represents a server error. In this context, it typically means there was an error with Elasticsearch while trying to execute the query. It could be due to an invalid or malformed query, an issue with the Elasticsearch service, or some other server-side problem. The specific details of the error are typically included in the response body. + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/query' \ +--header 'Content-Type: application/json' \ +--data-raw '{ + "query": { + "match_all": {} + } +}' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const body = { "query": { "match_all": { } } } + + +const response = await axios.post('https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/query', body) +console.log(response.status) +console.log(response.data.hits.hits[0]) +for (const value of response.data.hits.hits) { + console.log(value); +} + +``` + +### Validate DDO + +Used to validate the content of a DDO (Decentralized Identifier Document). + +* **Endpoint**: `POST /api/aquarius/assets/ddo/validate` +* **Purpose**: This endpoint is used to verify the validity of a DDO. This could be especially helpful prior to submitting a DDO to ensure it meets the necessary criteria and avoid any issues or errors. The endpoint consumes `application/octet-stream`, which means the data sent should be in binary format, often used for handling different data types. +* **Parameters**: The parameters for this endpoint are provided in the body of the POST request as a valid JSON object, which represents the DDO that needs to be validated. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server successfully validated your DDO content and it meets the necessary criteria. +* **400**: This HTTP status code indicates a client error. In this context, it means that the submitted DDO format is invalid. You will need to revise the DDO content according to the required specifications and resubmit it. +* **500**: This HTTP status code represents a server error. This indicates an internal server error while processing your request. The specific details of the error are typically included in the response body. + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/query/api/v1/aquarius/assets/ddo/validate' \ +--header 'Content-Type: application/json' \ +--data-raw '' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const body = { + "@context": ["https://w3id.org/did/v1"], + "id": "did:op:56c3d0ac76c02cc5cec98993be2b23c8a681800c08f2ff77d40c895907517280", + "version": "4.1.0", + "chainId": 1337, + "nftAddress": "0xabc", + "metadata": { + "created": "2000-10-31T01:30:00.000-05:00Z", + "updated": "2000-10-31T01:30:00.000-05:00", + "name": "Ocean protocol white paper", + "type": "dataset", + "description": "Ocean protocol white paper -- description", + "author": "Ocean Protocol Foundation Ltd.", + "license": "CC-BY", + "contentLanguage": "en-US", + "tags": ["white-papers"], + "additionalInformation": {"test-key": "test-value"}, + "links": [ + "http://data.ceda.ac.uk/badc/ukcp09/data/gridded-land-obs/gridded-land-obs-daily/", + "http://data.ceda.ac.uk/badc/ukcp09/data/gridded-land-obs/gridded-land-obs-averages-25km/", + "http://data.ceda.ac.uk/badc/ukcp09/" + ] + }, + "services": [ + { + "id": "test", + "type": "access", + "datatokenAddress": "0xC7EC1970B09224B317c52d92f37F5e1E4fF6B687", + "name": "Download service", + "description": "Download service", + "serviceEndpoint": "http://172.15.0.4:8030/", + "timeout": 0, + "files": "encryptedFiles" + } + ] + } + +const response = await axios.post( 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/validate', body) +console.log(response.status) +console.log(response.data) + +``` + +### Trigger Caching + +Used to manually initiate the process of DDO caching based on a transaction ID. This transaction ID should include either MetadataCreated or MetadataUpdated events. + +* **Endpoint**: `POST /api/aquarius/assets/triggerCaching` +* **Purpose**: This endpoint is used to manually trigger the caching process of a DDO (Decentralized Identifier Document). This process is initiated based on a specific transaction ID, which should include either MetadataCreated or MetadataUpdated events. This can be particularly useful in situations where immediate caching of metadata changes is required. +* **Parameters**: The parameters for this endpoint are provided in the body of the POST request as a valid JSON object. This includes the transaction ID and log index that is associated with the metadata event. + +
NameDescriptionTypeWithinRequired
transactionIdDID of the assetstringpathtrue
logIndexcustom log index for the transactionintpathfalse
+ +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server successfully initiated the DDO caching process and the updated asset is returned. +* **400**: This HTTP status code indicates a client error. In this context, it suggests issues with the request: either the log index was not found, or the transaction log did not contain MetadataCreated or MetadataUpdated events. You should revise your input parameters and try again. +* **500**: This HTTP status code represents a server error. This indicates an internal server error while processing your request. The specific details of the error are typically included in the response body. + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request POST 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/triggerCaching' \ +--header 'Content-Type: application/json' \ +--data-raw '' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const body = { "transactionId": "0x945596edf2a26d127514a78ed94fea86b199e68e9bed8b6f6d6c8bb24e451f27", "logIndex": 0} +const response = await axios.post( 'https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/triggerCaching', body) +console.log(response.status) +console.log(response.data) + +``` + diff --git a/developers/aquarius/chain-requests.md b/developers/aquarius/chain-requests.md new file mode 100644 index 00000000..2668536e --- /dev/null +++ b/developers/aquarius/chain-requests.md @@ -0,0 +1,82 @@ +# Chain Requests + +The universal Aquarius Endpoint is [`https://v4.aquarius.oceanprotocol.com`](https://v4.aquarius.oceanprotocol.com). + +### Chain List + +Retrieves a list of chains that are currently supported or recognized by the Aquarius service. + +* **Endpoint**: `GET /api/aquarius/chains/list` +* **Purpose**: This endpoint provides a list of the chain IDs that are recognized by the Aquarius service. Each chain ID represents a different blockchain network, and the boolean value indicates if the chain is currently active (true) or not (false). +* **Parameters**: This endpoint does not require any parameters. You simply send a GET request to it. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing chain IDs as keys and their active status as values. + +Example response: + +```json +{ "246": true, "3": true, "137": true, + "2021000": true, "4": true, "1": true, + "56": true, "80001": true, "1287": true +} +``` + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/list' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const response = await axios( 'https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/list') +console.log(response.status) +console.log(response.data) + +``` + +### **Chain Status** + +Retrieves the index status for a specific chain\_id from the Aquarius service. + +* **Endpoint**: `GET /api/aquarius/chains/status/{chain_id}` +* **Purpose**: This endpoint is used to fetch the index status for a specific blockchain chain, identified by its chain\_id. The status, expressed as the "last\_block", gives the most recent block that Aquarius has processed on this chain. +* **Parameters**: This endpoint requires a chain\_id as a parameter in the path. This chain\_id represents the specific chain you want to get the index status for. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the "last\_block", which is the most recent block that Aquarius has processed on this chain. In the response example you provided, "25198729" is the last block processed on the chain with the chain\_id "137". + +Example response: + +```json +{"last_block": 25198729} +``` + +#### Curl Example + +{% code overflow="wrap" %} +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/status/137' +``` +{% endcode %} + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') +const chainId = 1 + +const response = await axios( `https://v4.aquarius.oceanprotocol.com/api/aquarius/chains/status/${chainId}`) +console.log(response.status) +console.log(response.data) + +``` + diff --git a/developers/aquarius/other-requests.md b/developers/aquarius/other-requests.md new file mode 100644 index 00000000..40d9eede --- /dev/null +++ b/developers/aquarius/other-requests.md @@ -0,0 +1,100 @@ +# Other Requests + +The universal Aquarius Endpoint is [`https://v4.aquarius.oceanprotocol.com`](https://v4.aquarius.oceanprotocol.com). + +### **Info** + +Retrieves version, plugin, and software information from the Aquarius service. + +* **Endpoint**: `GET /` +* **Purpose**: This endpoint is used to fetch key information about the Aquarius service, including its current version, the plugin it's using, and the name of the software itself. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the `plugin`, `software`, and `version`. + +Example response: + +```json +{ + "plugin": "elasticsearch", + "software": "Aquarius", + "version": "4.2.0" +} +``` + +#### Curl Example + +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/' +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const response = await axios( 'https://v4.aquarius.oceanprotocol.com/') +console.log(response.status) +console.log(response.data) + +``` + + + +### **Health** + +Retrieves the health status of the Aquarius service. + +* **Endpoint**: `GET /health` +* **Purpose**: This endpoint is used to fetch the current health status of the Aquarius service. This can be helpful for monitoring and ensuring that the service is running properly. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a message indicating the health status. For example, "Elasticsearch connected" indicates that the Aquarius service is able to connect to Elasticsearch, which is a good sign of its health. + +**Curl Example** + +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/health' +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const response = await axios( 'https://v4.aquarius.oceanprotocol.com/health') +console.log(response.status) +console.log(response.data) + +``` + +### **Spec** + +Retrieves the Swagger specification for the Aquarius service. + +* **Endpoint**: `GET /spec` +* **Purpose**: This endpoint is used to fetch the Swagger specification of the Aquarius service. Swagger is a set of rules (in other words, a specification) for a format describing REST APIs. This endpoint returns a document that describes the entire API, including the available endpoints, their methods, parameters, and responses. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns the Swagger specification. + +#### Example + +```bash +curl --location --request GET 'https://v4.aquarius.oceanprotocol.com/spec' +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const response = await axios( 'https://v4.aquarius.oceanprotocol.com/spec') +console.log(response.status) +console.log(response.data.info) + +``` + diff --git a/developers/architecture.md b/developers/architecture.md new file mode 100644 index 00000000..e43d5250 --- /dev/null +++ b/developers/architecture.md @@ -0,0 +1,50 @@ +--- +description: Ocean Protocol Architecture Adventure! +--- + +# Architecture Overview + +Embark on an exploration of the innovative realm of Ocean Protocol, where data flows seamlessly and AI achieves new heights. Dive into the intricately layered architecture that converges data and services, fostering a harmonious collaboration. Let us delve deep and uncover the profound design of Ocean Protocol.🐬 + +

Overview of the Ocean Protocol Architecture

+ +### Layer 1: The Foundational Blockchain Layer + +At the core of Ocean Protocol lies the robust [Blockchain Layer](contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. + +The [smart contracts](contracts/README.md) are deployed on the Ethereum mainnet and other compatible [networks](../discover/networks/README.md). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](contracts/README.md) section. + +### Layer 2: The Empowering Middle Layer + +Above the smart contracts, you'll find essential [libraries](architecture.md#libraries) employed by applications within the Ocean Protocol ecosystem, the [middleware components](architecture.md#middleware-components), and [Compute-to-Data](architecture.md#compute-to-data). + +#### Libraries + +These libraries include [Ocean.js](ocean.js/README.md), a JavaScript library, and [Ocean.py](ocean.py/README.md), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol. + +1. [Ocean.js](ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. +2. [Ocean.py](ocean.py/README.md): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../data-science/README.md) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data. + +#### Middleware components + +Additionally, in supporting the discovery process, middleware components come into play: + +1. [Aquarius](aquarius/README.md): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery. +2. [Provider](provider/README.md): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants. +3. [Subgraph](subgraph/README.md): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information. + +#### Compute-to-Data + +[Compute-to-Data](compute-to-data/README.md) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols. + +### Layer 3: The Accessible Application Layer + +Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures. + +Prominently featured within this layer is [Ocean Market](../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized. + +### Layer 4: The Friendly Wallets + +At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../discover/wallets/README.md), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../discover/wallets/README.md). + +With the layers of the architecture clearly delineated, the stage is set for a comprehensive exploration of their underlying logic and intricate design. By examining each individually, we can gain a deeper understanding of their unique characteristics and functionalities. diff --git a/developers/barge/README.md b/developers/barge/README.md new file mode 100644 index 00000000..491fc775 --- /dev/null +++ b/developers/barge/README.md @@ -0,0 +1,22 @@ +--- +description: 🧑🏽‍💻 Local Development Environment for Ocean Protocol +--- + +# Barge + +The Barge component of Ocean Protocol is a powerful tool designed to simplify the development process by providing Docker Compose files for running the full Ocean Protocol stack locally. It allows developers to set up and configure the various services required by Ocean Protocol for local testing and development purposes. + +By using the Barge component, developers can spin up an environment that includes default versions of [Aquarius](../aquarius/README.md), [Provider](../provider/README.md), [Subgraph](../subgraph/README.md), and [Compute-to-Data](../compute-to-data/README.md). Additionally, it deploys all the [smart contracts](../contracts/README.md) from the ocean-contracts repository, ensuring a complete and functional local setup. Barge component also starts additional services like [Ganache](https://trufflesuite.com/ganache/), which is a local blockchain simulator used for smart contract development, and [Elasticsearch](https://www.elastic.co/elasticsearch/), a powerful search and analytics engine required by Aquarius for efficient indexing and querying of data sets. A full list of components and exposed ports is available in the GitHub [repository](https://github.com/oceanprotocol/barge#component-versions-and-exposed-ports). + +

Load Ocean components locally by using Barge

+ +To explore all the available options and gain a deeper understanding of how to utilize the Barge component, you can visit the official GitHub [repository](https://github.com/oceanprotocol/barge#all-options) of Ocean Protocol. + +By utilizing the Barge component, developers gain the freedom to conduct experiments, customize, and fine-tune their local development environment, and offers the flexibility to override the Docker image tag associated with specific components. By setting the appropriate environment variable before executing the start\_ocean.sh command, developers can customize the versions of various components according to their requirements. For instance, developers can modify the: `AQUARIUS_VERSION`, `PROVIDER_VERSION`, `CONTRACTS_VERSION`, `RBAC_VERSION`, and `ELASTICSEARCH_VERSION` environment variables to specify the desired Docker image tags for each respective component. + +{% hint style="warning" %} +⚠️ We've got an important heads-up about Barge that we want to share with you. Brace yourself, because **Barge is not for the faint-hearted**! Here's the deal: the barge works great on Linux, but we need to be honest about its limitations on macOS. And, well, it doesn't work at all on Windows. Sorry, Windows users! + +To make things easier for everyone, we **strongly** recommend giving a try first on a **testnet**. Everything is configured already so it should be sufficient for your needs as well. Visit our [networks](../../discover/networks/README.md) page to have clarity on the available test networks. ⚠️\ + +{% endhint %} diff --git a/developers/barge/local-setup-ganache.md b/developers/barge/local-setup-ganache.md new file mode 100644 index 00000000..75a13945 --- /dev/null +++ b/developers/barge/local-setup-ganache.md @@ -0,0 +1,46 @@ +--- +description: 🧑🏽‍💻 Your Local Development Environment for Ocean Protocol +--- + +# Local Setup + +**Functionalities of Barge** + +Barge offers several functionalities that enable developers to create and test the Ocean Protocol infrastructure efficiently. Here are its key components: + +
FunctionalityDescription
AquariusA metadata storage and retrieval service for Ocean Protocol. Allows indexing and querying of metadata.
ProviderA service that facilitates interaction between users and the Ocean Protocol network.
GanacheA local Ethereum blockchain network for testing and development purposes.
TheGraphA decentralized indexing and querying protocol used for building subgraphs in Ocean Protocol.
ocean-contractsSmart contracts repository for Ocean Protocol. Deploys and manages the necessary contracts for local development.
Customization and OptionsBarge provides various options to customize component versions, log levels, and enable/disable specific blocks.
+ +Barge helps developers to get started with Ocean Protocol by providing a local development environment. With its modular and user-friendly design, developers can focus on building and testing their applications without worrying about the intricacies of the underlying infrastructure. + +To use Barge, you can follow the instructions in the [Barge repository](https://github.com/oceanprotocol/barge). + +Before getting started, make sure you have the following prerequisites: + +* **Linux** or **macOS** operating system. Barge does not currently support Windows, but you can run it inside a Linux virtual machine or use the Windows Subsystem for Linux (WSL). +* Docker installed on your system. You can download and install Docker from the [Docker website](https://www.docker.com/get-started). On Linux, you may need to allow non-root users to run Docker. On Windows or macOS, it is recommended to increase the memory allocated to Docker to 4 GB (default is 2 GB). +* Docker Compose, which is used to manage the Docker containers. You can find installation instructions in the [Docker Compose documentation](https://docs.docker.com/compose/). + +Once you have the prerequisites set up, you can clone the Barge repository and navigate to the repository folder using the command line: + +```bash +git clone git@github.com:oceanprotocol/barge.git +cd barge +``` + +The repository contains a shell script called `start_ocean.sh` that you can run to start the Ocean Protocol stack locally for development. To start Barge with the default configurations, simply run the following command: + +```bash +./start_ocean.sh +``` + +This command will start the default versions of Aquarius, Provider, and Ganache, along with the Ocean contracts deployed to Ganache. + +For more advanced options and customization, you can refer to the README file in the Barge repository. It provides detailed information about the available startup options, component versions, log levels, and more. + +To clean up your environment and stop all the Barge-related containers, volumes, and networks, you can run the following command: + +```bash +./cleanup.sh +``` + +Please refer to the Barge repository's README for more comprehensive instructions, examples, and details on how to use Barge for local development with the Ocean Protocol stack. diff --git a/building-with-ocean/build-a-marketplace/README.md b/developers/build-a-marketplace/README.md similarity index 62% rename from building-with-ocean/build-a-marketplace/README.md rename to developers/build-a-marketplace/README.md index 62af90a7..6924c520 100644 --- a/building-with-ocean/build-a-marketplace/README.md +++ b/developers/build-a-marketplace/README.md @@ -1,6 +1,5 @@ --- title: Forking Ocean Market -featuredImage: images/creatures/mantaray/mantaray-full@2x.png description: Forking and customizing Ocean Market (Frontend) --- @@ -20,9 +19,9 @@ Using Ocean Market is already a big improvement on the alternatives that are out The tutorial covers: -* Forking and running Ocean Market locally -* Customising your fork of Ocean market -* Quick deployment of Ocean Market +* [Forking and running Ocean Market locally](forking-ocean-market.md) +* [Customizing your fork of Ocean market](customising-your-market.md) +* [Quick deployment of Ocean Market](deploying-market.md) ## Preparation @@ -30,14 +29,18 @@ The tutorial covers: If you’re completely unfamiliar with Ocean Market or web3 applications in general, you will benefit from reading these guides first: -* To use your clone of Ocean Market, you’ll need a [wallet](../wallets.md). We recommend [getting set up with metamask](../../orientation/metamask-setup.md). -* You’ll also need some [Ocean tokens on a testnet](../wallets-and-ocean-tokens.md) to use your marketplace. -* When you have the testnet tokens, have a go at [publishing a data asset](../../using-ocean-market/marketplace-publish-data-asset.md) on Ocean Market. -* Run through the process of [consuming a data asset](../../using-ocean-market/marketplace-download-data-asset.md) on Ocean Market. +* To use your clone of Ocean Market, you’ll need a [wallet](../../discover/wallets/README.md). We recommend [getting set up with metamask](../../discover/wallets/metamask-setup.md). +* You’ll also need some [Ocean tokens on a testnet](../../discover/wallets-and-ocean-tokens.md) to use your marketplace. +* When you have the testnet tokens, have a go at [publishing a data NFT](../../user-guides/publish-data-nfts.md) on Ocean Market. +* Run through the process of [consuming a data asset](../../user-guides/buy-data-nfts.md) on Ocean Market. **Required Prerequisites** * Git. Instructions for installing Git can be found [here](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git). -* Node.js can be downloaded from [here](https://nodejs.org/en/download/) (we’re using version 16 in this guide) +* Node.js can be downloaded from [here](https://nodejs.org/en/download/) (we’re using version 18 in this guide) * A decent code editor, such as [Visual Studio Code](https://code.visualstudio.com/). -* You’ll need a Github account to fork Ocean market via [Github](https://github.com/). +* You’ll need a Github account to fork Ocean Market via [Github](https://github.com/). + +{% hint style="warning" %} +Let's emphasize an important aspect of building dApps. It's crucial to keep in mind that practically everything can be added to the blockchain 😵 When you integrate with our components, it becomes **crucial** for you, as a developer, to ensure **proper sanitization** of the responses on your end. This means you should carefully **validate and filter** the data received to **prevent** any potential vulnerabilities or security risks in your applications. +{% endhint %} diff --git a/developers/build-a-marketplace/customising-your-market.md b/developers/build-a-marketplace/customising-your-market.md new file mode 100644 index 00000000..ee587733 --- /dev/null +++ b/developers/build-a-marketplace/customising-your-market.md @@ -0,0 +1,211 @@ +--- +title: Customising Market +description: Step by step guide to customizing your fork of Ocean market +--- + +# Customising a Market + +So you’ve got a fully functioning data marketplace at this point, which is pretty cool. But it doesn’t really look like your data marketplace. Right now, it’s still just a clone of Ocean Market — the same branding, name, logo, etc. The next few steps focus on personalizing your data marketplace. + +* Change your Market Name +* Change the Logo +* Change the Styling +* Edit the Publish Form +* Advanced customization + +## Change your Market Name + +It’s now time to open up your favorite code editor and start getting stuck into the code. The first thing we will be doing is changing the name of your marketplace. A decent code editor (such as VS Code) makes this incredibly simple by searching and replacing all the places where the name appears. + +Let’s start by searching and replacing `Ocean Marketplace`. In VS Code there is a magnifying glass symbol in the left-hand panel (arrow 1 in the image below) that will open up the interface for searching and replacing text. Type “Ocean Marketplace” into the first textbox, and the name of your marketplace into the second textbox (arrow 2). To make things simple, there is a button to the right of the second textbox (arrow 3) that will replace all instances at once. You can take a moment to review all the text you’re changing if you wish, and then click this button. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-3.png) + +Next up, we need to repeat the process but this time we’ll be searching for and replacing `Ocean Market`. As you can see in the screenshot below, we have called our fork `Crypto Photos Market`. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-4.png) + +![Market Customisation](../../.gitbook/assets/market/market-customisation-4.1.png) + +![Market Customisation](../../.gitbook/assets/market/market-customisation-4.2.jpg) + +Now let’s change the tagline of your site. Open up the folder called `content` and then open the file called `site.json`. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-5.png) + +On line 3 in this file, you can enter the tagline that you want for your marketplace. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-6.png) + +![Market Customisation](../../.gitbook/assets/market/market-customisation-6.1.png) + +## Change the Logo + +The next important step to personalizing your marketplace is setting up your own logo. We highly recommend using your logo in SVG format for this. The site logo is stored in the following location: + +``` +src/@images/logo.svg +``` + +Delete the `logo.svg` file from that folder and paste your own logo in the same folder. Then, if you rename your `logo.svg` everything will work without any problems. + +At this point, it’s a good idea to check how things are looking. First, check that you have saved all of your changes, then cancel the build that’s running in your terminal (Ctrl + C OR Cmnd + C) and start it again `npm start`. Once the build has finished, navigate to http://localhost:8000/ and see how things look. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-7.1.png) + +Awesome! Our logo is looking great! + +## Change the Styling + +Hopefully, you like our pink and purple branding, but we don’t expect you to keep it in your own marketplace. This step focuses on applying your own brand colors and styles. + +### Background + +Let’s start with the background. Open up the following CSS file: + +``` +src/components/App/index.module.css +``` + +You’ll notice in the screenshot above that we are setting our `wave` background on line 3. Here, you’ll want to use your own background color or image. For this example, we’ll use an SVG background from [here](https://www.svgbackgrounds.com/). First, we save the new background image into the src/images/ folder (same folder as the logo), then we change the CSS to the file location of the new background (see line 3 in the image below). + +![Market Customisation](../../.gitbook/assets/market/market-customisation-8.png) + +If we save this file and view the site at this point, we get a white section at the top (see image below). And you’ll also notice that the background doesn’t fill all the way down to the bottom of the screen. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-10.1.png) ![Market Customisation](../../.gitbook/assets/market/market-customisation-10.2.png) + +To fix this, we need to change the starting position of the background image and change it from no-repeat to repeat. We can do this on line 3. + +When we view our marketplace, we can see that the new background starts at the top and fills the whole page. Perfect! + +![Market Customisation](../../.gitbook/assets/market/market-customisation-11.1.png) + +### Brand Colors + +Next up, let’s change the background colors to match your individual style. Open up the following file: `src/global/_variables.css`. Here you’ll see the global style colors that are set. Now is the time to get creative, or consult your brand handbook (if you already have one). + +You can change these colors as much as you wish until you’re happy with how everything looks. Each time you save your changes, the site will immediately update so you can see how things look. You can see the styles chosen for this example in the image below. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-12.png) + +### Change Fonts + +The final part of the styling that we’ll alter in this guide is the fonts. This is an important step because the font used in Ocean Market is one of the few elements of the market that are **copyright protected**. If you want to use the same font you’ll need to purchase a license. The other copyrighted elements are the logo and the name — which we have already changed. + +If you don’t already have a brand font, head over to Google Fonts to pick some fonts that suit the brand you’re trying to create. Google makes it nice and easy to see how they’ll look, and it’s simple to import them into your project. + +The global fonts are set in the same file as the colors, scroll down and you’ll see them on lines 36 to 41. + +If you are importing fonts, such as from Google Fonts, you need to make sure that you include the import statement at the top of the `_variables.css` file. + +As with the color changes, it’s a good idea to save the file with each change and check if the site is looking the way that you expected it to. You can see our eclectic choices below. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-13.png) + +## Customize the Publish Form + +Let’s head to the publish page to see what it looks like with our new styling - so far, so good. But there is one major issue, the publish form is still telling people to publish datasets. On our new marketplace, we want people to publish and sell their photos, so we’re going to have to make some changes here. + +![Market Customisation](../../.gitbook/assets/market/publish-page-before-edit.png) + +Open up the `index.json` file from `content/publish/index.json` - here we change the text to explain that this form is for publishing photos. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-15.png) + +Additionally, the asset type currently says dataset, and we need to change this so that it says photo. The simplest way to do this is to change the title of the asset type without changing anything else. Ocean can handle selling any digital asset that can be accessed via a URL, so no further changes are needed to accommodate selling photos. + +Open up `src/components/Publish/Metadata/index.tsx` and change line 33 so that it says `Photo` + +![Market Customisation](../../.gitbook/assets/market/market-customisation-18.png) + +Great, now our publish page explains that users should be publishing photos and the photo is provided as an asset type option. We’ll also leave the algorithm as an option in case some data scientists want to do some analysis or image transformation on the photos. + +![Market Customisation](../../.gitbook/assets/market/publish-page-2.png) + +There is one more thing that is fun to change before we move away from the publish form. You’ll notice that Ocean Market V4 now has a cool SVG generation feature that creates the images for the Data NFT. It creates a series of pink waves. Let’s change this so that it uses our brand colors in the waves! + +Open up `/src/@utils/SvgWaves.ts` and have a look at lines 27 to 30 where the colors are specified. Currently, the pink color is the one used in the SVG generator. You can replace this with your own brand color: + +![Market Customisation](../../.gitbook/assets/market/market-customisation-21.png) + +If you’re interested in doing some further customization, take a look at lines 53 to 64. You can change these properties to alter how the image looks. Feel free to play around with it. We’ve increased the number of layers from 4 to 5. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-22.png) + +And now your customized publish page is ready for your customers: + +![Market Customisation](../../.gitbook/assets/market/market-customisation-20.png) + +## Advanced customization + +This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. You'll also need to set the environmental variables if you customized services like Aquarius, Provider, or Subgraph. + +First, we are going to create a new file called `.env` in the root of your repository. + +Copy and paste the following into the file: + +```bash + +NEXT_PUBLIC_MARKET_FEE_ADDRESS="0x123abc" +NEXT_PUBLIC_PUBLISHER_MARKET_ORDER_FEE="0.01" +NEXT_PUBLIC_PUBLISHER_MARKET_FIXED_SWAP_FEE="0.01" +NEXT_PUBLIC_CONSUME_MARKET_ORDER_FEE="0.01" +NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="0.01" + +# +# ADVANCED SETTINGS +# + +# Toggle pricing options presented during price creation +#NEXT_PUBLIC_ALLOW_FIXED_PRICING="true" +#NEXT_PUBLIC_ALLOW_FREE_PRICING="true" + +# Privacy Preference Center +#NEXT_PUBLIC_PRIVACY_PREFERENCE_CENTER="true" + +# Development Preference Center +#NEXT_PUBLIC_PROVIDER_URL="http://xxx:xxx" +#NEXT_PUBLIC_SUBGRAPH_URI="http://xxx:xxx" +#NEXT_PUBLIC_METADATACACHE_URI="http://xxx:xxx" +#NEXT_PUBLIC_RPC_URI="http://xxx:xxx" + +``` + +### Change the Fee Address + +At this point, we have made a lot of changes and hopefully, you’re happy with the way that your marketplace is looking. Given that you now have your own awesome photo marketplace, it’s about time we talked about monetizing it. Yup, that’s right - you will earn a [commission](../contracts/fees.md) when people buy and sell photos in your marketplace. In Ocean V4, there are a whole host of new [fees](../contracts/fees.md) and customization options that you can use. In order to receive the fees you’ll need to set the address where you want to receive these fees in. + +When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that. + +You need to replace “0x123abc” with your Ethereum address (this is where the fees will be sent). + +### Change the Fees Values + +You can also alter the fees to the levels that you intend them to be at. If you change your mind, these fees can always be altered later. + +Go to [Fees page](../contracts/fees.md) to know more details about each type of fee and its relevance. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-23.png) + +It is important that the file is saved in the right place at the root of your repository, your file structure should look the same as below. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-24.png) + +Now that’s it; you now have a fully functioning photo marketplace that operates over the blockchain. Every time someone uses it, you will receive revenue. + +![Market Customisation](../../.gitbook/assets/market/market-customisation-25.png) + +### Using a custom Provider + +You have the flexibility to tailor the ocean market according to your preferences by directing it to a predetermined custom [provider](https://github.com/oceanprotocol/provider/) deployment. This customization option allows you to choose a specific default provider, in addition to the option of manually specifying it when publishing an asset. To make use of this feature, you need to uncomment the designated line and modify the URL for your custom provider in the previously generated `.env` file. Look for the key labeled `NEXT_PUBLIC_PROVIDER_URL` and update its associated URL accordingly. + +### Using a custom MetadataCache + +If you intend to utilize the ocean market with a custom [Aquarius](../aquarius/README.md) deployment, you can also make set a custom MetadataCache flag. To do this, you will need to update the same file mentioned earlier. However, instead of modifying the `NEXT_PUBLIC_PROVIDER_URL` key, you should update the `NEXT_PUBLIC_METADATACACHE_URI` key. By updating this key, you can specify the URI for your custom Aquarius deployment, enabling you to take advantage of the ocean market with your preferred metadata cache setup. + +### Using a custom subgraph + +Using a custom subgraph with the ocean market requires additional steps due to the differences in deployment. Unlike the multi-network deployment of the provider and Aquarius services, each network supported by the ocean market has a separate subgraph deployment. This means that while the provider and Aquarius services can be handled by a single deployment across all networks, the subgraph requires specific handling for each network. + +To utilize a custom subgraph, you will need to implement additional logic within the `getOceanConfig` function located in the `src/utils/ocean.ts` file. By modifying this function, you can ensure that the market uses the desired custom subgraph for the selected network. This is particularly relevant if your market aims to support multiple networks and you do not want to enforce the use of the same subgraph across all networks. By incorporating the necessary logic within `getOceanConfig`, you can ensure the market utilizes the appropriate custom subgraph for each network, enabling the desired level of customization. If the mentioned scenario doesn't apply to your situation, there is another approach you can take. Similar to the previously mentioned examples, you can modify the `.env` file by updating the key labeled `NEXT_PUBLIC_SUBGRAPH_URI`. By making changes to this key, you can configure the ocean market to utilize your preferred subgraph deployment. This alternative method allows you to customize the market's behavior and ensure it utilizes the desired subgraph, even if you don't require different subgraph deployments for each network. diff --git a/building-with-ocean/build-a-marketplace/deploying-market.md b/developers/build-a-marketplace/deploying-market.md similarity index 68% rename from building-with-ocean/build-a-marketplace/deploying-market.md rename to developers/build-a-marketplace/deploying-market.md index 9a95a548..b002249b 100644 --- a/building-with-ocean/build-a-marketplace/deploying-market.md +++ b/developers/build-a-marketplace/deploying-market.md @@ -1,41 +1,42 @@ --- title: Deployment of Ocean Market -order: 3 -hideLanguageSelector: true -featuredImage: images/creatures/mantaray/mantaray-full@2x.png description: Step by step guide to a quick deployment of Ocean Market --- -# Deploying Market +# Build and host your Data Marketplace All that’s left is for you to host your data marketplace and start sharing it with your future users. -## **Build and host your Data Marketplace** +## **Build and host your marketplace using surge** To host your data marketplace, you need to run the build command: -``` +```bash npm run build:static ``` -This takes a few minutes to run. While this is running, you can get prepared to host your new data marketplace. You have many options for hosting your data marketplace (including AWS S3, Vercel, Netlify and many more). In this guide, we will demonstrate how to host it with surge, which is completely free and very easy to use. +This takes a few minutes to run. While this is running, you can get prepared to host your new data marketplace. You have many options for hosting your data marketplace (including AWS S3, Vercel, Netlify and many more). In this guide, we will demonstrate how to host it with surge, which is completely free and very easy to use. You can also refer to this [tutorial](../../infrastructure/deploying-marketplace.md) from the infrastructuree section as well if you want to deploy your market in you own infrastructure using docker. Open up a new terminal window and run the following command to install surge: -``` +```bash npm install --global surge ``` When this is complete, navigate back to the terminal window that is building your finished data marketplace. Once the build is completed, enter the following commands to enter the public directory and host it: -``` +```bash cd out ``` -``` +```bash surge ``` -If this is your first time using surge, you will be prompted to enter an email address and password to create a free account. It will ask you to confirm the directory that it is about to publish, check that you are in the market/public/ directory and press enter to proceed. Now it gives you the option to choose the domain that you want your project to be available on. We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain. +If this is your first time using surge, you will be prompted to enter an email address and password to create a free account. It will ask you to confirm the directory that it is about to publish, check that you are in the market/public/ directory and press enter to proceed. Now it gives you the option to choose the domain that you want your project to be available on. + +

surge interaction

+ +We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain. After a few minutes, your upload will be complete, and you’re ready to share your data marketplace. You can view the version we created in this guide [here](https://crypto-photos.surge.sh/). diff --git a/building-with-ocean/build-a-marketplace/forking-ocean-market.md b/developers/build-a-marketplace/forking-ocean-market.md similarity index 92% rename from building-with-ocean/build-a-marketplace/forking-ocean-market.md rename to developers/build-a-marketplace/forking-ocean-market.md index c9647603..6a4864af 100644 --- a/building-with-ocean/build-a-marketplace/forking-ocean-market.md +++ b/developers/build-a-marketplace/forking-ocean-market.md @@ -1,8 +1,5 @@ --- title: Forking Ocean Market -order: 1 -hideLanguageSelector: true -featuredImage: images/creatures/mantaray/mantaray-full@2x.png description: Forking and running Ocean Market locally. --- @@ -31,7 +28,7 @@ Installing the dependencies is a vital step for running the market. It’s a sup Enter the following command to install the dependencies: -``` +```bash npm install ``` @@ -41,18 +38,18 @@ This command will take a few minutes to complete and you’ll see some warnings At this point, you are ready to run your data marketplace for the first time. This is another straightforward step that requires just one command: -``` +```bash npm start ``` The above command will build the development bundle and run it locally. -![Forking Ocean Market](../../.gitbook/assets/market-forking-1.png) +

Forking Ocean Market

Great news - your marketplace has successfully been built and is now running locally. Let’s check it out! Open your browser and navigate to http://localhost:8000/. You’ll see that you have a full-on clone of Ocean Market running locally. Give it a go and test out publishing and consuming assets - everything works! That’s all that’s required to get a clone of Ocean market working. The whole process is made simple because your clone can happily use all the smart contracts and backend components that are maintained by Ocean Protocol Foundation. -![Forking Ocean Market](../../.gitbook/assets/market-forking-2.png) +

Forking Ocean Market

So you’ve got a fully functioning marketplace at this point, which is pretty cool. But it doesn’t really look like your marketplace. Right now, it’s still just a clone of Ocean Market - the same branding, name, logo, etc. The next few steps focus on personalizing your marketplace. diff --git a/developers/community-monetization.md b/developers/community-monetization.md new file mode 100644 index 00000000..a3ceea85 --- /dev/null +++ b/developers/community-monetization.md @@ -0,0 +1,47 @@ +--- +description: How can you build a self sufficient project? +--- + +# Community Monetization + +Our intentions with all of the V4 updates are to ensure that your project is able to become self-sufficient and profitable in the long run (if that’s your aim). We love projects that are built on top of Ocean and we want to ensure that you are able to generate enough income to keep your project running well into the future. + +### 1. Publishing & Selling Data + +**Do you have data that you can monetize?** :thinking: + +Ocean V3 introduced the new crypto primitives of “data on-ramp” and “data off-ramp” via datatokens. The publisher creates ERC20 datatokens for a dataset (on-ramp). Then, anyone can access that dataset by acquiring and sending datatokens to the publisher via Ocean handshaking (data off-ramp). As a publisher, it’s in your best interest to create and publish useful data — datasets that people want to consume — because the more they consume the more you can **earn**. This is the heart of Ocean utility: connecting data publishers with data consumers :people\_hugging: + +The datasets can take one of many shapes. For AI use cases, they may be raw datasets, cleaned-up datasets, feature-engineered **data**, **AI models**, **AI model predictions**, or otherwise. (They can even be other forms of copyright-style IP such as **photos**, **videos**, or **music**!) Algorithms themselves may be sold as part of Ocean’s Compute-to-Data feature. + +The first opportunity of data NFTs is the potential to sell the base intellectual property (IP) as an exclusive license to others. This is akin to EMI selling the Beatles’ master tapes to Universal Music: whoever owns the masters has the right to create records, CDs, and digital [sub-licenses](../discover/glossary.md). It’s the same for data: as the data NFT owner you have the **exclusive right** to create ERC20 datatoken sub-licenses. With Ocean V4, this right is now transferable as a data NFT. You can sell these data NFTs in **OpenSea** and other NFT marketplaces. + +If you’re part of an established organization or a growing startup, you’ll also love the new role structure that comes with data NFTs. For example, you can specify a different address to collect [revenue](contracts/revenue.md) compared to the address that owns the NFT. It’s now possible to fully administer your project through these [roles](contracts/roles.md). + +**In short, if you have data to sell, then Ocean V4 gives you superpowers to scale up and manage your data project. We hope this enables you to bring your data to new audiences and increase your profits.** + +### 2. Running Your Own Data dApp + +We have always been super encouraging of anyone who wishes to build a dApp on top of Ocean or to fork Ocean Market and make their own data marketplace. With the V4 release, we have taken this to the next level and introduced more opportunities and even more fee customization options. + +Unlike in V3, where the fee collection was limited to the consume action with a fixed value of 0.1%, V4 empowers dApp operators like yourself to have greater flexibility and control over the fees you can charge. This means you can tailor the fee structure to suit your specific needs and ensure the sustainability of your project. **V4 smart contracts enable you to collect a fee not only in consume, but also in fixed-rate exchange, also you can set the fee value.** For more detailed information regarding the fees, we invite you to visit the [fees](contracts/fees.md) page. + +Another new opportunity is using your own **ERC20** token in your dApp, where it’s used as the unit of exchange. This is fully supported and can be a great way to ensure the sustainability of your project. + +### 3. Running Your Own Provider + +Now this is a completely brand new opportunity to start generating [revenue](contracts/revenue.md) — running your own [provider](https://github.com/oceanprotocol/provider). We have been aware for a while now that many of you haven’t taken up the opportunity to run your own provider, and the reason seems obvious — there aren’t strong enough incentives to do so. + +For those that aren’t aware, [Ocean Provider](provider/README.md) is the proxy service that’s responsible for encrypting/ decrypting the data and streaming it to the consumer. It also validates if the user is allowed to access a particular data asset or service. It’s a crucial component in Ocean’s architecture. + +Now, as mentioned above, fees are now paid to the individual or organization running the provider whenever a user downloads a data asset. The fees for downloading an asset are set as a cost per MB. In addition, there is also a provider fee that is paid whenever a compute job is run, which is set as a price per minute. + +The download and compute fees can both be set to any absolute amount and you can also decide which token you want to receive the fees in — they don’t have to be in the same currency used in the consuming market. So for example, the provider fee could be a fixed rate of 5 USDT per 1000 MB of data downloaded, and this fee remains fixed in USDT even if the marketplace is using a completely different currency. + +Additionally, provider fees are not limited to data consumption — they can also be used to charge for compute resources. So, for example, this means a provider can charge a fixed fee of 15 DAI to reserve compute resources for 1 hour. This has a huge upside for both the user and the provider host. From the user’s perspective, this means that they can now reserve a suitable amount of compute resources according to what they require. For the host of the provider, this presents another great opportunity to create an income. + + +**Benefits to the Ocean Community** +We’re always looking to give back to the Ocean community and collecting fees is an important part of that. As mentioned above, the Ocean Protocol Foundation retains the ability to implement community fees on data consumption. The tokens that we receive will either be burned or invested in the community via projects that they are building. These investments will take place either through [Data Farming](../rewards/df-intro.md), [Ocean Shipyard](https://oceanprotocol.com/shipyard), or Ocean Ventures. + +Projects that utilize the Ocean token or H2O are subject to a 0.1% fee. In the case of projects that opt to use different tokens, an additional 0.1% fee will be applied. We want to support marketplaces that use other tokens but we also recognize that they don’t bring the same wider benefit to the Ocean community, so we feel this small additional fee is proportionate. diff --git a/developers/compute-to-data/README.md b/developers/compute-to-data/README.md new file mode 100644 index 00000000..e2d1e526 --- /dev/null +++ b/developers/compute-to-data/README.md @@ -0,0 +1,44 @@ +--- +description: Monetise your data while preserving privacy +--- + +# Compute to data + +### Introduction + +Certain datasets, such as health records and personal information, are too sensitive to be directly sold. However, Compute-to-Data offers a solution that allows you to monetize these datasets while keeping the data private. Instead of selling the raw data itself, you can offer compute access to the private data. This means you have control over which algorithms can be run on your dataset. For instance, if you possess sensitive health records, you can permit an algorithm to calculate the average age of a patient without revealing any other details. + +Compute-to-Data effectively resolves the tradeoff between leveraging the benefits of private data and mitigating the risks associated with data exposure. It enables the data to remain on-premise while granting third parties the ability to perform specific compute tasks on it, yielding valuable results like statistical analysis or AI model development. + +Private data holds immense value as it can significantly enhance research and business outcomes. However, concerns regarding privacy and control often impede its accessibility. Compute-to-Data addresses this challenge by granting specific access to the private data without directly sharing it. This approach finds utility in various domains, including scientific research, technological advancements, and marketplaces where private data can be securely sold while preserving privacy. Companies can seize the opportunity to monetize their data assets while ensuring the utmost protection of sensitive information. + +Private data has the potential to drive groundbreaking discoveries in science and technology, with increased data improving the predictive accuracy of modern AI models. Due to its scarcity and the challenges associated with accessing it, private data is often regarded as the most valuable. By utilizing private data through Compute-to-Data, significant rewards can be reaped, leading to transformative advancements and innovative breakthroughs. + +{% hint style="info" %} +The Ocean Protocol provides a compute environment that you can access at the following address: [https://stagev4.c2d.oceanprotocol.com/](https://stagev4.c2d.oceanprotocol.com/). Feel free to explore and utilize this platform for your needs. +{% endhint %} + +We suggest reading these guides to get an understanding of how compute-to-data works: + +### Architecture & Overview Guides + +* [Architecture](compute-to-data-architecture.md) +* [Datasets & Algorithms](compute-to-data-datasets-algorithms.md) +* [Writing Algorithms](compute-to-data-algorithms.md) +* [Compute options](compute-options.md) + +### User Guides + +* [How to write compute to data algorithms](../../user-guides/compute-to-data/make-a-boss-c2d-algorithm.md) +* [How to publish a compute-to-data algorithm](../../user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md) +* [How to publish a dataset for compute to data](../../user-guides/compute-to-data/publish-a-c2d-data-nft.md) + +### Developer Guides + +* [How to use compute to data with ocean.js](../ocean.js/cod-asset.md) +* [How to use compute to data with ocean.py](../ocean.py/compute-flow.md) + +### Infrastructure Deployment Guides + +* [Minikube Environment](../../infrastructure/compute-to-data-minikube.md) +* [Private docker registry](../../infrastructure/compute-to-data-docker-registry.md) diff --git a/developers/compute-to-data/compute-options.md b/developers/compute-to-data/compute-options.md new file mode 100644 index 00000000..ec097945 --- /dev/null +++ b/developers/compute-to-data/compute-options.md @@ -0,0 +1,169 @@ +--- +title: Compute Options +section: developers +description: Specification of compute options for assets in Ocean Protocol. +--- + +# Compute Options + +### Compute Options + +An asset categorized as a `compute type` incorporates additional attributes under the `compute object`. + +These attributes are specifically relevant to assets that fall within the compute category and are not required for assets classified under the `access type`. However, if an asset is designated as `compute`, it is essential to include these attributes to provide comprehensive information about the compute service associated with the asset. + +
AttributeTypeDescription
allowRawAlgorithm*booleanIf true, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be false by default in all implementations.
allowNetworkAccess*booleanIf true, the algorithm job will have network access.
publisherTrustedAlgorithmPublishers*Array of stringIf not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed.
publisherTrustedAlgorithms*Array of publisherTrustedAlgorithmsIf not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below).
+ +\* Required + +### Trusted Algorithms + +The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm permissions. It controls which algorithms can be used for computation. If not defined, any published algorithm is allowed. If the array is empty, no algorithms are allowed. However, if the array is not empty, only algorithms published by the defined publishers are permitted. + +The structure of each object within the `publisherTrustedAlgorithms` array is as follows: + +
AttributeTypeDescription
didstringThe DID of the algorithm which is trusted by the publisher.
filesChecksumstringHash of algorithm's files (as string).
containerSectionChecksumstringHash of algorithm's image details (as string).
+ +To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If the algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc) + +To produce `containerSectionChecksum`: + +{% code overflow="wrap" %} +```js +sha256(algorithm_ddo.metadata.algorithm.container.entrypoint + algorithm_ddo.metadata.algorithm.container.checksum); +``` +{% endcode %} + +
+ +Compute Options Example + +Example: + +```json +{ + "services": [ + { + "id": "1", + "type": "access", + "files": "0x044736da6dae39889ff570c34540f24e5e084f...", + "name": "Download service", + "description": "Download service", + "datatokenAddress": "0x123", + "serviceEndpoint": "https://myprovider.com", + "timeout": 0 + }, + { + "id": "2", + "type": "compute", + "files": "0x6dd05e0edb460623c843a263291ebe757c1eb3...", + "name": "Compute service", + "description": "Compute service", + "datatokenAddress": "0x124", + "serviceEndpoint": "https://myprovider.com", + "timeout": 0, + "compute": { + "allowRawAlgorithm": false, + "allowNetworkAccess": true, + "publisherTrustedAlgorithmPublishers": ["0x234", "0x235"], + "publisherTrustedAlgorithms": [ + { + "did": "did:op:123", + "filesChecksum": "100", + "containerSectionChecksum": "200" + }, + { + "did": "did:op:124", + "filesChecksum": "110", + "containerSectionChecksum": "210" + } + ] + } + } + ] +} +``` + +
+ +### Consumer Parameters + +Sometimes, the asset needs additional input data before downloading or running a Compute-to-Data job. Examples: + +* The publisher needs to know the sampling interval before the buyer downloads it. Suppose the dataset URL is `https://example.com/mydata`. The publisher defines a field called `sampling` and asks the buyer to enter a value. This parameter is then added to the URL of the published dataset as query parameters: `https://example.com/mydata?sampling=10`. +* An algorithm that needs to know the number of iterations it should perform. In this case, the algorithm publisher defines a field called `iterations`. The buyer needs to enter a value for the `iterations` parameter. Later, this value is stored in a specific location in the Compute-to-Data pod for the algorithm to read and use it. + +The `consumerParameters` is an array of objects. Each object defines a field and has the following structure: + +
AttributeTypeDescription
name*stringThe parameter name (this is sent as HTTP param or key towards algo)
type*stringThe field type (text, number, boolean, select)
label*stringThe field label which is displayed
required*booleanIf customer input for this field is mandatory.
description*stringThe field description.
default*string, number, or booleanThe field default value. For select types, string key of default option.
optionsArray of optionFor select types, a list of options.
+ +\* **Required** + +Each `option` is an `object` containing a single key: value pair where the key is the option name, and the value is the option value. + +
+ +Consumer Parameters Example + +```json +[ + { + "name": "hometown", + "type": "text", + "label": "Hometown", + "required": true, + "description": "What is your hometown?", + "default": "Nowhere" + }, + { + "name": "age", + "type": "number", + "label": "Age", + "required": false, + "description": "Please fill your age", + "default": 0 + }, + { + "name": "developer", + "type": "boolean", + "label": "Developer", + "required": false, + "description": "Are you a developer?", + "default": false + }, + { + "name": "languagePreference", + "type": "select", + "label": "Language", + "required": false, + "description": "Do you like NodeJs or Python", + "default": "nodejs", + "options": [ + { + "nodejs": "I love NodeJs" + }, + { + "python": "I love Python" + } + ] + } +] +``` + +
+ +Algorithms will have access to a JSON file located at `/data/inputs/algoCustomData.json`, which contains the `keys/values` input data required. Example: + +
+ +Key Value Example + +
{ 
+    "hometown": "São Paulo",
+    "age": 10, 
+    "developer": true, 
+    "languagePreference": "nodejs" 
+}
+
+ +
diff --git a/developers/compute-to-data/compute-to-data-algorithms.md b/developers/compute-to-data/compute-to-data-algorithms.md new file mode 100644 index 00000000..f9d14c64 --- /dev/null +++ b/developers/compute-to-data/compute-to-data-algorithms.md @@ -0,0 +1,285 @@ +--- +title: Writing Algorithms for Compute to Data +description: >- + Learn how to write algorithms for use in Ocean Protocol's Compute-to-Data + feature. +--- + +In the Ocean Protocol stack, algorithms are recognized as distinct asset types, alongside datasets. When it comes to Compute-to-Data, an algorithm comprises the following key components: + +* **Algorithm Code**: The algorithm code refers to the specific instructions and logic that define the computational steps to be executed on a dataset. It encapsulates the algorithms' functionalities, calculations, and transformations. +* **Docker Image**: A Docker image plays a crucial role in encapsulating the algorithm code and its runtime dependencies. It consists of a base image, which provides the underlying environment for the algorithm, and a corresponding tag that identifies a specific version or variant of the image. +* **Entry Point**: The entry point serves as the starting point for the algorithm's execution within the compute environment. It defines the initial actions to be performed when the algorithm is invoked, such as loading necessary libraries, setting up configurations, or calling specific functions. + +Collectively, these components form the foundation of an algorithm in the context of Compute-to-Data. + +### Environment + +When creating an algorithm asset in Ocean Protocol, it is essential to include the additional algorithm object in its metadata service. This algorithm object plays a crucial role in defining the Docker container environment associated with the algorithm. By specifying the necessary details within the algorithm object, such as the base image, tags, runtime configurations, and dependencies, the metadata service ensures that the algorithm asset is properly configured for execution within a Docker container. + +
+ +Environment Object Example + +{% code overflow="wrap" %} +```json +{ "algorithm": { "container": { "entrypoint": "node $ALGO", "image": "node", "tag": "latest" } } } +``` +{% endcode %} + +
+ +
VariableUsage
imageThe Docker image name the algorithm will run with.
tagThe Docker image tag that you are going to use.
entrypointThe Docker entrypoint. $ALGO is a macro that gets replaced inside the compute job, depending where your algorithm code is downloaded.
+ +Define your entry point according to your dependencies. E.g. if you have multiple versions of Python installed, use the appropriate command `python3.6 $ALGO`. + +#### What Docker container should I use? + +There are plenty of Docker containers that work out of the box. However, if you have custom dependencies, you may want to configure your own Docker Image. To do so, create a Dockerfile with the appropriate instructions for dependency management and publish the container, e.g. using Dockerhub. + +We also collect some [example images](https://github.com/oceanprotocol/algo_dockers) which you can also view in Dockerhub. + +When publishing an algorithm through the [Ocean Market](https://market.oceanprotocol.com), these properties can be set via the publish UI. + +
+ +Environment Examples + +Run an algorithm written in JavaScript/Node.js, based on Node.js v14: + +```json +{ + "algorithm": { + "container": { + "entrypoint": "node $ALGO", + "image": "node", + "tag": "14" + } + } +} +``` + +Run an algorithm written in Python, based on Python v3.9: + +```json +{ + "algorithm": { + "container": { + "entrypoint": "python3.9 $ALGO", + "image": "python", + "tag": "3.9.4-alpine3.13" + } + } +} +``` + +
+ +#### Data Storage + +As part of a compute job, every algorithm runs in a K8s pod with these volumes mounted: + +| Path | Permissions | Usage | +| --------------- | ----------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `/data/inputs` | read | Storage for input data sets, accessible only to the algorithm running in the pod. Contents will be the files themselves, inside indexed folders e.g. `/data/inputs/{did}/{service_id}`. | +| `/data/ddos` | read | Storage for all DDOs involved in compute job (input data set + algorithm). Contents will json files containing the DDO structure. | +| `/data/outputs` | read/write | Storage for all of the algorithm's output files. They are uploaded on some form of cloud storage, and URLs are sent back to the consumer. | +| `/data/logs/` | read/write | All algorithm output (such as `print`, `console.log`, etc.) is stored in a file located in this folder. They are stored and sent to the consumer as well. | + +Please note that when using local Providers or Metatata Caches, the ddos might not be correctly transferred into c2d, but inputs are still available. If your algorithm relies on contents from the DDO json structure, make sure to use a public Provider and Metadata Cache (Aquarius instance). + +#### Environment variables available to algorithms + +For every algorithm pod, the Compute to Data environment provides the following environment variables: + +
VariableUsage
DIDSAn array of DID strings containing the input datasets.
TRANSFORMATION_DIDThe DID of the algorithm.
+ +
+ +Example: JavaScript/Node.js + +The following is a simple JavaScript/Node.js algorithm, doing a line count for ALL input datasets. The algorithm is not using any environment variables, but instead it's scanning the `/data/inputs` folder. + +```js +const fs = require('fs') + +const inputFolder = '/data/inputs' +const outputFolder = '/data/outputs' + +async function countrows(file) { + console.log('Start counting for ' + file) + const fileBuffer = fs.readFileSync(file) + const toString = fileBuffer.toString() + const splitLines = toString.split('\n') + const rows = splitLines.length - 1 + fs.appendFileSync(outputFolder + '/output.log', file + ',' + rows + '\r\n') + console.log('Finished. We have ' + rows + ' lines') +} + +async function processfolder(folder) { + const files = fs.readdirSync(folder) + + for (const i = 0; i < files.length; i++) { + const file = files[i] + const fullpath = folder + '/' + file + if (fs.statSync(fullpath).isDirectory()) { + await processfolder(fullpath) + } else { + await countrows(fullpath) + } + } +} + +processfolder(inputFolder) +``` + +This snippet will create and expose the following files as compute job results to the consumer: + +* `/data/outputs/output.log` +* `/data/logs/algo.log` + +To run this, use the following container object: + +```json +{ + "algorithm": { + "container": { + "entrypoint": "node $ALGO", + "image": "node", + "tag": "12" + } + } +} +``` + +
+ +
+ +Example: Python + +A more advanced line counting in Python, which relies on environment variables and constructs a job object, containing all the input files & DDOs + +```python +import pandas as pd +import numpy as np +import os +import time +import json + +def get_job_details(): + """Reads in metadata information about assets used by the algo""" + job = dict() + job['dids'] = json.loads(os.getenv('DIDS', None)) + job['metadata'] = dict() + job['files'] = dict() + job['algo'] = dict() + job['secret'] = os.getenv('secret', None) + algo_did = os.getenv('TRANSFORMATION_DID', None) + if job['dids'] is not None: + for did in job['dids']: + # get the ddo from disk + filename = '/data/ddos/' + did + print(f'Reading json from {filename}') + with open(filename) as json_file: + ddo = json.load(json_file) + # search for metadata service + for service in ddo['service']: + if service['type'] == 'metadata': + job['files'][did] = list() + index = 0 + for file in service['attributes']['main']['files']: + job['files'][did].append( + '/data/inputs/' + did + '/' + str(index)) + index = index + 1 + if algo_did is not None: + job['algo']['did'] = algo_did + job['algo']['ddo_path'] = '/data/ddos/' + algo_did + return job + + +def line_counter(job_details): + """Executes the line counter based on inputs""" + print('Starting compute job with the following input information:') + print(json.dumps(job_details, sort_keys=True, indent=4)) + + """ Now, count the lines of the first file in first did """ + first_did = job_details['dids'][0] + filename = job_details['files'][first_did][0] + non_blank_count = 0 + with open(filename) as infp: + for line in infp: + if line.strip(): + non_blank_count += 1 + print ('number of non-blank lines found %d' % non_blank_count) + """ Print that number to output to generate algo output""" + f = open("/data/outputs/result", "w") + f.write(str(non_blank_count)) + f.close() + + +if __name__ == '__main__': + line_counter(get_job_details()) + +``` + +To run this algorithm, use the following `container` object: + +```json +{ + "algorithm": { + "container": { + "entrypoint": "python3.6 $ALGO", + "image": "oceanprotocol/algo_dockers", + "tag": "python-sql" + } + } +} +``` + +
+ +#### Algorithm Metadata + +An asset of type `algorithm` has additional attributes under `metadata.algorithm`, describing the algorithm and the Docker environment it is supposed to be run under. + +
AttributeTypeDescription
languagestringLanguage used to implement the software.
versionstringVersion of the software preferably in SemVer notation. E.g. 1.0.0.
consumerParametersConsumer ParametersAn object that defines required consumer input before running the algorithm
container*containerObject describing the Docker container image. See below
+ +\* Required + +The `container` object has the following attributes defining the Docker image for running the algorithm: + +
AttributeTypeDescription
entrypoint*stringThe command to execute, or script to run inside the Docker image.
image*stringName of the Docker image.
tag*stringTag of the Docker image.
checksum*stringDigest of the Docker image. (ie: sha256:xxxxx)
+ +\* Required + +
+ +Algorithm Metadata Example + +{% code overflow="wrap" %} +```json +{ + "metadata": { + "created": "2020-11-15T12:27:48Z", + "updated": "2021-05-17T21:58:02Z", + "description": "Sample description", + "name": "Sample algorithm asset", + "type": "algorithm", + "author": "OPF", + "license": "https://market.oceanprotocol.com/terms", + "algorithm": { "language": "Node.js", "version": "1.0.0", + "container": { + "entrypoint": "node $ALGO", + "image": "ubuntu", + "tag": "latest", + "checksum": "sha256:44e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550" + }, + "consumerParameters": {} + } + } +} +``` +{% endcode %} + +
diff --git a/developers/compute-to-data/compute-to-data-architecture.md b/developers/compute-to-data/compute-to-data-architecture.md new file mode 100644 index 00000000..e95085d3 --- /dev/null +++ b/developers/compute-to-data/compute-to-data-architecture.md @@ -0,0 +1,113 @@ +--- +title: Compute-to-Data +description: Architecture overview +--- + +# Architecture + +Compute-to-Data (C2D) is a cutting-edge data processing paradigm that enables secure and privacy-preserving computation on sensitive datasets. + +In the C2D workflow, the following steps are performed: + +1. The consumer initiates a compute-to-data job by selecting the desired data asset and algorithm, and then, the orders are validated via the dApp used. +2. A dedicated and isolated execution pod is created for the C2D job. +3. The execution pod loads the specified algorithm into its environment. +4. The execution pod securely loads the selected dataset for processing. +5. The algorithm is executed on the loaded dataset within the isolated execution pod. +6. The results and logs generated by the algorithm are securely returned to the user. +7. The execution pod deletes the dataset, algorithm, and itself to ensure data privacy and security. + +

Compute architecture overview

+ +The interaction between the Consumer and the Provider follows a specific workflow. To initiate the process, the Consumer contacts the Provider by invoking the `start(did, algorithm, additionalDIDs)` function with parameters such as the data identifier (DID), algorithm, and additional DIDs if required. Upon receiving this request, the Provider generates a unique job identifier (`XXXX`) and returns it to the Consumer. The Provider then assumes the responsibility of overseeing the remaining steps. + +Throughout the computation process, the Consumer has the ability to check the status of the job by making a query to the Provider using the `getJobDetails(XXXX)` function, providing the job identifier (`XXXX`) as a reference. + +{% hint style="info" %} +You have the option to initiate a compute job using one or more data assets. You can explore this functionality by utilizing the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries. +{% endhint %} + +Now, let's delve into the inner workings of the Provider. Initially, it verifies whether the Consumer has sent the appropriate datatokens to gain access to the desired data. Once validated, the Provider interacts with the Operator-Service, a microservice responsible for coordinating the job execution. The Provider submits a request to the Operator-Service, which subsequently forwards the request to the Operator-Engine, the actual compute system in operation. + +The Operator-Engine, equipped with functionalities like running Kubernetes compute jobs, carries out the necessary computations as per the requirements. Throughout the computation process, the Operator-Engine informs the Operator-Service of the job's progress. Finally, when the job reaches completion, the Operator-Engine signals the Operator-Service, ensuring that the Provider receives notification of the job's successful conclusion. + +Here are the actors/components: + +* Consumers - The end users who need to use some computing services offered by the same Publisher as the data Publisher. +* Operator-Service - Micro-service that is handling the compute requests. +* Operator-Engine - The computing systems where the compute will be executed. +* Kubernetes - a K8 cluster + +Before the flow can begin, these pre-conditions must be met: + +* The Asset DDO has a `compute` service. +* The Asset DDO compute service must permit algorithms to run on it. +* The Asset DDO must specify an Ocean Provider endpoint exposed by the Publisher. + +### Access Control using Ocean Provider + +Similar to the `access service`, the `compute service` within Ocean Protocol relies on the [Ocean Provider](../provider/README.md), which is a crucial component managed by the asset Publishers. The role of the Ocean Provider is to facilitate interactions with users and handle the fundamental aspects of a Publisher's infrastructure, enabling seamless integration into the Ocean Protocol ecosystem. It serves as the primary interface for direct interaction with the infrastructure where the data is located. + +The [Ocean Provider](../provider/README.md) encompasses the necessary credentials to establish secure and authorized interactions with the underlying infrastructure. Initially, this infrastructure may be hosted in cloud providers, although it also has the flexibility to extend to on-premise environments if required. By encompassing the necessary credentials, the Ocean Provider ensures the smooth and controlled access to the infrastructure, allowing Publishers to effectively leverage the compute service within Ocean Protocol. + +### Operator Service + +The **Operator Service** is a micro-service in charge of managing the workflow executing requests. + +The main responsibilities are: + +* Expose an HTTP API allowing for the execution of data access and compute endpoints. +* Interact with the infrastructure (cloud/on-premise) using the Publisher's credentials. +* Start/stop/execute computing instances with the algorithms provided by users. +* Retrieve the logs generated during executions. + +Typically the Operator Service is integrated from Ocean Provider, but can be called independently of it. + +The Operator Service is in charge of establishing the communication with the K8s cluster, allowing it to: + +* Register new compute jobs +* List the current compute jobs +* Get a detailed result for a given job +* Stop a running job + +The Operator Service doesn't provide any storage capability, all the state is stored directly in the K8s cluster. + +### Operator Engine + +The **Operator Engine** is in charge of orchestrating the compute infrastructure using Kubernetes as backend where each compute job runs in an isolated [Kubernetes Pod](https://kubernetes.io/docs/concepts/workloads/pods/). Typically the Operator Engine retrieves the workflows created by the Operator Service in Kubernetes, and manage the infrastructure necessary to complete the execution of the compute workflows. + +The Operator Engine is in charge of retrieving all the workflows registered in a K8s cluster, allowing to: + +* Orchestrate the flow of the execution +* Start the configuration pod in charge of download the workflow dependencies (datasets and algorithms) +* Start the pod including the algorithm to execute +* Start the publishing pod that publish the new assets created in the Ocean Protocol network. +* The Operator Engine doesn't provide any storage capability, all the state is stored directly in the K8s cluster. + +### Pod Configuration + +The Pod-Configuration repository works hand in hand with the Operator Engine, playing a vital role in the initialization phase of a job. It carries out essential functions that establish the environment for job execution. + +At the core of the Pod-Configuration is a node.js script that dynamically manages the setup process when a job begins within the operator-engine. Its primary responsibility revolves around fetching and preparing the required assets and files, ensuring a smooth and seamless execution of the job. By meticulously handling the environment configuration, the Pod-Configuration script guarantees that all necessary components are in place, setting the stage for a successful job execution. + +1. **Fetching Dataset Assets**: It fetches the files corresponding to datasets and saves them in the location `/data/inputs/DID/`. The files are named based on their array index ranging from 0 to X, depending on the total number of files associated with the dataset. +2. **Fetching Algorithm Files**: The script then retrieves the algorithm files and stores them in the `/data/transformations/` directory. The first file is named 'algorithm', and the subsequent files are indexed from 1 to X, based on the number of files present for the algorithm. +3. **Fetching DDOS**: Additionally, the Pod-Configuration fetches Decentralized Document Oriented Storage (DDOS) and saves them to the disk at the location `/data/ddos/`. +4. **Error Handling**: In case of any provisioning failures, whether during data fetching or algorithm processing, the script updates the job status in a PostgreSQL database, and logs the relevant error messages. + +Upon the successful completion of its tasks, the Pod-Configuration gracefully concludes its operations and sends a signal to the operator-engine, prompting the initiation of the algorithm pod for subsequent steps. This repository serves as a fundamental component in ensuring the seamless processing of jobs by efficiently managing assets, algorithm files, and addressing potential provisioning errors. By effectively handling these crucial aspects, the Pod-Configuration establishes a solid foundation for smooth job execution and enables the efficient progression of the overall workflow. + +### Pod Publishing + +Pod Publishing is a command-line utility that seamlessly integrates with the Operator Service and Operator Engine within a Kubernetes-based compute infrastructure. It serves as a versatile tool for efficient processing, logging, and uploading workflow outputs. By working in tandem with the Operator Service and Operator Engine, Pod Publishing streamlines the workflow management process, enabling easy and reliable handling of output data generated during computation tasks. Whether it's processing complex datasets or logging crucial information, Pod Publishing simplifies these tasks and enhances the overall efficiency of the compute infrastructure. + +The primary functionality of Pod Publishing can be divided into three key areas: + +1. **Interaction with Operator Service**: Pod Publishing uploads the outputs of compute workflows initiated by the Operator Service to a designated AWS S3 bucket or the InterPlanetary File System (IPFS). It logs all processing steps and updates a PostgreSQL database. +2. **Role in Publishing Pod**: Within the compute infrastructure orchestrated by the Operator Engine on Kubernetes (K8s), Pod Publishing is integral to the Publishing Pod. The Publishing Pod handles the creation of new assets in the Ocean Protocol network after a workflow execution. +3. **Workflow Outputs Management**: Pod Publishing manages the storage of workflow outputs. Depending on configuration, it interacts with IPFS or AWS S3, and logs the processing steps. + +{% hint style="info" %} +* Pod Publishing does not provide storage capabilities; all state information is stored directly in the K8s cluster or the respective data storage solution (AWS S3 or IPFS). +* The utility works in close coordination with the Operator Service and Operator Engine, but does not have standalone functionality. +{% endhint %} diff --git a/developers/compute-to-data/compute-to-data-datasets-algorithms.md b/developers/compute-to-data/compute-to-data-datasets-algorithms.md new file mode 100644 index 00000000..3d1a2f17 --- /dev/null +++ b/developers/compute-to-data/compute-to-data-datasets-algorithms.md @@ -0,0 +1,27 @@ +--- +title: Compute-to-Data +description: Datasets and Algorithms +--- + +# Datasets & Algorithms + +### Datasets & Algorithms + +Compute-to-Data introduces a paradigm where datasets remain securely within the premises of the data holder, ensuring strict data privacy and control. Only authorized algorithms are granted access to operate on these datasets, subject to specific conditions, within a secure and isolated environment. In this context, algorithms are treated as valuable assets, comparable to datasets, and can be priced accordingly. This approach enables data holders to maintain control over their sensitive data while allowing for valuable computations to be performed on them, fostering a balanced and secure data ecosystem. + +To define the accessibility of algorithms, their classification as either public or private can be specified by setting the `attributes.main.type` value in the Decentralized Data Object (DDO): + +* `"access"` - public. The algorithm can be downloaded, given appropriate datatoken. +* `"compute"` - private. The algorithm is only available to use as part of a compute job without any way to download it. The Algorithm must be published on the same Ocean Provider as the dataset it's targeted to run on. + +This flexibility allows for fine-grained control over algorithm usage, ensuring data privacy and enabling fair pricing mechanisms within the Compute-to-Data framework. + +For each dataset, Publishers have the flexibility to define permission levels for algorithms to execute on their datasets, offering granular control over data access. + +There are several options available for publishers to configure these permissions: + +* allow selected algorithms, referenced by their DID +* allow all algorithms published within a network or marketplace +* allow raw algorithms, for advanced use cases circumventing algorithm as an asset type, but most prone to data escape + +All implementations default to `private`, meaning that no algorithms are allowed to run on a compute dataset upon publishing. This precautionary measure helps prevent data leakage by thwarting rogue algorithms that could be designed to extract all data from a dataset. By establishing private permissions as the default setting, publishers ensure a robust level of protection for their data assets and mitigate the risk of unauthorized data access. diff --git a/building-with-ocean/compute-to-data/README.md b/developers/compute-to-data/compute-to-data.md similarity index 93% rename from building-with-ocean/compute-to-data/README.md rename to developers/compute-to-data/compute-to-data.md index fce9f52a..d3bfff50 100644 --- a/building-with-ocean/compute-to-data/README.md +++ b/developers/compute-to-data/compute-to-data.md @@ -25,7 +25,7 @@ The most basic scenario for a Publisher is to provide access to the datasets the * [Compute-to-Data architecture](compute-to-data-architecture.md) * [Tutorial: Writing Algorithms](compute-to-data-algorithms.md) -* [Tutorial: Set Up a Compute-to-Data Environment](compute-to-data-minikube.md) +* [Tutorial: Set Up a Compute-to-Data Environment](../../infrastructure/compute-to-data-minikube.md) * [Use Compute-to-Data in Ocean Market](https://blog.oceanprotocol.com/compute-to-data-is-now-available-in-ocean-market-58868be52ef7) * [Build ML models via Ocean Market or Python](https://medium.com/ravenprotocol/machine-learning-series-using-logistic-regression-for-classification-in-oceans-compute-to-data-18df49b6b165) * [Compute-to-Data Python Quickstart](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/c2d-flow.md) diff --git a/building-with-ocean/compute-to-data/user-defined-parameters.md b/developers/compute-to-data/user-defined-parameters.md similarity index 87% rename from building-with-ocean/compute-to-data/user-defined-parameters.md rename to developers/compute-to-data/user-defined-parameters.md index c7f510d3..4ae86f47 100644 --- a/building-with-ocean/compute-to-data/user-defined-parameters.md +++ b/developers/compute-to-data/user-defined-parameters.md @@ -12,8 +12,8 @@ Ocean Protocol allows dataset buyers to provide custom parameters that can be us There 2 are types of parameters that asset publishers can support: -* User defined parameters -* Algorithm custom parameters +- User defined parameters +- Algorithm custom parameters ### Publish a dataset that uses custom parameters @@ -25,8 +25,8 @@ For example, if the publisher has published an URL `https://example.com` which s Suppose the publisher defines the following 2 parameters: -* `location`: A string indicating region code -* `type`: A string indicating the type of weather data. It can be temperature/humidity/pressure. +- `location`: A string indicating region code +- `type`: A string indicating the type of weather data. It can be temperature/humidity/pressure. Suppose the buyer wants to download the temperature data in the region code `XYZ`. While downloading the data, the buyer enters the desired parameter values using ocean.py. @@ -63,9 +63,9 @@ def serve_content(): The publisher now must provide the file URL as `https://example.org` while publishing the asset, as shown in the below image. -![](../../.gitbook/assets/compute-to-data-parameters-publish-dataset.png) +![Compute to data parameters](../../.gitbook/assets/c2d/compute-to-data-parameters-publish-dataset.png) -For a complete tutorial on publishing asset using Ocean Marketplace read [our guide on publishing with Ocean Market](../../using-ocean-market/marketplace-publish-data-asset.md). +For a complete tutorial on publishing asset using Ocean Marketplace read [our guide on publishing with Ocean Market](../../user-guides//publish-data-nfts.md). ### Publish an algorithm that uses custom parameters @@ -105,23 +105,22 @@ with open(os.path.join(output_dir, "result"), "w") as f: The publisher now must provide the file URL as `https://example.org` while publishing the algorithm asset, as shown in the below image. -![](../../.gitbook/assets/compute-to-data-parameters-publish-algorithm.png) +![Publish algorithm asset](../../.gitbook/assets/c2d/compute-to-data-parameters-publish-algorithm.png) -For a complete tutorial on publishing asset using Ocean Marketplace read [our guide on publishing with Ocean Market](../../using-ocean-market/marketplace-publish-data-asset.md). +For a complete tutorial on publishing asset using Ocean Marketplace read [our guide on publishing with Ocean Market](../../user-guides/publish-data-nfts.md). ### Starting compute job with custom parameters In this example, the buyer wants to run the algorithm with certain parameters on a selected dataset. The code snippet below shows how the buyer can start the compute job with custom parameter values. Before embarking on this tutorial you should familiarize yourself with how to: -* Search for a dataset using [Ocean market](https://market.oceanprotocol.com/) or [Aquarius API](../../api-references/aquarius-rest-api.md) -* [Allow an algorithm to run on the dataset](https://github.com/oceanprotocol/ocean.py/blob/6eb068df338abc7376430cc5ba7fe2d381508328/READMEs/c2d-flow.md#5-alice-allows-the-algorithm-for-c2d-for-that-data-asset) -* Buy datatokens using [Ocean market](https://market.oceanprotocol.com/) or [ocean.py](https://github.com/oceanprotocol/ocean.py) -* [Set up ocean.py](../using-ocean-libraries/configuration.md) - -For configuring ocean.py/ocean.js, please refer this [guide](../using-ocean-libraries/configuration.md). Copy the below code snippet to a file locally after completing required configurations and execute the script. +- Search for a dataset using [Ocean market](https://market.oceanprotocol.com/) or [Aquarius](../aquarius/README.md) +- [Allow an algorithm to run on the dataset](https://github.com/oceanprotocol/ocean.py/blob/6eb068df338abc7376430cc5ba7fe2d381508328/READMEs/c2d-flow.md#5-alice-allows-the-algorithm-for-c2d-for-that-data-asset) +- Buy datatokens using [Ocean market](https://market.oceanprotocol.com/) or [ocean.py](https://github.com/oceanprotocol/ocean.py) +- [Set up ocean.py](../ocean.py/install.md) {% tabs %} {% tab title="Python" %} +
# Import dependencies
 from config import web3_wallet, ocean, config, web3_wallet
 from ocean_lib.models.compute_input import ComputeInput
@@ -182,5 +181,6 @@ Execute script
 ```bash
 python start_compute.py
 ```
+
 {% endtab %}
 {% endtabs %}
diff --git a/developers/contracts/README.md b/developers/contracts/README.md
new file mode 100644
index 00000000..3c911f83
--- /dev/null
+++ b/developers/contracts/README.md
@@ -0,0 +1,47 @@
+---
+description: Empowering the Decentralised Data Economy
+---
+
+# Contracts
+
+The [V4 release](https://blog.oceanprotocol.com/ocean-v4-overview-1ccd4a7ce150) of Ocean Protocol introduces a comprehensive and enhanced suite of [smart contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts) that serve as the backbone of the decentralized data economy. These contracts facilitate secure, transparent, and efficient interactions among data providers, consumers, and ecosystem participants. With the introduction of V4 contracts, Ocean Protocol propels itself forward, delivering substantial functionality, scalability, and flexibility advancements.
+
+The V4 smart contracts have been deployed across multiple [networks](../../discover/networks/README.md) and are readily accessible through the GitHub [repository](https://github.com/oceanprotocol/contracts/tree/main/contracts). The V4 introduces significant enhancements that encompass the following key **features**:
+
+### [**Data NFTs**](data-nfts.md) **for Enhanced Data IP Management**
+
+In Ocean V3, the publication of a dataset involved deploying an ERC20 "datatoken" contract along with relevant [metadata](../metadata.md). This process allowed the dataset publisher to claim copyright or exclusive rights to the underlying Intellectual Property (IP). Upon obtaining 1.0 ERC20 datatokens for a particular dataset, users were granted a license to consume that dataset, utilizing the Ocean infrastructure by spending the obtained datatokens.
+
+However, Ocean V3 faced limitations in terms of flexibility. It lacked support for different licenses associated with the same base IP, such as 1-day versus 1-month access, and the transferability of the base IP was not possible. Additionally, the ERC20 datatoken template was hardcoded, restricting customization options.
+
+Ocean V4 effectively tackles these challenges by adopting **ERC721** **tokens** to explicitly represent the **base IP** as "data NFTs" (Non-Fungible Tokens). [**Data NFT**](data-nfts.md) owners can now deploy ERC20 "datatoken" contracts specific to their data NFTs, with each datatoken contract offering its own distinct licensing terms.
+
+By utilizing ERC721 tokens, Ocean V4 **grants data creators greater flexibility and control over licensing arrangements**. The introduction of data NFTs allows for the representation of [base IP](../../discover/glossary.md) and the creation of customized ERC20 datatoken contracts tailored to individual licensing requirements.
+
+

Ocean Protocol V4 Smart Contracts

+ + +### [**Community monetization**](../community-monetization.md), to help the community create sustainable businesses. + +Ocean V4 brings forth enhanced opportunities for marketplace operators, creating a conducive environment for the emergence of a thriving market of **third-party Providers**. + +With Ocean V4, marketplace operators can unlock additional benefits. Firstly, the V4 smart contracts empower marketplace operators to collect [fees](fees.md) not only during **data consumption** but also through **fixed-rate exchanges**. This expanded revenue model allows operators to derive more value from the ecosystem. Moreover, in Ocean V4, the marketplace operator has the authority to determine the fee value, providing them with **increased control** over their pricing strategies. + +In addition to empowering marketplace operators, Ocean V4 facilitates the participation of third-party [Providers](../provider/README.md) who can offer compute services in exchange for a fee. This paves the way for the development of a diverse marketplace of Providers. This model supports both centralized trusted providers, where data publishers and consumers have established trust relationships, as well as trustless providers that leverage decentralization or other privacy-preserving mechanisms. + +By enabling a marketplace of [Providers](../provider/README.md), Ocean V4 fosters competition, innovation, and choice. It creates an ecosystem where various providers can offer their compute services, catering to the diverse needs of data publishers and consumers. Whether based on trust or privacy-preserving mechanisms, this expansion in provider options enhances the overall functionality and accessibility of the Ocean Protocol ecosystem. + +Key features of the V4 smart contracts: + +* Base IP is now represented by a data [NFT](data-nfts.md), from which a data publisher can create multiple ERC20s [datatokens](datatokens.md) representing different types of access for the same dataset. +* Interoperability with the NFT ecosystem (and DeFi & DAO tools). +* Allows new data [NFT & datatoken templates](datatoken-templates.md), for flexibility and future-proofing. +* Besides base data IP, you can use data NFTs to **implement comments & ratings, verifiable claims, identity credentials, and social media posts**. They can point to parent data NFTs, enabling the nesting of comments on comments, or replies to tweets. All on-chain, GDPR-compliant, easily searched, with js & py drivers 🤯 +* Introduce an advanced [Fee](fees.md) structure both for Marketplace and Provider runners 💰 +* [Roles](roles.md) Administration: there are now multiple roles for a more flexible administration both at [NFT](data-nfts.md) and [ERC20](datatokens.md) levels 👥 +* When the NFT is transferred, it auto-updates all permissions, e.g. who receives payment, or who can mint derivative ERC20 datatokens. +* Key-value store in the NFT contract: NFT contract can be used to store custom key-value pairs (ERC725Y standard) enabling applications like soulbound tokens and Sybil protection approaches 🗃️ +* Multiple NFT template support: the Factory can deploy different types of NFT templates 🖼️ +* Multiple datatoken template support: the Factory can deploy different types of [datatoken templates](datatoken-templates.md). + +In the forthcoming pages, you will discover more information about the key features. If you have any inquiries or find anything missing, feel free to contact the core team on [Discord](https://discord.com/invite/TnXjkR5) 💬 diff --git a/developers/contracts/architecture.md b/developers/contracts/architecture.md new file mode 100644 index 00000000..1eb6c012 --- /dev/null +++ b/developers/contracts/architecture.md @@ -0,0 +1,52 @@ +--- +title: Architecture Overview +description: Ocean Protocol Architecture Adventure! +--- + +# Architecture Overview + +Embark on an exploration of the innovative realm of Ocean Protocol, where data flows seamlessly and AI achieves new heights. Dive into the intricately layered architecture that converges data and services, fostering a harmonious collaboration. Let us delve deep and uncover the profound design of Ocean Protocol.🐬 + +

Overview of the Ocean Protocol Architecture

+ +### Layer 1: The Foundational Blockchain Layer + +At the core of Ocean Protocol lies the robust [Blockchain Layer](../contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. + +The [smart contracts](../contracts/README.md) are deployed on the Ethereum mainnet and other compatible [networks](../../discover/networks/README.md). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](../contracts/README.md) section. + +### Layer 2: The Empowering Middle Layer + +Above the smart contracts, you'll find essential [libraries](architecture.md#libraries) employed by applications within the Ocean Protocol ecosystem, the [middleware components](architecture.md#middleware-components), and [Compute-to-Data](architecture.md#compute-to-data). + +#### Libraries + +These libraries include [Ocean.js](../ocean.js/README.md), a JavaScript library, and [Ocean.py](../ocean.py/README.md), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol. + +1. [Ocean.js](../ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. +2. [Ocean.py](../ocean.py/README.md): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../../data-science/README.md) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data. + +#### Middleware components + +Additionally, in supporting the discovery process, middleware components come into play: + +1. [Aquarius](../aquarius/README.md): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery. +2. [Provider](../provider/README.md): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](../ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants. +3. [Subgraph](../subgraph/README.md): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information. + +#### Compute-to-Data + +[Compute-to-Data](../compute-to-data/README.md) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols. + +### Layer 3: The Accessible Application Layer + +Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures. + +Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized. + +### Layer 4: The Friendly Wallets + +At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../../discover/wallets/README.md), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../../discover/wallets/README.md). + + +With the layers of the architecture clearly delineated, the stage is set for a comprehensive exploration of their underlying logic and intricate design. By examining each individually, we can gain a deeper understanding of their unique characteristics and functionalities. diff --git a/developers/contracts/data-nfts.md b/developers/contracts/data-nfts.md new file mode 100644 index 00000000..42f2a04b --- /dev/null +++ b/developers/contracts/data-nfts.md @@ -0,0 +1,46 @@ +--- +description: ERC721 data NFTs represent holding the copyright/base IP of a data asset. +--- + +# Data NFTs + +A non-fungible token stored on the blockchain represents a unique asset. NFTs can represent images, videos, digital art, or any piece of information. NFTs can be traded, and allow the transfer of copyright/base IP. [EIP-721](https://eips.ethereum.org/EIPS/eip-721) defines an interface for handling NFTs on EVM-compatible blockchains. The creator of the NFT can deploy a new contract on Ethereum or any Blockchain supporting NFT-related interface and also, transfer the ownership of copyright/base IP through transfer transactions. + +## What is a Data NFT? + +A data NFT represents the **copyright** (or **exclusive license** against copyright) for a data asset on the blockchain — we call this the “**base IP**”. When a user publishes a dataset in Ocean V4, they create a new NFT as part of the process. This data NFT is proof of your claim of base IP. Assuming a valid claim, you are entitled to the revenue from that asset, just like a title deed gives you the right to receive rent. + +The data NFT smart contract holds metadata about the data asset, stores roles like “who can mint datatokens” or “who controls fees”, and an open-ended key-value store to enable custom fields. + +If you have the private key that controls the NFT, you own that NFT. The owner has the claim on the base IP and is the default recipient of any revenue. They can also assign another account to receive revenue. This enables the publisher to sell their base IP and the revenues that come with it. When the Data NFT is transferred to another user, all the information about roles and where the revenue should be sent is reset. The default recipient of the revenue is the new owner of the data NFT. + +### Key Features and Functionality + +Data NFTs offer several key features and functionalities within the Ocean Protocol ecosystem: + +1. **Ownership and Transferability**: Data NFTs establish ownership rights, enabling data owners to transfer or sell their data assets to other participants in the network. +2. **Metadata and Descriptions**: Each Data NFT contains metadata that describes the associated dataset, providing essential information such as title, description, creator, and licensing terms. +3. **Access Control and Permissions**: Data NFTs can include access control mechanisms, allowing data owners to define who can access and utilize their datasets, as well as the conditions and terms of usage. +4. **Interoperability**: Data NFTs conform to the ERC721 token standard, ensuring interoperability across various platforms, wallets, and marketplaces within the Ethereum ecosystem. + +#### Data NFTs Open Up New Possibilities + +By tokenizing data assets into Data NFTs, data owners can establish clear ownership rights and enable seamless transferability of the associated datasets. Data NFTs serve as digital certificates of authenticity, enabling data consumers to trust the origin and integrity of the data they access. + +With data NFTs, you are able to take advantage of the broader NFT ecosystem and all the tools and possibilities that come with it. As a first example, many leading crypto wallets have first-class support for NFTs, allowing you to manage data NFTs from those wallets. Or, you can post your data NFT for sale on a popular NFT marketplace like [OpenSea](https://www.opensea.io/) or [Rarible](https://www.rarible.com/). As a final example, we’re excited to see [data NFTs linked to physical items via WiseKey chips](https://www.globenewswire.com/news-release/2021/05/19/2232106/0/en/WISeKey-partners-with-Ocean-Protocol-to-launch-TrustedNFT-io-a-decentralized-marketplace-for-objects-of-value-designed-to-empower-artists-creators-and-collectors-with-a-unique-solu.html). + +### Implementation in Ocean Protocol + +We have implemented data NFTs using the [ERC721 standard](https://erc721.org/). Ocean Protocol defines the [ERC721Factory](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/ERC721Factory.sol) contract, allowing **Base IP holders** to create their ERC721 contract instances on any supported networks. The deployed contract stores Metadata, ownership, sub-license information, and permissions. The contract creator can also create and mint ERC20 token instances for sub-licensing the **Base IP**. + +ERC721 tokens are non-fungible, and thus cannot be used for automatic price discovery like ERC20 tokens. ERC721 and ERC20 combined together can be used for sub-licensing. Ocean Protocol's [ERC721Template](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC721Template.sol) solves this problem by using ERC721 for tokenizing the **Base IP** and tokenizing sub-licenses by using ERC20. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**. + +Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. The data NFTs can be easily managed from any NFT marketplace like [OpenSea](https://opensea.io/). + +

Data NFT on Open Sea

+ +Ocean V4’s data NFT factory can deploy different types of data NFTs based on a variety of templates. Some templates could be tuned for data unions, others for DeFi, and others yet for enterprise use cases. + +Something else that we’re super excited about in our data NFTs is a cutting-edge standard called [ERC725](https://github.com/ERC725Alliance/erc725/blob/main/docs/ERC-725.md) being driven by our friends at [Lukso](https://lukso.network/about). The ERC725y feature enables the NFT owner (or a user with the “store updater” role) to input and update information in a key-value store. These values can be viewed externally by anyone. + +ERC725y is incredibly flexible and can be used to store any string; you could use it for anything from additional metadata to encrypted values. This helps future-proof the data NFTs and ensure that they are suitable for a wide range of projects that have not been launched yet. As you can imagine, the inclusion of ERC725y has huge potential and we look forward to seeing the different ways people end up using it. If you’re interested in using this, take a look at [EIP725](https://eips.ethereum.org/EIPS/eip-725#erc725y). diff --git a/developers/contracts/datanft-and-datatoken.md b/developers/contracts/datanft-and-datatoken.md new file mode 100644 index 00000000..ed0eb505 --- /dev/null +++ b/developers/contracts/datanft-and-datatoken.md @@ -0,0 +1,65 @@ +--- +title: Data NFTs and Datatokens +description: >- + In Ocean Protocol, ERC721 data NFTs represent holding the copyright/base IP of + a data asset, and ERC20 datatokens represent licenses to access the assets. +--- + +# Data NFTs and Datatokens + +

Data NFTs and Datatokens

+ +In summary: A [**data NFT**](data-nfts.md) serves as a **representation of the copyright** or exclusive license for a data asset on the blockchain, known as the [**base IP**](../../discover/glossary.md). **Datatokens**, on the other hand, function as a crucial mechanism for **decentralized access** to data assets. + +For a specific data NFT, multiple ERC20 datatoken contracts can exist. Here's the main concept: Owning 1.0 datatokens grants you the ability to **consume** the corresponding dataset. Essentially, it acts as a **sub-license** from the [base IP](../../discover/glossary.md), allowing you to utilize the dataset according to the specified license terms (when provided by the publisher). License terms can be established with a "good default" or by the Data NFT owner. + +The choice to employ the ERC20 fungible token standard for datatokens is logical, as licenses themselves are fungible. This standard ensures compatibility and interoperability of datatokens with ERC20-based wallets, decentralized exchanges (DEXes), decentralized autonomous organizations (DAOs), and other relevant platforms. Datatokens can be transferred, acquired through marketplaces or exchanges, distributed via airdrops, and more. + +You can [publish](../../discover/glossary.md) a data NFT initially with no ERC20 datatoken contracts. This means you simply aren’t ready to grant access to your data asset yet (sub-license it). Then, you can publish one or more ERC20 datatoken contracts against the data NFT. One datatoken contract might grant consume rights for **1 day**, another for **1 week**, etc. Each different datatoken contract is for **different** license terms. + +For a more comprehensive exploration of intellectual property and its practical connections with ERC721 and ERC20, you can read the blog post written by [Trent McConaghy](http://www.trent.st/), co-founder of Ocean Protocol. It delves into the subject matter in detail and provides valuable insights. + +{% embed url="https://blog.oceanprotocol.com/nfts-ip-1-practical-connections-of-erc721-with-intellectual-property-dc216aaf005d" %} + +**DataNFTs and Datatokens example:** + +* In step 1, Alice **publishes** her dataset with Ocean: this means deploying an ERC721 data NFT contract (claiming copyright/base IP), then an ERC20 datatoken contract (license against base IP). Then Alice mints an ERC20 datatokens +* In step 2, Alice **transfers** 1.0 of them to Bob's wallet; now he has a license to be able to download that dataset. + +

Data NFT & Datatokens flow

+ +What happends under the hood? 🤔 + +Publishing with V4 smart contracts in Ocean Protocol involves a well-defined process that streamlines the publishing of data assets. It provides a systematic approach to ensure efficient management and exchange of data within the Ocean Protocol ecosystem. By leveraging smart contracts, publishers can securely create and deploy data NFTs, allowing them to tokenize and represent their data assets. Additionally, the flexibility of V4 smart contracts enables publishers to define pricing schemas for datatokens, facilitating fair and transparent transactions. This publishing framework empowers data publishers by providing them with greater control and access to a global marketplace, while ensuring trust, immutability, and traceability of their published data assets. + +The V4 smart contracts publishing includes the following steps: + +1. The data publisher initiates the creation of a new data NFT. +2. The data NFT factory deploys the template for the new data NFT. +3. The data NFT template creates the data NFT contract. +4. The address of the newly created data NFT is available to the data publisher. +5. The publisher is now able to create datatokens with pricing schema for the data NFT. To accomplish this, the publisher initiates a call to the data NFT contract, specifically requesting the creation of a new datatoken with a fixed rate schema. +6. The data NFT contract deploys a new datatoken and a fixed rate schema by interacting with the datatoken template contract. +7. The datatoken contract is created (Datatoken-1 contract). +8. The datatoken template generates a new fixed rate schema for Datatoken-1. +9. The address of Datatoken-1 is now available to the data publisher. +10. Optionally, the publisher can create a new datatoken (Datatoken-2) with a free price schema. +11. The data NFT contract interacts with the Datatoken Template contract to create a new datatoken and a dispenser schema. +12. The datatoken templated deploys the Datatoken-2 contract. +13. The datatoken templated creates a dispenser for the Datatoken-2 contract. + +Below is a visual representation that illustrates the flow: + +

Data NFT & Datatokens flow

+ +We have some awesome hands-on experience when it comes to publishing a data NFT and minting datatokens. + +* Publish using [ocean.py](../ocean.py/publish-flow.md) +* Publish using [ocean.js](../ocean.js/publish.md) + +### Other References + +* [Data & NFTs 1: Practical Connections of ERC721 with Intellectual Property](https://blog.oceanprotocol.com/nfts-ip-1-practical-connections-of-erc721-with-intellectual-property-dc216aaf005d) +* [Data & NFTs 2: Leveraging ERC20 Fungibility](https://blog.oceanprotocol.com/nfts-ip-2-leveraging-erc20-fungibility-bcee162290e3) +* [Data & NFTs 3: Combining ERC721 & ERC20](https://blog.oceanprotocol.com/nfts-ip-3-combining-erc721-erc20-b69ea659115e) +* [Fungibility sightings in NFTs](https://blog.oceanprotocol.com/on-difficult-to-explain-fungibility-sightings-in-nfts-26bc18620f70) diff --git a/developers/contracts/datatoken-templates.md b/developers/contracts/datatoken-templates.md new file mode 100644 index 00000000..2403835f --- /dev/null +++ b/developers/contracts/datatoken-templates.md @@ -0,0 +1,102 @@ +--- +description: Discover all about the extensible & flexible smart contract templates. +--- + +# Datatoken Templates + +Each [data NFT](data-nfts.md) or [datatoken](datatokens.md) within Ocean Protocol is generated from pre-defined [template](https://github.com/oceanprotocol/contracts/tree/main/contracts/templates) contracts. The _**templateId**_ parameter specifies the template used for creating a data NFT or datatoken, which can be set during the creation process. The _**templateId**_ is stored within the smart contract code and can be accessed using the [_**getId**_](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/interfaces/IERC20Template.sol#L134)_**()**_ function. + +```solidity + +it("#getId - should return templateId", async () => { + const templateId = 1; + assert((await erc20Token.getId()) == templateId); + }); + +``` + +Currently, Ocean Protocol supports **1** [template](https://github.com/oceanprotocol/contracts/blob/main/contracts/templates/ERC721Template.sol) type for data NFTs and **2** template variants for datatokens: the [**regular template**](https://github.com/oceanprotocol/contracts/blob/main/contracts/templates/ERC20Template.sol) and the [**enterprise template**](https://github.com/oceanprotocol/contracts/blob/main/contracts/templates/ERC20TemplateEnterprise.sol). While these templates share the same interfaces, they differ in their underlying implementation and may offer additional features. + +The details regarding currently supported **datatoken templates** are as follows: + +### **Regular template** + +The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a [`MINTER`](roles.md#minter) role, making the supply of datatoken variable. This template is assigned _**`templateId =`**_`1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol). + +### **Enterprise template** + +The enterprise template has additional functions apart from methods in the ERC20 interface. This additional feature allows access to the service by paying in the basetoken instead of the datatoken. Internally, the smart contract handles the conversion of basetoken to datatoken, initiating an order to access the service, and minting/burning the datatoken. The total supply of the datatoken effectively remains 0 in the case of the enterprise template. This template is assigned _**`templateId =`**_`2` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20TemplateEnterprise.sol). + +#### Set the template + +When you're creating an ERC20 datatoken, you can specify the desired template by passing on the template index. + +{% tabs %} +{% tab title="Ocean.js" %} +To specify the datatoken template via ocean.js, you need to customize the [DatatokenCreateParams](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/%40types/Datatoken.ts#L3) with your desired `templateIndex`. + +The default template used is 1. + +```typescript +export interface DatatokenCreateParams { + templateIndex: number + minter: string + paymentCollector: string + mpFeeAddress: string + feeToken: string + feeAmount: string + cap: string + name?: string + symbol?: string +} +``` +{% endtab %} + +{% tab title="Ocean.py" %} +To specify the datatoken template via ocean.py, you need to customize the [DatatokenArguments](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/datatoken_base.py#L64) with your desired template\_index. + +The default template used is 1. + +```python +class DatatokenArguments: + def __init__( + self, + name: Optional[str] = "Datatoken 1", + symbol: Optional[str] = "DT1", + template_index: Optional[int] = 1, + minter: Optional[str] = None, + fee_manager: Optional[str] = None, + publish_market_order_fees: Optional = None, + bytess: Optional[List[bytes]] = None, + services: Optional[list] = None, + files: Optional[List[FilesType]] = None, + consumer_parameters: Optional[List[Dict[str, Any]]] = None, + cap: Optional[int] = None, + ): +``` +{% endtab %} +{% endtabs %} + +{% hint style="info" %} +By default, all assets published through the Ocean Market use the Enterprise Template. +{% endhint %} + +#### Retrieve the template + +To identify the template used for a specific asset, you can easily retrieve this information using the network explorer. Here are the steps to follow: + +1. Visit the network explorer where the asset was published. +2. Search for the datatoken address :mag: +3. Once you have located the datatoken address, click on the contract tab to access more details. +4. Within the contract details, we can identify and determine the template used for the asset. + + + +We like making things easy :sunglasses: so here is an even easier way to retrieve the info for [this](https://market.oceanprotocol.com/asset/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6) asset published in the Ocean Market: + +{% embed url="https://app.arcade.software/share/wxBPSc42eSYUiawSY8rC" fullWidth="false" %} +{% endembed %} + +{% hint style="info" %} +_It's important to note that Ocean Protocol may introduce new templates to support additional variations of data NFTs and datatokens in the future._ +{% endhint %} diff --git a/developers/contracts/datatokens.md b/developers/contracts/datatokens.md new file mode 100644 index 00000000..749bb0b6 --- /dev/null +++ b/developers/contracts/datatokens.md @@ -0,0 +1,29 @@ +--- +description: ERC20 datatokens represent licenses to access the assets. +--- + +# Datatokens + +Fungible tokens are a type of digital asset that are identical and interchangeable with each other. Each unit of a fungible token holds the same value and can be exchanged on a one-to-one basis. This means that one unit of a fungible token is indistinguishable from another unit of the same token. Examples of fungible tokens include cryptocurrencies like Bitcoin (BTC) and Ethereum (ETH), where each unit of the token is equivalent to any other unit of the same token. Fungible tokens are widely used for transactions, trading, and as a means of representing value within blockchain-based ecosystems. + +## What is a Datatoken? + +Datatokens are fundamental within Ocean Protocol, representing a key mechanism to **access** data assets in a decentralized manner. In simple terms, a datatoken is an **ERC20-compliant token** that serves as **access control** for a data/service represented by a [data NFT](data-nfts.md). + +Datatokens enable data assets to be tokenized, allowing them to be easily traded, shared, and accessed within the Ocean Protocol ecosystem. Each datatoken is associated with a particular data asset, and its value is derived from the underlying dataset's availability, scarcity, and demand. + +By using datatokens, data owners can retain ownership and control over their data while still enabling others to access and utilize it based on predefined license terms. These license terms define the conditions under which the data can be accessed, used, and potentially shared by data consumers. + +### Understanding Datatokens and Licenses + +Each datatoken represents a [**sub-license**](../../discover/glossary.md) from the base intellectual property (IP) owner, enabling users to access and consume the associated dataset. The license terms can be set by the data NFT owner or default to a predefined "good default" license. The fungible nature of ERC20 tokens aligns perfectly with the fungibility of licenses, facilitating seamless exchangeability and interoperability between different datatokens. + +By adopting the ERC20 standard for datatokens, Ocean Protocol ensures compatibility and interoperability with a wide array of ERC20-based wallets, [decentralized exchanges (DEXes)](https://blog.oceanprotocol.com/ocean-datatokens-will-be-tradeable-on-decentrs-dex-41715a166a1f), decentralized autonomous organizations (DAOs), and other blockchain-based platforms. This standardized approach enables users to effortlessly transfer, purchase, exchange, or receive datatokens through various means such as marketplaces, exchanges, or airdrops. + +### Utilizing Datatokens + +Data owners and consumers can engage with datatokens in numerous ways. Datatokens can be acquired through transfers or obtained by purchasing them on dedicated marketplaces or exchanges. Once in possession of the datatokens, users gain access to the corresponding dataset, enabling them to utilize the data within the boundaries set by the associated license terms. + +Once someone has generated datatokens, they can be used in any ERC20 exchange, centralized or decentralized. In addition, Ocean provides a convenient default marketplace that is tuned for data: **Ocean Market**. It’s a vendor-neutral reference data marketplace for use by the Ocean community. + +You can publish a [data NFT](data-nfts.md) initially with no ERC20 datatoken contracts. This means you simply aren’t ready to grant access to your data asset yet (sub-license it). Then, you can publish one or more ERC20 datatoken contracts against the data NFT. One datatoken contract might grant consume rights for 1 day, another for 1 week, etc. Each different datatoken contract is for different license terms. diff --git a/developers/contracts/fees.md b/developers/contracts/fees.md new file mode 100644 index 00000000..16a18065 --- /dev/null +++ b/developers/contracts/fees.md @@ -0,0 +1,109 @@ +--- +description: The Ocean Protocol defines various fees for creating a sustainability loop. +--- + +# Fees + +One transaction may have fees going to several entities, such as the market where the asset was published, or the Ocean Community. Here are all of them: + +* Publish Market: the market where the asset was published. +* Consume Market: the market where the asset was consumed. +* Provider: the entity facilitating asset consumption. May serve up data, run compute, etc. +* Ocean Community: Ocean Community Wallet. + +### Publish fee + +When you publish an asset on the Ocean marketplace, there are currently no charges for publishing fees :tada: + +However, if you're building a custom marketplace, you have the flexibility to include a publishing fee by adding an extra transaction in the publish flow. Depending on your marketplace's unique use case, you, as the marketplace owner, can decide whether or not to implement this fee. We believe in giving you the freedom to tailor your marketplace to your specific needs and preferences. + +| Value in Ocean Market | Value in Other Markets | +| --------------------- | ------------------------------ | +| 0% | Customizable in market config. | + +### Consume(aka. Order) fee + +When a user exchanges a [datatoken](datatokens.md) for the privilege of downloading an asset or initiating a compute job that utilizes the asset, consume fees come into play. These fees are associated with accessing an asset and include: + +1. **Publisher Market** Consumption Fee + * Defined during the ERC20 [creation](https://github.com/oceanprotocol/contracts/blob/b937a12b50dc4bdb7a6901c33e5c8fa136697df7/contracts/templates/ERC721Template.sol#L334). + * Defined as Address, Token, Amount. The amount is an absolute value(not a percentage). + * A marketplace can charge a specified amount per order. + * Eg: A market can set a fixed fee of 10 USDT per order, no matter what pricing schemas are used (fixedrate with ETH, BTC, dispenser, etc). +2. **Consume Market** Consumption Fee + * A market can specify what fee it wants on the order function. +3. **Provider Consumption** Fees + * Defined by the [Provider](../provider/README.md) for any consumption. + * Expressed in: Address, Token, Amount (absolute), Timeout. + * You can retrieve them when calling the initialize endpoint. + * Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser). +4. **Ocean Community** Fee + * Ocean's smart contracts collect **Ocean Community fees** during order operations. These fees are reinvested in community projects and distributed to the veOcean holders through Data Farming. + * This fee is set at the [smart contract](https://github.com/oceanprotocol/contracts/blob/main/contracts/communityFee/OPFCommunityFeeCollector.sol) level. + * It can be updated by Ocean Protocol Foundation. See details in our [smart contracts](https://github.com/oceanprotocol/contracts/blob/main/contracts/pools/FactoryRouter.sol#L391-L407). + +
+ +Update Ocean Community Fees + +The Ocean Protocol Foundation can [change](https://github.com/oceanprotocol/contracts/blob/main/contracts/pools/FactoryRouter.sol#L391-L407) the Ocean community fees. + +```solidity +/** +* @dev updateOPCFee + * Updates OP Community Fees + * @param _newSwapOceanFee Amount charged for swapping with ocean approved tokens + * @param _newSwapNonOceanFee Amount charged for swapping with non ocean approved tokens + * @param _newConsumeFee Amount charged from consumeFees + * @param _newProviderFee Amount charged for providerFees + */ +function updateOPCFee(uint256 _newSwapOceanFee, uint256 _newSwapNonOceanFee, + uint256 _newConsumeFee, uint256 _newProviderFee) external onlyRouterOwner { + + swapOceanFee = _newSwapOceanFee; + swapNonOceanFee = _newSwapNonOceanFee; + consumeFee = _newConsumeFee; + providerFee = _newProviderFee; + emit OPCFeeChanged(msg.sender, _newSwapOceanFee, _newSwapNonOceanFee, _newConsumeFee, _newProviderFee); +} +``` + +
+ +Each of these fees plays a role in ensuring fair compensation and supporting the Ocean community. + +| Fee | Value in Ocean Market | Value in Other Markets | +| ---------------- | :-------------------: | -------------------------------------------------------- | +| Publisher Market | 0 | Customizable in market config. | +| Consume Market | 0 | Customizable in market config. | +| Provider | 0 | Customizable. See details [below](fees.md#provider-fee). | +| Ocean Community | 0.03 DT | 0.03 DT | + +### Provider fee + +[Providers](../provider/README.md) facilitate data consumption, initiate compute jobs, encrypt and decrypt DDOs, and verify user access to specific data assets or services. + +Provider fees serve as [compensation](../community-monetization.md#3.-running-your-own-provider) to the individuals or organizations operating their own provider instances when users request assets. + +* Defined by the [Provider](../provider/README.md) for any consumption. +* Expressed in: Address, Token, Amount (absolute), Timeout. +* You can retrieve them when calling the initialize endpoint. +* These fees can be set as a **fixed amount** rather than a percentage. +* Providers have the flexibility to specify the token in which the fees must be paid, which can differ from the token used in the consuming market. +* Provider fees can be utilized to charge for [computing](../compute-to-data/README.md) resources. Consumers can select the desired payment amount based on the compute resources required to execute an algorithm within the [Compute-to-Data](../compute-to-data/README.md) environment, aligning with their specific needs. +* Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser). +* Eg: A provider may impose a fixed fee of 15 DAI to reserve compute resources for 1 hour, enabling the initiation of compute jobs. + +These fees play a crucial role in incentivizing individuals and organizations to operate provider instances and charge consumers based on their resource usage. By doing so, they contribute to the growth and sustainability of the Ocean Protocol ecosystem. + +| Type | OPF Provider | 3rd party Provider | +| ---------------------------------------------------------------------------- | :--------------------: | --------------------------------------------------------------------- | +| Token to charge the fee: `PROVIDER_FEE_TOKEN` | OCEAN |

Customizable by the Provider Owner.
E.g. USDC

| +| Download: `COST_PER_MB` | 0 | Customizable in the Provider `envvars`. | +|

Compute: COST_PER_MIN
Environment: 1 CPU, 60 secs max

| 0 | Customizable in the OperatorEngine `envvars`. | +|

Compute: COST_PER_MIN
Environment: 1 CPU, 1 hour max

| 1.0 OCEAN/min | Customizable in the OperatorEngine `envvars`. | +| Ocean Community | 0% of the Provider fee | 0% of the Provider fee. | + +{% hint style="info" %} +Stay up-to-date with the latest information! The values within the system are regularly updated. We recommend verifying the most recent values directly from the [contracts](https://github.com/oceanprotocol/contracts) and the [market](https://github.com/oceanprotocol/market). +{% endhint %} diff --git a/developers/contracts/pricing-schemas.md b/developers/contracts/pricing-schemas.md new file mode 100644 index 00000000..5ad3f76d --- /dev/null +++ b/developers/contracts/pricing-schemas.md @@ -0,0 +1,189 @@ +--- +description: Choose the revenue model during asset publishing. +--- + +# Pricing Schemas + +Ocean Protocol offers you flexible and customizable pricing options to monetize your valuable data assets. You have two main pricing models to choose from: + +* [Fixed pricing](pricing-schemas.md#fixed-pricing) +* [Free pricing](pricing-schemas.md#free-pricing) + +These models are designed to cater to your specific needs and ensure a smooth experience for data consumers. + +The price of an asset is determined by the number of tokens (this can be Ocean Tokens or any ERC20 Token configured when published the asset) a buyer must pay to access the data. When users pay the tokens, they get a _datatoken_ in their wallets, a tokenized representation of the access right stored on the blockchain. To read more about datatoken and data NFT click [here](datanft-and-datatoken.md). + +To provide you with even greater flexibility in monetizing your data assets, Ocean Protocol allows you to customize the pricing schema by configuring your own ERC20 token when publishing the asset. This means that instead of using Ocean Tokens as the pricing currency, you can utilize your own token, aligning the pricing structure with your specific requirements and preferences. + +You can customised your token this way: + +{% tabs %} +{% tab title="Ocean Market" %} +```javascript +NEXT_PUBLIC_OCEAN_TOKEN_ADDRESS='0x00000' // YOUR TOKEN'S ADDRESS +``` +{% endtab %} + +{% tab title="Ocean.js" %} +
// https://github.com/oceanprotocol/ocean.js/blob/main/CodeExamples.md#61-publish-a-dataset-create-nft--datatoken-with-a-fixed-rate-exchange
+const freParams: FreCreationParams = {
+    fixedRateAddress: addresses.FixedPrice,
+    baseTokenAddress: addresses.Ocean, // you can customize this with any ERC20 token
+    owner: await publisherAccount.getAddress(),
+    marketFeeCollector: await publisherAccount.getAddress(),
+    baseTokenDecimals: 18,
+    datatokenDecimals: 18,
+    fixedRate: '1',
+    marketFee: '0.001',
+    allowedConsumer: ZERO_ADDRESS,
+    withMint: true
+}
+
+{% endtab %} + +{% tab title="Ocean.py" %} +```python +exchange_args = ExchangeArguments( + rate=to_wei(1), # you can customize this with any price + base_token_addr=OCEAN.address, # you can customize this with any ERC20 token + owner_addr=publisher_wallet.address, + publish_market_fee_collector=ZERO_ADDRESS, + publish_market_fee=0, + with_mint=True, + allowed_swapper=ZERO_ADDRESS, + full_info=False, + dt_decimals=datatoken.decimals() +) +``` +{% endtab %} +{% endtabs %} + +Furthermore, Ocean Protocol recognizes that different data assets may have distinct pricing needs. That's why the platform supports multiple pricing schemas, allowing you to implement various pricing models for different datasets or use cases. This flexibility ensures that you can tailor the pricing strategy to each specific asset, maximizing its value and potential for monetization. + +

Pricing Schemas

+ +### Fixed pricing + +With the fixed pricing model, you have the power to set a specific price for your data assets. This means that buyers interested in accessing your data will need to pay the designated amount of configured tokens. To make things even easier, Ocean automatically creates a special token called a "datatoken" behind the scenes. + +This datatoken represents the access right to your data, so buyers don't have to worry about the technical details. If you ever want to adjust the price of your dataset, you have the flexibility to do so whenever you need. + +The fixed pricing model relies on the [createNftWithErc20WithFixedRate](https://github.com/oceanprotocol/contracts/blob/main/contracts/ERC721Factory.sol#LL674C14-L674C45) in our smart contract, which securely stores the pricing information for assets published using this model. + +
+ +Create NFT with Fixed Rate Pricing + +```javascript +/** + * @dev createNftWithErc20WithFixedRate + * Creates a new NFT, then a ERC20, then a FixedRateExchange, all in one call + * Use this carefully, because if Fixed Rate creation fails, you are still going to pay a lot of gas + * @param _NftCreateData input data for NFT Creation + * @param _ErcCreateData input data for ERC20 Creation + * @param _FixedData input data for FixedRate Creation + */ +function createNftWithErc20WithFixedRate( +NftCreateData calldata _NftCreateData, +ErcCreateData calldata _ErcCreateData, +FixedData calldata _FixedData +) external nonReentrant returns (address erc721Address, address erc20Address, bytes32 exchangeId){ +//we are adding ourselfs as a ERC20 Deployer, because we need it in order to deploy the fixedrate +erc721Address = deployERC721Contract( + _NftCreateData.name, + _NftCreateData.symbol, + _NftCreateData.templateIndex, + address(this), + address(0), + _NftCreateData.tokenURI, + _NftCreateData.transferable, + _NftCreateData.owner); +erc20Address = IERC721Template(erc721Address).createERC20( + _ErcCreateData.templateIndex, + _ErcCreateData.strings, + _ErcCreateData.addresses, + _ErcCreateData.uints, + _ErcCreateData.bytess +); +exchangeId = IERC20Template(erc20Address).createFixedRate( + _FixedData.fixedPriceAddress, + _FixedData.addresses, + _FixedData.uints + ); +// remove our selfs from the erc20DeployerRole +IERC721Template(erc721Address).removeFromCreateERC20List(address(this)); +} +``` + +
+ +{% hint style="info" %} +There are two templates available: [ERC20Template](datatoken-templates.md#regular-template) and [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template). + +In the case of [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template), when you deploy a fixed rate exchange, the funds generated as revenue are automatically sent to the owner's address. The owner receives the revenue without any manual intervention. + +On the other hand, with [ERC20Template](datatoken-templates.md#regular-template), for a fixed rate exchange, the revenue is available at the fixed rate exchange level. The owner or the payment collector has the authority to manually retrieve the revenue. +{% endhint %} + +### Free pricing + +On the other hand, the free pricing model gives data consumers access to your asset without requiring them to make a direct payment. Users can freely access your data, with the only cost being the transaction fees associated with the blockchain network. + +In this model, datatokens are allocated to a dispenser smart contract, which dispenses data tokens to users at no charge when they access your asset. This is perfect if you want to make your data widely available and encourage collaboration. It's particularly suitable for individuals and organizations working in the public domain or for assets that need to comply with open-access licenses. + +The fixed pricing model relies on the [createNftWithErc20WithDispenser](https://github.com/oceanprotocol/contracts/blob/main/contracts/ERC721Factory.sol#LL713C14-L713C45) in our smart contract, which securely stores the pricing information for assets published using this model. + +
+ +Create NFT with Free Pricing + +```javascript +/** + * @dev createNftWithErc20WithDispenser + * Creates a new NFT, then a ERC20, then a Dispenser, all in one call + * Use this carefully + * @param _NftCreateData input data for NFT Creation + * @param _ErcCreateData input data for ERC20 Creation + * @param _DispenserData input data for Dispenser Creation + */ +function createNftWithErc20WithDispenser( + NftCreateData calldata _NftCreateData, + ErcCreateData calldata _ErcCreateData, + DispenserData calldata _DispenserData +) external nonReentrant returns (address erc721Address, address erc20Address){ + //we are adding ourselfs as a ERC20 Deployer, because we need it in order to deploy the fixedrate + erc721Address = deployERC721Contract( + _NftCreateData.name, + _NftCreateData.symbol, + _NftCreateData.templateIndex, + address(this), + address(0), + _NftCreateData.tokenURI, + _NftCreateData.transferable, + _NftCreateData.owner); + erc20Address = IERC721Template(erc721Address).createERC20( + _ErcCreateData.templateIndex, + _ErcCreateData.strings, + _ErcCreateData.addresses, + _ErcCreateData.uints, + _ErcCreateData.bytess + ); + IERC20Template(erc20Address).createDispenser( + _DispenserData.dispenserAddress, + _DispenserData.maxTokens, + _DispenserData.maxBalance, + _DispenserData.withMint, + _DispenserData.allowedSwapper + ); + // remove our selfs from the erc20DeployerRole + IERC721Template(erc721Address).removeFromCreateERC20List(address(this)); +} +``` + +
+ +To make the most of these pricing models, you can rely on user-friendly libraries such as [Ocean.js ](../ocean.js/README.md)and [Ocean.py](../ocean.py/README.md), specifically developed for interacting with Ocean Protocol. + +With Ocean.js, you can use the [createFRE() ](../ocean.js/publish.md)function to effortlessly deploy a data NFT (non-fungible token) and datatoken with a fixed-rate exchange pricing model. Similarly, in Ocean.py, the [create\_url\_asset()](../ocean.py/publish-flow.md#create-an-asset--pricing-schema-simultaneously) function allows you to create an asset with fixed pricing. These libraries simplify the process of interacting with Ocean Protocol, managing pricing, and handling asset creation. + +By taking advantage of Ocean Protocol's pricing options and leveraging the capabilities of [Ocean.js](../ocean.js/README.md) and [Ocean.py](../ocean.py/README.md) (or by using the [Market](../../user-guides/using-ocean-market.md)), you can effectively monetize your data assets while ensuring transparent and seamless access for data consumers. diff --git a/developers/contracts/revenue.md b/developers/contracts/revenue.md new file mode 100644 index 00000000..84c84572 --- /dev/null +++ b/developers/contracts/revenue.md @@ -0,0 +1,49 @@ +--- +description: Explore and manage the revenue generated from your data NFTs. +--- + +# Revenue + +Having a [data NFT](data-nfts.md) that generates revenue continuously, even when you're not actively involved, is an excellent source of income. This revenue stream allows you to earn consistently without actively dedicating your time and effort. Each time someone buys access to your NFT, you receive money, further enhancing the financial benefits. This steady income allows you to enjoy the rewards of your asset while minimizing the need for constant engagement:moneybag: + +

Make it rain

+ +By default, the revenue generated from a [data NFT](data-nfts.md) is directed to the [owner](roles.md#nft-owner) of the NFT. This arrangement automatically updates whenever the data NFT is transferred to a new owner. + +However, there are scenarios where you may prefer the revenue to be sent to a different account instead of the owner. This can be accomplished by designating a new payment collector. This feature becomes particularly beneficial when the data NFT is owned by an organization or enterprise rather than an individual. + +{% hint style="info" %} +There are two templates available: [ERC20Template](datatoken-templates.md#regular-template) and [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template). + +In the case of [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template), when you deploy a fixed rate exchange, the funds generated as revenue are automatically sent to the owner's address. The owner receives the revenue without any manual intervention. + +On the other hand, with [ERC20Template](datatoken-templates.md#regular-template), for a fixed rate exchange, the revenue is available at the fixed rate exchange level. The owner or the payment collector has the authority to manually retrieve the revenue. +{% endhint %} + +There are several methods available for establishing a new **payment collector**. You have the option to utilize the ERC20Template/ERC20TemplateEnterprise contract directly. Another approach is to leverage the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries. Alternatively, you can employ the network explorer associated with your asset. Lastly, you can directly set it up within the Ocean Market. + +Here are some examples of how to set up a new payment collector using the mentioned methods: + +1. Using [Ocean.js](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/contracts/Datatoken.ts#L393). + +```typescript +datatokenAddress = 'Your datatoken address' +paymentCollectorAddress = 'New payment collector address' + +await datatoken.setPaymentCollector(datatokenAddress, callerAddress, paymentCollectorAddress) +``` + +2. Using [Ocean.py](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/test/test_datatoken.py#L39). + +```python +datatokenAddress = 'Your datatoken address' +paymentCollectorAddress = 'New payment collector address' + +datatoken.setPaymentCollector(paymentCollectorAddress, {"from": publisher_wallet}) +``` + +3. Using the [Ocean Market](https://market.oceanprotocol.com/). + +Go to the asset detail page and then click on “Edit Asset” and then scroll down to the field called “Payment Collector Address”. Add the new Ethereum address in this field and then click “Submit“. Finally, you will then need to sign two transactions to finalize the update. + +

Update payment collector

diff --git a/developers/contracts/roles.md b/developers/contracts/roles.md new file mode 100644 index 00000000..2380c57b --- /dev/null +++ b/developers/contracts/roles.md @@ -0,0 +1,353 @@ +--- +title: Data NFTs and datatoken roles +description: >- + The permissions stored on chain in the contracts control the access to the + data NFT (ERC721) and datatoken (ERC20) smart contract functions. +--- + +# Roles + +The permissions governing access to the smart contract functions are stored within the [data NFT](data-nfts.md) (ERC721) smart contract. Both the [data NFT](data-nfts.md) (ERC721) and [datatoken](datatokens.md) (ERC20) smart contracts utilize this information to enforce restrictions on certain actions, limiting access to authorized users. The tables below outline the specific actions that are restricted and can only be accessed by allowed users. + +The [data NFT](data-nfts.md) serves as the foundational intellectual property (IP) for the asset, and all datatokens are inherently linked to the data NFT smart contract. This linkage has enabled the introduction of various exciting capabilities related to role administration. + +### NFT Owner + +The NFT owner is the owner of the base-IP and is therefore at the highest level. The NFT owner can perform any action or assign any role but crucially, the NFT owner is the only one who can assign the manager role. Upon deployment or transfer of the data NFT, the NFT owner is automatically added as a manager. The NFT owner is also the only role that can’t be assigned to multiple users — the only way to share this role is via multi-sig or a DAO. + +## Roles-NFT level + +

Roles at the data NFT level

+ +{% hint style="info" %} +With the exception of the NFT owner role, all other roles can be assigned to multiple users. +{% endhint %} + +There are several methods available to assign roles and permissions. One option is to utilize the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries that we provide. These libraries offer a streamlined approach for assigning roles and permissions programmatically. + +Alternatively, for a more straightforward solution that doesn't require coding, you can utilize the network explorer of your asset's network. By accessing the network explorer, you can directly interact with the contracts associated with your asset. Below, we provide a few examples to help guide you through the process. + +### Manager + +The ability to add or remove Managers is exclusive to the **NFT Owner**. If you are the NFT Owner and wish to add/remove a new manager, simply call the [addManager](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC721Template.sol#L426)/[removeManager](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC721Template.sol#L438) function within the ERC721Template contract. This function enables you to grant managerial permissions to the designated individual. + +
+ +Add/Remove Manager Contract functions + +```solidity +/** +* @dev addManager +* Only NFT Owner can add a new manager (Roles admin) +* There can be multiple minters +* @param _managerAddress new manager address +*/ + +function addManager(address _managerAddress) external onlyNFTOwner { + _addManager(_managerAddress); +} + +/** +* @dev removeManager +* Only NFT Owner can remove a manager (Roles admin) +* There can be multiple minters +* @param _managerAddress new manager address +*/ +function removeManager(address _managerAddress) external onlyNFTOwner { + _removeManager(_managerAddress); +} +``` + +
+ +The **manager** can assign or revoke three main roles (**deployer, metadata updater, and store updater**). The manager is also able to call any other contract (ERC725X implementation). + +{% embed url="https://app.arcade.software/share/qC8QpkLsFIQk3NxPzB8p" fullWidth="false" %} +{% endembed %} + +### Metadata Updater + +There is also a specific role for updating the metadata. The [Metadata](../metadata.md) updater has the ability to update the information about the data asset (title, description, sample data etc) that is displayed to the user on the asset detail page within the market. + +To add/remove a metadata updater, the manager can use the [addToMetadataList](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L164)/[removeFromMetadataList](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L183) functions from the ERC721RolesAddress. + +
+ +Add/Remove Metadata Updater Contract functions + +```solidity +/** +* @dev addToMetadataList +* Adds metadata role to an user. +* It can be called only by a manager +* @param _allowedAddress user address +*/ +function addToMetadataList(address _allowedAddress) public onlyManager { + _addToMetadataList(_allowedAddress); +} + + +/** +* @dev removeFromMetadataList +* Removes metadata role from an user. +* It can be called by a manager or by the same user, if he already has metadata role +* @param _allowedAddress user address +*/ +function removeFromMetadataList(address _allowedAddress) public { + if(permissions[msg.sender].manager == true || + (msg.sender == _allowedAddress && permissions[msg.sender].updateMetadata == true) + ){ + Roles storage user = permissions[_allowedAddress]; + user.updateMetadata = false; + emit RemovedFromMetadataList(_allowedAddress,msg.sender,block.timestamp,block.number); + _SafeRemoveFromAuth(_allowedAddress); + } + else{ + revert("ERC721RolesAddress: Not enough permissions to remove from metadata list"); + } +} +``` + +
+ +### Store Updater + +The store updater can store, remove or update any arbitrary key value using the ERC725Y implementation (at the ERC721 level). The use case for this role depends a lot on what data is being stored in the ERC725Y key-value pair — as mentioned above, this is highly flexible. + +To add/remove a store updater, the manager can use the [addTo725StoreList](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L61)/[removeFrom725StoreList](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L76) functions from the ERC721RolesAddress. + +
+ +Add/Remove Store Updater Contract functions + +```solidity +/** +* @dev addTo725StoreList +* Adds store role to an user. +* It can be called only by a manager +* @param _allowedAddress user address +*/ +function addTo725StoreList(address _allowedAddress) public onlyManager { + if(_allowedAddress != address(0)){ + Roles storage user = permissions[_allowedAddress]; + user.store = true; + _pushToAuth(_allowedAddress); + emit AddedTo725StoreList(_allowedAddress,msg.sender,block.timestamp,block.number); + } +} + +/** +* @dev removeFrom725StoreList +* Removes store role from an user. +* It can be called by a manager or by the same user, if he already has store role +* @param _allowedAddress user address +*/ +function removeFrom725StoreList(address _allowedAddress) public { + if(permissions[msg.sender].manager == true || + (msg.sender == _allowedAddress && permissions[msg.sender].store == true) + ){ + Roles storage user = permissions[_allowedAddress]; + user.store = false; + emit RemovedFrom725StoreList(_allowedAddress,msg.sender,block.timestamp,block.number); + _SafeRemoveFromAuth(_allowedAddress); + } + else{ + revert("ERC721RolesAddress: Not enough permissions to remove from 725StoreList"); + } +} +``` + +
+ +### ERC20 Deployer + +The Deployer has a bunch of privileges at the ERC20 datatoken level. They can deploy new datatokens with fixed price exchange, or free pricing. They can also update the ERC725Y key-value store and **assign** **roles** at the ERC20 level(datatoken level). + +To add/remove an ERC20 deployer, the manager can use the [addToCreateERC20List](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L111)/[removeFromCreateERC20List](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L129) functions from the ERC721RolesAddress. + +
+ +Add/Remove ERC20 Deployer Contract functions + +```solidity +/** +* @dev addToCreateERC20List +* Adds deployERC20 role to an user. +* It can be called only by a manager +* @param _allowedAddress user address +*/ +function addToCreateERC20List(address _allowedAddress) public onlyManager { + _addToCreateERC20List(_allowedAddress); +} + +/** +* @dev removeFromCreateERC20List +* Removes deployERC20 role from an user. +* It can be called by a manager or by the same user, if he already has deployERC20 role +* @param _allowedAddress user address +*/ +function removeFromCreateERC20List(address _allowedAddress) public { + if(permissions[msg.sender].manager == true || + (msg.sender == _allowedAddress && permissions[msg.sender].deployERC20 == true) + ){ + Roles storage user = permissions[_allowedAddress]; + user.deployERC20 = false; + emit RemovedFromCreateERC20List(_allowedAddress,msg.sender,block.timestamp,block.number); + _SafeRemoveFromAuth(_allowedAddress); + } + else{ + revert("ERC721RolesAddress: Not enough permissions to remove from ERC20List"); + } +} +``` + +
+ +{% hint style="info" %} +To assign/remove all the above roles(ERC20 Deployer, Metadata Updater, or Store Updater), the manager can use the [**addMultipleUsersToRoles**](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/utils/ERC721RolesAddress.sol#L268) function from the ERC721RolesAddress. +{% endhint %} + +
+ +Assign multiple roles at once Contract function + +```solidity +/** +* @dev addMultipleUsersToRoles +* Add multiple users to multiple roles +* @param addresses Array of addresses +* @param roles Array of coresponding roles +*/ +function addMultipleUsersToRoles(address[] memory addresses, RolesType[] memory roles) external onlyManager { + require(addresses.length == roles.length && roles.length>0 && roles.length<50, "Invalid array size"); + uint256 i; + for(i=0; i + +### Roles & permissions in data NFT (ERC721) smart contract + +
Action ↓ / Role →NFT OwnerManagerERC20 DeployerStore UpdaterMetadata Updater
Set token URI
Add manager
Remove manager
Clean permissions
Set base URI
Set Metadata state
Set Metadata
Create new datatoken
Executes any other smart contract
Set new key-value in store
+ +## Roles-datatokens level + +

Roles at the datatokens level

+ +### Minter + +The Minter has the ability to mint new datatokens, provided the limit has not been exceeded. + +To add/remove a minter, the ERC20 deployer can use the [addMinter](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC20Template.sol#L617)/[removeMinter](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC20Template.sol#L628) functions from the ERC20Template. + +
+ +Add/Remove Minter Contract functions + +```solidity +/** +* @dev addMinter +* Only ERC20Deployer (at 721 level) can update. +* There can be multiple minters +* @param _minter new minter address +*/ + +function addMinter(address _minter) external onlyERC20Deployer { + _addMinter(_minter); +} + +/** +* @dev removeMinter +* Only ERC20Deployer (at 721 level) can update. +* There can be multiple minters +* @param _minter minter address to remove +*/ + +function removeMinter(address _minter) external onlyERC20Deployer { + _removeMinter(_minter); +} +``` + +
+ +{% embed url="https://app.arcade.software/share/OHlwsPbf29S1PLh03FM7" fullWidth="false" %} +{% endembed %} + +### Fee Manager + +Finally, we also have a fee manager which has the ability to set a new fee collector — this is the account that will receive the datatokens when a data asset is consumed. If no fee collector account has been set, the **datatokens will be sent by default to the NFT Owner**. + +{% hint style="info" %} +The applicable fees (market and community fees) are automatically deducted from the datatokens that are received. +{% endhint %} + +To add/remove a fee manager, the ERC20 deployer can use the [addPaymentManager](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC20Template.sol#L639)/[removePaymentManager](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC20Template.sol#L653) functions from the ERC20Template. + +
+ +Add/Remove Fee Manager Contract functions + +```solidity +/** +* @dev addPaymentManager (can set who's going to collect fee when consuming orders) +* Only ERC20Deployer (at 721 level) can update. +* There can be multiple paymentCollectors +* @param _paymentManager new minter address +*/ +function addPaymentManager(address _paymentManager) external onlyERC20Deployer +{ + _addPaymentManager(_paymentManager); +} + +/** +* @dev removePaymentManager +* Only ERC20Deployer (at 721 level) can update. +* There can be multiple paymentManagers +* @param _paymentManager _paymentManager address to remove +*/ + +function removePaymentManager(address _paymentManager) external onlyERC20Deployer +{ + _removePaymentManager(_paymentManager); +} +``` + +
+ +{% hint style="info" %} +When the NFT ownership is transferred to another wallet address, all the roles and permissions and [cleared](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC721Template.sol#L511). + +
function cleanPermissions() external onlyNFTOwner {
+    _cleanPermissions();
+    //Make sure that owner still has permissions
+    _addManager(ownerOf(1));
+}   
+
+{% endhint %} + +### Roles & permission in datatoken (ERC20) smart contract + +
Action ↓ / Role →ERC20 DeployerMinterNFT ownerFee manager
Create Fixed Rate exchange
Create Dispenser
Add minter
Remove minter
Add fee manager
Remove fee manager
Set data
Clean permissions
Mint
Set fee collector
diff --git a/developers/ddo-specification.md b/developers/ddo-specification.md new file mode 100644 index 00000000..ac5417e5 --- /dev/null +++ b/developers/ddo-specification.md @@ -0,0 +1,588 @@ +--- +title: DDO +slug: /developers/ddo/ +section: developers +description: >- + Specification of decentralized identifiers for assets in Ocean Protocol using + the DDO standard. +--- + +# DDO Specification + +### DDO Schema - High Level + +The below diagram shows the high-level DDO schema depicting the content of each data structure and the relations between them. + +Please note that some data structures apply only on certain types of services or assets. + +```mermaid +--- +title: DDO High Level Diagram +--- +classDiagram + + class DDO{ + } + + class Metadata{ + } + class Credentials{ + } + + class AlgorithmMetadata["AlgorithmMetadata\n(for algorithm type)"] { + } + + class Container{ + } + class Service{ + } + class ConsumerParameters["Consumer\nParameters"]{ + } + class Compute{ + } +DDO "1" --> "1" Metadata +DDO "1" --> "0..n" Credentials +DDO "1" --> "1..*" Service + +Metadata "1" --> "0..1" AlgorithmMetadata +AlgorithmMetadata "1" --> "1..*" Container +AlgorithmMetadata "1" --> "1..*" ConsumerParameters + +Service "1" --> "0..n" Compute +Service "1" --> "0..n" ConsumerParameters +``` + +### Required Attributes + +A DDO in Ocean has these required attributes: + +
AttributeTypeDescription
@contextArray of stringContexts used for validation.
idstringComputed as sha256(address of ERC721 contract + chainId).
versionstringVersion information in SemVer notation referring to this DDO spec version, like 4.1.0.
chainIdnumberStores the chainId of the network the DDO was published to.
nftAddressstringNFT contract linked to this asset
metadataMetadataStores an object describing the asset.
servicesServicesStores an array of services defining access to the asset.
credentialsCredentialsDescribes the credentials needed to access a dataset in addition to the services definition.
+ +
+ +Full Enhanced DDO Example + +{% code overflow="wrap" %} +```json +{ + "@context": ["https://w3id.org/did/v1"], + "id": "did:op:ACce67694eD2848dd683c651Dab7Af823b7dd123", + "version": "4.1.0", + "chainId": 1, + "nftAddress": "0x123", + "metadata": { + "created": "2020-11-15T12:27:48Z", + "updated": "2021-05-17T21:58:02Z", + "description": "Sample description", + "name": "Sample asset", + "type": "dataset", + "author": "OPF", + "license": "https://market.oceanprotocol.com/terms" + }, + "services": [ + { + "id": "1", + "type": "access", + "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735", + "name": "Download service", + "description": "Download service", + "datatokenAddress": "0x123", + "serviceEndpoint": "https://myprovider.com", + "timeout": 0, + "consumerParameters": [ + { + "name": "surname", + "type": "text", + "label": "Name", + "required": true, + "default": "NoName", + "description": "Please fill your name" + }, + { + "name": "age", + "type": "number", + "label": "Age", + "required": false, + "default": 0, + "description": "Please fill your age" + } + ] + }, + { + "id": "2", + "type": "compute", + "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735", + "name": "Compute service", + "description": "Compute service", + "datatokenAddress": "0x124", + "serviceEndpoint": "https://myprovider.com", + "timeout": 3600, + "compute": { + "allowRawAlgorithm": false, + "allowNetworkAccess": true, + "publisherTrustedAlgorithmPublishers": ["0x234", "0x235"], + "publisherTrustedAlgorithms": [ + { + "did": "did:op:123", + "filesChecksum": "100", + "containerSectionChecksum": "200" + }, + { + "did": "did:op:124", + "filesChecksum": "110", + "containerSectionChecksum": "210" + } + ] + } + } + ], + "credentials": { + "allow": [ + { + "type": "address", + "values": ["0x123", "0x456"] + } + ], + "deny": [ + { + "type": "address", + "values": ["0x2222", "0x333"] + } + ] + }, + + "nft": { + "address": "0x123", + "name": "Ocean Protocol Asset v4", + "symbol": "OCEAN-A-v4", + "owner": "0x0000000", + "state": 0, + "created": "2000-10-31T01:30:00", + "tokenURI": "xxx" + }, + + "datatokens": [ + { + "address": "0x000000", + "name": "Datatoken 1", + "symbol": "DT-1", + "serviceId": "1" + }, + { + "address": "0x000001", + "name": "Datatoken 2", + "symbol": "DT-2", + "serviceId": "2" + } + ], + + "event": { + "tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c", + "block": 12831214, + "from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83", + "contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf", + "datetime": "2000-10-31T01:30:00" + }, + + "purgatory": { + "state": false + }, + + "stats": { + "orders": 4 + } +} +``` +{% endcode %} + +
+ +### Metadata + +This object holds information describing the actual asset. + +
AttributeTypeDescription
createdISO date/time stringContains the date of the creation of the dataset content in ISO 8601 format preferably with timezone designators, e.g. 2000-10-31T01:30:00Z.
updatedISO date/time stringContains the date of last update of the dataset content in ISO 8601 format preferably with timezone designators, e.g. 2000-10-31T01:30:00Z.
description*stringDetails of what the resource is. For a dataset, this attribute explains what the data represents and what it can be used for.
copyrightHolderstringThe party holding the legal copyright. Empty by default.
name*stringDescriptive name or title of the asset.
type*stringAsset type. Includes "dataset" (e.g. csv file), "algorithm" (e.g. Python script). Each type needs a different subset of metadata attributes.
author*stringName of the entity generating this data (e.g. Tfl, Disney Corp, etc.).
license*stringShort name referencing the license of the asset (e.g. Public Domain, CC-0, CC-BY, No License Specified, etc. ). If it's not specified, the following value will be added: "No License Specified".
linksArray of stringMapping of URL strings for data samples, or links to find out more information. Links may be to either a URL or another asset.
contentLanguagestringThe language of the content. Use one of the language codes from the IETF BCP 47 standard
tagsArray of stringArray of keywords or tags used to describe this content. Empty by default.
categoriesArray of stringArray of categories associated to the asset. Note: recommended to use tags instead of this.
additionalInformationObjectStores additional information, this is customizable by publisher
algorithm**Algorithm MetadataInformation about asset of type algorithm
+ +\* Required + +\*\* Required for algorithms only + +
+ +Metadata Example + +```json +{ + "metadata": { + "created": "2020-11-15T12:27:48Z", + "updated": "2021-05-17T21:58:02Z", + "description": "Sample description", + "name": "Sample asset", + "type": "dataset", + "author": "OPF", + "license": "https://market.oceanprotocol.com/terms" + } +} +``` + +
+ +#### Services + +Services define the access for an asset, and each service is represented by its respective datatoken. + +An asset should have at least one service to be actually accessible and can have as many services which make sense for a specific use case. + +
AttributeTypeDescription
id*stringUnique ID
type*stringType of service access, compute, wss etc.
namestringService friendly name
descriptionstringService description
datatokenAddress*stringDatatoken
serviceEndpoint*stringProvider URL (schema + host)
files*FilesEncrypted file.
timeout*numberDescribing how long the service can be used after consumption is initiated. A timeout of 0 represents no time limit. Expressed in seconds.
compute**ComputeIf service is of type compute, holds information about the compute-related privacy settings & resources.
consumerParametersConsumer ParametersAn object the defines required consumer input before consuming the asset
additionalInformationObjectStores additional information, this is customizable by publisher
+ +\* Required + +\*\* Required for compute assets only + +#### Files + +The `files` field is returned as a `string` which holds the encrypted file URLs. + +
+ +Files Example + +{% code overflow="wrap" %} +```json +{ + "files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735" +} +``` +{% endcode %} + +
+ +#### Credentials + +By default, a consumer can access a resource if they have 1 datatoken. _Credentials_ allow the publisher to optionally specify more fine-grained permissions. + +Consider a medical data use case, where only a credentialed EU researcher can legally access a given dataset. Ocean supports this as follows: a consumer can only access the resource if they have 1 datatoken _and_ one of the specified `"allow"` credentials. + +This is like going to an R-rated movie, where you can only get in if you show both your movie ticket (datatoken) _and_ some identification showing you're old enough (credential). + +Only credentials that can be proven are supported. This includes Ethereum public addresses and in the future [W3C Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) and more. + +Ocean also supports `deny` credentials: if a consumer has any of these credentials, they can not access the resource. + +Here's an example object with both `allow` and `deny` entries: + +
+ +Credentials Example + +```json +{ + "credentials": { + "allow": [ + { + "type": "address", + "values": ["0x123", "0x456"] + } + ], + "deny": [ + { + "type": "address", + "values": ["0x2222", "0x333"] + } + ] + } +} +``` + +
+ +#### DDO Checksum + +In order to ensure the integrity of the DDO, a checksum is computed for each DDO: + +```js +const checksum = sha256(JSON.stringify(ddo)); +``` + +The checksum hash is used when publishing/updating metadata using the `setMetaData` function in the ERC721 contract, and is stored in the event generated by the ERC721 contract. + +
+ +MetadataCreated and MetadataUpdated smart contract events + +```solidity +event MetadataCreated( + address indexed createdBy, + uint8 state, + string decryptorUrl, + bytes flags, + bytes data, + bytes metaDataHash, + uint256 timestamp, + uint256 blockNumber +); + +event MetadataUpdated( + address indexed updatedBy, + uint8 state, + string decryptorUrl, + bytes flags, + bytes data, + bytes metaDataHash, + uint256 timestamp, + uint256 blockNumber +); +``` + +
+ +_Aquarius_ should always verify the checksum after data is decrypted via a _Provider_ API call. + +#### State + +Each asset has a state, which is held by the NFT contract. The possible states are: + +
StateDescriptionDiscoverable in Ocean MarketOrdering allowedListed under profile
0ActiveYesYesYes
1End-of-lifeNoNoNo
2Deprecated (by another asset)NoNoNo
3Revoked by publisherNoNoNo
4Ordering is temporary disabledYesNoYes
5Asset unlisted.NoYesYes
+ +### Aquarius Enhanced DDO Response + +The following fields are added by _Aquarius_ in its DDO response for convenience reasons, where an asset returned by _Aquarius_ inherits the DDO fields stored on-chain. + +These additional fields are never stored on-chain and are never taken into consideration when [hashing the DDO](ddo-specification.md#ddo-checksum). + +#### NFT + +The `nft` object contains information about the ERC721 NFT contract which represents the intellectual property of the publisher. + +
AttributeTypeDescription
addressstringContract address of the deployed ERC721 NFT contract.
namestringName of NFT set in contract.
symbolstringSymbol of NFT set in contract.
ownerstringETH account address of the NFT owner.
statenumberState of the asset reflecting the NFT contract value. See State
createdISO date/time stringContains the date of NFT creation.
tokenURIstringtokenURI
+ +
+ +NFT Object Example + +```json +{ + "nft": { + "address": "0x000000", + "name": "Ocean Protocol Asset v4", + "symbol": "OCEAN-A-v4", + "owner": "0x0000000", + "state": 0, + "created": "2000-10-31T01:30:00Z" + } +} +``` + +
+ +#### Datatokens + +The `datatokens` array contains information about the ERC20 datatokens attached to [asset services](ddo-specification.md#services). + +
AttributeTypeDescription
addressstringContract address of the deployed ERC20 contract.
namestringName of NFT set in contract.
symbolstringSymbol of NFT set in contract.
serviceIdstringID of the service the datatoken is attached to.
+ +
+ +Datatokens Array Example + +```json +{ + "datatokens": [ + { + "address": "0x000000", + "name": "Datatoken 1", + "symbol": "DT-1", + "serviceId": "1" + }, + { + "address": "0x000001", + "name": "Datatoken 2", + "symbol": "DT-2", + "serviceId": "2" + } + ] +} +``` + +
+ +#### Event + +The `event` section contains information about the last transaction that created or updated the DDO. + +
+ +Event Example + +{% code overflow="wrap" %} +```json +{ + "event": { + "tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c", + "block": 12831214, + "from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83", + "contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf", + "datetime": "2000-10-31T01:30:00" + } +} +``` +{% endcode %} + +
+ +#### Purgatory + +Contains information about an asset's purgatory status defined in [`list-purgatory`](https://github.com/oceanprotocol/list-purgatory). Marketplace interfaces are encouraged to prevent certain user actions like adding liquidity on assets in purgatory. + +
AttributeTypeDescription
statebooleanIf true, asset is in purgatory.
reasonstringIf asset is in purgatory, contains the reason for being there as defined in list-purgatory.
+ +
+ +Purgatory Example + +```json +{ + "purgatory": { + "state": true, + "reason": "Copyright violation" + } + +} +``` + +```json +{ + "purgatory": { + "state": false + } +} +``` + +
+ +#### Statistics + +The `stats` section contains different statistics fields. + +
AttributeTypeDescription
ordersnumberHow often an asset was ordered, meaning how often it was either downloaded or used as part of a compute job.
+ +
+ +Statistics Example + +```json +{ + "stats": { + "orders": 4 + } +} +``` + +
+ +### Compute to data + +For algorithms and datasets that are used for compute to data, there are additional fields and objects within the DDO structure that you need to consider. These include: + +* `compute` attributes +* `publisherTrustedAlgorithms` +* `consumerParameters` + +Details for each of these are explained on the [Compute Options page](compute-to-data/compute-options.md). + +### DDO Schema - Detailed + +The below diagram shows the detailed DDO schema depicting the content of each data structure and the relations between them. + +Please note that some data structures apply only on certain types of services or assets. + +```mermaid +--- +title: DDO Detailed Diagram +--- +classDiagram + + class DDO{ + +@context + +id + +version + +chainId + +nftAddress + +Metadata + +Credentials + +Service + } + + class Metadata{ + +created + +updated + +description + +name + +type ["dataset"/"algorithm"] + +author + +license + +tags + +links + +contentLanguage + +categories + +copyrightHolder + +additionalInformation + +AlgorithmMetadata [for "algorithm" type] + } + class Credentials{ + +allow + +deny + } + + class AlgorithmMetadata["AlgorithmMetadata (for algorithm)"] { + +language + +version + +ConsumerParameters + +Container + } + + class Container{ + +entrypoint + +image + +tag + +checksum + } + class Service{ + +id + +type ["access"/"compute"] + +files + +name + +description + +datatokenAddress + +serviceEndpoint + +timeout + +additionalInformation + +ConsumerParameters + +Compute + } + class ConsumerParameters{ + +type + +name + +label + +required + +description + +default + +options + } + class Compute{ + +publisherTrustedAlgorithms + +publisherTrustedAlgorithmPublishers + } +DDO "1" --> "1" Metadata +DDO "1" --> "1..n" Service +DDO "1" --> "*" Credentials + + +Metadata "1" --> "0..1" AlgorithmMetadata +AlgorithmMetadata "1" --> "1..*" Container +AlgorithmMetadata "1" --> "1..*" ConsumerParameters + +Service "1" --> "0..n" Compute +Service "1" --> "0..n" ConsumerParameters +``` + diff --git a/developers/fg-permissions.md b/developers/fg-permissions.md new file mode 100644 index 00000000..5a0d3e42 --- /dev/null +++ b/developers/fg-permissions.md @@ -0,0 +1,139 @@ +--- +title: Fine-Grained Permissions +description: >- + Fine-Grained Permissions Using Role-Based Access Control. You can Control who + can publish, buy or browse data +--- + +# Fine-Grained Permissions + +A large part of Ocean is about access control, which is primarily handled by datatokens. Users can access a resource (e.g. a file) by redeeming datatokens for that resource. We recognize that enterprises and other users often need more precise ways to specify and manage access, and we have introduced fine-grained permissions for these use cases. Fine-grained permissions mean that access can be controlled precisely at two levels: + +* [Marketplace-level permissions](fg-permissions.md#market-level-permissions) for browsing, downloading or publishing within a marketplace frontend. +* [Asset-level permissions](fg-permissions.md#asset-level-restrictions) on downloading a specific asset. + +The fine-grained permissions features are designed to work in forks of Ocean Market. We have not enabled them in Ocean Market itself, to keep Ocean Market open for everyone to use. On the front end, the permissions features are easily enabled by setting environment variables. + +### Introduction + +Some datasets need to be restricted to appropriately credentialed users. In this situation there is tension: + +1. Datatokens on their own aren’t enough - the datatokens can be exchanged without any restrictions, which means anyone can acquire them and access the data. +2. We want to retain datatokens approach, since they enable Ocean users to leverage existing crypto infrastructure e.g. wallets, exchange etc. + +We can resolve this tension by drawing on the following analogy: + +> Imagine going to an age 18+ rock concert. You can only get in if you show both (a) your concert ticket and (b) an id showing that you’re old enough. + +We can port this model into Ocean, where (a) is a datatoken, and (b) is a credential. The datatoken is the baseline access control. It’s fungible, and something that you’ve paid for or had shared to you. It’s independent of your identity. The credential is something that’s a function of your identity. + +The credential based restrictions are implemented in two ways, at the market level and at the asset level. Access to the market is restricted on a role basis, the user's identity is attached to a role via the role based access control (RBAC) server. Access to individual assets is restricted via allow and deny lists which list the ethereum addresses of the users who can and cannot access the asset within the DDO. + +### Asset-Level Restrictions + +For asset-level restrictions Ocean supports allow and deny lists. Allow and deny lists are advanced features that allow publishers to control access to individual data assets. Publishers can restrict assets so that they can only be accessed by approved users (allow lists) or they can restrict assets so that they can be accessed by anyone except certain users (deny lists). + +When an allow-list is in place, a consumer can only access the resource if they have a datatoken and one of the credentials in the "allow" list of the DDO. Ocean also has complementary deny functionality: if a consumer is on the "deny" list, they will not be allowed to access the resource. + +Initially, the only credential supported is Ethereum public addresses. To be fair, it’s more a pointer to an individual not a credential; but it has a low-complexity implementation so makes a good starting point. For extensibility, the Ocean metadata schema enables specification of other types of credentials like W3C Verifiable Credentials and more. When this gets implemented, asset-level permissions will be properly RBAC too. Since asset-level permissions are in the DDO, and the DDO is controlled by the publisher, asset-level restrictions are controlled by the publisher. + +### Market-Level Permissions + +For market-level permissions, Ocean implements a role-based access control server (RBAC server). It implements restrictions at the user level, based on the user’s role (credentials). The RBAC server is run & controlled by the marketplace owner. Therefore permissions at this level are at the discretion of the marketplace owner. + +The RBAC server is the primary mechanism for restricting your users ability to publish, buy, or browse assets in the market. + +#### Roles + +The RBAC server defines four different roles: + +* Admin +* Publisher +* Consumer +* User + +**Admin/ Publisher** + +Currently users with either the admin or publisher roles will be able to use the Market without any restrictions. They can publish, buy and browse datasets. + +**Consumer** + +A user with the consumer is able to browse datasets, purchase them, trade datatokens and also contribute to datapools. However, they are not able to publish datasets. + +**Users** + +Users are able to browse and search datasets but they are not able to purchase datasets, trade datatokens, or contribute to data pools. They are also not able to publish datasets. + +**Address without a role** + +If a user attempts to view the data market without a role, or without a wallet connected, they will not be able to view or search any of the datasets. + +**No wallet connected** + +When the RBAC server is enabled on the market, users are required to have a wallet connected to browse the datasets. + +#### Mapping roles to addresses + +Currently the are two ways that the RBAC server can be configured to map user roles to Ethereum addresses. The RBAC server is also built in such a way that it is easy for you to add your own authorization service. They two existing methods are: + +1. Keycloak + +If you already have a [Keycloak](https://www.keycloak.org/) identity and access management server running you can configure the RBAC server to use it by adding the URL of your Keycloak server to the `KEYCLOAK_URL` environmental variable in the RBAC `.enb` file. + +2. JSON + +Alternatively, if you are not already using Keycloak, the easiest way to map user roles to ethereum addresses is in a JSON object that is saved as the `JSON_DATA` environmental variable in the RBAC `.env` file. There is an example of the format required for this JSON object in `.example.env` + +It is possible that you can configure both of these methods of mapping user roles to Ethereum Addresses. In this case the requests to your RBAC server should specify which auth service they are using e.g. `"authService": "json"` or `"authService": "keycloak"` + +**Default Auth service** + +Additionally, you can also set an environmental variable within the RBAC server that specifies the default authorization method that will be used e.g. `DEFAULT_AUTH_SERVICE = "json"`. When this variable is specified, requests sent to your RBAC server don't need to include an `authService` and they will automatically use the default authorization method. + +#### Running the RBAC server locally + +You can start running the RBAC server by following these steps: + +1. Clone this repository: + +```bash +git clone https://github.com/oceanprotocol/RBAC-Server.git +cd RBAC-Server +``` + +2. Install the dependencies: + +```bash +npm install +``` + +3. Build the service + +```bash +npm run build +``` + +4. Start the server + +```bash +npm run start +``` + +#### Running the RBAC in Docker + +When you are ready to deploy the RBAC server to + +1. Replace the KEYCLOAK\_URL in the Dockerfile with the correct URL for your hosting of [Keycloak](https://www.keycloak.org/). +2. Run the following command to build the RBAC service in a Docker container: + +```bash +npm run build:docker +``` + +3. Next, run the following command to start running the RBAC service in the Docker container: + +```bash +npm run start:docker +``` + +4. Now you are ready to send requests to the RBAC server via postman. Make sure to replace the URL to `http://localhost:49160` in your requests. diff --git a/developers/fractional-ownership.md b/developers/fractional-ownership.md new file mode 100644 index 00000000..9112c8bc --- /dev/null +++ b/developers/fractional-ownership.md @@ -0,0 +1,30 @@ +--- +description: >- + Exploring fractional ownership in Web3, combining NFTs and DeFi for + co-ownership of data IP and tokenized DAOs for collective data management. +--- + +# Fractional Ownership + +Fractional ownership represents an exciting subset within the realm of Web3, combining the realms of NFTs and DeFi. It introduces the concept of co-owning data intellectual property (IP). + +Ocean offers two approaches to facilitate fractional ownership: + +1. Sharded Holding of ERC20 Datatokens: Under this approach, each holder of ERC20 tokens possesses the typical datatoken rights outlined earlier. For instance, owning 1.0 datatoken allows consumption of a particular asset. Ocean conveniently provides this feature out of the box. +2. Sharding ERC721 Data NFT: This method involves dividing the ownership of an ERC721 data NFT among multiple individuals, granting each co-owner the right to a portion of the earnings generated from the underlying IP. Moreover, these co-owners collectively control the data NFT. For instance, a dedicated DAO may be established to hold the data NFT, featuring its own ERC20 token. DAO members utilize their tokens to vote on updates to data NFT roles or the deployment of ERC20 datatokens associated with the ERC721. + +It's worth noting that for the second approach, one might consider utilizing platforms like Niftex for sharding. However, important questions arise in this context: + +* What specific rights do shard-holders possess? +* It's possible that they have limited rights, just as Amazon shareholders don't have the authority to roam the hallways of Amazon's offices simply because they own shares +* Additionally, how do shard-holders exercise control over the data NFT? + +These concerns are effectively addressed by employing a tokenized DAO, as previously described. + +

DAO

+ +Data DAOs present a fascinating use case whenever a group of individuals desires to collectively manage data or consolidate data for increased bargaining power. Such DAOs can take the form of unions, cooperatives, or trusts. + +Consider the following example involving a mobile app: You install the app, which includes an integrated crypto wallet. After granting permission for the app to access your location data, it leverages the DAO to sell your anonymized location data on your behalf. The DAO bundles your data with that of thousands of other DAO members, and as a member, you receive a portion of the generated profits. + +This use case can manifest in several variations. Each member's data feed could be represented by their own data NFT, accompanied by corresponding datatokens. Alternatively, a single data NFT could aggregate data feeds from all members into a unified feed, which is then fractionally owned through sharded ERC20 tokens (as described in approach 1) or by sharding the ERC721 data NFT (as explained in approach 2). If you're interested in establishing a data union, we recommend reaching out to our associates at [Data Union](https://www.dataunion.app/). diff --git a/developers/get-api-keys-for-blockchain-access.md b/developers/get-api-keys-for-blockchain-access.md new file mode 100644 index 00000000..2f2445ba --- /dev/null +++ b/developers/get-api-keys-for-blockchain-access.md @@ -0,0 +1,21 @@ +--- +description: 🧑🏽‍💻 Remote Development Environment for Ocean Protocol +--- + +# Get API Keys for Blockchain Access + +This article points out an alternative for configuring remote networks on Ocean Protocol components: the libraries, Provider, Aquarius, Subgraph, without using Barge services. + +### Get API key for Ethereum node provider + +Ocean Protocol's smart contracts are deployed on EVM-compatible networks. Using an API key provided by a third-party Ethereum node provider allows you to interact with the Ocean Protocol's smart contracts on the supported networks without requiring you to host a local node. + +Choose any API provider of your choice. Some of the commonly used are: + +* [Infura](https://infura.io/) +* [Alchemy](https://www.alchemy.com/) +* [Moralis](https://moralis.io/) + +The supported networks are listed [here](../discover/networks/README.md). + +Let's configure the remote setup for the mentioned components in the following sections. diff --git a/developers/identifiers.md b/developers/identifiers.md new file mode 100644 index 00000000..10224f9c --- /dev/null +++ b/developers/identifiers.md @@ -0,0 +1,42 @@ +--- +description: >- + Specification of decentralized identifiers for assets in Ocean Protocol using + the DID & DDO standards. +--- + +# Identifiers (DIDs) + +### Identifiers + +In Ocean, we use decentralized identifiers (DIDs) to identify your asset within the network. Decentralized identifiers (DIDs) are a type of identifier that enables verifiable, decentralized digital identity. In contrast to typical, centralized identifiers, DIDs have been designed so that they may be decoupled from centralized registries, identity providers, and certificate authorities. Specifically, while other parties might be used to help enable the discovery of information related to a DID, the design enables the controller of a DID to prove control over it without requiring permission from any other party. DIDs are URIs that associate a DID subject with a DID document allowing trustable interactions associated with that subject. + +{% embed url="https://www.youtube.com/watch?t=95s&v=I06AUNt7ee8" %} +What is a DID and DDO? +{% endembed %} + +### Examples + +DIDs in Ocean follow [the generic DID scheme](https://w3c-ccg.github.io/did-spec/#the-generic-did-scheme), they look like this: + +``` +did:op:0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea +``` + +The part after `did:op:` is the ERC721 contract address(in checksum format) and the chainId (expressed to 10 decimal places). The following javascript example shows how to calculate the DID for the asset: + +```runkit nodeVersion="18.x.x" +const CryptoJS = require('crypto-js') + +const dataNftAddress = '0xa331155197F70e5e1EA0CC2A1f9ddB1D49A9C1De' +const chainId = 1 +const checksum = CryptoJS.SHA256(dataNftAddress + chainId.toString(10)) +const did = 'did:op:' + checksum + +console.log(did) + +``` + +Before creating a DID you should first publish a data NFT, we suggest reading the following sections so you are familiar with the process: + +* [Creating a data NFT with ocean.js](ocean.js/creating-datanft.md) +* [Publish flow with ocean.py](ocean.py/publish-flow.md) diff --git a/developers/metadata.md b/developers/metadata.md new file mode 100644 index 00000000..7479fdec --- /dev/null +++ b/developers/metadata.md @@ -0,0 +1,86 @@ +--- +description: How can you enhance data discovery? +--- + +# Metadata + +Metadata plays a **crucial role** in asset **discovery**, providing essential information such as **asset type, name, creation date, and licensing details**. Each data asset can have a [decentralized identifier (DID)](identifiers.md) that resolves to a DID document ([DDO](ddo-specification.md)) containing associated metadata. The DDO is essentially a collection of fields in a [JSON](https://www.json.org/) object. To understand working with OCEAN DIDs, you can refer to the [DID documentation](identifiers.md). For a more comprehensive understanding of metadata structure, the [DDO Specification](ddo-specification.md) documentation provides in-depth information. + +

Data discovery

+ +In general, any dApp within the Ocean ecosystem is required to store metadata for every listed dataset. The metadata is useful to determine which datasets are the most relevant. + +So, for example, imagine you're searching for data on Spanish almond production in an Ocean-powered dApp. You might find a large number of datasets, making it difficult to identify the most relevant one. What can we do about it? :thinking: This is where metadata is useful! The metadata provides valuable information that helps you identify the most relevant dataset. This information can include: + +* **name**, e.g. “Largueta Almond Production: 1995 to 2005” +* **dateCreated**, e.g. “2007–01–20” +* **datePublished**, e.g. “2022–11–10T12:32:15Z” +* **author**, e.g. “Spanish Almond Board” +* **license**, e.g. “SAB Data License v4” +* technical information about the **files**, such as the content type. + +Other metadata might also be available. For example: + +* **categories**, e.g. \[“agriculture”, “economics”] +* **tags**, e.g. \[“Europe”, “Italy”, “nuts”, “almonds”] +* **description**, e.g. “2002 Italian almond production statistics for 14 varieties and 20 regions.” +* **additionalInformation** can be used to store any other facts about the asset. + +### **Overview** + +DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/). + +[**Decentralized identifiers**](identifiers.md) (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more. A **DID Document** (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID. + +Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more. + +#### Rules for DID & DDO + +An _asset_ in Ocean represents a downloadable file, compute service, or similar. Each asset is a _resource_ under the control of a _publisher_. The Ocean network itself does _not_ store the actual resource (e.g. files). + +An _asset_ has a DID and DDO. The DDO should include metadata about the asset, and define access in at least one [service](ddo-specification.md#services). Only _owners_ or _delegated users_ can modify the DDO. + +All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like [_Aquarius_](aquarius/README.md) can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted. + +#### Publishing & Retrieving DDOs + +The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the [_Provider_](provider/README.md). To resolve it, a metadata cache like [_Aquarius_](aquarius/README.md) must query the [Provider](provider/README.md) to decrypt the DDO. + +Here is the flow: + +

DDO Flow

+ +To set up the metadata for an asset, you'll need to call the [**setMetaData**](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC721Template.sol#L247) function at the contract level. + +* [**\_metaDataState**](ddo-specification.md#state) - Each asset has a state, which is held by the NFT contract. One of the following: active (0), end-of-life (1), deprecated (2), revoked (3), ordering temporarily disabled (4), and asset unlisted (5). +* **\_metaDataDecryptorUrl** - You create the DDO and then the Provider encrypts it with its private key. Only that Provider can decrypt it. +* **\_metaDataDecryptorAddress** - The decryptor address. +* **flags** - Additional information to represent the state of the data. One of two values: 0 - plain text, 1 - compressed, 2 - encrypted. Used by Aquarius. +* **data -** The [DDO](ddo-specification.md) of the asset. You create the DDO as a JSON, send it to the [Provider](provider/README.md) that encrypts it, and then you set it up at the contract level. +* **\_metaDataHash** - Hash of the clear data **generated before the encryption.** It is used by Provider to check the validity of the data after decryption. +* **\_metadataProofs** - Array with signatures of entities who validated data (before the encryption). Pass an empty array if you don't have any. + +{% code overflow="wrap" %} +```solidity +function setMetadata(uint8 _metaDataState, string calldata _metaDataDecryptorUrl + , string calldata _metaDataDecryptorAddress, bytes calldata flags, + bytes calldata data,bytes32 _metaDataHash, metaDataProof[] memory _metadataProofs) external { + require( + permissions[msg.sender].updateMetadata, + "ERC721Template: NOT METADATA_ROLE" + ); + _setMetaData(_metaDataState, _metaDataDecryptorUrl, _metaDataDecryptorAddress,flags, + data,_metaDataHash, _metadataProofs); +} +``` +{% endcode %} + +{% hint style="info" %} +While we utilize a specific DDO structure, you have the flexibility to customize it according to your unique requirements. However, to enable seamless processing, it is essential to have your own Aquarius instance that can handle your modified DDO. +{% endhint %} + +{% hint style="info" %} +As developers, we understand that you eat, breathe, and live code. That's why we invite you to explore our [ocean.py](ocean.py/publish-flow.md#publishing-alternatives) and [ocean.js](ocean.js/update-metadata.md) pages, where you'll find practical examples of how to set up and update metadata for an asset :computer: +{% endhint %} + +You'll have more information about the DIDs, on the [Identifiers](identifiers.md) page. diff --git a/developers/obtaining-api-keys-for-blockchain-access.md b/developers/obtaining-api-keys-for-blockchain-access.md new file mode 100644 index 00000000..a2f9be36 --- /dev/null +++ b/developers/obtaining-api-keys-for-blockchain-access.md @@ -0,0 +1,21 @@ +--- +description: 🧑🏽‍💻 Remote Development Environment for Ocean Protocol +--- + +# Obtaining API Keys for Blockchain Access + +This article points out an alternative for configuring remote networks on Ocean Protocol components: the libraries, Provider, Aquarius, Subgraph, without using Barge services. + +### Obtaining API key for Ethereum node provider + +Ocean Protocol's smart contracts are deployed on EVM-compatible networks. Using an API key provided by a third-party Ethereum node provider allows you to interact with the Ocean Protocol's smart contracts on the supported networks without requiring you to host a local node. + +Choose any API provider of your choice. Some of the commonly used are: + +* [Infura](https://infura.io/) +* [Alchemy](https://www.alchemy.com/) +* [Moralis](https://moralis.io/) + +The supported networks are listed [here](../discover/networks/README.md). + +Let's configure the remote setup for the mentioned components in the following sections. diff --git a/developers/ocean.js/README.md b/developers/ocean.js/README.md new file mode 100644 index 00000000..1e9770e7 --- /dev/null +++ b/developers/ocean.js/README.md @@ -0,0 +1,31 @@ +--- +description: >- + JavaScript library to privately & securely publish, exchange, and consume + data. +--- + +# Ocean.js + +With ocean.js, you can: + +* **Publish** data services: downloadable files or compute-to-data. Create an ERC721 **data NFT** for each service, and ERC20 **datatoken** for access (1.0 datatokens to access). +* **Sell** datatokens for a fixed price. Sell data NFTs. +* **Transfer** data NFTs & datatokens. + +Ocean.js is part of the [Ocean Protocol](https://oceanprotocol.com) toolset. + +{% embed url="https://www.youtube.com/watch?v=lqGXPkPUCqI" %} +Introducing Ocean.JS +{% endembed %} + +The Ocean.js library adopts the module architectural pattern, ensuring clear separation and organization of code units. Utilizing ES6 modules simplifies the process by allowing you to import only the necessary module for your specific task. + +Our module structure follows this format: + +* Types +* Config +* Contracts +* Services +* Utils + +When working with a particular module, you will need to provide different parameters. To instantiate classes from the contracts module, you must pass objects such as Signer, which represents the wallet instance, or the contract address you wish to utilize, depending on the scenario. As for the services modules, you will need to provide the provider URI or metadata cache URI. diff --git a/developers/ocean.js/cod-asset.md b/developers/ocean.js/cod-asset.md new file mode 100644 index 00000000..a1853c8e --- /dev/null +++ b/developers/ocean.js/cod-asset.md @@ -0,0 +1,182 @@ +# Run C2D Jobs + +**Overview** + +Compute-to-Data is a powerful feature of Ocean Protocol that enables privacy-preserving data analysis and computation. With Compute-to-Data, data owners can maintain control over their data while allowing external parties to perform computations on that data. + +This documentation provides an overview of Compute-to-Data in Ocean Protocol and explains how to use it with Ocean.js. For detailed code examples and implementation details, please refer to the official [Ocean.js](https://github.com/oceanprotocol/ocean.js) GitHub repository. + +**Getting Started** + +To get started with Compute-to-Data using Ocean.js, follow these steps: + +1. **Environment Setup**: Ensure that you have the necessary dependencies and libraries installed to work with Ocean.js. Refer to the Ocean.js documentation for detailed instructions on setting up your development environment. +2. **Connecting to the Ocean Protocol Network**: Establish a connection to the Ocean Protocol network using Ocean.js. This connection will enable you to interact with the various components of Ocean Protocol, including Compute-to-Data. +3. **Registering a Compute-to-Data Service**: As a data provider, you can register a Compute-to-Data service using Ocean.js. This process involves specifying the data you want to expose and defining the computation tasks that can be performed on it. +4. **Searching and Consuming Compute-to-Data Services**: As a data consumer, you can search for Compute-to-Data services available on the Ocean Protocol network. Utilize Ocean.js to discover services based on data types, pricing, and other parameters. +5. **Executing Computations on Data**: Once you have identified a suitable Compute-to-Data service, use Ocean.js to execute computations on the provided data. The actual computation is performed by the service provider, and the results are securely returned to you. + +Please note that the implementation details of Compute-to-Data can vary depending on your specific use case. The code examples available in the Ocean.js GitHub repository provide comprehensive illustrations of working with Compute-to-Data in Ocean Protocol. Visit [ComputeExamples.md](https://github.com/oceanprotocol/ocean.js/blob/main/ComputeExamples.md) for detailed code snippets and explanations that guide you through leveraging Compute-to-Data capabilities. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +{% hint style="info" %} +The variable **AQUARIUS\_URL** and **PROVIDER\_URL** should be set correctly in `.env` file +{% endhint %} + +#### Create a script that starts compute to data using an already published dataset and algorithm + +Create a new file in the same working directory where configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +
// Note: Make sure .env file and config.js are created and setup correctly
+const { oceanConfig } = require('./config.js');
+const { ZERO_ADDRESS, NftFactory, getHash, Nft } = require ('@oceanprotocol/lib');
+
+// replace the did here
+const datasetDid = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f";
+const algorithmDid = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f";
+
+// This function takes dataset and algorithm dids as a parameters,
+// and starts a compute job for them
+const startComputeJob = async (datasetDid, algorithmDid) => {
+  
+  const consumer = await oceanConfig.consumerAccount.getAddress();
+  
+   // Fetch the dataset and the algorithm from Aquarius
+  const dataset = await await oceanConfig.aquarius.resolve(datasetDid);
+  const algorithm = await await oceanConfig.aquarius.resolve(algorithmDid);
+  
+  // Let's fetch the compute environments and choose the free one
+  const computeEnv = computeEnvs[resolvedDatasetDdo.chainId].find(
+    (ce) => ce.priceMin === 0
+  )
+  
+  // Request five minutes of compute access
+  const mytime = new Date()
+  const computeMinutes = 5
+  mytime.setMinutes(mytime.getMinutes() + computeMinutes)
+  const computeValidUntil = Math.floor(mytime.getTime() / 1000
+  
+  // Let's initialize the provider for the compute job
+  const asset: ComputeAsset[] = {
+    documentId: dataset.id,
+    serviceId: dataset.services[0].id
+  }
+
+  const algo: ComputeAlgorithm = {
+    documentId: algorithm.id,
+    serviceId: algorithm.services[0].id
+  }
+  
+  const providerInitializeComputeResults = await ProviderInstance.initializeCompute(
+    assets,
+    algo,
+    computeEnv.id,
+    computeValidUntil,
+    providerUrl,
+    await consumerAccount.getAddress()
+  )
+  
+  await approve(
+    consumerAccount,
+    config,
+    await consumerAccount.getAddress(),
+    addresses.Ocean,
+    datasetFreAddress,
+    '100'
+  )
+  
+  await approve(
+    consumerAccount,
+    config,
+    await consumerAccount.getAddress(),
+    addresses.Ocean,
+    algoFreAddress,
+    '100'
+  )
+    
+  const fixedRate = new FixedRateExchange(fixedRateExchangeAddress, consumerAccount)
+  const buyDatasetTx = await fixedRate.buyDatatokens(datasetFreAddress, '1', '2')
+  const buyAlgoTx = await fixedRate.buyDatatokens(algoFreAddress, '1', '2')
+ 
+  
+  // We now order both the dataset and the algorithm
+  algo.transferTxId = await handleOrder(
+    providerInitializeComputeResults.algorithm,
+    algorithm.services[0].datatokenAddress,
+    consumerAccount,
+    computeEnv.consumerAddress,
+    0
+  )
+  
+  asset.transferTxId = await handleOrder(
+    providerInitializeComputeResults.datasets[0],
+    dataset.services[0].datatokenAddress,,
+    consumerAccount,
+    computeEnv.consumerAddress,
+    0
+  )
+  
+  // Start the compute job for the given dataset and algorithm
+  const computeJobs = await ProviderInstance.computeStart(
+    providerUrl,
+    consumerAccount,
+    computeEnv.id,
+    assets[0],
+    algo
+  )
+  
+  return  computeJobs[0].jobId
+  
+};
+
+const checkIfJobFinished = async (jobId) => {
+  const jobStatus = await ProviderInstance.computeStatus(
+      providerUrl,
+      await consumerAccount.getAddress(),
+      computeJobId,
+      DATASET_DDO.id
+    )
+  if (jobStatus?.status === 70) return true
+  else checkIfJobFinished(jobId)
+}
+
+const checkIfJobFinished = async (jobId) => {
+  const jobStatus = await ProviderInstance.computeStatus(
+      providerUrl,
+      await consumerAccount.getAddress(),
+      computeJobId,
+      DATASET_DDO.id
+    )
+  if (jobStatus?.status === 70) return true
+  else checkIfJobFinished(jobId)
+}
+
+const downloadComputeResults = async (jobId) => {
+  const downloadURL = await ProviderInstance.getComputeResultUrl(
+      oceanConfig.providerURI,
+      oceanConfig.consumerAccount,
+      jobId,
+      0
+    )
+}
+
+// Call startComputeJob(...) checkIfJobFinished(...) downloadComputeResults(...)
+// functions defined above in that particular order 
+startComputeJob(datasetDid, algorithmDid).then((jobId) => {
+  checkIfJobFinished(jobId).then((result) => {
+    downloadComputeResults(jobId).then((result) => {
+      process.exit();
+    })
+  })
+}).catch((err) => {
+  console.error(err);
+  process.exit(1);
+});
+
+ diff --git a/developers/ocean.js/configuration.md b/developers/ocean.js/configuration.md new file mode 100644 index 00000000..ea98009e --- /dev/null +++ b/developers/ocean.js/configuration.md @@ -0,0 +1,174 @@ +# Configuration + +For obtaining the API keys for blockchain access and setting the correct environment variables, please consult [this section](http://127.0.0.1:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) first and proceed with the next steps. + +### Create a directory + +Let's start with creating a working directory where we store the environment variable file, configuration files, and the scripts. + +```bash +mkdir my-ocean-project +cd my-ocean-project +``` + +### Create a `.env` file + +In the working directory create a `.env` file. The content of this file will store the values for the following variables: + +
Variable nameDescriptionRequired
OCEAN_NETWORKName of the network where the Ocean Protocol's smart contracts are deployed.Yes
OCEAN_NETWORK_URLThe URL of the Ethereum node (along with API key for non-local networks)**Yes
PRIVATE_KEYThe private key of the account which you want to use. A private key is made up of 64 hex characters. Make sure you have sufficient balance to pay for the transaction fees.Yes
AQUARIUS_URLThe URL of the Aquarius. This value is needed when reading an asset from off-chain store.No
PROVIDER_URLThe URL of the Provider. This value is needed when publishing a new asset or update an existing asset.No
+ +{% hint style="info" %} +Treat this file as a secret and do not commit this file to git or share the content publicly. If you are using git, then include this file name in `.gitignore` file. +{% endhint %} + +The below tabs show partially filled `.env` file content for some of the supported networks. + +{% tabs %} +{% tab title="Mainnet" %} +{% code title=".env" %} +```bash +# Mandatory environment variables + +OCEAN_NETWORK=mainnet +OCEAN_NETWORK_URL= +PRIVATE_KEY= + +# Optional environment variables + +AQUARIUS_URL=https://v4.aquarius.oceanprotocol.com/ +PROVIDER_URL=https://v4.provider.oceanprotocol.com +``` +{% endcode %} +{% endtab %} + +{% tab title="Polygon" %} +{% code title=".env" %} +```bash +# Mandatory environment variables + +OCEAN_NETWORK=polygon +OCEAN_NETWORK_URL= +PRIVATE_KEY= + +# Optional environment variables + +AQUARIUS_URL=https://v4.aquarius.oceanprotocol.com/ +PROVIDER_URL=https://v4.provider.oceanprotocol.com +``` +{% endcode %} +{% endtab %} + +{% tab title="Local (using Barge)" %} +{% code title=".env" %} +```bash +# Mandatory environment variables +OCEAN_NETWORK=development +OCEAN_NETWORK_URL=http://172.15.0.3:8545/ +AQUARIUS_URL=http://172.15.0.5:5000 +PROVIDER_URL=http://172.15.0.4:8030 + +# Replace PRIVATE_KEY if needed +PRIVATE_KEY=0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58 +``` +{% endcode %} +{% endtab %} +{% endtabs %} + +Replace `` with the appropriate values. You can see all the networks configuration on Oceanjs' [config helper](https://github.com/oceanprotocol/ocean.js/blob/main/src/config/ConfigHelper.ts#L42). + +### Setup dependencies + +In this step, all required dependencies will be installed. + +### Installation & Usage + +Let's install Ocean.js library into your current project by running: + +{% tabs %} +{% tab title="Terminal" %} +{% code overflow="wrap" %} +```bash +npm init +npm i @oceanprotocol/lib@latest dotenv crypto-js ethers@5.7.4 @truffle/hdwallet-provider +``` +{% endcode %} +{% endtab %} +{% endtabs %} + +### Create a configuration file + +A configuration file will read the content of the `.env` file and initialize the required configuration objects which will be used in the further tutorials. The below scripts creates a Web3 wallet instance and an Ocean's configuration object. + +Create the configuration file in the working directory i.e. at the same path where the `.env` is located. + +{% tabs %} +{% tab title="config.js" %} +{% code title="config.js" %} +```javascript +require('dotenv').config(); +const { Aquarius, ZERO_ADDRESS, ConfigHelper, configHelperNetworks } = require('@oceanprotocol/lib'); +const { providers } = require('ethers') +const ethers = require('ethers'); +async function oceanConfig(){ + + // Get configuration for the given network + const provider = new providers.JsonRpcProvider( + process.env.OCEAN_NETWORK_URL || configHelperNetworks[1].nodeUri + ) + + const ethersProvider = new ethers.Wallet( + process.env.PRIVATE_KEY, + provider + ); + + const publisherAccount = wallet.connect(provider); + + let oceanConfig = new ConfigHelper().getConfig( + parseInt(String((await publisherAccount.provider.getNetwork()).chainId)) + ) + const aquarius = new Aquarius(oceanConfig?.metadataCacheUri) + + // If using local development environment, read the addresses from local file. + // The local deployment address file can be generated using barge. + if (process.env.OCEAN_NETWORK === 'development') { + const addresses = JSON.parse( + // eslint-disable-next-line security/detect-non-literal-fs-filename + fs.readFileSync( + process.env.ADDRESS_FILE || + `${homedir}/.ocean/ocean-contracts/artifacts/address.json`, + 'utf8' + ) + ).development + + oceanConfig = { + ...oceanConfig, + oceanTokenAddress: addresses.Ocean, + poolTemplateAddress: addresses.poolTemplate, + fixedRateExchangeAddress: addresses.FixedPrice, + dispenserAddress: addresses.Dispenser, + nftFactoryAddress: addresses.ERC721Factory, + sideStakingAddress: addresses.Staking, + opfCommunityFeeCollector: addresses.OPFCommunityFeeCollector + }; + } + + oceanConfig = { + ...oceanConfig, + publisherAccount: publisherAccount, + consumerAccount: consumerAccount, + stakerAccount: stakerAccount, + aquarius: aquarius + }; + + return oceanConfig +}; + +module.exports = { + oceanConfig +} +``` +{% endcode %} +{% endtab %} +{% endtabs %} + +Now you have set up the necessary files and configurations to interact with Ocean Protocol's smart contracts using ocean.js. You can proceed with further tutorials or development using these configurations. diff --git a/developers/ocean.js/consume-asset.md b/developers/ocean.js/consume-asset.md new file mode 100644 index 00000000..85fe56eb --- /dev/null +++ b/developers/ocean.js/consume-asset.md @@ -0,0 +1,115 @@ +# Consume Asset + +Consuming an asset involves a two-step process: **placing an order** and then **utilizing the order** transaction to **download** and **access** the asset's files. Let's delve into each step in more detail. + +To initiate the ordering process, there are two scenarios depending on the pricing schema of the asset. Firstly, if the asset has a fixed-rate pricing schema configured, you would need to acquire the corresponding datatoken by purchasing it. Once you have obtained the datatoken, you send it to the publisher to place the order for the asset. + +The second scenario applies when the asset follows a free pricing schema. In this case, you can obtain a free datatoken from the dispenser service provided by Ocean Protocol. Using the acquired free datatoken, you can place the order for the desired asset. + +However, it's crucial to note that even when utilizing free assets, network gas fees still apply. These fees cover the costs associated with executing transactions on the blockchain network. + +Additionally, the specific type of datatoken associated with an asset influences the ordering process. There are two common datatoken templates: Template 1 (regular template) and Template 2 (enterprise template). The type of template determines the sequence of method calls required before placing an order. + +For assets utilizing Template '1', prior to ordering, you need to perform two separate method calls. First, you need to call the `approve` method to grant permission for the fixedRateExchange contract to spend the required amount of datatokens. Then, you proceed to call the `buyDatatokens` method from the fixedRateExchange contract. This process ensures that you have the necessary datatokens in your possession to successfully place the order. Alternatively, if the asset follows a free pricing schema, you can employ the `dispenser.dispense` method to obtain the free datatoken before proceeding with the order. + +On the other hand, assets utilizing Template '2' offer bundled methods for a more streamlined approach. For ordering such assets, you can use methods like `buyFromFreeAndOrder` or `buyFromDispenserAndOrder`. These bundled methods handle the acquisition of the necessary datatokens and the subsequent ordering process in a single step, simplifying the workflow for enterprise-template assets. + +Later on, when working with the ocean.js library, you can use this order transaction identifier to call the `getDownloadUrl` method from the provider service class. This method allows you to retrieve the download URL for accessing the asset's files. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +{% hint style="info" %} +The variables **AQUARIUS\_URL** and **PROVIDER\_URL** should be set correctly in `.env` file +{% endhint %} + +#### Create a script to consume an asset + +Create a new file in the same working directory where the configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +
// Note: Make sure .env file and config.js are created and setup correctly
+const { oceanConfig } = require('./config.js');
+const { ZERO_ADDRESS, NftFactory, getHash, Nft } = require ('@oceanprotocol/lib');
+
+// replace the did here
+const did = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f";
+
+// This function takes did as a parameter and updates the data NFT information
+const consumeAsset = async (did) => {
+  
+  const consumer = await oceanConfig.consumerAccount.getAddress();
+  
+   // Fetch ddo from Aquarius
+  const asset = await await oceanConfig.aquarius.resolve(did);
+
+  const nft = new Nft(oceanConfig.consumerAccount);
+  
+  await approve(
+    consumerAccount,
+    config,
+    await consumerAccount.getAddress(),
+    addresses.Ocean,
+    freAddress,
+    '1'
+  )
+    
+ const fixedRate = new FixedRateExchange(fixedRateExchangeAddress, consumerAccount)
+ const tx = await fixedRate.buyDatatokens(fixedRateId, '1', '2')
+ 
+ const initializeData = await ProviderInstance.initialize(
+    resolvedDDO.id,
+    resolvedDDO.services[0].id,
+    0,
+    await consumerAccount.getAddress(),
+    providerUrl
+  )
+
+  const providerFees: ProviderFees = {
+    providerFeeAddress: initializeData.providerFee.providerFeeAddress,
+    providerFeeToken: initializeData.providerFee.providerFeeToken,
+    providerFeeAmount: initializeData.providerFee.providerFeeAmount,
+    v: initializeData.providerFee.v,
+    r: initializeData.providerFee.r,
+    s: initializeData.providerFee.s,
+    providerData: initializeData.providerFee.providerData,
+    validUntil: initializeData.providerFee.validUntil
+  }
+
+ datatoken = new Datatoken(consumerAccount)
+    
+ const tx = await datatoken.startOrder(
+    freDatatokenAddress,
+    await consumerAccount.getAddress(),
+    0,
+    providerFees
+ )
+    
+ const orderTx = await tx.wait()
+ const orderStartedTx = getEventFromTx(orderTx, 'OrderStarted')
+
+ const downloadURL = await ProviderInstance.getDownloadUrl(
+   fixedDDO.id,
+   fixedDDO.services[0].id,
+   0,
+   orderStartedTx.transactionHash,
+   providerUrl,
+   consumerAccount
+ )
+  console.log(`Resolved asset did [${updatedAsset.id}]from aquarius.`);
+  console.log(`Updated asset state: [${updatedAsset.nft.state}].`);
+
+};
+
+// Call setMetadata(...) function defined above
+consumeAsset(did).then(() => {
+  process.exit();
+}).catch((err) => {
+  console.error(err);
+  process.exit(1);
+});
+
+
diff --git a/developers/ocean.js/creating-datanft.md b/developers/ocean.js/creating-datanft.md new file mode 100644 index 00000000..f2675963 --- /dev/null +++ b/developers/ocean.js/creating-datanft.md @@ -0,0 +1,78 @@ +# Creating a data NFT + +This tutorial guides you through the process of creating your own data NFT using Ocean libraries. To know more about data NFT please refer [this page](../contracts/data-nfts.md). + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +#### Create a script to deploy dataNFT + +The provided script demonstrates how to create a data NFT using Oceanjs. + +First, create a new file in the working directory, alongside the `config.js` and `.env` files. Name it `create_dataNFT.js` (or any appropriate name). Then, copy the following code into the new created file: + +{% tabs %} +{% tab title="create_dataNFT.js" %} +{% code title="create_dataNFT.js" overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { ZERO_ADDRESS, NftFactory } = require ('@oceanprotocol/lib'); + +// Deinfe a function which will create a dataNFT using Ocean.js library +const createDataNFT = async () => { + let config = await oceanConfig(); + // Create a NFTFactory + const factory = new NftFactory(config.nftFactoryAddress, config.publisherAccount); + + const publisherAddress = await config.publisherAccount.getAddress(); + + // Define dataNFT parameters + const nftParams = { + name: '72120Bundle', + symbol: '72Bundle', + // Optional parameters + templateIndex: 1, + tokenURI: 'https://example.com', + transferable: true, + owner: publisherAddress + }; + + const bundleNFT = await factory.createNFT(nftParams); + + const trxReceipt = await bundleNFT.wait() + + return { + trxReceipt + }; +}; + +// Call the create createDataNFT() function +createDataNFT() + .then(({ nftAddress }) => { + console.log(`DataNft address ${nftAddress}`); + process.exit(); + }) + .catch((err) => { + console.error(err); + process.exit(1); + }); +``` +{% endcode %} + +Run script: + +```bash +node create_dataNFT.js +``` +{% endtab %} +{% endtabs %} + +* Checkout our [code examples](https://github.com/oceanprotocol/ocean.js/blob/main/CodeExamples.md#L0-L1) or [compute to data examples](https://github.com/oceanprotocol/ocean.js/blob/main/ComputeExamples.md#L417) to see how you can use ocean.js. +* If you have any difficulties or if you have further questions about how to use ocean.js please reach out to us on [Discord](https://discord.gg/TnXjkR5). +* If you notice any bugs or issues with ocean.js please [open an issue on github](https://github.com/oceanprotocol/ocean.js/issues/new?assignees=\&labels=bug\&template=bug\_report.md\&title=). +* Visit the [Ocean Protocol website](https://docs.oceanprotocol.com/) for general information about Ocean Protocol. diff --git a/developers/ocean.js/mint-datatoken.md b/developers/ocean.js/mint-datatoken.md new file mode 100644 index 00000000..971794b1 --- /dev/null +++ b/developers/ocean.js/mint-datatoken.md @@ -0,0 +1,85 @@ +# Mint Datatokens + +This tutorial guides you through the process of minting datatokens and sending them to a receiver address. The tutorial assumes that you already have the address of the datatoken contract which is owned by you. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +#### Create a script to mint datatokens + +Create a new file in the same working directory where configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +{% tabs %} +{% tab title="mint_datatoken.js" %} +{% code title="mint_datatoken.js" overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { amountToUnits } = require ('@oceanprotocol/lib'); +const ethers = require('ethers'); + +// Define a function createFRE() +const createMINT = async () => { + + let config = await oceanConfig(); + const publisher = config.publisherAccount + const publisherAccount = await config.publisherAccount.getAddress() + + const minAbi = [ + { + constant: false, + inputs: [ + { name: 'to', type: 'address' }, + { name: 'value', type: 'uint256' } + ], + name: 'mint', + outputs: [{ name: '', type: 'bool' }], + payable: false, + stateMutability: 'nonpayable', + type: 'function' + } + ] + + const tokenContract = new ethers.Contract(config.oceanTokenAddress, minAbi, publisher) + const estGasPublisher = await tokenContract.estimateGas.mint( + publisherAccount, + amountToUnits(null, null, '1000', 18) + ) + const trxReceipt = await sendTx( + estGasPublisher, + publisher, + 1, + tokenContract.mint, + publisherAccount, + amountToUnits(null, null, '1000', 18) + ) + + return { + trxReceipt + }; +}; + +// Call the createFRE() function +createMINT() + .then(({ trxReceipt }) => { + console.log(`TX Receipt ${trxReceipt}`); + process.exit(1); + }) + .catch((err) => { + console.error(err); + process.exit(1); + }); +``` +{% endcode %} + +**Execute script** + +```bash +node mint_datatoken.js +``` +{% endtab %} +{% endtabs %} diff --git a/developers/ocean.js/publish.md b/developers/ocean.js/publish.md new file mode 100644 index 00000000..e0efeb6e --- /dev/null +++ b/developers/ocean.js/publish.md @@ -0,0 +1,190 @@ +# Publish + +This tutorial guides you through the process of creating your own data NFT and a datatoken using Ocean libraries. To know more about data NFTs and datatokens please refer [this page](../contracts/datanft-and-datatoken.md). Ocean Protocol supports different pricing schemes which can be set while publishing an asset. Please refer [this page](../contracts/pricing-schemas.md) for more details on pricing schemes. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +#### Create a script to deploy a data NFT and datatoken with the price schema you chose. + +Create a new file in the same working directory where configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +{% hint style="info" %} +**Fees**: The code snippets below define fees related parameters. Please refer [fees page ](../contracts/fees.md)for more details +{% endhint %} + +The code utilizes methods such as `NftFactory` and `Datatoken` from the Ocean libraries to enable you to interact with the Ocean Protocol and perform various operations related to data NFTs and datatokens. + +The `createFRE()` performs the following: + +1. Creates a web3 instance and import Ocean configs. +2. Retrieves the accounts from the web3 instance and sets the publisher. +3. Defines parameters for the data NFT, including name, symbol, template index, token URI, transferability, and owner. +4. Defines parameters for the datatoken, including name, symbol, template index, cap, fee amount, payment collector address, fee token address, minter, and multi-party fee address. +5. Defines parameters for the price schema, including the fixed rate address, base token address, owner, market fee collector, base token decimals, datatoken decimals, fixed rate, market fee, and optional parameters. +6. Uses the NftFactory to create a data NFT and datatoken with the fixed rate exchange, using the specified parameters. +7. Retrieves the addresses of the data NFT and datatoken from the result. +8. Returns the data NFT and datatoken addresses. + +{% tabs %} +{% tab title="create_datatoken_with_fre.js" %} +{% code title="create_datatoken_with_fre.js" overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { ZERO_ADDRESS, NftFactory } = require ('@oceanprotocol/lib'); + +// Define a function createFRE() +const createFRE = async () => { + + const FRE_NFT_NAME = 'Datatoken 2' + const FRE_NFT_SYMBOL = 'DT2' + + let config = await oceanConfig(); + + // Create a NFTFactory + const factory = new NftFactory(config.nftFactoryAddress, config.publisherAccount); + + const nftParams = { + name: FRE_NFT_NAME, + symbol: FRE_NFT_SYMBOL, + templateIndex: 1, + tokenURI: '', + transferable: true, + owner: await config.publisherAccount.getAddress() + } + + const datatokenParams = { + templateIndex: 1, + cap: '100000', + feeAmount: '0', + paymentCollector: ZERO_ADDRESS, + feeToken: ZERO_ADDRESS, + minter: await config.publisherAccount.getAddress(), + mpFeeAddress: ZERO_ADDRESS + } + + const freParams = { + fixedRateAddress: config.fixedRateExchangeAddress, + baseTokenAddress: config.oceanTokenAddress, + owner: await config.publisherAccount.getAddress(), + marketFeeCollector: await config.publisherAccount.getAddress(), + baseTokenDecimals: 18, + datatokenDecimals: 18, + fixedRate: '1', + marketFee: '0.001', + allowedConsumer: ZERO_ADDRESS, + withMint: true + } + + const bundleNFT = await factory.createNftWithDatatokenWithFixedRate( + nftParams, + datatokenParams, + freParams + ) + + const trxReceipt = await bundleNFT.wait() + + return { + trxReceipt + }; +}; + +// Call the createFRE() function +createFRE() + .then(({ trxReceipt }) => { + console.log(`TX Receipt ${trxReceipt}`); + process.exit(1); + }) + .catch((err) => { + console.error(err); + process.exit(1); + }); +``` +{% endcode %} + +Execute script + +```bash +node create_datatoken_with_fre.js +``` +{% endtab %} + +{% tab title="create_datatoken_with_free.js" %} +{% code title="create_datatoken_with_free.js" overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { ZERO_ADDRESS, NftFactory } = require ('@oceanprotocol/lib'); + +// Define a function createFRE() +const createFRE = async () => { + + const DISP_NFT_NAME = 'Datatoken 3' + const DISP_NFT_SYMBOL = 'DT3' + + let config = await oceanConfig(); + + // Create a NFTFactory + const factory = new NftFactory(config.nftFactoryAddress, config.publisherAccount); + + const nftParams = { + name: DISP_NFT_NAME, + symbol: DISP_NFT_SYMBOL, + templateIndex: 1, + tokenURI: '', + transferable: true, + owner: await config.publisherAccount.getAddress() + } + + const datatokenParams = { + templateIndex: 1, + cap: '100000', + feeAmount: '0', + paymentCollector: ZERO_ADDRESS, + feeToken: ZERO_ADDRESS, + minter: await config.publisherAccount.getAddress(), + mpFeeAddress: ZERO_ADDRESS + } + + const dispenserParams = { + dispenserAddress: config.dispenserAddress, + maxTokens: '1', + maxBalance: '1', + withMint: true, + allowedSwapper: ZERO_ADDRESS + } + + const bundleNFT = await factory.createNftWithDatatokenWithDispenser( + nftParams, + datatokenParams, + dispenserParams + ) + + const trxReceipt = await bundleNFT.wait() + + return { + trxReceipt + }; +}; + +// Call the createFRE() function +createDispenser() + .then(({ trxReceipt }) => { + console.log(`TX Receipt ${trxReceipt}`); + process.exit(1); + }) + .catch((err) => { + console.error(err); + process.exit(1); + }); +``` +{% endcode %} +{% endtab %} +{% endtabs %} + +By utilizing these dependencies and configuration settings, the script can leverage the functionalities provided by the Ocean libraries and interact with the Ocean Protocol ecosystem effectively. diff --git a/developers/ocean.js/remove-asset.md b/developers/ocean.js/remove-asset.md new file mode 100644 index 00000000..73e052ae --- /dev/null +++ b/developers/ocean.js/remove-asset.md @@ -0,0 +1,82 @@ +# Asset Visibility + +In the Ocean Protocol ecosystem, each asset is associated with a state that is maintained by the NFT (Non-Fungible Token) contract. The [state of an asset](../ddo-specification.md#state) determines its visibility and availability for different actions on platforms like Ocean Market, as well as its appearance in user profiles. The following table outlines the possible states and their characteristics: + +
StateDescriptionDiscoverable in Ocean MarketOrdering AllowedListed Under Profile
0ActiveYesYesYes
1End-of-lifeNoNoNo
2Deprecated (by another asset)NoNoNo
3Revoked by publisherNoNoNo
4Ordering is temporarily disabledYesNoYes
5Asset unlistedNoYesYes
+ +Now let's explain each state in more detail: + +1. **Active**: Assets in the "Active" state are fully functional and available for discovery in Ocean Market, and other components. Users can search for, view, and interact with these assets. Ordering is allowed, which means users can place orders to purchase or access the asset's services. +2. **End-of-life**: Assets in the "End-of-life" state are no longer discoverable. They are typically deprecated or outdated and are no longer actively promoted or maintained. Users cannot place orders or interact with these assets, and they are not listed under the owner's profile. +3. **Deprecated (by another asset)**: This state indicates that another asset has deprecated the current asset. Deprecated assets are not discoverable, and ordering is not allowed. Similar to the "End-of-life" state, deprecated assets are not listed under the owner's profile. +4. **Revoked by publisher**: When an asset is revoked by its publisher, it means that the publisher has explicitly revoked access or ownership rights to the asset. Revoked assets are not discoverable, and ordering is not allowed. +5. **Ordering is temporarily disabled**: Assets in this state are still discoverable, but ordering functionality is temporarily disabled. Users can view the asset and gather information, but they cannot place orders at that moment. However, these assets are still listed under the owner's profile. +6. **Asset unlisted**: Assets in the "Asset unlisted" state are not discoverable. However, users can still place orders for these assets, making them accessible. Unlisted assets are listed under the owner's profile, allowing users to view and access them. + +By assigning specific states to assets, Ocean Protocol enables a structured approach to asset management and visibility. These states help regulate asset discoverability, ordering permissions, and the representation of assets in user profiles, ensuring a controlled and reliable asset ecosystem. + +It is possible to remove assets from Ocean Protocol by modifying the state of the asset. Each asset has a state, which is stored in the NFT contract. Additional details regarding asset states can be found at this [link](../ddo-specification.md#state). There is also an assets purgatory that contains information about the purgatory status of an asset, as defined in the list-purgatory. For more information about the purgatory, please refer to: [https://docs.oceanprotocol.com/core-concepts/did-ddo#purgatory](https://docs.oceanprotocol.com/core-concepts/did-ddo#purgatory). + +We can utilize a portion of the previous tutorial on updating metadata and incorporate the steps to update the asset's state in the asset DDO. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +{% hint style="info" %} +The variables **AQUARIUS\_URL** and **PROVIDER\_URL** should be set correctly in `.env` file +{% endhint %} + +#### Create a script to update the state of an asset by updating the asset's metatada + +Create a new file in the same working directory where the configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +{% code overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { ZERO_ADDRESS, NftFactory, getHash, Nft } = require ('@oceanprotocol/lib'); + +// replace the did here +const did = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f"; + +// This function takes did as a parameter and updates the data NFT information +const updateAssetState = async (did) => { + + const publisherAccount = await oceanConfig.publisherAccount.getAddress(); + + // Fetch ddo from Aquarius + const asset = await await oceanConfig.aquarius.resolve(did); + + const nft = new Nft(oceanConfig.ethersProvider); + + // Update the metadata state and bring it to end-of-life state ("1") + await nft.setMetadataState( + asset?.nft?.address, + publisherAccount, + 1 + ) + + // Check if ddo is correctly udpated in Aquarius + await oceanConfig.aquarius.waitForAqua(ddo.id); + + // Fetch updated asset from Aquarius + const updatedAsset = await await oceanConfig.aquarius.resolve(did); + + console.log(`Resolved asset did [${updatedAsset.id}]from aquarius.`); + console.log(`Updated asset state: [${updatedAsset.nft.state}].`); + +}; + +// Call setMetadata(...) function defined above +updateAssetState(did).then(() => { + process.exit(); +}).catch((err) => { + console.error(err); + process.exit(1); +}); +``` +{% endcode %} diff --git a/developers/ocean.js/update-metadata.md b/developers/ocean.js/update-metadata.md new file mode 100644 index 00000000..9b74247b --- /dev/null +++ b/developers/ocean.js/update-metadata.md @@ -0,0 +1,96 @@ +# Update Metadata + +This tutorial will guide you to update an existing asset published on-chain using Ocean libraries. The tutorial assumes that you already have the `did` of the asset which needs to be updated. In this tutorial, we will update the name, description, tags of the data NFT. Please refer [the page on DDO](../ddo-specification.md) to know more about additional the fields which can be updated. + +#### Prerequisites + +* [Obtain an API key](../get-api-keys-for-blockchain-access.md) +* [Set up the .env file](configuration.md#create-a-env-file) +* [Install the dependencies](configuration.md#setup-dependencies) +* [Create a configuration file](configuration.md#create-a-configuration-file) + +{% hint style="info" %} +The variable **AQUARIUS\_URL** and **PROVIDER\_URL** should be set correctly in `.env` file +{% endhint %} + +#### Create a script to update the metadata + +Create a new file in the same working directory where configuration file (`config.js`) and `.env` files are present, and copy the code as listed below. + +{% tabs %} +{% tab title="ocean.js" %} +{% code title="updateMetadata.js" overflow="wrap" %} +```javascript +// Note: Make sure .env file and config.js are created and setup correctly +const { oceanConfig } = require('./config.js'); +const { ZERO_ADDRESS, NftFactory, getHash, Nft } = require ('@oceanprotocol/lib'); + +// replace the did here +const did = "did:op:a419f07306d71f3357f8df74807d5d12bddd6bcd738eb0b461470c64859d6f0f"; + +// This function takes did as a parameter and updates the data NFT information +const setMetadata = async (did) => { + + const publisherAccount = await oceanConfig.publisherAccount.getAddress(); + + // Fetch ddo from Aquarius + const ddo = await await oceanConfig.aquarius.resolve(did); + + const nft = new Nft(config.publisherAccount); + + // update the ddo here + ddo.metadata.name = "Sample dataset v2"; + ddo.metadata.description = "Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam"; + ddo.metadata.tags = ["new tag1", "new tag2"]; + + const providerResponse = await oceanConfig.ethersProvider.encrypt(ddo, process.env.OCEAN_NETWORK_URL); + const encryptedResponse = await providerResponse; + const metadataHash = getHash(JSON.stringify(ddo)); + + // Update the data NFT metadata + await nft.setMetadata( + ddo.nftAddress, + publisherAccount, + 0, + process.env.OCEAN_NETWORK_URL, + '', + '0x2', + encryptedResponse, + `0x${metadataHash}` + ); + + // Check if ddo is correctly udpated in Aquarius + await oceanConfig.aquarius.waitForAqua(ddo.id); + + console.log(`Resolved asset did [${ddo.id}]from aquarius.`); + console.log(`Updated name: [${ddo.metadata.name}].`); + console.log(`Updated description: [${ddo.metadata.description}].`); + console.log(`Updated tags: [${ddo.metadata.tags}].`); + +}; + +// Call setMetadata(...) function defined above +setMetadata(did).then(() => { + process.exit(); +}).catch((err) => { + console.error(err); + process.exit(1); +}); +``` +{% endcode %} + +Execute the script + +```bash +node updateMetadata.js +``` +{% endtab %} +{% endtabs %} + +We provided several code examples using the Ocean.js library for interacting with the Ocean Protocol. Some highlights from our [code examples](https://github.com/oceanprotocol/ocean.js/blob/main/CodeExamples.md) ([compute examples](https://github.com/oceanprotocol/ocean.js/blob/main/ComputeExamples.md)) are: + +1. **Minting an NFT** - This example demonstrates how to mint an NFT (Non-Fungible Token) using the Ocean.js library. It shows the necessary steps, including creating a NFTFactory instance, defining NFT parameters, and calling the `create()` method to mint the NFT. +2. **Publishing a dataset** - This example explains how to publish a dataset on the Ocean Protocol network. It covers steps such as creating a DDO, signing the DDO, and publish the dataset. +3. **Consuming a dataset** - This example demonstrates how to consume a published dataset. It shows how to search for available assets, retrieve the DDO for a specific asset, order the asset using a specific datatoken, and then download the asset. + +You can explore more detailed code examples and explanations on Ocean.js [readme](https://github.com/oceanprotocol/ocean.js#readme). diff --git a/developers/ocean.py/README.md b/developers/ocean.py/README.md new file mode 100644 index 00000000..274382a1 --- /dev/null +++ b/developers/ocean.py/README.md @@ -0,0 +1,55 @@ +# Ocean.py + + + +Attention all data enthusiasts! Are you an inquisitive data scientist intrigued by the world of Web3 and blockchain, but unsure of where to begin? Have you developed a groundbreaking AI algorithm and desire to transform it into profitable success? Perhaps you're engaged in training a highly lucrative model (LLM) and seek to define precise licensing terms for your valuable data. Or maybe you simply wish to sell your data while maintaining utmost privacy and control. + +Well, brace yourselves for some exhilarating news! Introducing ocean.py, a Python library that possesses a touch of magic. 🎩🐍 It empowers you to discreetly and securely publish, exchange, and effortlessly consume data. 🐙💦 Collaborating with the Ocean Protocol 🌊, it unlocks a plethora of advantages mentioned earlier. So get ready to take the plunge into the vast ocean of data with a resounding splash of excitement! 💦🌊 + +

ocean.py library

+ +### Overview + +ocean.py serves as a connection layer bridging the V4 smart contracts and various components such as [Provider](https://github.com/oceanprotocol/provider), [Aquarius](https://github.com/oceanprotocol/aquarius), and [Compute to Data engine](https://github.com/oceanprotocol/operator-service) within Ocean Protocol. This pythonic library brings all these elements together, facilitating seamless integration and interaction. By acting as an intermediary, ocean.py enables developers to easily leverage the functionalities offered by Ocean Protocol, making it a valuable tool in building applications and solutions that utilize decentralized data marketplaces. Its purpose is to simplify the process of connecting with smart contracts and accessing services provided by Provider, Aquarius, and Compute to Data engine, providing a convenient and efficient development experience for users. + +#### Architectural point of view + +ocean.py is like the conductor of an underwater orchestra, guiding different marine creatures (modules) to work together harmoniously. It's an open-source library that makes swimming in the vast sea of data a breeze! 🌊 + +The head of our library is the "[Ocean](technical-details.md)" class. It oversees everything and keeps track of the data flow. + +Now, let's take a closer look at those amazing branches: + +1. **Data Discovery Branch**: This branch discovers & creates valuable datasets stored in the Ocean Protocol ecosystem. It navigates through metadata and identifies the hidden treasures of the data assets. +2. **Data Access Branch**: Just like a skilled locksmith, this branch unlocks the doors to the datasets, facilitating access and content retrieval. It interacts with the Ocean Protocol's smart contracts to securely fetch the desired data. +3. **Data Transformation Branch**: Transforming data is like wielding magic, and this arm is the magician! It performs enchanting operations on the data, such as reformatting, reorganizing, or even enriching it, making it ready for the next steps. +4. **Model Deployment Branch**: This branch deploy the Ocean smart contract objects wrapped models using [Brownie](https://github.com/eth-brownie/brownie), making them accessible for utilization within the library. +5. **Model Training Branch**: This branch collaborates with Compute-To-Data engine in order to run algorithms and to train models using the transformed data. +6. **Model Monitoring Branch**: This branch monitors the received algorithm result logs from Compute-to-Data engine, tracking their performance. + +So, in the realm of ocean.py's integration with Ocean Protocol's smart contracts, the six versatile branches embark on an exciting journey. Together, they form a powerful team, navigating the depths of the Ocean ecosystem. 🌊🐙 + +### ocean.py Strengths 💪 + +ocean.py lets you do the following things: + +* Publish data services: downloadable files or compute-to-data. Create an ERC721 data NFT for each service, and ERC20 datatoken for access (1.0 datatokens to access). +* Sell datatokens via for a fixed price. Sell data NFTs. +* Transfer data NFTs & datatokens to another owner, and all other ERC721 & ERC20 actions using Brownie. + +If you prefer video format, please check this video below, otherwise let's move forward. + +{% embed url="https://youtu.be/8uZC6PC9PBM" %} + + + +### ocean.py Quickstart 🚀 + +To kickstart your adventure with ocean.py, we set out the following steps to get you zooming ahead in no time! + +1. [Install Ocean](install.md) 📥 +2. Setup 🛠️ — [Remote ](remote-setup.md)(Win, MacOS, Linux) — or [Local ](local-setup.md)(Linux only) +3. [Publish asset](publish-flow.md), post for free / for sale, dispense it / buy it, and [consume ](consume-flow.md)it +4. Run algorithms through [Compute-to-Data flow](compute-flow.md) using Ocean environment. + +After these quickstart steps, the main [README](https://github.com/oceanprotocol/ocean.py/blob/main/README.md) points to several other use cases, such as [Predict-ETH](https://github.com/oceanprotocol/predict-eth), [Data Farming](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/df.md), on-chain key-value stores ([public](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/key-value-public.md) or [private](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/key-value-private.md)), and other types of data assets ([REST API](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-restapi.md), [GraphQL](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-graphql.md), [on-chain](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-onchain.md)). diff --git a/developers/ocean.py/compute-flow.md b/developers/ocean.py/compute-flow.md new file mode 100644 index 00000000..ed2ec2cd --- /dev/null +++ b/developers/ocean.py/compute-flow.md @@ -0,0 +1,219 @@ +--- +description: This page shows how you run a compute flow. +--- + +# Compute Flow + +In this page, we provide the steps for publishing algorithm asset, run it on Ocean environment for C2D and retrieve the result logs, using ocean.py. + +We assumed that you have completed the installation part with the preferred setup. + +Here are the steps: + +1. Alice publishes dataset +2. Alice publishes algorithm +3. Alice allows the algorithm for C2D for that data asset +4. Bob acquires datatokens for data and algorithm +5. Bob starts a compute job using a free C2D environment (no provider fees) +6. Bob monitors logs / algorithm output + +Let's go through each step. + +### 1. Alice publishes dataset + +In the same python console: + +{% code overflow="wrap" %} +```python +# Publish data NFT, datatoken, and asset for dataset based on url + +# ocean.py offers multiple file object types. A simple url file is enough for here +from ocean_lib.structures.file_objects import UrlFile +DATA_url_file = UrlFile( + url="https://raw.githubusercontent.com/oceanprotocol/c2d-examples/main/branin_and_gpr/branin.arff" +) + +name = "Branin dataset" +(DATA_data_nft, DATA_datatoken, DATA_ddo) = ocean.assets.create_url_asset(name, DATA_url_file.url, {"from": alice}, with_compute=True, wait_for_aqua=True) +print(f"DATA_data_nft address = '{DATA_data_nft.address}'") +print(f"DATA_datatoken address = '{DATA_datatoken.address}'") + +print(f"DATA_ddo did = '{DATA_ddo.did}'") +``` +{% endcode %} + +To customise the privacy and accessibility of your compute service, add the `compute_values` argument to `create_url_asset` to set values according to the [DDO specs](https://docs.oceanprotocol.com/core-concepts/did-ddo). The function assumes the documented defaults. + +### 2. Alice publishes an algorithm + +In the same Python console: + +{% code overflow="wrap" %} +```python +# Publish data NFT & datatoken for algorithm +ALGO_url = "https://raw.githubusercontent.com/oceanprotocol/c2d-examples/main/branin_and_gpr/gpr.py" + +name = "grp" +(ALGO_data_nft, ALGO_datatoken, ALGO_ddo) = ocean.assets.create_algo_asset(name, ALGO_url, {"from": alice}, wait_for_aqua=True) + +print(f"ALGO_data_nft address = '{ALGO_data_nft.address}'") +print(f"ALGO_datatoken address = '{ALGO_datatoken.address}'") +print(f"ALGO_ddo did = '{ALGO_ddo.did}'") +``` +{% endcode %} + +### 3. Alice allows the algorithm for C2D for that data asset + +In the same Python console: + +{% code overflow="wrap" %} +```python +compute_service = DATA_ddo.services[1] +compute_service.add_publisher_trusted_algorithm(ALGO_ddo) +DATA_ddo = ocean.assets.update(DATA_ddo, {"from": alice}) +``` +{% endcode %} + +### 4. Bob acquires datatokens for data and algorithm + +In the same Python console: + +```python +# Alice mints DATA datatokens and ALGO datatokens to Bob. +# Alternatively, Bob might have bought these in a market. +from ocean_lib.ocean.util import to_wei +DATA_datatoken.mint(bob, to_wei(5), {"from": alice}) +ALGO_datatoken.mint(bob, to_wei(5), {"from": alice}) +``` + +You can choose each method for getting access from[ consume flow approaches](consume-flow.md). + +### 5. Bob starts a compute job using a free C2D environment + +Only inputs needed: DATA\_did, ALGO\_did. Everything else can get computed as needed. For demo purposes, we will use the free C2D environment, which requires no provider fees. + +In the same Python console: + +{% code overflow="wrap" %} +```python +# Convenience variables +DATA_did = DATA_ddo.did +ALGO_did = ALGO_ddo.did + +# Operate on updated and indexed assets +DATA_ddo = ocean.assets.resolve(DATA_did) +ALGO_ddo = ocean.assets.resolve(ALGO_did) + +compute_service = DATA_ddo.services[1] +algo_service = ALGO_ddo.services[0] +free_c2d_env = ocean.compute.get_free_c2d_environment(compute_service.service_endpoint, DATA_ddo.chain_id) + +from datetime import datetime, timedelta, timezone +from ocean_lib.models.compute_input import ComputeInput + +DATA_compute_input = ComputeInput(DATA_ddo, compute_service) +ALGO_compute_input = ComputeInput(ALGO_ddo, algo_service) + +# Pay for dataset and algo for 1 day +datasets, algorithm = ocean.assets.pay_for_compute_service( + datasets=[DATA_compute_input], + algorithm_data=ALGO_compute_input, + consume_market_order_fee_address=bob.address, + tx_dict={"from": bob}, + compute_environment=free_c2d_env["id"], + valid_until=int((datetime.now(timezone.utc) + timedelta(days=1)).timestamp()), + consumer_address=free_c2d_env["consumerAddress"], +) +assert datasets, "pay for dataset unsuccessful" +assert algorithm, "pay for algorithm unsuccessful" + +# Start compute job +job_id = ocean.compute.start( + consumer_wallet=bob, + dataset=datasets[0], + compute_environment=free_c2d_env["id"], + algorithm=algorithm, +) +print(f"Started compute job with id: {job_id}") +``` +{% endcode %} + +### 6. Bob monitors logs / algorithm output + +In the same Python console, you can check the job status as many times as needed: + +```python +# Wait until job is done +import time +from decimal import Decimal +succeeded = False +for _ in range(0, 200): + status = ocean.compute.status(DATA_ddo, compute_service, job_id, bob) + if status.get("dateFinished") and Decimal(status["dateFinished"]) > 0: + succeeded = True + break + time.sleep(5) +``` + +This will output the status of the current job. Here is a list of possible results: [Operator Service Status description](https://github.com/oceanprotocol/operator-service/blob/main/API.md#status-description). + +Once the returned status dictionary contains the `dateFinished` key, Bob can retrieve the job results using ocean.compute.result or, more specifically, just the output if the job was successful. For the purpose of this tutorial, let's choose the second option. + +```python +# Retrieve algorithm output and log files +output = ocean.compute.compute_job_result_logs( + DATA_ddo, compute_service, job_id, bob +)[0] + +import pickle +model = pickle.loads(output) # the gaussian model result +assert len(model) > 0, "unpickle result unsuccessful" +``` + +You can use the result however you like. For the purpose of this example, let's plot it. + +Make sure you have `matplotlib` package installed in your virtual environment. + +{% code overflow="wrap" %} +```python +import numpy +from matplotlib import pyplot + +X0_vec = numpy.linspace(-5., 10., 15) +X1_vec = numpy.linspace(0., 15., 15) +X0, X1 = numpy.meshgrid(X0_vec, X1_vec) +b, c, t = 0.12918450914398066, 1.5915494309189535, 0.039788735772973836 +u = X1 - b * X0 ** 2 + c * X0 - 6 +r = 10. * (1. - t) * numpy.cos(X0) + 10 +Z = u ** 2 + r + +fig, ax = pyplot.subplots(subplot_kw={"projection": "3d"}) +ax.scatter(X0, X1, model, c="r", label="model") +pyplot.title("Data + model") +pyplot.show() # or pyplot.savefig("test.png") to save the plot as a .png file instead +``` +{% endcode %} + +You should see something like this: + +
+ +### Appendix. Tips & tricks + +This README has a simple ML algorithm. However, Ocean C2D is not limited to usage in ML. The file [c2d-flow-more-examples.md](https://github.com/oceanprotocol/ocean.py/blob/v4main/READMEs/c2d-flow-more-examples.md) has examples from vision and other fields. + +In the "publish algorithm" step, to replace the sample algorithm with another one: + +* Use one of the standard [Ocean algo_dockers images](https://github.com/oceanprotocol/algo_dockers) or publish a custom docker image. +* Use the image name and tag in the `container` part of the algorithm metadata. +* The image must have basic support for installing dependencies. E.g. "pip" for the case of Python. You can use other languages, of course. +* More info: [https://docs.oceanprotocol.com/tutorials/compute-to-data-algorithms/](../compute-to-data/compute-to-data-algorithms.md) + +The function to `pay_for_compute_service` automates order starting, order reusing and performs all the necessary Provider and on-chain requests. It modifies the contents of the given ComputeInput as follows: + +* If the dataset/algorithm contains a `transfer_tx_id` property, it will try to reuse that previous transfer id. If provider fees have expired but the order is still valid, then the order is reused on-chain. +* If the dataset/algorithm does not contain a `transfer_tx_id` or the order has expired (based on the Provider's response), then one new order will be created. + +This means you can reuse the same ComputeInput and you don't need to regenerate it everytime it is sent to `pay_for_compute_service`. This step makes sure you are not paying unnecessary or duplicated fees. + +If you wish to upgrade the compute resources, you can use any (paid) C2D environment. Inspect the results of `ocean.ocean_compute.get_c2d_environments(service.service_endpoint, DATA_ddo.chain_id)` and `ocean.retrieve_provider_fees_for_compute(datasets, algorithm_data, consumer_address, compute_environment, duration)` for a preview of what you will pay. Don't forget to handle any minting, allowance or approvals on the desired token to ensure transactions pass. diff --git a/developers/ocean.py/consume-flow.md b/developers/ocean.py/consume-flow.md new file mode 100644 index 00000000..007c535c --- /dev/null +++ b/developers/ocean.py/consume-flow.md @@ -0,0 +1,109 @@ +--- +description: This page shows how you can get datatokens & download an asset +--- + +# Consume Flow + +Consume flow highlights the methods for getting a datatoken for accessing an asset from Ocean Market and for downloading the content of the asset. + +We assumed that you accomplished the publish flow presented previously. + +Now let's see how can Bob get access to Alice's asset in order to download/consume it. + +### Get access for a dataset 🔑 + +Below, we show four possible approaches: + +* A & B are when Alice is in contact with Bob. She can mint directly to him, or mint to herself and transfer to him. +* C is when Alice wants to share access for free, to anyone +* D is when Alice wants to sell access + +
+ +In the same Python console: + +```python +from ocean_lib.ocean.util import to_wei + +#Approach A: Alice mints datatokens to Bob +datatoken.mint(bob, to_wei(1), {"from": alice}) + +#Approach B: Alice mints for herself, and transfers to Bob +datatoken.mint(alice, to_wei(1), {"from": alice}) +datatoken.transfer(bob, to_wei(1), {"from": alice}) + +#Approach C: Alice posts for free, via a dispenser / faucet; Bob requests & gets +datatoken.create_dispenser({"from": alice}) +datatoken.dispense(to_wei(1), {"from": bob}) + +#Approach D: Alice posts for sale; Bob buys +# D.1 Alice creates exchange +price = to_wei(100) +exchange = datatoken.create_exchange({"from": alice}, price, ocean.OCEAN_address) + +# D.2 Alice makes 100 datatokens available on the exchange +datatoken.mint(alice, to_wei(100), {"from": alice}) +datatoken.approve(exchange.address, to_wei(100), {"from": alice}) + +# D.3 Bob lets exchange pull the OCEAN needed +OCEAN_needed = exchange.BT_needed(to_wei(1), consume_market_fee=0) +ocean.OCEAN_token.approve(exchange.address, OCEAN_needed, {"from":bob}) + +# D.4 Bob buys datatoken +exchange.buy_DT(to_wei(1), consume_market_fee=0, tx_dict={"from": bob}) +``` + +For more info, check [Technical Details](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/336/developers/ocean.py/technical-details) about ocean.py most used functions and also the smart contracts for [Dispenser](https://github.com/oceanprotocol/contracts/blob/main/contracts/pools/dispenser/Dispenser.sol) & [Fixed Rate Exchange](https://github.com/oceanprotocol/contracts/blob/main/contracts/pools/fixedRate/FixedRateExchange.sol). + +### Consume the asset ⬇️ + +To "consume" an asset typically means placing an "order", where you pass in 1.0 datatokens and get back a url. Then, you typically download the asset from the url. + +Bob now has the datatoken for the dataset! Time to download the dataset and use it. + +
+ +In the same Python console: + +```python +# Bob sends a datatoken to the service to get access +order_tx_id = ocean.assets.pay_for_access_service(ddo, {"from": bob}) + +# Bob downloads the file. If the connection breaks, Bob can try again +asset_dir = ocean.assets.download_asset(ddo, bob, './', order_tx_id) + +import os +file_name = os.path.join(asset_dir, "file0") +``` + +Let's check that the file is downloaded. In a new console: + +```bash +cd my_project/datafile.did:op:* +cat file0 +``` + +The _beginning_ of the file should contain the following contents: + +```bash +% 1. Title: Branin Function +% 3. Number of instances: 225 +% 6. Number of attributes: 2 + +@relation branin + +@attribute 'x0' numeric +@attribute 'x1' numeric +@attribute 'y' numeric + +@data +-5.0000,0.0000,308.1291 +-3.9286,0.0000,206.1783 +... +``` + +Here’s a video version for this post 👇 + +{% embed url="https://www.youtube.com/watch?v=JQF-5oRvq9w" %} +Main Flow Video +{% endembed %} diff --git a/developers/ocean.py/datatoken-interface-tech-details.md b/developers/ocean.py/datatoken-interface-tech-details.md new file mode 100644 index 00000000..e32a4d6e --- /dev/null +++ b/developers/ocean.py/datatoken-interface-tech-details.md @@ -0,0 +1,586 @@ +--- +description: Technical details about Datatoken functions +--- + +# Datatoken Interface Tech Details + +`Datatoken contract interface` is like the superhero that kicks off the action-packed adventure of contract calls! It's here to save the day by empowering us to unleash the mighty powers of dispensers, fixed rate exchanges, and initializing orders. For this page, we present the utilitary functions that embark you on the Ocean journey. + +### Create Dispenser + +* **create\_dispenser**(`self`, `tx_dict: dict`, `max_tokens: Optional[Union[int, str]] = None`, `max_balance: Optional[Union[int, str]] = None`, `with_mint: Optional[bool] = True`) + +Through datatoken, you can deploy a new dispenser schema which is used for creating free assets, because its behaviour is similar with a faucet. ⛲ + +It is implemented in DatatokenBase, inherited by Datatoken2, so it can be called within both instances. + +**Parameters** + +* `tx_dict` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). +* `max_tokens` - maximum amount of tokens to dispense in wei. The default is a large number. +* `max_balance` - maximum balance of requester in wei. The default is a large number. +* `with_mint` - boolean, `true` if we want to allow the dispenser to be a minter as default value + +**Returns** + +`str` + +Return value is a hex string which denotes the transaction hash of dispenser deployment. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL336C5-L377C18) + +
+ +Source code + +```python +@enforce_types + def create_dispenser( + self, + tx_dict: dict, + max_tokens: Optional[Union[int, str]] = None, + max_balance: Optional[Union[int, str]] = None, + with_mint: Optional[bool] = True, + ): + """ + For this datataken, create a dispenser faucet for free tokens. + + This wraps the smart contract method Datatoken.createDispenser() + with a simpler interface. + + :param: max_tokens - max # tokens to dispense, in wei + :param: max_balance - max balance of requester + :tx_dict: e.g. {"from": alice_wallet} + :return: tx + """ + # already created, so nothing to do + if self.dispenser_status().active: + return + + # set max_tokens, max_balance if needed + max_tokens = max_tokens or MAX_UINT256 + max_balance = max_balance or MAX_UINT256 + + # args for contract tx + dispenser_addr = get_address_of_type(self.config_dict, "Dispenser") + with_mint = with_mint # True -> can always mint more + allowed_swapper = ZERO_ADDRESS # 0 -> so anyone can call dispense + + # do contract tx + tx = self.createDispenser( + dispenser_addr, + max_tokens, + max_balance, + with_mint, + allowed_swapper, + tx_dict, + ) + return tx +``` + +
+ +### Dispense Datatokens + +* **dispense**(`self`, `amount: Union[int, str]`, `tx_dict: dict`) + +This function is used to retrieve funds or datatokens for an user who wants to start an order. + +It is implemented in DatatokenBase, so it can be called within Datatoken class. + +**Parameters** + +* `amount` - amount of datatokens to be dispensed in wei (int or string format) +* `tx_dict` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). + +**Returns** + +`str` + +Return value is a hex string which denotes the transaction hash of dispensed datatokens, like a proof. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL379C5-L400C18) + +
+ +Source code + +```python + @enforce_types + def dispense(self, amount: Union[int, str], tx_dict: dict): + """ + Dispense free tokens via the dispenser faucet. + + :param: amount - number of tokens to dispense, in wei + :tx_dict: e.g. {"from": alice_wallet} + :return: tx + """ + # args for contract tx + datatoken_addr = self.address + from_addr = ( + tx_dict["from"].address + if hasattr(tx_dict["from"], "address") + else tx_dict["from"] + ) + + # do contract tx + tx = self._ocean_dispenser().dispense( + datatoken_addr, amount, from_addr, tx_dict + ) + return tx +``` + +
+ +### Dispense Datatokens & Order + +* **dispense\_and\_order**(`self`, `consumer: str`, `service_index: int`, `provider_fees: dict`, `transaction_parameters: dict`, `consume_market_fees=None`) -> `str` + +This function is used to retrieve funds or datatokens for an user who wants to start an order. + +It is implemented in `Datatoken2`, so it can be called within `Datatoken2` class (using the enterprise template). + +**Parameters** + +* `consumer` - address of the consumer wallet that needs funding +* `service_index` - service index as int for identifying the service that you want to further call [`start_order`](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL169C5-L197C10). +* `transaction_parameters` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). +* `consume_market_fees` - [`TokenInfo` ](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#L31)object which contains the consume market fee amount, address & token address. If it is not explicitly specified, by default it has an empty `TokenInfo` object. + +**Returns** + +`str` + +Return value is a hex string which denotes the transaction hash of dispensed datatokens, like a proof of starting order. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL439C5-L483C1) + +
+ +Source code + +{% code overflow="wrap" %} +```python +def dispense_and_order( + self, + consumer: str, + service_index: int, + provider_fees: dict, + transaction_parameters: dict, + consume_market_fees=None, + ) -> str: + if not consume_market_fees: + consume_market_fees = TokenFeeInfo() + + buyer_addr = ( + transaction_parameters["from"].address + if hasattr(transaction_parameters["from"], "address") + else transaction_parameters["from"] + ) + + bal = from_wei(self.balanceOf(buyer_addr)) + if bal < 1.0: + dispenser_addr = get_address_of_type(self.config_dict, "Dispenser") + from ocean_lib.models.dispenser import Dispenser # isort: skip + + dispenser = Dispenser(self.config_dict, dispenser_addr) + + # catch key failure modes + st = dispenser.status(self.address) + active, allowedSwapper = st[0], st[6] + if not active: + raise ValueError("No active dispenser for datatoken") + if allowedSwapper not in [ZERO_ADDRESS, buyer_addr]: + raise ValueError(f"Not allowed. allowedSwapper={allowedSwapper}") + + # Try to dispense. If other issues, they'll pop out + dispenser.dispense( + self.address, "1 ether", buyer_addr, transaction_parameters + ) + + return self.start_order( + consumer=ContractBase.to_checksum_address(consumer), + service_index=service_index, + provider_fees=provider_fees, + consume_market_fees=consume_market_fees, + transaction_parameters=transaction_parameters, + ) +``` +{% endcode %} + +
+ +### Dispenser Status + +* **dispenser\_status**(`self`) -> `DispenserStatus` + +**Returns** + +`DispenserStatus` + +Returns a `DispenserStatus` object returned from `Dispenser.sol::status(dt_addr)` which is composed of: + +* bool active +* address owner +* bool isMinter +* uint256 maxTokens +* uint256 maxBalance +* uint256 balance +* address allowedSwapper + +These are Solidity return values & types, but `uint256` means int in Python and `address` is a `string` instance. + +For tips & tricks, check [this section](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/READMEs/main-flow.md#faucet-tips--tricks) from the [README](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/READMEs/main-flow.md). + +It is implemented in `DatatokenBase`, inherited by `Datatoken2`, so it can be called within both instances. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL402C1-L409C43) + +
+ +Source code + +```python +@enforce_types + def dispenser_status(self): + """:return: DispenserStatus object""" + # import here to avoid circular import + from ocean_lib.models.dispenser import DispenserStatus + + status_tup = self._ocean_dispenser().status(self.address) + return DispenserStatus(status_tup) +``` + +
+ +### Create Fixed Rate Exchange + +* **create\_exchange**(`self`, `rate: Union[int, str]`, `base_token_addr: str`, `tx_dict: dict`, `owner_addr: Optional[str] = None`, `publish_market_fee_collector: Optional[str] = None, publish_market_fee: Union[int, str] = 0`, `with_mint: bool = False`, `allowed_swapper: str = ZERO_ADDRESS`, `full_info: bool = False`) -> `Union[OneExchange, tuple]` + +It is implemented in `DatatokenBase`, inherited by `Datatoken2`, so it can be called within both instances. + +For this datatoken, create a single fixed-rate exchange (`OneExchange`). + +This wraps the smart contract method `Datatoken.createFixedRate()` with a simpler interface. + +**Parameters** + +* `rate` - how many base tokens does 1 datatoken cost? In wei or string +* `base_token_addr` - e.g. OCEAN address +* `tx_dict` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). + +**Optional parameters** + +* `owner_addr` - owner of the datatoken +* `publish_market_fee_collector` - fee going to publish market address +* `publish_market_fee` - in wei or string, e.g. `int(1e15)` or `"0.001 ether"` +* `with_mint` - should the exchange mint datatokens as needed (`True`), or do they need to be supplied/allowed by participants like base token (`False`)? +* `allowed_swapper` - if `ZERO_ADDRESS`, anyone can swap +* `full_info` - return just `OneExchange`, or `(OneExchange, )` + +**Returns** + +* `exchange` - OneExchange +* (maybe) `tx_receipt` + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL236C4-L310C1) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def create_exchange( + self, + rate: Union[int, str], + base_token_addr: str, + tx_dict: dict, + owner_addr: Optional[str] = None, + publish_market_fee_collector: Optional[str] = None, + publish_market_fee: Union[int, str] = 0, + with_mint: bool = False, + allowed_swapper: str = ZERO_ADDRESS, + full_info: bool = False, + ) -> Union[OneExchange, tuple]: + + # import now, to avoid circular import + from ocean_lib.models.fixed_rate_exchange import OneExchange + + FRE_addr = get_address_of_type(self.config_dict, "FixedPrice") + from_addr = ( + tx_dict["from"].address + if hasattr(tx_dict["from"], "address") + else tx_dict["from"] + ) + BT = Datatoken(self.config_dict, base_token_addr) + owner_addr = owner_addr or from_addr + publish_market_fee_collector = publish_market_fee_collector or from_addr + + tx = self.contract.createFixedRate( + checksum_addr(FRE_addr), + [ + checksum_addr(BT.address), + checksum_addr(owner_addr), + checksum_addr(publish_market_fee_collector), + checksum_addr(allowed_swapper), + ], + [ + BT.decimals(), + self.decimals(), + rate, + publish_market_fee, + with_mint, + ], + tx_dict, + ) + + exchange_id = tx.events["NewFixedRate"]["exchangeId"] + FRE = self._FRE() + exchange = OneExchange(FRE, exchange_id) + if full_info: + return (exchange, tx) + return exchange +``` +{% endcode %} + +
+ +### Buy Datatokens & Order + +* **buy\_DT\_and\_order**(`self`, `consumer: str`, `service_index: int`, `provider_fees: dict`, `exchange: Any`, `transaction_parameters: dict`, `consume_market_fees=None`) -> `str` + +This function is used to retrieve funds or datatokens for an user who wants to start an order. + +It is implemented in `Datatoken` class and it is also inherited in `Datatoken2` class. + +**Parameters** + +* `consumer` - address of the consumer wallet that needs funding +* `service_index` - service index as int for identifying the service that you want to further call [`start_order`](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL169C5-L197C10). +* `transaction_parameters` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). +* `consume_market_fees` - [`TokenInfo` ](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#L31)object which contains the consume market fee amount, address & token address. If it is not explicitly specified, by default it has an empty `TokenInfo` object. + +**Returns** + +`str` + +Return value is a hex string for transaction hash which denotes the proof of starting order. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL484C4-L518C10) + +
+ +Source code + +```python + @enforce_types + def buy_DT_and_order( + self, + consumer: str, + service_index: int, + provider_fees: dict, + exchange: Any, + transaction_parameters: dict, + consume_market_fees=None, + ) -> str: + fre_address = get_address_of_type(self.config_dict, "FixedPrice") + + # import now, to avoid circular import + from ocean_lib.models.fixed_rate_exchange import OneExchange + + if not consume_market_fees: + consume_market_fees = TokenFeeInfo() + + if not isinstance(exchange, OneExchange): + exchange = OneExchange(fre_address, exchange) + + exchange.buy_DT( + datatoken_amt=to_wei(1), + consume_market_fee_addr=consume_market_fees.address, + consume_market_fee=consume_market_fees.amount, + tx_dict=transaction_parameters, + ) + + return self.start_order( + consumer=ContractBase.to_checksum_address(consumer), + service_index=service_index, + provider_fees=provider_fees, + consume_market_fees=consume_market_fees, + transaction_parameters=transaction_parameters, + ) + + +``` + +
+ +### Get Exchanges + +* **get\_exchanges**(`self`) -> `list` + +**Returns** + +`list` + +Returns `List[OneExchange]` - all the exchanges for this datatoken. + +It is implemented in `DatatokenBase`, inherited by `Datatoken2`, so it can be called within both instances. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL311C4-L322C25) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def get_exchanges(self) -> list: + """return List[OneExchange] - all the exchanges for this datatoken""" + # import now, to avoid circular import + from ocean_lib.models.fixed_rate_exchange import OneExchange + + FRE = self._FRE() + addrs_and_exchange_ids = self.getFixedRates() + exchanges = [ + OneExchange(FRE, exchange_id) for _, exchange_id in addrs_and_exchange_ids + ] + return exchanges +``` +{% endcode %} + +
+ +### Start Order + +* **start\_order**(`self`, `consumer: str`, `service_index: int`, `provider_fees: dict`, `transaction_parameters: dict`, `consume_market_fees=None`) -> `str` + +Starting order of a certain datatoken. + +It is implemented in Datatoken class and it is also inherited in Datatoken2 class. + +**Parameters** + +* `consumer` - address of the consumer wallet that needs funding +* `service_index` - service index as int for identifying the service that you want to apply `start_order`. +* `provider_fees` - dictionary which includes provider fees generated when `initialize` endpoint from `Provider` was called. +* `transaction_parameters` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). +* `consume_market_fees` - [`TokenInfo` ](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#L31)object which contains the consume market fee amount, address & token address. If it is not explicitly specified, by default it has an empty `TokenInfo` object. + +**Returns** + +`str` + +Return value is a hex string for transaction hash which denotes the proof of starting order. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL169C5-L197C10) + +
+ +Source code + +```python +@enforce_types + def start_order( + self, + consumer: str, + service_index: int, + provider_fees: dict, + transaction_parameters: dict, + consume_market_fees=None, + ) -> str: + + if not consume_market_fees: + consume_market_fees = TokenFeeInfo() + + return self.contract.startOrder( + checksum_addr(consumer), + service_index, + ( + checksum_addr(provider_fees["providerFeeAddress"]), + checksum_addr(provider_fees["providerFeeToken"]), + int(provider_fees["providerFeeAmount"]), + provider_fees["v"], + provider_fees["r"], + provider_fees["s"], + provider_fees["validUntil"], + provider_fees["providerData"], + ), + consume_market_fees.to_tuple(), + transaction_parameters, + ) +``` + +
+ +### Reuse Order + +* **reuse\_order**(`self`, `order_tx_id: Union[str, bytes]`, `provider_fees: dict`, `transaction_parameters: dict` ) -> `str` + +Reusing an order from a certain datatoken. + +It is implemented in Datatoken class and it is also inherited in Datatoken2 class. + +**Parameters** + +* `order_tx_id` - transaction hash of a previous order, string or bytes format. +* `provider_fees` - dictionary which includes provider fees generated when `initialize` endpoint from `Provider` was called. +* `transaction_parameters` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/). + +**Returns** + +`str` + +Return value is a hex string for transaction hash which denotes the proof of reusing order. + +**Defined in** + +[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) + +
+ +Source code + +```python + @enforce_types + def reuse_order( + self, + order_tx_id: Union[str, bytes], + provider_fees: dict, + transaction_parameters: dict, + ) -> str: + return self.contract.reuseOrder( + order_tx_id, + ( + checksum_addr(provider_fees["providerFeeAddress"]), + checksum_addr(provider_fees["providerFeeToken"]), + int(provider_fees["providerFeeAmount"]), + provider_fees["v"], + provider_fees["r"], + provider_fees["s"], + provider_fees["validUntil"], + provider_fees["providerData"], + ), + transaction_parameters, + ) +``` + +
diff --git a/developers/ocean.py/install.md b/developers/ocean.py/install.md new file mode 100644 index 00000000..e775203c --- /dev/null +++ b/developers/ocean.py/install.md @@ -0,0 +1,61 @@ +# Install + +Let’s start interacting with the python library by firstly installing it & its prerequisites. + +From the adventurous `Python 3.8.5` all the way up to `Python 3.10.4`, ocean.py has got your back! 🚀 + +While `ocean.py` can join you on your `Python 3.11` journey, a few manual tweaks may be required. But worry not, brave explorers, we've got all the juicy details for you below! 📚✨ +⚠️ Make sure that you have `autoconf`, `pkg-config` and `build-essential` or their equivalents installed on your host. + +### Installing ocean.py + +ocean.py is a Python library [on pypi as ocean-lib](https://pypi.org/project/ocean-lib/). So after you have completed the prerequisites step, let's create a new console for library installation: + +```bash +# Create your working directory +mkdir my_project +cd my_project + +# Initialize virtual environment and activate it. Install artifacts. +# Make sure your Python version inside the venv is >=3.8. +# Anaconda is not fully supported for now, please use venv +python3 -m venv venv +source venv/bin/activate + +# Avoid errors for the step that follows +pip install wheel + +# Install Ocean library. +pip install ocean-lib +``` + +### Potential issues & workarounds + +Issue: M1 \* `coincurve` or `cryptography` + +* If you have an Apple M1 processor, `coincurve` and `cryptography` installation may fail due missing packages, which come pre-packaged in other operating systems. +* Workaround: ensure you have `autoconf`, `automake` and `libtool` installed as it is mentioned in the prerequisites, e.g. using Homebrew or MacPorts. + +Issue: MacOS “Unsupported Architecture” + +* If you run MacOS, you may encounter an “Unsupported Architecture” issue. +* Workaround: install including ARCHFLAGS: `ARCHFLAGS="-arch x86_64" pip install ocean-lib`. [Details](https://github.com/oceanprotocol/ocean.py/issues/486). + +To install ocean-lib using Python 3.11, run `pip install vyper==0.3.7 --ignore-requires-python` and `sudo apt-get install python3.11-dev` before installing ocean-lib. Since the parsimonious dependency does not support Python 3.11, you need to edit the `parsimonious/expressions.py` to `import getfullargspec as getargsspec` instead of the regular import. These are temporary fixes until all dependencies are fully supported in Python 3.11. We do not directly use Vyper in ocean-lib. + +### ocean.py uses Brownie + +Let's dive deeper into the Ocean world! 💙 Did you know that Ocean and Brownie are like best buddies? When you installed Ocean (ocean-lib pypi package) earlier, it automatically took care of installing Brownie (eth-brownie package) too. Talk about a dynamic duo! 🦸‍♀️🦸‍♂️ + +`ocean.py` treats each Ocean smart contract as a Python class, and each deployed smart contract as a Python object. We love this feature, because it means Python programmers can treat Solidity code as Python code! 🤯 + +### Helpful resources + +Oh, buoy! 🌊🐙 When it comes to installation, ocean.py has you covered with a special README called ["install.md"](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/install.md). It's like a trusty guide that helps you navigate all the nitty-gritty details. So, let's dive in and ride the waves of installation together! 🏄‍♂️🌊 + +Or if you prefer a video format, you can check this tutorial on Youtube + +{% embed url="https://www.youtube.com/watch?v=mbniGPNHE_M" %} +Install ocean.py +{% endembed %} + diff --git a/developers/ocean.py/local-setup.md b/developers/ocean.py/local-setup.md new file mode 100644 index 00000000..db206dd0 --- /dev/null +++ b/developers/ocean.py/local-setup.md @@ -0,0 +1,123 @@ +--- +description: Local setup for running & testing ocean.py +--- + +# Local Setup + +On this page, we continue our journey from [installation part](install.md), to do setup for local testing. Local setup means that we will use Ganache as local blockchain where we can effectuate transactions and all the services point to this network. + +⚠️ Ocean local setup uses Docker, which is fine for Linux/Ubuntu but plays badly with MacOS and Windows. If you are on these, you’ll want [remote setup](remote-setup.md)_._ + +Here are the following steps for configuring ocean.py on Ganache network using barge. + +### Prerequisites + +Ahoy there, matey! 🌊⚓️ When it comes to setting up ocean.py locally, we're diving into the world of Docker containers. These clever containers hold our trusty local blockchain nodes (Ganache) and the mighty Ocean middleware (Aquarius metadata cache and Provider to aid in consuming data assets). But fear not, for a smooth sailing experience, you'll need to ensure the following Docker components are shipshape and ready to go: + +1. [Docker](https://docs.docker.com/engine/install/) 🐳 +2. [Docker Compose](https://docs.docker.com/compose/install/) 🛠️ +3. Oh, and don't forget to [allow those non-root users](https://www.thegeekdiary.com/run-docker-as-a-non-root-user/) to join in on the fun! 🙅‍♂️ + +So hoist the anchor, prepare your Docker crew, and let's embark on an exciting voyage with ocean.py! 🚢⛵️ + +### 1. Download barge and run services + +Ocean `barge` runs ganache (local blockchain), Provider (data service), and Aquarius (metadata cache). + +Barge helps you quickly become familiar with Ocean, because the local blockchain has low latency and no transaction fees.\ + + +In a new console: + +```bash +# Grab repo +git clone https://github.com/oceanprotocol/barge +cd barge + +# Clean up old containers (to be sure) +docker system prune -a --volumes + +# Run barge: start Ganache, Provider, Aquarius; deploy contracts; update ~/.ocean +./start_ocean.sh +``` + +Let barge do its magic and wait until the blockchain is fully synced. That means when you start to see continuosly `eth_blockNumber` + +### 2. Brownie local network configuration + +(You don't need to do anything in this step, it's just useful to understand.) + +Brownie's network configuration file is at `~/.brownie/network-config.yaml`. + +When running locally, Brownie will use the chain listed under `development`, having id `development`. This refers to Ganache, which is running in Barge. + +### 3. Set envvars + +From here on, go to a console different than Barge. (E.g. the console where you installed Ocean, or a new one.) + +First, ensure that you're in the working directory, with venv activated: + +```bash +cd my_project +source venv/bin/activate +``` + +For this tutorial Alice is the publisher of the dataset and Bob is the consumer of the dataset. As a Linux user, you'll use "`export`" for setting the private keys. In the same console: + +```bash +# keys for alice and bob +export TEST_PRIVATE_KEY1=0x8467415bb2ba7c91084d932276214b11a3dd9bdb2930fefa194b666dd8020b99 +export TEST_PRIVATE_KEY2=0x1d751ded5a32226054cd2e71261039b65afb9ee1c746d055dd699b1150a5befc + + +# key for minting fake OCEAN +export FACTORY_DEPLOYER_PRIVATE_KEY=0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58 +``` + +### 4. Setup in Python + +In the same console, run Python console: + +```bash +python +``` + +In the Python console: + +```python +# Create Ocean instance +from ocean_lib.web3_internal.utils import connect_to_network +connect_to_network("development") + +from ocean_lib.example_config import get_config_dict +config = get_config_dict("development") + +from ocean_lib.ocean.ocean import Ocean +ocean = Ocean(config) + +# Create OCEAN object. Barge auto-created OCEAN, and ocean instance knows +OCEAN = ocean.OCEAN_token + +# Mint fake OCEAN to Alice & Bob +from ocean_lib.ocean.mint_fake_ocean import mint_fake_OCEAN +mint_fake_OCEAN(config) + +# Create Alice's wallet +import os +from brownie.network import accounts +accounts.clear() + +alice_private_key = os.getenv("TEST_PRIVATE_KEY1") +alice = accounts.add(alice_private_key) +assert alice.balance() > 0, "Alice needs ETH" +assert OCEAN.balanceOf(alice) > 0, "Alice needs OCEAN" + +# Create additional wallets. While some flows just use Alice wallet, it's simpler to do all here. +bob_private_key = os.getenv('TEST_PRIVATE_KEY2') +bob = accounts.add(bob_private_key) +assert bob.balance() > 0, "Bob needs ETH" +assert OCEAN.balanceOf(bob) > 0, "Bob needs OCEAN" + +# Compact wei <> eth conversion +from ocean_lib.ocean.util import to_wei, from_wei +``` diff --git a/developers/ocean.py/ocean-assets-tech-details.md b/developers/ocean.py/ocean-assets-tech-details.md new file mode 100644 index 00000000..831b14b9 --- /dev/null +++ b/developers/ocean.py/ocean-assets-tech-details.md @@ -0,0 +1,981 @@ +--- +description: Technical details about OceanAssets functions +--- + +# Ocean Assets Tech Details + +Through this class we can publish different types of assets & consume them to make 💲💲💲 + +### Creates URL Asset + +* **create\_url\_asset**(`self`, `name: str`, `url: str`, `publisher_wallet`, `wait_for_aqua: bool = True` ) -> `tuple` + +It is the most used functions in all the READMEs. + +Creates asset of type "dataset", having `UrlFiles`, with good defaults. + +It can be called after instantiating Ocean object. + +**Parameters** + +* `name` - name of the asset, `string` +* `url` - url that is stored in the asset, `string` +* `publisher_wallet` - wallet of the asset publisher/owner, `Brownie account` +* `wait_for_aqua` - boolean value which default is `True`, waiting for aquarius to fetch the asset takes additional time, but if you want to be sure that your asset is indexed, keep the default value. + +**Returns** + +`tuple` + +A tuple which contains the data NFT, datatoken and the data asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL178C1-L185C82) + +
+ +Source code + +{% code overflow="wrap" %} +```python + @enforce_types + def create_url_asset( + self, name: str, url: str, publisher_wallet, wait_for_aqua: bool = True + ) -> tuple: + """Create asset of type "data", having UrlFiles, with good defaults""" + metadata = self._default_metadata(name, publisher_wallet) + files = [UrlFile(url)] + return self._create_1dt(metadata, files, publisher_wallet, wait_for_aqua) +``` +{% endcode %} + +
+ +### Creates Algorithm Asset + +* **create\_algo\_asset**(`self`, `name: str`, `url: str`, `publisher_wallet`, `image: str = "oceanprotocol/algo_dockers"`, `tag: str = "python-branin"`, `checksum: str = "sha256:8221d20c1c16491d7d56b9657ea09082c0ee4a8ab1a6621fa720da58b09580e4"`, `wait_for_aqua: bool = True`) -> `tuple`: + +Create asset of type "algorithm", having `UrlFiles`, with good defaults. + +It can be called after instantiating Ocean object. + +**Parameters**: + +* `name` - name of the asset, `string` +* `url` - url that is stored in the asset, `string` +* `publisher_wallet` - wallet of the asset publisher/owner, `Brownie account` +* `image` - docker image of that algorithm, `string` +* `tag` - docker tag for that algorithm image, `string` +* `checksum` - docker checksum for algorithm's image, `string` +* `wait_for_aqua` - boolean value which default is `True`, waiting for aquarius to fetch the asset takes additional time, but if you want to be sure that your asset is indexed, keep the default value. + +**Returns** + +`tuple` + +A tuple which contains the algorithm NFT, algorithm datatoken and the algorithm asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL146C4-L176C82) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def create_algo_asset( + self, + name: str, + url: str, + publisher_wallet, + image: str = "oceanprotocol/algo_dockers", + tag: str = "python-branin", + checksum: str = "sha256:8221d20c1c16491d7d56b9657ea09082c0ee4a8ab1a6621fa720da58b09580e4", + wait_for_aqua: bool = True, + ) -> tuple: + """Create asset of type "algorithm", having UrlFiles, with good defaults""" + + if image == "oceanprotocol/algo_dockers" or tag == "python-branin": + assert image == "oceanprotocol/algo_dockers" and tag == "python-branin" + + metadata = self._default_metadata(name, publisher_wallet, "algorithm") + metadata["algorithm"] = { + "language": "python", + "format": "docker-image", + "version": "0.1", + "container": { + "entrypoint": "python $ALGO", + "image": image, + "tag": tag, + "checksum": checksum, + }, + } + + files = [UrlFile(url)] + return self._create_1dt(metadata, files, publisher_wallet, wait_for_aqua) +``` +{% endcode %} + +
+ +### Creates Arweave Asset + +* **create\_arweave\_asset**(`self`, `name: str`, `transaction_id: str`, `publisher_wallet`, `wait_for_aqua: bool = True`) -> `tuple` + +Creates asset of type "data", having `ArweaveFile`, with good defaults. + +It can be called after instantiating Ocean object. + +**Parameters** + +* `name` - name of the asset, `string` +* `transaction_id` - transaction id from the arweave file, `string` +* `publisher_wallet` - wallet of the asset publisher/owner, `Brownie account` +* `wait_for_aqua` - boolean value which default is `True`, waiting for aquarius to fetch the asset takes additional time, but if you want to be sure that your asset is indexed, keep the default value. + +**Returns** + +`tuple` + +A tuple which contains the data NFT, datatoken and the data asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL187C5-L198C82) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def create_arweave_asset( + self, + name: str, + transaction_id: str, + publisher_wallet, + wait_for_aqua: bool = True, + ) -> tuple: + """Create asset of type "data", having ArweaveFiles, with good defaults""" + metadata = self._default_metadata(name, publisher_wallet) + files = [ArweaveFile(transaction_id)] + return self._create_1dt(metadata, files, publisher_wallet, wait_for_aqua) +``` +{% endcode %} + +
+ +### Creates GraphQL Asset + +* **create\_graphql\_asset**(`self`, `name: str`, `url: str`, `query: str`, `publisher_wallet`, `wait_for_aqua: bool = True`) -> `tuple` + +Creates asset of type "data", having `GraphqlQuery` files, with good defaults. + +It can be called after instantiating Ocean object. + +**Parameters** + +* `name` - name of the asset, `string` +* `url` - url of subgraph that you are using, `string` +* `query` - GraphQL query, `string` +* `publisher_wallet` - wallet of the asset publisher/owner, `Brownie account` +* `wait_for_aqua` - boolean value which default is `True`, waiting for aquarius to fetch the asset takes additional time, but if you want to be sure that your asset is indexed, keep the default value. + +**Returns** + +`tuple` + +A tuple which contains the data NFT, datatoken and the data asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL200C5-L212C82) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def create_graphql_asset( + self, + name: str, + url: str, + query: str, + publisher_wallet, + wait_for_aqua: bool = True, + ) -> tuple: + """Create asset of type "data", having GraphqlQuery files, w good defaults""" + metadata = self._default_metadata(name, publisher_wallet) + files = [GraphqlQuery(url, query)] + return self._create_1dt(metadata, files, publisher_wallet, wait_for_aqua) +``` +{% endcode %} + +
+ +### Creates Onchain Asset + +* **create\_onchain\_asset**(`self`, `name: str`, `contract_address: str`, `contract_abi: dict`, `publisher_wallet`, `wait_for_aqua: bool = True`) -> `tuple` + +Creates asset of type "data", having `SmartContractCall` files, with good defaults. + +It can be called after instantiating Ocean object. + +**Parameters** + +* `name` - name of the asset, `string` +* `contract_address` - contract address that should be stored in the asset, `string` +* `contract_abi` - ABI of functions presented in the contract, `string` +* `publisher_wallet` - wallet of the asset publisher/owner, `Brownie account` +* `wait_for_aqua` - boolean value which default is `True`, waiting for aquarius to fetch the asset takes additional time, but if you want to be sure that your asset is indexed, keep the default value. + +**Returns** + +`tuple` + +A tuple which contains the data NFT, datatoken and the data asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL214C5-L229C1) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def create_onchain_asset( + self, + name: str, + contract_address: str, + contract_abi: dict, + publisher_wallet, + wait_for_aqua: bool = True, + ) -> tuple: + """Create asset of type "data", having SmartContractCall files, w defaults""" + chain_id = self._chain_id + onchain_data = SmartContractCall(contract_address, chain_id, contract_abi) + files = [onchain_data] + metadata = self._default_metadata(name, publisher_wallet) + return self._create_1dt(metadata, files, publisher_wallet, wait_for_aqua) +``` +{% endcode %} + +
+ +### Creates Asset (for advanced skills) + +* **create**(`self`, `metadata: dict`, `publisher_wallet`, `credentials: Optional[dict] = None`, `data_nft_address: Optional[str] = None`, `data_nft_args: Optional[DataNFTArguments] = None`, `deployed_datatokens: Optional[List[Datatoken]] = None`, `services: Optional[list] = None`, `datatoken_args: Optional[List["DatatokenArguments"]] = None`, `encrypt_flag: Optional[bool] = True`, `compress_flag: Optional[bool] = True`, `wait_for_aqua: bool = True`) -> `tuple` + +Register an asset on-chain. Asset = {data\_NFT, >=0 datatokens, DDO} + +Creating/deploying a DataNFT contract and in the Metadata store (Aquarius). + +**Parameters** + +* `metadata`: `dictionary` conforming to the Metadata accepted by Ocean Protocol. +* `publisher_wallet`- `Brownie account` of the publisher registering this asset. +* `credentials` - credentials `dictionary` necessary for the asset, which establish who can consume the asset and who cannot. +* `data_nft_address`- hex string, the address of the data NFT. The new asset will be associated with this data NFT address. +* `data_nft_args`- object of DataNFTArguments type if creating a new one. +* `deployed_datatokens`- list of datatokens which are already deployed. +* `services` - list of `Service` objects if you want to run multiple services for a datatoken or you have multiple datatokens with a single service each. +* `datatoken_args` - list of objects of `DatatokenArguments` type if creating a new datatokens. +* `encrypt_flag`- bool for encryption of the DDO. +* `compress_flag`- bool for compression of the DDO. +* `wait_for_aqua`- bool for spending time waiting for DDO to be updated in Aquarius. + +**Returns** + +`tuple` + +A tuple which contains the data NFT, datatoken and the data asset. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL259C5-L390C43) + +
+ +Source code + +{% code overflow="wrap" %} +```python +def create( + self, + metadata: dict, + publisher_wallet, + credentials: Optional[dict] = None, + data_nft_address: Optional[str] = None, + data_nft_args: Optional[DataNFTArguments] = None, + deployed_datatokens: Optional[List[Datatoken]] = None, + services: Optional[list] = None, + datatoken_args: Optional[List["DatatokenArguments"]] = None, + encrypt_flag: Optional[bool] = True, + compress_flag: Optional[bool] = True, + wait_for_aqua: bool = True, + ) -> Optional[DDO]: + + self._assert_ddo_metadata(metadata) + + provider_uri = DataServiceProvider.get_url(self._config_dict) + + if not data_nft_address: + data_nft_args = data_nft_args or DataNFTArguments( + metadata["name"], metadata["name"] + ) + data_nft = data_nft_args.deploy_contract( + self._config_dict, publisher_wallet + ) + # register on-chain + if not data_nft: + logger.warning("Creating new NFT failed.") + return None, None, None + logger.info(f"Successfully created NFT with address {data_nft.address}.") + else: + data_nft = DataNFT(self._config_dict, data_nft_address) + + # Create DDO object + ddo = DDO() + + # Generate the did, add it to the ddo. + ddo.did = data_nft.calculate_did() + # Check if it's already registered first! + if self._aquarius.ddo_exists(ddo.did): + raise AquariusError( + f"Asset id {ddo.did} is already registered to another asset." + ) + ddo.chain_id = self._chain_id + ddo.metadata = metadata + + ddo.credentials = credentials if credentials else {"allow": [], "deny": []} + + ddo.nft_address = data_nft.address + datatokens = [] + + if not deployed_datatokens: + services = [] + for datatoken_arg in datatoken_args: + new_dt = datatoken_arg.create_datatoken( + data_nft, publisher_wallet, with_services=True + ) + datatokens.append(new_dt) + + services.extend(datatoken_arg.services) + + for service in services: + ddo.add_service(service) + else: + if not services: + logger.warning("services required with deployed_datatokens.") + return None, None, None + + datatokens = deployed_datatokens + dt_addresses = [] + for datatoken in datatokens: + if deployed_datatokens[0].address not in data_nft.getTokensList(): + logger.warning( + "some deployed_datatokens don't belong to the given data nft." + ) + return None, None, None + + dt_addresses.append(datatoken.address) + + for service in services: + if service.datatoken not in dt_addresses: + logger.warning("Datatoken services mismatch.") + return None, None, None + + ddo.add_service(service) + + # Validation by Aquarius + _, proof = self.validate(ddo) + proof = ( + proof["publicKey"], + proof["v"], + proof["r"][0], + proof["s"][0], + ) + + document, flags, ddo_hash = self._encrypt_ddo( + ddo, provider_uri, encrypt_flag, compress_flag + ) + + data_nft.setMetaData( + 0, + provider_uri, + Web3.toChecksumAddress(publisher_wallet.address.lower()).encode("utf-8"), + flags, + document, + ddo_hash, + [proof], + {"from": publisher_wallet}, + ) + + # Fetch the ddo on chain + if wait_for_aqua: + ddo = self._aquarius.wait_for_ddo(ddo.did) + + return (data_nft, datatokens, ddo) +``` +{% endcode %} + +**Publishing Alternatives** + +Here are some examples similar to the `create()` above, but exposes more fine-grained control. + +In the same python console: + +```python +# Specify metadata and services, using the Branin test dataset +date_created = "2021-12-28T10:55:11Z" +metadata = { + "created": date_created, + "updated": date_created, + "description": "Branin dataset", + "name": "Branin dataset", + "type": "dataset", + "author": "Trent", + "license": "CC0: PublicDomain", +} + +# Use "UrlFile" asset type. (There are other options) +from ocean_lib.structures.file_objects import UrlFile +url_file = UrlFile( + url="https://raw.githubusercontent.com/trentmc/branin/main/branin.arff" +) + +# Publish data asset +from ocean_lib.models.datatoken_base import DatatokenArguments +_, _, ddo = ocean.assets.create( + metadata, + {"from": alice}, + datatoken_args=[DatatokenArguments(files=[url_file])], +) +``` + +**DDO Encryption or Compression** + +The DDO is stored on-chain. It's encrypted and compressed by default. Therefore it supports GDPR "right-to-be-forgotten" compliance rules by default. + +You can control this during `create()`: + +* To disable encryption, use `ocean.assets.create(..., encrypt_flag=False)`. +* To disable compression, use `ocean.assets.create(..., compress_flag=False)`. +* To disable both, use `ocean.assetspy.create(..., encrypt_flag=False, compress_flag=False)`. + +**Create **_**just**_** a data NFT** + +Calling `create()` like above generates a data NFT, a datatoken for that NFT, and a ddo. This is the most common case. However, sometimes you may want _just_ the data NFT, e.g. if using a data NFT as a simple key-value store. Here's how: + +```python +data_nft = ocean.data_nft_factory.create({"from": alice}, 'NFT1', 'NFT1') +``` + +If you call `create()` after this, you can pass in an argument `data_nft_address:string` and it will use that NFT rather than creating a new one. + +**Create a datatoken from a data NFT** + +Calling `create()` like above generates a data NFT, a datatoken for that NFT, and a ddo object. However, we may want a second datatoken. Or, we may have started with _just_ the data NFT, and want to add a datatoken to it. Here's how: + +```python +datatoken = data_nft.create_datatoken({"from": alice}, "Datatoken 1", "DT1") +``` + +If you call `create()` after this, you can pass in an argument `deployed_datatokens:List[Datatoken1]` and it will use those datatokens during creation. + +**Create an asset & pricing schema simultaneously** + +Ocean Assets allows you to bundle several common scenarios as a single transaction, thus lowering gas fees. + +Any of the `ocean.assets.create__asset()` functions can also take an optional parameter that describes a bundled pricing schema (Dispenser or Fixed Rate Exchange). + +Here is an example involving an exchange: + +{% code overflow="wrap" %} +```python +from ocean_lib.models.fixed_rate_exchange import ExchangeArguments +(data_nft, datatoken, ddo) = ocean.assets.create_url_asset( + name, + url, + {"from": alice}, + pricing_schema_args=ExchangeArguments(rate=to_wei(3), base_token_addr=ocean.OCEAN_address, dt_decimals=18) +) + +assert len(datatoken.get_exchanges()) == 1 +``` +{% endcode %} + +
+ +### Updates Asset + +* **update**(`self`, `ddo: DDO`, `publisher_wallet`, `provider_uri: Optional[str] = None`, `encrypt_flag: Optional[bool] = True`, `compress_flag: Optional[bool] = True`) -> `Optional[DDO]` + +Updates a ddo on-chain. + +**Parameters** + +* `ddo` - DDO to update +* `publisher_wallet` - who published this DDO +* `provider_uri` - URL of service provider. This will be used as base to construct the serviceEndpoint for the `access` (download) service. +* `encrypt_flag` - boolean value for encryption the DDO +* `compress_flag` - boolean value for compression the DDO + +**Returns** + +`DDO` or `None` + +The updated DDO, or `None` if updated DDO not found in Aquarius. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL392C5-L454C19) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def update( + self, + ddo: DDO, + publisher_wallet, + provider_uri: Optional[str] = None, + encrypt_flag: Optional[bool] = True, + compress_flag: Optional[bool] = True, + ) -> Optional[DDO]: + + self._assert_ddo_metadata(ddo.metadata) + + if not provider_uri: + provider_uri = DataServiceProvider.get_url(self._config_dict) + + assert ddo.nft_address, "need nft address to update a ddo" + data_nft = DataNFT(self._config_dict, ddo.nft_address) + + assert ddo.chain_id == self._chain_id + + for service in ddo.services: + service.encrypt_files(ddo.nft_address) + + # Validation by Aquarius + validation_result, errors_or_proof = self.validate(ddo) + if not validation_result: + msg = f"DDO has validation errors: {errors_or_proof}" + logger.error(msg) + raise ValueError(msg) + + document, flags, ddo_hash = self._encrypt_ddo( + ddo, provider_uri, encrypt_flag, compress_flag + ) + + proof = ( + errors_or_proof["publicKey"], + errors_or_proof["v"], + errors_or_proof["r"][0], + errors_or_proof["s"][0], + ) + + tx_result = data_nft.setMetaData( + 0, + provider_uri, + Web3.toChecksumAddress(publisher_wallet.address.lower()).encode("utf-8"), + flags, + document, + ddo_hash, + [proof], + {"from": publisher_wallet}, + ) + + ddo = self._aquarius.wait_for_ddo_update(ddo, tx_result.txid) + + return ddo +``` +{% endcode %} + +
+ +### Resolves Asset + +* **resolve**(`self`, `did: str`) -> `"DDO"` + +Resolves the asset from Metadata Cache store (Aquarius). + +**Parameter** + +* `did` - identifier of the DDO to be searched & resolved in Aquarius + +**Returns** + +`DDO` + +Returns DDO instance. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL456C5-L458C43) + +
+ +Source code + +```python +@enforce_types + def resolve(self, did: str) -> "DDO": + return self._aquarius.get_ddo(did) +``` + +
+ +### Searches Assets by Text + +* **search**(`self`, `text: str`) -> `list` + +Searches a DDO by a specific text. + +**Parameter** + +* `text` - string text to search for assets which include it. + +**Returns** + +`list` + +A list of DDOs which have matches with the text provided as parameter. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL460C4-L475C10) + +
+ +Source code + +```python +@enforce_types + def search(self, text: str) -> list: + """ + Search for DDOs in aquarius that contain the target text string + :param text - target string + :return - List of DDOs that match with the query + """ + logger.info(f"Search for DDOs containing text: {text}") + text = text.replace(":", "\\:").replace("\\\\:", "\\:") + return [ + DDO.from_dict(ddo_dict["_source"]) + for ddo_dict in self._aquarius.query_search( + {"query": {"query_string": {"query": text}}} + ) + if "_source" in ddo_dict + ] +``` + +
+ +### Searches Asset by GraphQL Query + +* **query**(`self`, `query: dict`) -> `list` + +Searches a DDO by a specific query. + +**Parameter** + +* `query` - dictionary type query to search for assets which include it. + +**Returns** + +`list` + +A list of DDOs which have matches with the query provided as parameter. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL477C4-L490C10) + +
+ +Source code + +{% code overflow="wrap" %} +```python + @enforce_types + def query(self, query: dict) -> list: + """ + Search for DDOs in aquarius with a search query dict + :param query - dict with query parameters + More info at: https://docs.oceanprotocol.com/api-references/aquarius-rest-api + :return - List of DDOs that match the query. + """ + logger.info(f"Search for DDOs matching query: {query}") + return [ + DDO.from_dict(ddo_dict["_source"]) + for ddo_dict in self._aquarius.query_search(query) + if "_source" in ddo_dict + ] +``` +{% endcode %} + +
+ +### Downloads Asset + +* **download\_asset**(`self`, `ddo: DDO`, `consumer_wallet`, `destination: str`, `order_tx_id: Union[str, bytes]`, `service: Optional[Service] = None`, `index: Optional[int] = None`, `userdata: Optional[dict] = None`) -> `str` + +Downloads the asset from Ocean Market. + +**Parameters** + +* `ddo` - DDO to be downloaded. +* `consumer_wallet` - Brownie account for the wallet that "ordered" the asset. +* `destination` - destination path, as string, where the asset will be downloaded. +* `order_tx_id` - transaction ID for the placed order, string and bytes formats are accepted. + +**Optional parameters** + +* `service` - optionally if you want to provide the `Service` object through you downloaded the asset. +* `index` - optionally if you want to download certain files, not the whole asset, you can specify how many files you want to download as positive `integer` format. +* `userdata` - `dictionary` additional data from user. + +**Returns** + +`str` + +The full path to the downloaded file as `string`. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL492C5-L516C20) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def download_asset( + self, + ddo: DDO, + consumer_wallet, + destination: str, + order_tx_id: Union[str, bytes], + service: Optional[Service] = None, + index: Optional[int] = None, + userdata: Optional[dict] = None, + ) -> str: + service = service or ddo.services[0] # fill in good default + + if index is not None: + assert isinstance(index, int), logger.error("index has to be an integer.") + assert index >= 0, logger.error("index has to be 0 or a positive integer.") + + assert ( + service and service.type == ServiceTypes.ASSET_ACCESS + ), f"Service with type {ServiceTypes.ASSET_ACCESS} is not found." + + path: str = download_asset_files( + ddo, service, consumer_wallet, destination, order_tx_id, index, userdata + ) + return path +``` +{% endcode %} + +
+ +### Pays for Access Service + +* **pay\_for\_access\_service**(`self`, `ddo: DDO`, `wallet`, `service: Optional[Service] = None`, `consume_market_fees: Optional[TokenFeeInfo] = None`, `consumer_address: Optional[str] = None`, `userdata: Optional[dict] = None`) + +Pays for access service by calling initialize endpoint from Provider and starting the order. + +**Parameters** + +* `ddo` - DDO to be downloaded. +* `wallet`- Brownie account for the wallet that pays for the asset. + +**Optional parameters** + +* `service` - optionally if you want to provide the `Service` object through you downloaded the asset. +* `consume_market_fees` - `TokenFeeInfo` object which contains consume market fee address, amount and token address. +* `consumer_address` - address for the consumer which pays for the access. +* `userdata` - `dictionary` additional data from user. + +**Returns** + +`str` + +Return value is a hex string for transaction hash which denotes the proof of starting order. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL518C5-L571C28) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def pay_for_access_service( + self, + ddo: DDO, + wallet, + service: Optional[Service] = None, + consume_market_fees: Optional[TokenFeeInfo] = None, + consumer_address: Optional[str] = None, + userdata: Optional[dict] = None, + ): + # fill in good defaults as needed + service = service or ddo.services[0] + consumer_address = consumer_address or wallet.address + + # main work... + dt = Datatoken(self._config_dict, service.datatoken) + balance = dt.balanceOf(wallet.address) + + if balance < to_wei(1): + raise InsufficientBalance( + f"Your token balance {balance} {dt.symbol()} is not sufficient " + f"to execute the requested service. This service " + f"requires 1 wei." + ) + + consumable_result = is_consumable( + ddo, + service, + {"type": "address", "value": wallet.address}, + userdata=userdata, + ) + if consumable_result != ConsumableCodes.OK: + raise AssetNotConsumable(consumable_result) + + data_provider = DataServiceProvider + + initialize_args = { + "did": ddo.did, + "service": service, + "consumer_address": consumer_address, + } + + initialize_response = data_provider.initialize(**initialize_args) + provider_fees = initialize_response.json()["providerFee"] + + receipt = dt.start_order( + consumer=consumer_address, + service_index=ddo.get_index_of_service(service), + provider_fees=provider_fees, + consume_market_fees=consume_market_fees, + transaction_parameters={"from": wallet}, + ) + + return receipt.txid +``` +{% endcode %} + +
+ +### Pays for Compute Service + +* **pay\_for\_compute\_service**(`self`, `datasets: List[ComputeInput]`, `algorithm_data: Union[ComputeInput, AlgorithmMetadata]`, `compute_environment: str`, `valid_until: int`, `consume_market_order_fee_address: str`, `wallet`, `consumer_address: Optional[str] = None`) + +Pays for compute service by calling `initializeCompute` endpoint from Provider to retrieve the provider fees and starting the order afterwards. + +**Parameters** + +* `datasets` - list of `ComputeInput` objects, each of them includes mandatory the DDO and service. +* `algorithm_data` - which can be either a `ComputeInput` object which contains the whole DDO and service, either provide just the algorithm metadata as `AlgorithmMetadata`. +* `compute_environment` - `string` that represents the ID from the chosen C2D environment. +* `valid_until` - `UNIX timestamp` which represents until when the algorithm can be used/run. +* `consume_market_order_fee_address` - string address which denotes the consume market fee address for that order and can be the wallet address itself. +* `wallet` - the `Brownie account` which pays for the compute service + +**Optional parameters** + +* `consumer_address` - is the string address of the C2D environment consumer. + +**Returns** + +`tuple` + +Return value is a tuple composed of list of datasets and algorithm data (if exists in result), `(datasets, algorithm_data)`. + +**Defined in** + +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL573C5-L627C30) + +
+ +Source code + +```python + @enforce_types + def pay_for_compute_service( + self, + datasets: List[ComputeInput], + algorithm_data: Union[ComputeInput, AlgorithmMetadata], + compute_environment: str, + valid_until: int, + consume_market_order_fee_address: str, + wallet, + consumer_address: Optional[str] = None, + ): + data_provider = DataServiceProvider + + if not consumer_address: + consumer_address = wallet.address + + initialize_response = data_provider.initialize_compute( + [x.as_dictionary() for x in datasets], + algorithm_data.as_dictionary(), + datasets[0].service.service_endpoint, + consumer_address, + compute_environment, + valid_until, + ) + + result = initialize_response.json() + for i, item in enumerate(result["datasets"]): + self._start_or_reuse_order_based_on_initialize_response( + datasets[i], + item, + TokenFeeInfo( + consume_market_order_fee_address, + datasets[i].consume_market_order_fee_token, + datasets[i].consume_market_order_fee_amount, + ), + wallet, + consumer_address, + ) + + if "algorithm" in result: + self._start_or_reuse_order_based_on_initialize_response( + algorithm_data, + result["algorithm"], + TokenFeeInfo( + address=consume_market_order_fee_address, + token=algorithm_data.consume_market_order_fee_token, + amount=algorithm_data.consume_market_order_fee_amount, + ), + wallet, + consumer_address, + ) + + return datasets, algorithm_data + + return datasets, None +``` + +
diff --git a/developers/ocean.py/ocean-compute-tech-details.md b/developers/ocean.py/ocean-compute-tech-details.md new file mode 100644 index 00000000..b647e2b7 --- /dev/null +++ b/developers/ocean.py/ocean-compute-tech-details.md @@ -0,0 +1,381 @@ +--- +description: Technical details about OceanCompute functions +--- + +# Ocean Compute Tech Details + +Using this class, we are able to manipulate a compute job, run it on Ocean environment and retrieve the results after the execution is finished. + +### Start Compute Job + +* **start**(`self`, `consumer_wallet`, `dataset: ComputeInput`, `compute_environment: str`, `algorithm: Optional[ComputeInput] = None`, `algorithm_meta: Optional[AlgorithmMetadata] = None`, `algorithm_algocustomdata: Optional[dict] = None`, `additional_datasets: List[ComputeInput] = []`) -> `str` + +Starts a compute job. + +It can be called within Ocean Compute class. + +**Parameters** + +* `consumer_wallet` - the `Brownie account` of consumer who pays & starts for compute job. +* `dataset` - `ComputeInput` object, each of them includes mandatory the DDO and service. +* `compute_environment` - `string` that represents the ID from the chosen C2D environment. +* `additional_datasets` - list of `ComputeInput` objects for additional datasets in case of starting a compute job for multiple datasets. + +**Optional parameters** + +* `algorithm` - `ComputeInput` object, each of them includes mandatory the DDO and service for algorithm. +* `algorithm_meta` - either provide just the algorithm metadata as `AlgorithmMetadata.` +* `algorithm_algocustomedata` - additional user data for the algorithm as dictionary. + +**Returns** + +`str` + +Returns a string type job ID. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL32C4-L70C33) + +
+ +Source code + +```python + @enforce_types + def start( + self, + consumer_wallet, + dataset: ComputeInput, + compute_environment: str, + algorithm: Optional[ComputeInput] = None, + algorithm_meta: Optional[AlgorithmMetadata] = None, + algorithm_algocustomdata: Optional[dict] = None, + additional_datasets: List[ComputeInput] = [], + ) -> str: + metadata_cache_uri = self._config_dict.get("METADATA_CACHE_URI") + ddo = Aquarius.get_instance(metadata_cache_uri).get_ddo(dataset.did) + service = ddo.get_service_by_id(dataset.service_id) + assert ( + ServiceTypes.CLOUD_COMPUTE == service.type + ), "service at serviceId is not of type compute service." + + consumable_result = is_consumable( + ddo, + service, + {"type": "address", "value": consumer_wallet.address}, + with_connectivity_check=True, + ) + if consumable_result != ConsumableCodes.OK: + raise AssetNotConsumable(consumable_result) + + # Start compute job + job_info = self._data_provider.start_compute_job( + dataset_compute_service=service, + consumer=consumer_wallet, + dataset=dataset, + compute_environment=compute_environment, + algorithm=algorithm, + algorithm_meta=algorithm_meta, + algorithm_custom_data=algorithm_algocustomdata, + input_datasets=additional_datasets, + ) + return job_info["jobId"] +``` + +
+ +### Compute Job Status + +* **status**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`) -> `Dict[str, Any]` + +Gets status of the compute job. + +It can be called within Ocean Compute class. + +**Parameters** + +* `ddo` - DDO offering the compute service of this job +* `service` - Service object of compute +* `job_id` - ID of the compute job +* `wallet` - Brownie account which initiated the compute job + +**Returns** + +`Dict[str, Any]` + +A dictionary which contains the status for an existing compute job, keys are `(ok, status, statusText)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL72C5-L88C24) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def status(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]: + """ + Gets job status. + + :param ddo: DDO offering the compute service of this job + :param service: compute service of this job + :param job_id: str id of the compute job + :param wallet: Wallet instance + :return: dict the status for an existing compute job, keys are (ok, status, statusText) + """ + job_info = self._data_provider.compute_job_status( + ddo.did, job_id, service, wallet + ) + job_info.update({"ok": job_info.get("status") not in (31, 32, None)}) + + return job_info +``` +{% endcode %} + +
+ +### Compute Job Result + +* **result**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `index: int`, `wallet` ) -> `Dict[str, Any]` + +Gets compute job result. + +It can be called within Ocean Compute class. + +**Parameters** + +* `ddo` - DDO offering the compute service of this job +* `service` - Service object of compute +* `job_id` - ID of the compute job +* `index` - compute result index +* `wallet` - Brownie account which initiated the compute job + +**Returns** + +`Dict[str, Any]` + +A dictionary wich contains the results/logs urls for an existing compute job, keys are `(did, urls, logs)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL90C5-L106C22) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def result( + self, ddo: DDO, service: Service, job_id: str, index: int, wallet + ) -> Dict[str, Any]: + """ + Gets job result. + + :param ddo: DDO offering the compute service of this job + :param service: compute service of this job + :param job_id: str id of the compute job + :param index: compute result index + :param wallet: Wallet instance + :return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs) + """ + result = self._data_provider.compute_job_result(job_id, index, service, wallet) + + return result +``` +{% endcode %} + +
+ +### Compute Job Result Logs + +* **compute\_job\_result\_logs**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`, `log_type="output"`) -> `Dict[str, Any]` + +Gets job output if exists. + +It can be called within Ocean Compute class. + +**Parameters** + +* `ddo` - DDO offering the compute service of this job +* `service` - Service object of compute +* `job_id` - ID of the compute job +* `wallet` - Brownie account which initiated the compute job +* `log_type` - string which selects what kind of logs to display. Default "output" + +**Returns** + +`Dict[str, Any]` + +A dictionary which includes the results/logs urls for an existing compute job, keys are `(did, urls, logs)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL108C5-L130C22) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def compute_job_result_logs( + self, + ddo: DDO, + service: Service, + job_id: str, + wallet, + log_type="output", + ) -> Dict[str, Any]: + """ + Gets job output if exists. + + :param ddo: DDO offering the compute service of this job + :param service: compute service of this job + :param job_id: str id of the compute job + :param wallet: Wallet instance + :return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs) + """ + result = self._data_provider.compute_job_result_logs( + ddo, job_id, service, wallet, log_type + ) + + return result +``` +{% endcode %} + +
+ +### Stop Compute Job + +* **stop**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`) -> `Dict[str, Any]` + +Attempts to stop the running compute job. + +It can be called within Ocean Compute class. + +**Parameters** + +* `ddo` - DDO offering the compute service of this job +* `service` - Service object of compute +* `job_id` - ID of the compute job +* `wallet` - Brownie account which initiated the compute job + +**Returns** + +`Dict[str, Any]` + +A dictionary which contains the status for the stopped compute job, keys are `(ok, status, statusText)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL132C5-L146C24) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def stop(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]: + """ + Attempt to stop the running compute job. + + :param ddo: DDO offering the compute service of this job + :param job_id: str id of the compute job + :param wallet: Wallet instance + :return: dict the status for the stopped compute job, keys are (ok, status, statusText) + """ + job_info = self._data_provider.stop_compute_job( + ddo.did, job_id, service, wallet + ) + job_info.update({"ok": job_info.get("status") not in (31, 32, None)}) + return job_info +``` +{% endcode %} + +
+ +### Get Priced C2D Environments + +* **get\_c2d\_environments**(`self`, `service_endpoint: str`, `chain_id: int`) + +Get list of compute environments. + +It can be called within Ocean Compute class. + +**Parameters** + +* `service_endpoint` - string Provider URL that is stored in compute service. +* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type. + +**Returns** + +`list` + +A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL148C4-L150C84) + +
+ +Source code + +{% code overflow="wrap" %} +```python + @enforce_types + def get_c2d_environments(self, service_endpoint: str, chain_id: int): + return DataServiceProvider.get_c2d_environments(service_endpoint, chain_id) +``` +{% endcode %} + +
+ +### Get Free C2D Environments + +* **get\_free\_c2d\_environment**(`self`, `service_endpoint: str`, `chain_id`) + +Get list of free compute environments. + +Important thing is that not all Providers contain free environments (`priceMin = 0`). + +It can be called within Ocean Compute class. + +**Parameters** + +* `service_endpoint` - string Provider URL that is stored in compute service. +* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type. + +**Returns** + +`list` + +A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`. + +**Defined in** + +[ocean/ocean_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL152C5-L155C87) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def get_free_c2d_environment(self, service_endpoint: str, chain_id): + environments = self.get_c2d_environments(service_endpoint, chain_id) + return next(env for env in environments if float(env["priceMin"]) == float(0)) +``` +{% endcode %} + +
diff --git a/developers/ocean.py/publish-flow.md b/developers/ocean.py/publish-flow.md new file mode 100644 index 00000000..cd6509af --- /dev/null +++ b/developers/ocean.py/publish-flow.md @@ -0,0 +1,133 @@ +--- +description: >- + This page shows how you can publish a data NFT, a datatoken & a data asset all + at once in different scenarios. +--- + +# Publish Flow + +In this page, we provide some tips & tricks for publishing an asset on Ocean Market using ocean.py. + +We assume you've already (a) [installed Ocean](install.md), and (b) done [local setup](local-setup.md) or [remote setup](remote-setup.md). This flow works for either one, without any changes between them. + +In the Python console: + +```python +#data info +name = "Branin dataset" +url = "https://raw.githubusercontent.com/trentmc/branin/main/branin.arff" + +#create data asset +(data_nft, datatoken, ddo) = ocean.assets.create_url_asset(name, url, {"from": alice}) + +#print +print("Just published asset:") +print(f" data_nft: symbol={data_nft.symbol()}, address={data_nft.address}") +print(f" datatoken: symbol={datatoken.symbol()}, address={datatoken.address}") +print(f" did={ddo.did}") +``` + +You've now published an Ocean asset! + +* [`data_nft`](../contracts/data-nfts.md) is the base (base IP) +* [`datatoken`](../contracts/datatokens.md) for access by others (licensing) +* [`ddo`](../ddo-specification.md) holding metadata + +
+ +### Appendix + +For more information regarding: Data NFT & Datatokens interfaces and how they are implemented in Solidity, we suggest to follow up this [article](../contracts/datanft-and-datatoken.md) and [contracts repo](https://github.com/oceanprotocol/contracts) from GitHub. + +As you may want to explore more the DDO specs, structure & meaning, we invite you to consult [DDO Specification](../ddo-specification.md) section. + +#### Publishing Alternatives + +Here's an example similar to the `create()` step above, but exposes more parameters to interact with, which requires deeper knowledge about ocean.py usage. The below example points out the creation of an asset and attempts to create a datatoken as well, with the files specified in `DatatokenArguments` class. You have the freedom to customize the data NFT, datatoken and also fields from DDO, such as: + +* services +* metadata +* credentials + +In the same python console: + +```python +# Specify metadata and services, using the Branin test dataset +date_created = "2021-12-28T10:55:11Z" +metadata = { + "created": date_created, + "updated": date_created, + "description": "Branin dataset", + "name": "Branin dataset", + "type": "dataset", + "author": "Trent", + "license": "CC0: PublicDomain", +} + +# Use "UrlFile" asset type. (There are other options) +from ocean_lib.structures.file_objects import UrlFile +url_file = UrlFile( + url="https://raw.githubusercontent.com/trentmc/branin/main/branin.arff" +) + +# Publish data asset +from ocean_lib.models.datatoken_base import DatatokenArguments +_, _, ddo = ocean.assets.create( + metadata, + {"from": alice}, + datatoken_args=[DatatokenArguments(files=[url_file])], +) +``` + +#### DDO Encryption or Compression + +The DDO is stored on-chain. It's encrypted and compressed by default. Therefore it supports GDPR "right-to-be-forgotten" compliance rules by default. + +You can control this during `create()`: + +* To disable encryption, use [`ocean.assets.create(..., encrypt_flag=False)`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_assets.py#L425). +* To disable compression, use [`ocean.assets.create(..., compress_flag=False)`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_assets.py#L426). +* To disable both, use [`ocean.assetspy.create(..., encrypt_flag=False, compress_flag=False)`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_assets.py#LL425C8-L426C46). + +#### Create a data NFT + +Calling `create()` like above generates a data NFT, a datatoken for that NFT, and a ddo. This is the most common case. However, sometimes you may want _just_ the data NFT, e.g. if using a data NFT as a simple key-value store. Here's how: + +```python +data_nft = ocean.data_nft_factory.create({"from": alice}, 'NFT1', 'NFT1') +``` + +If you call `create()` after this, you can pass in an argument `data_nft_address:string` and it will use that NFT rather than creating a new one. + +#### Create a datatoken from a data NFT + +Calling `create()` like above generates a data NFT, a datatoken for that NFT, and a ddo object. However, we may want a second datatoken. Or, we may have started with _just_ the data NFT, and want to add a datatoken to it. Here's how: + +```python +datatoken = data_nft.create_datatoken({"from": alice}, "Datatoken 1", "DT1") +``` + +If you call `create()` after this, you can pass in an argument `deployed_datatokens:List[Datatoken1]` and it will use those datatokens during creation. + +#### Create an asset & pricing schema simultaneously + +Ocean Assets allows you to bundle several common scenarios as a single transaction, thus lowering gas fees. + +Any of the `ocean.assets.create__asset()` functions can also take an optional parameter that describes a bundled [pricing schema](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) (Dispenser or Fixed Rate Exchange). + +Here is an example involving an exchange: + +{% code overflow="wrap" %} +```python +from ocean_lib.models.fixed_rate_exchange import ExchangeArguments +(data_nft, datatoken, ddo) = ocean.assets.create_url_asset( + name, + url, + {"from": alice}, + pricing_schema_args=ExchangeArguments(rate=to_wei(3), base_token_addr=ocean.OCEAN_address, dt_decimals=18) +) + +assert len(datatoken.get_exchanges()) == 1 +``` +{% endcode %} + diff --git a/developers/ocean.py/remote-setup.md b/developers/ocean.py/remote-setup.md new file mode 100644 index 00000000..fd740f4a --- /dev/null +++ b/developers/ocean.py/remote-setup.md @@ -0,0 +1,238 @@ +--- +description: Remote setup for running & testing ocean.py +--- + +# Remote Setup + +This setup does not use barge and uses a remote chain to do the transactions. When the network URL is specified & configured, ocean.py will use components (such as Provider, Aquarius, C2D) according to the expected blockchain. + +Here, we do setup for Mumbai, the testnet for Polygon. It's similar for other remote chains. + +Here, we will: + +1. Configure Brownie networks +2. Create two accounts - `REMOTE_TEST_PRIVATE_KEY1` and `2` +3. Get test MATIC on Mumbai +4. Get test OCEAN on Mumbai +5. Set envvars +6. Set up Alice and Bob wallets in Python + +Let's go! + +### 1. Configure Brownie Networks (One-Time) + +#### 1.1 Network config file + +Brownie's network config file is `network-config.yaml`. It is located in the `.brownie/` subfolder of your home folder. + +* For Linux & MacOS, it's: `~/.brownie/network-config.yaml` +* For Windows users, it's: `C:\Users\\.brownie\network-config.yaml` + +#### 1.2 Generate network config file (if needed) + +If you already see the config file, skip this section. + +If you don't, you need to auto-generate by calling any brownie function from a Python console. Here's an example. + +First, in a new or existing console, run Python: + +```bash +python +``` + +In the Python console: + +```python +from ocean_lib.example_config import get_config_dict +``` + +It will generate the file in the target location. You can check the target location to confirm. + +#### 1.3 Contents of network config file + +The network configuration file has settings for each network, e.g. development (ganache), Ethereum mainnet, Polygon, and Mumbai. + +Each network gets specifications for: + +* `host` - the RPC URL, i.e. what URL do we pass through to talk to the chain +* `required_confs` - the number of confirmations before a tx is done +* `id` - e.g. `polygon-main` (Polygon), `polygon-test` (Mumbai) + +`development chains` run locally; `live` chains run remotely. + +The example `network-config.yaml` in Brownie's GitHub repo is [here](https://github.com/eth-brownie/brownie/blob/master/brownie/data/network-config.yaml). It can serve as a comparison to your local copy. + +Ocean.py follows the exact `id` name for the network's name from the default Brownie configuration file. Therefore, you need to ensure that your target network name matches the corresponding Brownie `id`. + +#### 1.4 Networks Supported + +All [Ocean-deployed](https://docs.oceanprotocol.com/core-concepts/networks) chains (Eth mainnet, Polygon, etc) should be in Brownie's default `network-config.yaml` except Energy Web Chain (EWC). + +For Windows users: it's possible that your `network-config.yaml` doesn't have all the network entries. In this case, just replace your local file's content with the `network-config.yaml` in Brownie's GitHub repo, [here](https://github.com/eth-brownie/brownie/blob/master/brownie/data/network-config.yaml). + +For all users: to use EWC, add the following to network-config.yaml: + +```yaml +- name: energyweb + networks: + - chainid: 246 + host: https://rpc.energyweb.org + id: energyweb + name: energyweb +``` + +#### 1.5 RPCs and Infura + +In order to obtain API keys for blockchain access, follow up [this document](http://127.0.0.1:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) for tips & tricks. + +The config file's default RPCs point to Infura, which require you to have an Infura account with corresponding token `WEB3_INFURA_PROJECT_ID`. + +**If you do have an Infura account** + +* Linux & MacOS users: in console: `export WEB3_INFURA_PROJECT_ID=` +* Windows: in console: `set WEB3_INFURA_PROJECT_ID=` + +**If you do **_**not**_** have an Infura account** + +One option is to get an Infura account. A simpler option is to _bypass the need_ for an Infura account: just change to RPCs that don't need Infura. + +You can bypass manually: just edit your brownie network config file. + +Or you can bypass via the command line. The following command replaces Infura RPCs with public ones in `network-config.yaml`: + +* Linux users: in the console: + +{% code overflow="wrap" %} +```bash +sed -i 's#https://polygon-mainnet.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://polygon-rpc.com/#g; s#https://polygon-mumbai.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://rpc-mumbai.maticvigil.com#g' ~/.brownie/network-config.yaml +``` +{% endcode %} + +* MacOS users: you can achieve the same thing with `gnu-sed` and the `gsed` command. (Or just manually edit the file.) +* For Windows: you might need something similar to [powershell](https://www.marek.tokyo/2020/01/remove-string-from-file-in-windows-10.html). (Or just manually edit the file.) + +**1.6 Network config file wrapup** + +Congrats, you've now configured your Brownie network file! You rarely need to worry about it from now on. + +### 2. Create EVM Accounts (One-Time) + +An EVM account is singularly defined by its private key. Its address is a function of that key. Let's generate two accounts! + +In a new or existing console, run Python. + +```bash +python +``` + +In the Python console: + +```python +from eth_account.account import Account +account1 = Account.create() +account2 = Account.create() + +print(f""" +REMOTE_TEST_PRIVATE_KEY1={account1.key.hex()}, ADDRESS1={account1.address} +REMOTE_TEST_PRIVATE_KEY2={account2.key.hex()}, ADDRESS2={account2.address} +""") +``` + +Then, hit Ctrl-C to exit the Python console. + +Now, you have two EVM accounts (address & private key). Save them somewhere safe, like a local file or a password manager. + +These accounts will work on any EVM-based chain: production chains like Eth mainnet and Polygon, and testnets like Goerli and Mumbai. Here, we'll use them for Mumbai. + +### 3. Get (test) MATIC on Mumbai + +We need the a network's native token to pay for transactions on the network. [ETH](https://ethereum.org/en/get-eth/) is the native token for Ethereum mainnet; [MATIC](https://polygon.technology/matic-token/) is the native token for Polygon, and [(test) MATIC](https://faucet.polygon.technology/) is the native token for Mumbai. + +To get free (test) MATIC on Mumbai: + +1. Go to the faucet [https://faucet.polygon.technology/](https://faucet.polygon.technology/). Ensure you've selected "Mumbai" network and "MATIC" token. +2. Request funds for ADDRESS1 +3. Request funds for ADDRESS2 + +You can confirm receiving funds by going to the following url, and seeing your reported MATIC balance: `https://mumbai.polygonscan.com/address/` + +### 4. Get (test) OCEAN on Mumbai + +[OCEAN](https://oceanprotocol.com/token) can be used as a data payment token, and locked into veOCEAN for Data Farming / curation. The READMEs show how to use OCEAN in both cases. + +* OCEAN is an ERC20 token with a finite supply, rooted in Ethereum mainnet at address [`0x967da4048cD07aB37855c090aAF366e4ce1b9F48`](https://etherscan.io/token/0x967da4048cD07aB37855c090aAF366e4ce1b9F48). +* OCEAN on other production chains derives from the Ethereum mainnet OCEAN. OCEAN on Polygon (mOCEAN) is at [`0x282d8efce846a88b159800bd4130ad77443fa1a1`](https://polygonscan.com/token/0x282d8efce846a88b159800bd4130ad77443fa1a1). +* (Test) OCEAN is on each testnet. Test OCEAN on Mumbai is at [`0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8`](https://mumbai.polygonscan.com/token/0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8). + +To get free (test) OCEAN on Mumbai: + +1. Go to the faucet [https://faucet.mumbai.oceanprotocol.com/](https://faucet.mumbai.oceanprotocol.com/) +2. Request funds for ADDRESS1 +3. Request funds for ADDRESS2 + +You can confirm receiving funds by going to the following url, and seeing your reported OCEAN balance: `https://mumbai.polygonscan.com/token/0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8?a=` + +### 5. Set envvars + +As usual, Linux/MacOS needs "`export`" and Windows needs "`set`". In the console: + +**Linux & MacOS users:** + +```bash +# For accounts: set private keys +export REMOTE_TEST_PRIVATE_KEY1= +export REMOTE_TEST_PRIVATE_KEY2= +``` + +**Windows users:** + +```powershell +# For accounts: set private keys +set REMOTE_TEST_PRIVATE_KEY1= +set REMOTE_TEST_PRIVATE_KEY2= +``` + +### 6. Setup in Python + +In your working console, run Python: + +```bash +python +``` + +In the Python console: + +```python +# Create Ocean instance +from ocean_lib.web3_internal.utils import connect_to_network +connect_to_network("polygon-test") # mumbai is "polygon-test" + +import os +from ocean_lib.example_config import get_config_dict +from ocean_lib.ocean.ocean import Ocean +config = get_config_dict("polygon-test") +ocean = Ocean(config) + +# Create OCEAN object. ocean_lib knows where OCEAN is on all remote networks +OCEAN = ocean.OCEAN_token + +# Create Alice's wallet +from brownie.network import accounts +accounts.clear() + +alice_private_key = os.getenv('REMOTE_TEST_PRIVATE_KEY1') +alice = accounts.add(alice_private_key) +assert alice.balance() > 0, "Alice needs MATIC" +assert OCEAN.balanceOf(alice) > 0, "Alice needs OCEAN" + +# Create Bob's wallet. While some flows just use Alice wallet, it's simpler to do all here. +bob_private_key = os.getenv('REMOTE_TEST_PRIVATE_KEY2') +bob = accounts.add(bob_private_key) +assert bob.balance() > 0, "Bob needs MATIC" +assert OCEAN.balanceOf(bob) > 0, "Bob needs OCEAN" + +# Compact wei <> eth conversion +from ocean_lib.ocean.util import to_wei, from_wei +``` + +If you get a gas-related error like `transaction underpriced`, you'll need to change the `priority_fee` or `max_fee`. See details in [brownie docs](https://eth-brownie.readthedocs.io/en/stable/core-gas.html) or you can check the dedicated [README ](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/gas-strategy-remote.md)which customize your gas strategy. diff --git a/developers/ocean.py/technical-details.md b/developers/ocean.py/technical-details.md new file mode 100644 index 00000000..0920e569 --- /dev/null +++ b/developers/ocean.py/technical-details.md @@ -0,0 +1,531 @@ +--- +description: Technical details about most used ocean.py functions +--- + +# Ocean Instance Tech Details + +At the beginning of most flows, we create an `ocean` object, which is an instance of class [`Ocean`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py). It exposes useful information, including the following: + +* properties for config & OCEAN token +* contract objects retrieval +* users' orders +* provider fees + +### Constructor + +* **\_\_init\_\_**(`self`, `config_dict: Dict`, `data_provider: Optional[Type] = None`) + +The Ocean class is the entry point into Ocean Procol. + +In order to initialize a Ocean object, you must provide `config_dict` which is a `Dictionary` instance and optionally a `DataServiceProvider` instance. + +**Parameters** + +* `config_dict`: `dict` which is mandatory and it contains the configuration as dictionary format. +* `data_provider`: `Optional[DataProvider]` which is optional with a default value of None. If it is not provided, the constructor will instantiate a new one from scratch. + +**Returns** + +`None` + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#L43) + +
+ +Source code + +{% code overflow="wrap" %} +```python +class Ocean: + """The Ocean class is the entry point into Ocean Protocol.""" + + @enforce_types + def __init__(self, config_dict: Dict, data_provider: Optional[Type] = None) -> None: + """Initialize Ocean class. + + Usage: Make a new Ocean instance + + `ocean = Ocean({...})` + + This class provides the main top-level functions in ocean protocol: + 1. Publish assets metadata and associated services + - Each asset is assigned a unique DID and a DID Document (DDO) + - The DDO contains the asset's services including the metadata + - The DID is registered on-chain with a URL of the metadata store + to retrieve the DDO from + + `ddo = ocean.assets.create(metadata, publisher_wallet)` + + 2. Discover/Search ddos via the current configured metadata store (Aquarius) + + - Usage: + `ddos_list = ocean.assets.search('search text')` + + An instance of Ocean is parameterized by a `Config` instance. + + :param config_dict: variable definitions + :param data_provider: `DataServiceProvider` instance + """ + config_errors = {} + for key, value in config_defaults.items(): + if key not in config_dict: + config_errors[key] = "required" + continue + + if not isinstance(config_dict[key], type(value)): + config_errors[key] = f"must be {type(value).__name__}" + + if config_errors: + raise Exception(json.dumps(config_errors)) + + self.config_dict = config_dict + + network_name = config_dict["NETWORK_NAME"] + check_network(network_name) + + if not data_provider: + data_provider = DataServiceProvider + + self.assets = OceanAssets(self.config_dict, data_provider) + self.compute = OceanCompute(self.config_dict, data_provider) + + logger.debug("Ocean instance initialized: ") +``` +{% endcode %} + +
+ +### Config Getter + +* **config**(`self`) -> `dict` + +It is a helper method for retrieving the user's configuration for ocean.py.\ +It can be called only by Ocean object and returns a python dictionary. + +**Returns** + +`dict` + +Configuration fields as dictionary. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL265C1-L268C32) + +
+ +Source code + +```python +@property + @enforce_types + def config(self) -> dict: # alias for config_dict + return self.config_dict +``` + +
+ +### OCEAN Token Address + +* **ocean_address**(`self`) -> `str` + +It is a helper method for retrieving the OCEAN's token address.\ +It can be called only by Ocean object and returns the address as a `string`. + +**Returns** + +`str` + +OCEAN token address for that network. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL100C1-L103C52) + +
+ +Source code + +```python + @property + @enforce_types + def OCEAN_address(self) -> str: + return get_ocean_token_address(self.config) +``` + +[`get_ocean_token_address`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/util.py#LL31C1-L38C89) function is an utilitary function which gets the address from `address.json` file + +{% code overflow="wrap" %} +```python +@enforce_types +def get_ocean_token_address(config_dict: dict) -> str: + """Returns the Ocean token address for given network or web3 instance + Requires either network name or web3 instance. + """ + addresses = get_contracts_addresses(config_dict) + + return Web3.toChecksumAddress(addresses.get("Ocean").lower()) if addresses else None +``` +{% endcode %} + +
+ +### OCEAN Token Object + +* **ocean_token**(`self`) -> `DatatokenBase` +* **OCEAN**(`self`) -> `DatatokenBase` as alias for the above option + +It is a helper method for retrieving the OCEAN token object (Datatoken class).\ +It can be called within Ocean class and returns the OCEAN Datatoken. + +**Returns** + +`DatatokenBase` + +OCEAN token as `DatatokenBase` object. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL105C1-L113C32) + +
+ +Source code + +```python + @property + @enforce_types + def OCEAN_token(self) -> DatatokenBase: + return DatatokenBase.get_typed(self.config, self.OCEAN_address) + + @property + @enforce_types + def OCEAN(self): # alias for OCEAN_token + return self.OCEAN_token +``` + +
+ +### Data NFT Factory + +* **data\_nft\_factory**(`self`) -> `DataNFTFactoryContract` + +It is a property for getting `Data NFT Factory` object for the singleton smart contract.\ +It can be called within Ocean class and returns the `DataNFTFactoryContract` instance. + +**Returns** + +`DataNFTFactoryContract` + +Data NFT Factory contract object which access all the functionalities available from smart contracts in Python. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL117C1-L120C80) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@property + @enforce_types + def data_nft_factory(self) -> DataNFTFactoryContract: + return DataNFTFactoryContract(self.config, self._addr("ERC721Factory")) +``` +{% endcode %} + +
+ +### Dispenser + +* **dispenser**(`self`) -> `Dispenser` + +`Dispenser` is represented by a faucet for free data.\ +It is a property for getting `Dispenser` object for the singleton smart contract.\ +It can be called within Ocean class and returns the `Dispenser` instance. + +**Returns** + +`Dispenser` + +Dispenser contract object which access all the functionalities available from smart contracts in Python. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL122C1-L125C63) + +
+ +Source code + +```python + @property + @enforce_types + def dispenser(self) -> Dispenser: + return Dispenser(self.config, self._addr("Dispenser")) +``` + +
+ +### Fixed Rate Exchange + +* **fixed\_rate\_exchange**(`self`) -> `FixedRateExchange` + +Exchange is used for priced data.\ +It is a property for getting `FixedRateExchange` object for the singleton smart contract.\ +It can be called within Ocean class and returns the `FixedRateExchange` instance. + +**Returns** + +`FixedRateExchange` + +Fixed Rate Exchange contract object which access all the functionalities available from smart contracts in Python. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL127C1-L130C72) + +
+ +Source code + +```python + @property + @enforce_types + def fixed_rate_exchange(self) -> FixedRateExchange: + return FixedRateExchange(self.config, self._addr("FixedPrice")) +``` + +
+ +### NFT Token Getter + +* **get\_nft\_token**(`self`, `token_adress: str`) -> `DataNFT` + +It is a getter for a specific data NFT object based on its checksumed address.\ +It can be called within Ocean class which returns the `DataNFT` instance based on string `token_address` specified as parameter. + +**Parameters** + +* `token_address` - string checksumed address of the NFT token that you are searching for. + +**Returns** + +`DataNFT` + +Data NFT object which access all the functionalities available for ERC721 template in Python. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL139C5-L145C51) + +
+ +Source code + +```python + @enforce_types + def get_nft_token(self, token_address: str) -> DataNFT: + """ + :param token_address: Token contract address, str + :return: `DataNFT` instance + """ + return DataNFT(self.config, token_address) +``` + +
+ +### Datatoken Getter + +* **get\_datatoken**(`self`, `token_address: str`) -> `DatatokenBase` + +It is a getter for a specific `datatoken` object based on its checksumed address.\ +It can be called within Ocean class with a string `token_address` as parameter which returns the `DatatokenBase` instance depending on datatoken's template index. + +**Parameters** + +* `token_address` - string checksumed address of the datatoken that you are searching for. + +**Returns** + +`DatatokenBase` + +Datatoken object which access all the functionalities available for ERC20 templates in Python. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL147C5-L153C67) + +
+ +Source code + +```python +@enforce_types + def get_datatoken(self, token_address: str) -> DatatokenBase: + """ + :param token_address: Token contract address, str + :return: `Datatoken1` or `Datatoken2` instance + """ + return DatatokenBase.get_typed(self.config, token_address) + +``` + +
+ +### User Orders Getter + +* **get\_user\_orders**(`self`, `address: str`, `datatoken: str`) -> `List[AttributeDict]` + +Returns the list of orders that were made by a certain user on a specific datatoken. + +It can be called within Ocean class. + +**Parameters** + +* `address` - wallet address of that user +* `datatoken` - datatoken address + +**Returns** + +`List[AttributeDict]` + +List of all the orders on that `datatoken` done by the specified `user`. + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL157C5-L173C23) + +
+ +Source code + +{% code overflow="wrap" %} +```python + @enforce_types + def get_user_orders(self, address: str, datatoken: str) -> List[AttributeDict]: + """ + :return: List of orders `[Order]` + """ + dt = DatatokenBase.get_typed(self.config_dict, datatoken) + _orders = [] + for log in dt.get_start_order_logs(address): + a = dict(log.args.items()) + a["amount"] = int(log.args.amount) + a["address"] = log.address + a["transactionHash"] = log.transactionHash + a = AttributeDict(a.items()) + + _orders.append(a) + + return _orders +``` +{% endcode %} + + + +
+ +### Provider Fees + +* **retrieve\_provider\_fees**( `self`, `ddo: DDO`, `access_service: Service`, `publisher_wallet` ) -> `dict` + +Calls Provider to compute provider fees as dictionary for access service. + +**Parameters** + +* `ddo` - the data asset which has the DDO object +* `access_service` - Service instance for the service that needs the provider fees +* `publisher_wallet` - Wallet instance of the user that wants to retrieve the provider fees + +**Returns** + +`dict` + +A dictionary which contains the following keys (`providerFeeAddress`, `providerFeeToken`, `providerFeeAmount`, `providerData`, `v`, `r`, `s`, `validUntil`). + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL177C4-L189C1) + +
+ +Source code + +{% code overflow="wrap" %} +```python + @enforce_types + def retrieve_provider_fees( + self, ddo: DDO, access_service: Service, publisher_wallet + ) -> dict: + + initialize_response = DataServiceProvider.initialize( + ddo.did, access_service, consumer_address=publisher_wallet.address + ) + initialize_data = initialize_response.json() + provider_fees = initialize_data["providerFee"] + + return provider_fees +``` +{% endcode %} + +
+ +### Compute Provider Fees + +* **retrieve\_provider\_fees\_for\_compute**(`self`, `datasets: List[ComputeInput]`, `algorithm_data: Union[ComputeInput, AlgorithmMetadata]`, `consumer_address: str`, `compute_environment: str`, `valid_until: int`) -> `dict` + +Calls Provider to generate provider fees as dictionary for compute service. + +**Parameters** + +* `datasets` - list of `ComputeInput` which contains the data assets +* `algorithm_data` - necessary data for algorithm and it can be either a `ComputeInput` object, either just the algorithm metadata, `AlgorithmMetadata` +* `consumer_address` - address of the compute consumer wallet which is requesting the provider fees +* `compute_environment` - id provided from the compute environment as `string` +* `valid_until` - timestamp in UNIX miliseconds for the duration of provider fees for the compute service. + +**Returns** + +`dict` + +A dictionary which contains the following keys (`providerFeeAddress`, `providerFeeToken`, `providerFeeAmount`, `providerData`, `v`, `r`, `s`, `validUntil`). + +**Defined in** + +[ocean/ocean.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean.py#LL190C4-L210C1) + +
+ +Source code + +{% code overflow="wrap" %} +```python +@enforce_types + def retrieve_provider_fees_for_compute( + self, + datasets: List[ComputeInput], + algorithm_data: Union[ComputeInput, AlgorithmMetadata], + consumer_address: str, + compute_environment: str, + valid_until: int, + ) -> dict: + + initialize_compute_response = DataServiceProvider.initialize_compute( + [x.as_dictionary() for x in datasets], + algorithm_data.as_dictionary(), + datasets[0].service.service_endpoint, + consumer_address, + compute_environment, + valid_until, + ) + + return initialize_compute_response.json() +``` +{% endcode %} + +
diff --git a/developers/provider/README.md b/developers/provider/README.md new file mode 100644 index 00000000..7b228237 --- /dev/null +++ b/developers/provider/README.md @@ -0,0 +1,49 @@ +--- +description: An integral part of the Ocean Protocol stack +--- + +# Provider + +### What is Provider? + +It is a REST API designed specifically for the provision of data services. It essentially acts as a proxy that encrypts and decrypts the metadata and access information for the data asset. + +Constructed using the Python Flask HTTP server, the Provider service is the only component in the Ocean Protocol stack with the ability to access your data, it is an important layer of security for your information. + +The Provider service has several key functions. Firstly, it performs on-chain checks to ensure the buyer has permission to access the asset. Secondly, it encrypts the URL and metadata during the publication phase, providing security for your data during the initial upload. + +The Provider decrypts the URL when a dataset is downloaded and it streams the data directly to the buyer, it never reveals the asset URL to the buyer. This provides a layer of security and ensures that access is only provided when necessary. + +Additionally, the Provider service offers compute services by establishing a connection to the C2D environment. This enables users to compute and manipulate data within the Ocean Protocol stack, adding a new level of utility and function to this data services platform. + +### What does the Provider do? + +* The only component that can access your data +* Performs checks on-chain for buyer permissions and payments +* Encrypts the URL and metadata during publish +* Decrypts the URL when the dataset is downloaded or a compute job is started +* Provides access to data assets by streaming data (and never the URL) +* Provides compute services (connects to C2D environment) +* Typically run by the Data owner + +

Ocean Provider - publish & consume

+ +In the publishing process, the provider plays a crucial role by encrypting the DDO using its private key. Then, the encrypted DDO is stored on the blockchain. + +During the consumption flow, after a consumer obtains access to the asset by purchasing a datatoken, the provider takes responsibility for decrypting the DDO and fetching data from the source used by the data publisher. + +### What technology is used? + +* Python: This is the main programming language used in Provider. +* Flask: This Python framework is used to construct the Provider API. +* HTTP Server: Provider responds to HTTP requests from clients (like web browsers), facilitating the exchange of data and information over the internet. + +### How to run the provider? + +We recommend checking the README in the Provider [GitHub repository](https://github.com/oceanprotocol/provider) for the steps to run the Provider. If you see any errors in the instructions, please open an issue within the GitHub repository. + +### Ocean Provider Endpoints Specification + +The following pages in this section specify the endpoints for Ocean Provider that have been implemented by the core developers. + +For inspecting the errors received from `Provider` and their reasons, please revise this [document](https://github.com/oceanprotocol/provider/blob/main/ocean_provider/routes/README.md). diff --git a/developers/provider/authentication-endpoints.md b/developers/provider/authentication-endpoints.md new file mode 100644 index 00000000..49282177 --- /dev/null +++ b/developers/provider/authentication-endpoints.md @@ -0,0 +1,114 @@ +# Authentication Endpoints + +Provider offers an alternative to signing each request, by allowing users to generate auth tokens. The generated auth token can be used until its expiration in all supported requests. Simply omit the signature parameter and add the AuthToken request header based on a created token. + +Please note that if a signature parameter exists, it will take precedence over the AuthToken headers. All routes that support a signature parameter support the replacement, with the exception of auth-related ones (createAuthToken and deleteAuthToken need to be signed). + +### Create Auth Token + +**Endpoint:** `GET /api/services/createAuthToken` + +**Description:** Allows the user to create an authentication token that can be used to authenticate requests to the provider API, instead of signing each request. The generated auth token can be used until its expiration in all supported requests. + +**Parameters:** + +* `address`: The Ethereum address of the consumer (Optional). +* `nonce`: A unique identifier for this request, to prevent replay attacks (Required). +* `signature`: A digital signature proving ownership of the `address`. The signature should be generated by signing the hashed concatenation of the `address` and `nonce` parameters (Required). +* `expiration`: A valid future UTC timestamp representing when the auth token will expire (Required). + +**Curl Example:** + +{% code overflow="wrap" %} +``` +GET /api/services/createAuthToken?address=&&nonce=&&expiration=&signature= +``` +{% endcode %} + +Inside the angular brackets, the user should provide the valid values for the request. + +Response: + +{% code overflow="wrap" %} +``` +{"token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJleHAiOjE2NjAwNTMxMjksImFkZHJlc3MiOiIweEE3OGRlYjJGYTc5NDYzOTQ1QzI0Nzk5MTA3NUUyYTBlOThCYTdBMDkifQ.QaRqYeSYxZpnFayzPmUkj8TORHHJ_vRY-GL88ZBFM0o"} +``` +{% endcode %} + +#### Javascript Example: + +```runkit nodeVersion="18.x.x" +const axios = require('axios'); +const address = "0x7e2a2FA2a064F693f0a55C5639476d913Ff12D05" +const nonce = "1" +const signature = "" +const url = `http://provider.oceanprotocol.com/api/services/createAuthToken?address=${address}&nonce=${nonce}&expiration=&signature=`; +axios.get(url).then(response => { + console.log(response.data); +}).catch(error => { + console.error(error); +}); + +``` + +#### Delete Auth Token + +#### DELETE /api/services/deleteAuthToken + +Allows the user to delete an existing auth token before it naturally expires. + +Parameters + +``` +address: String object containing consumer's address (optional) +nonce: Integer, Nonce (required) +signature: String object containg user signature (signed message) + The signature is based on hashing the following parameters: + address + nonce +token: token to be expired +``` + +Returns: Success message if token is successfully deleted. If the token is not found or already expired, returns an error message. + +#### Javascript Example: + +{% code overflow="wrap" %} +```javascript +const axios = require('axios'); + +// Define the address, token, and signature +const address = ''; // Replace with your address +const token = ''; // Replace with your token +const signature = ''; // Replace with your signature + +// Define the URL for the deleteAuthToken endpoint +const deleteAuthTokenURL = 'http:///api/services/deleteAuthToken'; // Replace with your provider's URL + +// Make the DELETE request +axios.delete(deleteAuthTokenURL, { + data: { + address: address, + token: token + }, + headers: { + 'Content-Type': 'application/json', + 'signature': signature + } +}) +.then(response => { + console.log(response.data); +}) +.catch(error => { + console.log('Error:', error); +}); + +``` +{% endcode %} + +Replace ``, ``, ``, and `` with actual values. This script sends a DELETE request to the `deleteAuthToken` endpoint and logs the response. Please ensure that `axios` is installed in your environment (`npm install axios`). + +#### Example Response: + +``` +{"success": "Token has been deactivated."} +``` diff --git a/developers/provider/compute-endpoints.md b/developers/provider/compute-endpoints.md new file mode 100644 index 00000000..ff467faf --- /dev/null +++ b/developers/provider/compute-endpoints.md @@ -0,0 +1,319 @@ +# Compute Endpoints + +All compute endpoints respond with an Array of status objects, each object describing a compute job info. + +Each status object will contain: + +``` + owner:The owner of this compute job + documentId: String object containing document id (e.g. a DID) + jobId: String object containing workflowId + dateCreated: Unix timestamp of job creation + dateFinished: Unix timestamp when job finished (null if job not finished) + status: Int, see below for list + statusText: String, see below + algorithmLogUrl: URL to get the algo log (for user) + resultsUrls: Array of URLs for algo outputs + resultsDid: If published, the DID +``` + +Status description (`statusText`): (see Operator-Service for full status list) + +| status | Description | +| ------ | ----------------------------- | +| 1 | Warming up | +| 10 | Job started | +| 20 | Configuring volumes | +| 30 | Provisioning success | +| 31 | Data provisioning failed | +| 32 | Algorithm provisioning failed | +| 40 | Running algorith | +| 50 | Filtering results | +| 60 | Publishing results | +| 70 | Job completed | + +### Create or restart compute job + +**Endpoint:** POST /api/services/compute + +Start a new job + +Parameters + +{% code overflow="wrap" %} +``` + signature: String object containg user signature (signed message) (required) + consumerAddress: String object containing consumer's ethereum address (required) + nonce: Integer, Nonce (required) + environment: String representing a compute environment offered by the provider + dataset: Json object containing dataset information + dataset.documentId: String, object containing document id (e.g. a DID) (required) + dataset.serviceId: String, ID of the service the datatoken is attached to (required) + dataset.transferTxId: Hex string, the id of on-chain transaction for approval of datatokens transfer + given to the provider's account (required) + dataset.userdata: Json, user-defined parameters passed to the dataset service (optional) + algorithm: Json object, containing algorithm information + algorithm.documentId: Hex string, the did of the algorithm to be executed (optional) + algorithm.meta: Json object, defines the algorithm attributes and url or raw code (optional) + algorithm.serviceId: String, ID of the service to use to process the algorithm (optional) + algorithm.transferTxId: Hex string, the id of on-chain transaction of the order to use the algorithm (optional) + algorithm.userdata: Json, user-defined parameters passed to the algorithm running service (optional) + algorithm.algocustomdata: Json object, algorithm custom parameters (optional) + additionalDatasets: Json object containing a list of dataset objects (optional) + + One of `algorithm.documentId` or `algorithm.meta` is required, `algorithm.meta` takes precedence +``` +{% endcode %} + +Returns: Array of `status` objects as described above, in this case the array will have only one object + +Example: + +```json +POST /api/compute +payload: +{ + "signature": "0x00110011", + "consumerAddress": "0x123abc", + "nonce": 1, + "environment": "env", + "dataset": { + "documentId": "did:op:2222...", + "serviceId": "compute", + "transferTxId": "0x0232123..." + } +} +``` + +Response: + +```json +[ + { + "jobId": "0x1111:001", + "status": 1, + "statusText": "Job started", + ... + } +] +``` + +### Status and Result + +#### GET /api/services/compute + +Get all jobs and corresponding stats + +Parameters + +{% code overflow="wrap" %} +``` + signature: String object containg user signature (signed message) + documentId: String object containing document did (optional) + jobId: String object containing workflowID (optional) + consumerAddress: String object containing consumer's address (optional) + + At least one parameter from documentId, jobId and owner is required (can be any of them) +``` +{% endcode %} + +Returns + +Array of `status` objects as described above + +Example: + +``` +GET /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 +``` + +Response: + +```json +[ + { + "owner": "0x1111", + "documentId": "did:op:2222", + "jobId": "3333", + "dateCreated": "2020-10-01T01:00:00Z", + "dateFinished": "2020-10-01T01:00:00Z", + "status": 5, + "statusText": "Job finished", + "algorithmLogUrl": "http://example.net/logs/algo.log", + "resultsUrls": [ + "http://example.net/logs/output/0", + "http://example.net/logs/output/1" + ], + "resultsDid": "did:op:87bdaabb33354d2eb014af5091c604fb4b0f67dc6cca4d18a96547bffdc27bcf" + }, + { + "owner": "0x1111", + "documentId": "did:op:2222", + "jobId": "3334", + "dateCreated": "2020-10-01T01:00:00Z", + "dateFinished": "2020-10-01T01:00:00Z", + "status": 5, + "statusText": "Job finished", + "algorithmLogUrl": "http://example.net/logs2/algo.log", + "resultsUrls": [ + "http://example.net/logs2/output/0", + "http://example.net/logs2/output/1" + ], + "resultsDid": "" + } +] +``` + +#### GET /api/services/computeResult + +Allows download of asset data file. + +Parameters + +``` + jobId: String object containing workflowId (optional) + index: Integer, index of the result to download (optional) + consumerAddress: String object containing consumer's address (optional) + nonce: Integer, Nonce (required) + signature: String object containg user signature (signed message) +``` + +Returns: Bytes string containing the compute result. + +Example: + +{% code overflow="wrap" %} +``` +GET /api/services/computeResult?index=0&consumerAddress=0xA78deb2Fa79463945C247991075E2a0e98Ba7A09&jobId=4d32947065bb46c8b87c1f7adfb7ed8b&nonce=1644317370 +``` +{% endcode %} + +Response: + +``` +b'{"result": "0x0000000000000000000000000000000000000000000000000000000000000001"}' +``` + +### Stop + +#### PUT /api/services/compute + +Stop a running compute job. + +Parameters + +{% code overflow="wrap" %} +``` + signature: String object containg user signature (signed message) + documentId: String object containing document did (optional) + jobId: String object containing workflowID (optional) + consumerAddress: String object containing consumer's address (optional) + + At least one parameter from documentId,jobId and owner is required (can be any of them) +``` +{% endcode %} + +Returns + +Array of `status` objects as described above + +Example: + +``` +PUT /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 +``` + +Response: + +```json +[ + { + ..., + "status": 7, + "statusText": "Job stopped", + ... + } +] +``` + +### Delete + +#### DELETE /api/services/compute + +Delete a compute job and all resources associated with the job. If job is running it will be stopped first. + +Parameters + +``` + signature: String object containg user signature (signed message) + documentId: String object containing document did (optional) + jobId: String object containing workflowId (optional) + consumerAddress: String object containing consumer's address (optional) + + At least one parameter from documentId, jobId is required (can be any of them) + in addition to consumerAddress and signature +``` + +Returns + +Array of `status` objects as described above + +Example: + +``` +DELETE /api/services/compute?signature=0x00110011&documentId=did:op:1111&jobId=012023 +``` + +Response: + +```json +[ + { + ..., + "status": 8, + "statusText": "Job deleted successfully", + ... + } +] +``` + +#### GET /api/services/computeEnvironments + +Allows download of asset data file. + +Parameters + +{% code overflow="wrap" %} +``` +chainID: Int object representing the chain ID that the Provider is connected to (mandatory) +``` +{% endcode %} + +Returns: List of compute environments. + +Example: + +``` +GET /api/services/computeEnvironments?chainId=8996 +``` + +Response: + +```json +[ + { + "cpuType":"AMD Ryzen 7 5800X 8-Core Processor", + "currentJobs":0, + "desc":"This is a mocked environment", + "diskGB":2, + "gpuType":"AMD RX570", + "id":"ocean-compute", + "maxJobs":10, + "nCPU":2, + "nGPU":0, + "priceMin":2.3, + "ramGB":1 + }, + ... +] +``` diff --git a/developers/provider/encryption-decryption.md b/developers/provider/encryption-decryption.md new file mode 100644 index 00000000..9ccdde2d --- /dev/null +++ b/developers/provider/encryption-decryption.md @@ -0,0 +1,101 @@ +# Encryption / Decryption + +### Encrypt endpoint + +* **Endpoint**: `POST /api/services/encrypt` +* **Parameters**: The body of the request should contain a binary application/octet-stream. +* **Purpose**: This endpoint is used to encrypt a document. It accepts binary data and returns an encrypted bytes string. +* **Responses**: + * **200**: This is a successful HTTP response code. It returns a bytes string containing the encrypted document. For example: `b'0x04b2bfab1f4e...7ed0573'` + +Example response: + +```python +b'0x04b2bfab1f4e...7ed0573' +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const fetch = require('cross-fetch') + +const data = "test" +const response = await fetch('https://v4.provider.oceanprotocol.com/api/services/encrypt?chainId=1', { + method: 'POST', + body: JSON.stringify(data), + headers: { 'Content-Type': 'application/octet-stream' } + }) +console.log(response) + +``` + +### Decrypt endpoint + +* **Endpoint**: `POST /api/services/decrypt` +* **Parameters**: The body of the request should contain a JSON object with the following properties: + * `decrypterAddress`: A string containing the address of the decrypter (required). + * `chainId`: The chain ID of the network the document is on (required). + * `transactionId`: The transaction ID of the encrypted document (optional). + * `dataNftAddress`: The address of the data non-fungible token (optional). + * `encryptedDocument`: The encrypted document (optional). + * `flags`: The flags of the encrypted document (optional). + * `documentHash`: The hash of the encrypted document (optional). + * `nonce`: The nonce of the encrypted document (required). + * `signature`: The signature of the encrypted document (required). +* **Purpose**: This endpoint is used to decrypt a document. It accepts the decrypter address, chain ID, and other optional parameters, and returns the decrypted document. +* **Responses**: + * **200**: This is a successful HTTP response code. It returns a bytes string containing the decrypted document. + +#### Javascript Example + +{% code overflow="wrap" %} +```javascript +const axios = require('axios'); + +async function decryptAsset(payload) { + // Define the base URL of the services. + const SERVICES_URL = ""; // Replace with your base services URL. + + // Define the endpoint. + const endpoint = `${SERVICES_URL}/api/services/decrypt`; + + try { + // Send a POST request to the endpoint with the payload in the request body. + const response = await axios.post(endpoint, payload); + + // Check the response. + if (response.status !== 200) { + throw new Error(`Response status code is not 200: ${response.data}`); + } + + // Use the response data here. + console.log(response.data); + + } catch (error) { + console.error(error); + } +} + +// Define the payload. +let payload = { + "decrypterAddress": "", // Replace with your decrypter address. + "chainId": "", // Replace with your chain ID. + "transactionId": "", // Replace with your transaction ID. + "dataNftAddress": "", // Replace with your Data NFT Address. +}; + +// Run the function. +decryptAsset(payload); + +``` +{% endcode %} + + + +Example response: + +{% code overflow="wrap" %} +```python +b'{"@context": ["https://w3id.org/did/v1"], "id": "did:op:0c184915b07b44c888d468be85a9b28253e80070e5294b1aaed81c ...' +``` +{% endcode %} diff --git a/developers/provider/general-endpoints.md b/developers/provider/general-endpoints.md new file mode 100644 index 00000000..2d1fa7be --- /dev/null +++ b/developers/provider/general-endpoints.md @@ -0,0 +1,218 @@ +# General Endpoints + +### Nonce + +Retrieves the last-used nonce value for a specific user's Ethereum address. + +* **Endpoint**: `GET /api/services/nonce` +* **Parameters**: `userAddress`: This is a string that should contain the Ethereum address of the user. It is passed as a query parameter in the URL. +* **Purpose**: This endpoint is used to fetch the last-used nonce value for a user's Ethereum address. A nonce is a number that can only be used once, and it's typically used in cryptography to prevent replay attacks. While this endpoint provides the last-used nonce, it's recommended to use the current UTC timestamp as a nonce, where required in other endpoints. + +Here are some typical responses you might receive from the API: + +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the nonce value. + +Example response: + +```json +{ + "nonce": 23 +} +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const response = await axios( `https://v4.provider.oceanprotocol.com/api/services/nonce?userAddress=0x0db823218e337a6817e6d7740eb17635deadafaf`) + +console.log(response.status) +console.log(response.data) + +``` + +### File Info + +Retrieves Content-Type and Content-Length from the given URL or asset. + +* **Endpoint**: `POST /api/services/fileinfo` +* **Parameters**: The body of the request should contain a JSON object with the following properties: + * `did`: This is a string representing the Decentralized Identifier (DID) of the dataset. + * `serviceId`: This is a string representing the ID of the service. +* **Purpose**: This endpoint is used to retrieve the `Content-Type` and `Content-Length` from a given URL or asset. For published assets, `did` and `serviceId` should be provided. It also accepts file objects (as described in the Ocean Protocol documentation) and can compute a checksum if the file size is less than `MAX_CHECKSUM_LENGTH`. For larger files, the checksum will not be computed. +* **Responses**: + * **200**: This is a successful HTTP response code. It returns a JSON object containing the file info. + +Example response: + +```json +[ + { + "contentLength":"1161", + "contentType":"application/json", + "index":0, + "valid": true + },... +] +``` + +#### Javascript Example + +```runkit nodeVersion="18.x.x" +const axios = require('cross-fetch') + +const data = "test" +const response = await fetch('https://v4.provider.oceanprotocol.com/api/services/encrypt?chainId=1', { + method: 'POST', + body: JSON.stringify(data), + headers: { 'Content-Type': 'application/octet-stream' } + }) +console.log(response) + +``` + +### Download + +* **Endpoint**: `GET /api/services/download` +* **Parameters**: The query parameters for this endpoint should contain the following properties: + * `documentId`: A string containing the document id (e.g., a DID). + * `serviceId`: A string representing the list of `file` objects that describe each file in the dataset. + * `transferTxId`: A hex string representing the ID of the on-chain transaction for approval of data tokens transfer given to the provider's account. + * `fileIndex`: An integer representing the index of the file from the files list in the dataset. + * `nonce`: The nonce. + * `consumerAddress`: A string containing the consumer's Ethereum address. + * `signature`: A string containing the user's signature (signed message). +* **Purpose**: This endpoint is used to retrieve the attached asset files. It returns a file stream of the requested file. +* **Responses**: + * **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returned the file stream. + +#### Javascript Example + +Before calling the `/download` endpoint, you need to follow these steps: + +1. You need to set up and connect a wallet for the consumer. The consumer needs to have purchased the datatoken for the asset that you are trying to download. Libraries such as ocean.js or ocean.py can be used for this. +2. Get the nonce. This can be done by calling the `/getnonce` endpoint above. +3. Sign a message from the account that has purchased the datatoken. +4. Add the nonce and signature to the payload. + +```javascript +const axios = require('axios'); + +async function downloadAsset(payload) { + // Define the base URL of the services. + const SERVICES_URL = ""; // Replace with your base services URL. + + // Define the endpoint. + const endpoint = `${SERVICES_URL}/api/services/download`; + + try { + // Send a GET request to the endpoint with the payload as query parameters. + const response = await axios.get(endpoint, { params: payload }); + + // Check the response. + if (response.status !== 200) { + throw new Error(`Response status code is not 200: ${response.data}`); + } + + // Use the response data here. + console.log(response.data); + + } catch (error) { + console.error(error); + } +} + +// Define the payload. +let payload = { + "documentId": "", // Replace with your document ID. + "serviceId": "", // Replace with your service ID. + "consumerAddress": "", // Replace with your consumer address. + "transferTxId": "", // Replace with your transfer transaction ID. + "fileIndex": 0 +}; + +// Run the function. +downloadAsset(payload); + +``` + +### Initialize + +In order to consume a data service the user is required to send one datatoken to the provider. + +The datatoken is transferred on the blockchain by requesting the user to sign an ERC20 approval transaction where the approval is given to the provider's account for the number of tokens required by the service. + +* **Endpoint**: `GET /api/services/initialize` +* **Parameters**: The query parameters for this endpoint should contain the following properties: + * `documentId`: A string containing the document id (e.g., a DID). + * `serviceId`: A string representing the ID of the service the data token is attached to. + * `consumerAddress`: A string containing the consumer's Ethereum address. + * `environment`: A string representing a compute environment offered by the provider. + * `validUntil`: An integer representing the date of validity of the service (optional). + * `fileIndex`: An integer representing the index of the file from the files list in the dataset. If set, the provider will validate the file access (optional). +* **Purpose**: This endpoint is used to initialize a service and return a quote for the number of tokens to transfer to the provider's account. +* **Responses**: + * **200**: This is a successful HTTP response code. It returns a JSON object containing information about the quote for tokens to be transferred. + +#### Javascript Example + +```javascript +const axios = require('axios'); + +async function initializeServiceAccess(payload) { + // Define the base URL of the services. + const SERVICES_URL = ""; // Replace with your base services URL. + + // Define the endpoint. + const endpoint = `${SERVICES_URL}/api/services/initialize`; + + try { + // Send a GET request to the endpoint with the payload in the request query. + const response = await axios.get(endpoint, { params: payload }); + + // Check the response. + if (response.status !== 200) { + throw new Error(`Response status code is not 200: ${response.data}`); + } + + // Use the response data here. + console.log(response.data); + + } catch (error) { + console.error(error); + } +} + +// Define the payload. +let payload = { + "documentId": "", // Replace with your document ID. + "consumerAddress": "", // Replace with your consumer address. + "serviceId": "", // Replace with your service ID. + // Add other necessary parameters as needed. +}; + +// Run the function. +initializeServiceAccess(payload); + +``` + +Example response: + +```json +{ + "datatoken": "0x21fa3ea32892091...", + "nonce": 23, + "providerFee": { + "providerFeeAddress": "0xabc123...", + "providerFeeToken": "0xabc123...", + "providerFeeAmount": "200", + "providerData": "0xabc123...", + "v": 27, + "r": "0xabc123...", + "s": "0xabc123...", + "validUntil": 123456, + }, + "computeAddress": "0x8123jdf8sdsa..." +} +``` diff --git a/developers/retrieve-datatoken-address.md b/developers/retrieve-datatoken-address.md new file mode 100644 index 00000000..855ed174 --- /dev/null +++ b/developers/retrieve-datatoken-address.md @@ -0,0 +1,57 @@ +--- +description: >- + Use these steps to reveal the information contained within an asset's DID and + list the buyers of a datatoken +--- + +# Retrieve datatoken/data NFT addresses & Chain ID + +### How to find the network, datatoken address, and data NFT address from an Ocean Market link? + +If you are given an Ocean Market link, then the network and datatoken address for the asset is visible on the Ocean Market webpage. For example, given this asset's Ocean Market link: [https://odc.oceanprotocol.com/asset/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1](https://odc.oceanprotocol.com/asset/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1) the webpage shows that this asset is hosted on the Mumbai network, and one simply clicks the datatoken's hyperlink to reveal the datatoken's address as shown in the screenshot below: + +

See the Network and Datatoken Address for an Ocean Market asset by visiting the asset's Ocean Market page.

+ +#### More Detailed Info: + +You can access all the information for the Ocean Market asset also by **enabling Debug mode**. To do this, follow these steps: + +**Step 1** - Click the Settings button in the top right corner of the Ocean Market + +

Click the Settings button

+ +**Step 2** - Check the Activate Debug Mode box in the dropdown menu + +

Check 'Active Debug Mode'

+ +**Step 3** - Go to the page for the asset you would like to examine, and scroll through the DDO information to find the NFT address, datatoken address, chain ID, and other information. + +
+ +### How to use Aquarius to find the chainID and datatoken address from a DID? + +If you know the DID:op but you don't know the source link, then you can use Ocean Aquarius to resolve the metadata for the DID:op to find the `chainId`+ `datatoken address` of the asset. Simply enter in your browser "[https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/](https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1)\" to fetch the metadata. + +For example, for the following DID:op: "did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1" the Ocean Aquarius URL can be modified to add the DID:op and resolve its metadata. Simply add "[https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/](https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1)" to the beginning of the DID:op and enter the link in your browser like this: [https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1](https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1) + +

The metadata printout for this DID:op with the network's Chain ID and datatoken address circled in red

+ +Here are the networks and their corresponding chain IDs: + +```json +"mumbai: 80001" +"polygon: 137" +"bsc: 56" +"energyweb: 246" +"moonriver: 1285" +"mainnet: 1" +"goerli: 5" +"polygonedge: 81001" +"gaiaxtestnet: 2021000" +"alfajores: 44787" +"gen-x-testnet: 100" +"filecointestnet: 3141" +"oasis_saphire_testnet: 23295" +"development: 8996" +``` + diff --git a/developers/storage.md b/developers/storage.md new file mode 100644 index 00000000..f8e33156 --- /dev/null +++ b/developers/storage.md @@ -0,0 +1,174 @@ +--- +title: Storage Specifications +description: Specification of storage options for assets in Ocean Protocol. +--- + +# Storage Specifications + +Ocean does not handle the actual storage of files directly. The files are stored via other services which are then specified within the DDO. + +During the publish process, file URLs must be encrypted with a respective _Provider_ API call before storing the DDO on-chain. For this, you need to send the following object to Provider (where "files" contains one or more storage objects): + +```json +{ + "datatokenAddress":"0x1", + "nftAddress": "0x2", + "files": [ + ... + ] +} +``` + +The remainder of this document specifies the different types of storage objects that are supported: + +## Static URLs. + +Parameters: + +* `url` - File _URL_, **required** +* `method` - The HTTP _method_, **required** +* `headers` - Additional HTTP _headers_, **optional** + +```json +{ + "type": "url", + "url": "https://url.com/file1.csv", + "method": "GET", + "headers": + { + "Authorization": "Bearer 123", + "APIKEY": "124", + } +} +``` + +## Interplanetary File System + +**`IPFS`** + +The [Interplanetary File System](https://ipfs.tech/) (IPFS) is a distributed file storage protocol that allows computers all over the globe to store and serve files as part of a giant peer-to-peer network. Any computer, anywhere in the world, can download the IPFS software and start hosting and serving files. + +Parameters: + +* `hash` - The file _hash,_ **required** + +
{
+    "type": "ipfs",
+    "hash": "XXX"
+}
+
+ +## GraphQL + +**`GraphQL`** + +[GraphQL](https://graphql.org/) is a query language for APIs and a runtime for fulfilling those queries with your existing data. + +Parameters: + +* `url` - Server endpoint _URL_, **required** +* `query` - The _query_ to be executed, **required** +* `headers` - Additional HTTP headers, **optional** + +```json +{ + "type": "graphql", + "url": "http://172.15.0.15:8000/subgraphs/name/oceanprotocol/ocean-subgraph", + "headers":{ + "Authorization": "Bearer 123", + "APIKEY": "124", + }, + "query": """query{ + nfts(orderBy: createdTimestamp,orderDirection:desc){ + id + symbol + createdTimestamp + } + }""" +} +``` + +## Smart Contract Data + +Use a smart contract as data source. + +Parameters: + +* `chainId` - The _chainId_ used to query the contract, **required** +* `address` - The smartcontract _address_, **required** +* `abi` - The function _abi_ (NOT the entire contract abi), **required** + +{% code overflow="wrap" %} +```json +{ +"type": "smartcontract", +"chainId": 1, +"address": "0x8149276f275EEFAc110D74AFE8AFECEaeC7d1593", +"abi": { + "inputs": [], + "name": "swapOceanFee", + "outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}], + "stateMutability": "view", + "type": "function" + } +} +``` +{% endcode %} + +## Arweave + +[Arweave](https://www.arweave.org/) is a decentralized data storage that allows permanently storing files over a distributed network of computers. + +Parameters: + +* `transactionId` - The _transaction identifier,_ **required** + +```json +{ + "type": "arweave", + "transactionId": "a4qJoQZa1poIv5guEzkfgZYSAD0uYm7Vw4zm_tCswVQ", +} +``` + +First-class integrations supported in the future : **`Filecoin`** **`Storj`** **`SQL`** + +A service can contain multiple files, using multiple storage types. + +Example: + +```json +{ + "datatokenAddress": "0x1", + "nftAddress": "0x2", + "files": [ + { + "type": "url", + "url": "https://url.com/file1.csv", + "method": "GET" + }, + { + "type": "ipfs", + "hash": "XXXX" + } + ] +} +``` + +To get information about the files after encryption, the `/fileinfo` endpoint of the [_Provider_](provider/README.md) returns based on a passed DID an array of file metadata (based on the file type): + +```json +[ + { + "type": "url", + "contentLength": 100, + "contentType": "application/json" + }, + { + "type": "ipfs", + "contentLength": 130, + "contentType": "application/text" + } +] +``` + +This only concerns metadata about a file, but never the file URLs. The only way to decrypt them is to exchange at least 1 datatoken based on the respective service pricing scheme. diff --git a/developers/subgraph/README.md b/developers/subgraph/README.md new file mode 100644 index 00000000..f96b3f32 --- /dev/null +++ b/developers/subgraph/README.md @@ -0,0 +1,49 @@ +--- +description: >- + Unlocking the Speed: Subgraph - Bringing Lightning-Fast Retrieval to On-Chain + Data. +--- + +# Subgraph + +### What is the Subgraph? + +The [Ocean Subgraph](https://github.com/oceanprotocol/ocean-subgraph) is built on top of [The Graph](https://thegraph.com/) (the popular :sunglasses: indexing and querying protocol for blockchain data). It is an essential component of the Ocean Protocol ecosystem. It provides an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. The data sourced from the Ocean subgraph can be accessed through [GraphQL](https://graphql.org/learn/) queries. + +Imagine this 💭: if you were to always fetch data from the on-chain, you'd start to feel a little...old :older\_woman: Like your queries are stuck in a time warp. But fear not! When you embrace the power of the subgraph, data becomes your elixir of youth. + +

Ocean Subgraph

+ +The subgraph reads data from the blockchain, extracting relevant information. Additionally, it indexes events emitted from the Ocean smart contracts. This collected data is then made accessible to any decentralized applications (dApps) that require it, through GraphQL queries. The subgraph organizes and presents the data in a JSON format, facilitating efficient and structured access for dApps. + +### How to use the Subgraph? + +You can utilize the Subgraph instances provided by Ocean Protocol or deploy your instance. Deploying your own instance allows you to have more control and customization options for your specific use case. To learn how to host your own Ocean Subgraph instance, refer to the guide available on the [Deploying Ocean Subgraph](../../infrastructure/deploying-ocean-subgraph.md) page. + +If you're eager to use the Ocean Subgraph, here's some important information for you: We've deployed an Ocean Subgraph for each of the supported networks. Take a look at the table below, where you'll find handy links to both the subgraph instance and GraphiQL for each network. With the user-friendly GraphiQL interface, you can execute GraphQL queries directly, without any additional setup. It's a breeze! :ocean: + +{% hint style="info" %} +When it comes to fetching valuable information about [Data NFTs](../contracts/data-nfts.md) and [datatokens](../contracts/datatokens.md), the subgraph queries play a crucial role. They retrieve numerous details and information, but, the Subgraph cannot decrypt the DDO. But worry not, we have a dedicated component for that—[Aquarius](../aquarius/)! 🐬 Aquarius communicates with the provider and decrypts the encrypted information, making it readily available for queries. +{% endhint %} + +### Ocean Subgraph deployments + +| Network | Subgraph URL | GraphiQL URL | +| ------------------- | ----------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | +| Ethereum | [Subgraph](https://v4.subgraph.mainnet.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Polygon | [Subgraph](https://v4.subgraph.polygon.oceanprotocol.com/) | [GraphiQL](https://v4.subgraph.polygon.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Binance Smart Chain | [Subgraph](https://v4.subgraph.bsc.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.bsc.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Energy Web Chain | [Subgraph](https://v4.subgraph.energyweb.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.energyweb.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Moonriver | [Subgraph](https://v4.subgraph.moonriver.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.moonriver.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Mumbai | [Subgraph](https://v4.subgraph.mumbai.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | +| Görli | [Subgraph](https://v4.subgraph.goerli.oceanprotocol.com) | [GraphiQL](https://v4.subgraph.goerli.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) | + +{% hint style="warning" %} +When making subgraph queries, please remember that the parameters you send, such as a datatoken address or a data NFT address, should be in **lowercase**. This is an essential requirement to ensure accurate processing of the queries. We kindly request your attention to this detail to facilitate a seamless query experience. +{% endhint %} + +In the following pages, we've prepared a few examples just for you. From running queries to exploring data, you'll have the chance to dive right into the Ocean Subgraph data. There, you'll find a wide range of additional code snippets and examples that showcase the power and versatility of the Ocean Subgraph. So, grab a virtual snorkel, and let's explore together! 🤿 + +{% hint style="info" %} +For more examples, visit the subgraph GitHub [repository](https://github.com/oceanprotocol/ocean-subgraph), where you'll discover an extensive collection of code snippets and examples that highlight the Subgraph's capabilities and adaptability. +{% endhint %} diff --git a/building-with-ocean/using-ocean-subgraph/get-data-nft-information.md b/developers/subgraph/get-data-nft-information.md similarity index 56% rename from building-with-ocean/using-ocean-subgraph/get-data-nft-information.md rename to developers/subgraph/get-data-nft-information.md index 0a95b7f2..4972981b 100644 --- a/building-with-ocean/using-ocean-subgraph/get-data-nft-information.md +++ b/developers/subgraph/get-data-nft-information.md @@ -1,16 +1,30 @@ -# Get Data NFT Information +--- +description: >- + Explore the Power of Querying: Unveiling In-Depth Details of Individual Data + NFTs +--- -The result of following GraphQL query returns the information about a particular datatoken. Here, `0x1c161d721e6d99f58d47f709cdc77025056c544c` is the address of the dataNFT. +# Get data NFT information -{% hint style="info" %} -Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) to fetch the results from the mainnet. For other networks use [this table](./#ocean-subgraph-graphiql). -{% endhint %} +Now that you are familiar with the process of retrieving a list of data NFTs 😎, let's explore how to obtain more specific details about a particular NFT through querying. By utilizing the knowledge you have gained, you can customize your GraphQL query to include additional parameters such as the NFT's metadata, creator information, template, or any other relevant data points. This will enable you to delve deeper into the intricacies of a specific NFT and gain a comprehensive understanding of its attributes. With this newfound capability, you can unlock valuable insights and make informed decisions based on the specific details retrieved. So, let's dive into the fascinating world of querying and unravel the unique characteristics of individual data NFTs. -#### Query -```graphql -{ - nft (id:"0x1c161d721e6d99f58d47f709cdc77025056c544c", subgraphError:deny){ + +The result of the following GraphQL query returns the information about a particular data NFT. In this example, `0x1c161d721e6d99f58d47f709cdc77025056c544c`. + +_PS: In this example, the query is executed on the Ocean subgraph deployed on the mainnet. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ + +{% tabs %} +{% tab title="Javascript" %} +The javascript below can be used to run the query and fetch the information of a data NFT. If you wish to change the network, replace the variable's value `network` as needed. Change the value of the variable `datanftAddress` with the address of your choice. + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const datanftAddress = "0x1c161d721e6d99f58d47f709cdc77025056c544c"; + +const query = `{ + nft (id:"${datanftAddress}", subgraphError:deny){ id name symbol @@ -31,16 +45,32 @@ Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprot template orderCount } -} +}` + +const network = "mainnet" +var config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + let result = JSON.stringify(response.data) + console.log(result) + }) + .catch(function (error) { + console.log(error); + }); + ``` +{% endtab %} -#### Code - -{% tabs %} {% tab title="Python" %} -The python script below can be used to run the the query. If you wish to change the network, then replace the value of variable `base_url` as needed. Change the value of the variable dataNFT\_address with the address of the datatoken of your choice. +The Python script below can be used to run the query and fetch the details about an NFT. If you wish to change the network, replace the variable's value `base_url` as needed. Change the value of the variable dataNFT\_address with the address of the datatoken of your choice. -#### Create script +**Create script** {% code title="dataNFT_information.py" %} ```python @@ -91,22 +121,16 @@ print(json.dumps(result, indent=4, sort_keys=True)) **Execute script** -
python dataNFT_information.py
+
python dataNFT_information.py
+
{% endtab %} -{% tab title="Javascript" %} -The javascript below can be used to run the the query. If you wish to change the network, then replace the value of variable `baseUrl` as needed. Change the value of the variable `datanftAddress` with the address of the datatoken of your choice. +{% tab title="Query" %} +Copy the query to fetch the information about a data NFT in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). If you want to fetch the information about another NFT, replace the `id` with the address of your choice. -#### Create script - -{% code title="dataNFTInfo.js" %} -```javascript -var axios = require('axios'); - -const datanftAddress = "0x1c161d721e6d99f58d47f709cdc77025056c544c"; - -const query = `{ - nft (id:"${datanftAddress}", subgraphError:deny){ +```graphql +{ + nft (id:"0x1c161d721e6d99f58d47f709cdc77025056c544c", subgraphError:deny){ id name symbol @@ -127,41 +151,11 @@ const query = `{ template orderCount } -}` - -const baseUrl = "https://v4.subgraph.mainnet.oceanprotocol.com" -const route = "/subgraphs/name/oceanprotocol/ocean-subgraph" - -const url = `${baseUrl}${route}` - -var config = { - method: 'post', - url: url, - headers: { "Content-Type": "application/json" }, - data: JSON.stringify({ "query": query }) -}; - -axios(config) - .then(function (response) { - console.log(JSON.stringify(response.data)); - }) - .catch(function (error) { - console.log(error); - }); - -``` -{% endcode %} - -#### Execute script - -```bash -node dataNFTInfo.js +} ``` {% endtab %} {% endtabs %} -#### Response -
Sample response diff --git a/developers/subgraph/get-datatoken-buyers.md b/developers/subgraph/get-datatoken-buyers.md new file mode 100644 index 00000000..6eca5845 --- /dev/null +++ b/developers/subgraph/get-datatoken-buyers.md @@ -0,0 +1,373 @@ +--- +description: Query the Subgraph to see the buyers of a datatoken. +--- + +# Get datatoken buyers + +The result of the following GraphQL query returns the list of buyers for a particular datatoken. Here, `0xc22bfd40f81c4a28c809f80d05070b95a11829d9` is the address of the datatoken. + +_PS: In this example, the query is executed on the Ocean subgraph deployed on the **Mumbai** network. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ + +{% tabs %} +{% tab title="JavaScript" %} +The javascript below can be used to run the query and fetch the list of buyers for a datatoken. If you wish to change the network, replace the variable's value `network` as needed. Change the value of the variable `datatoken` with the address of your choice. + +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const datatoken = "0xc22bfd40f81c4a28c809f80d05070b95a11829d9".toLowerCase() + +const query = `{ + token(id : "${datatoken}") { + id, + orders( + orderBy: createdTimestamp + orderDirection: desc + first: 1000 + ) { + id + consumer { + id + } + payer { + id + } + reuses { + id + } + block + createdTimestamp + amount + } + } +}` + +const network = "mumbai" +var config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + const orders = response.data.data.token.orders + console.log(orders) + for (let order of orders) { + console.log('id:' + order.id + ' consumer: ' + order.consumer.id + ' payer: ' + order.payer.id) + } + console.log(response.data.data.token.orders) + }) + .catch(function (error) { + console.log(error); +}); + +``` +{% endtab %} + +{% tab title="Python" %} +The Python script below can be used to run the query and fetch the list of buyers for a datatoken. If you wish to change the network, replace the variable's value `base_url` as needed. Change the value of the variable `datatoken_address` with the address of the datatoken of your choice. + +**Create Script** + +{% code title="datatoken_buyers.py" %} +```python +import requests +import json + +datatoken_address = "0xc22bfd40f81c4a28c809f80d05070b95a11829d9" +query = """ +{{ + token(id:"{0}"){{ + id, + orders( + orderBy: createdTimestamp + orderDirection: desc + first: 1000 + ){{ + id + consumer{{ + id + }} + payer{{ + id + }} + reuses{{ + id + }} + block + createdTimestamp + amount + }} + }} +}}""".format( + datatoken_address +) + +base_url = "https://v4.subgraph.mumbai.oceanprotocol.com" +route = "/subgraphs/name/oceanprotocol/ocean-subgraph" + +url = base_url + route + +headers = {"Content-Type": "application/json"} +payload = json.dumps({"query": query}) +response = requests.request("POST", url, headers=headers, data=payload) +result = json.loads(response.text) + +print(json.dumps(result, indent=4, sort_keys=True)) +``` +{% endcode %} + +**Execute Script** + +``` +python datatoken_buyers.py +``` +{% endtab %} + +{% tab title="Query" %} +Copy the query to fetch the list of buyers for a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph). + +```graphql + + token(id : "0xc22bfd40f81c4a28c809f80d05070b95a11829d9") { + id, + orders( + orderBy: createdTimestamp + orderDirection: desc + first: 1000 + ) { + id + consumer { + id + } + payer { + id + } + reuses { + id + } + block + createdTimestamp + amount + } + } +``` +{% endtab %} +{% endtabs %} + +
+ +Sample response + +{% code overflow="wrap" %} +```json +{ + "data": { + "token": { + "id": "0xc22bfd40f81c4a28c809f80d05070b95a11829d9", + "orders": [ + { + "amount": "1", + "block": 36669814, + "consumer": { + "id": "0x0b58857708a6f84e7ee04beaef069a7e6d1d4a0b" + }, + "createdTimestamp": 1686386048, + "id": "0xd65c927af039bed60be4bfcb00a75eebe7db695598350ba9bc6cb5d6a6180062-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x0b58857708a6f84e7ee04beaef069a7e6d1d4a0b-38.0", + "payer": { + "id": "0x0b58857708a6f84e7ee04beaef069a7e6d1d4a0b" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35582325, + "consumer": { + "id": "0x027bfbe29df80bde49845b6fecf5e4ed14518f1f" + }, + "createdTimestamp": 1684067341, + "id": "0x118317568256f457a6ac29ba03875ad83815d5d8ec834c721ea20d80643d8629-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x027bfbe29df80bde49845b6fecf5e4ed14518f1f-0.0", + "payer": { + "id": "0x027bfbe29df80bde49845b6fecf5e4ed14518f1f" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35578590, + "consumer": { + "id": "0x86874bf84f0d27dcfc6c4c34ab99aad8ced8d892" + }, + "createdTimestamp": 1684059403, + "id": "0xe9668b60b5fe7cbfacf0311ae4dc93c50c43484c0a8cf94db783ffbee1be7cd5-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x86874bf84f0d27dcfc6c4c34ab99aad8ced8d892-1.0", + "payer": { + "id": "0x86874bf84f0d27dcfc6c4c34ab99aad8ced8d892" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35511102, + "consumer": { + "id": "0xb62e762af637b49eb4870bce8fe21bfff189e495" + }, + "createdTimestamp": 1683915991, + "id": "0x047a7ce1b3c69a5fc4c2c8078a2cc356164519077ef095265e4bcba1e0baf6c9-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0xb62e762af637b49eb4870bce8fe21bfff189e495-0.0", + "payer": { + "id": "0xb62e762af637b49eb4870bce8fe21bfff189e495" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35331127, + "consumer": { + "id": "0x85c1bbdc1b6a199e0964cb849deb59aef3045edd" + }, + "createdTimestamp": 1683533500, + "id": "0x8cbfb5a85d43f5a5b4aff4a2d657fe7dac4528a86cc78f21897fdd0169d3b3c3-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x85c1bbdc1b6a199e0964cb849deb59aef3045edd-0.0", + "payer": { + "id": "0x85c1bbdc1b6a199e0964cb849deb59aef3045edd" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35254580, + "consumer": { + "id": "0xf9df381272afc2d1bd8fbbc0061cdb1d387c2032" + }, + "createdTimestamp": 1683370838, + "id": "0x246637f9a410664c6880e7768880696763e7fd66aa7cc286fdc62d5d8589481c-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0xf9df381272afc2d1bd8fbbc0061cdb1d387c2032-3.0", + "payer": { + "id": "0xf9df381272afc2d1bd8fbbc0061cdb1d387c2032" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35110175, + "consumer": { + "id": "0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede" + }, + "createdTimestamp": 1683063962, + "id": "0xed9bcc6149cab8ee67a38d6b423a05ca328533d43ff83aff140fe9c424e449ee-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede-9.0", + "payer": { + "id": "0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 35053093, + "consumer": { + "id": "0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd" + }, + "createdTimestamp": 1682942664, + "id": "0xa97fa2c99f8e5f16ba7245989830c552bace1f72476f5dee4da01c0d56ada7be-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd-12.0", + "payer": { + "id": "0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34985052, + "consumer": { + "id": "0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd" + }, + "createdTimestamp": 1682798076, + "id": "0xb9b72efad41ded4fcb7e23f14a7caa3ebc4fdfbb710318cbf25d92068c8a650d-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd-0.0", + "payer": { + "id": "0x56e08babb8bf928bd8571d2a2a78235ae57ae5bd" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34984847, + "consumer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "createdTimestamp": 1682797640, + "id": "0x9d616c85fdfe8655640bf77ecea0e42a7a9d331c5f51975f2a56b4f5ac8ec955-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff-0.0", + "payer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34982389, + "consumer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "createdTimestamp": 1682792418, + "id": "0x16eee832f9e85ca8ac8f82aecb8861e5bb5378c2771bf9abd3930b9438dbbc01-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff-9.0", + "payer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34980112, + "consumer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "createdTimestamp": 1682787580, + "id": "0x5264d4694fc78d9211a658363d98571f8d455dfcf89f3450520909416a103c2c-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff-0.0", + "payer": { + "id": "0x3f0cc2ad70839e2b684f173389f7dd71fe5186ff" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34969169, + "consumer": { + "id": "0x616b5249aaf1c924339f8b8e94474e64ceb22af3" + }, + "createdTimestamp": 1682764326, + "id": "0x7222faab923d80218b242aec2670c1a775c77a254a28782e04aed5cb36c395d3-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x616b5249aaf1c924339f8b8e94474e64ceb22af3-18.0", + "payer": { + "id": "0x616b5249aaf1c924339f8b8e94474e64ceb22af3" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34938635, + "consumer": { + "id": "0x71eb23e03d3005803db491639a7ebb717810bd04" + }, + "createdTimestamp": 1682699439, + "id": "0x3eae9d33fe3223e25ca058955744c98ba8aa211b1e3e1bf62eb653c0d0441b79-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x71eb23e03d3005803db491639a7ebb717810bd04-0.0", + "payer": { + "id": "0x71eb23e03d3005803db491639a7ebb717810bd04" + }, + "reuses": [] + }, + { + "amount": "1", + "block": 34938633, + "consumer": { + "id": "0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede" + }, + "createdTimestamp": 1682699435, + "id": "0x8dfe458aa689a29ceea3208f55856420dbfd80ed777fd01103581cff9d7d76b7-0xc22bfd40f81c4a28c809f80d05070b95a11829d9-0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede-0.0", + "payer": { + "id": "0x726ab53c8da3efed40a32fe6ab5daa65b9da7ede" + }, + "reuses": [] + } + ] + } + } +} +``` +{% endcode %} + +
diff --git a/building-with-ocean/using-ocean-subgraph/get-datatoken-information.md b/developers/subgraph/get-datatoken-information.md similarity index 64% rename from building-with-ocean/using-ocean-subgraph/get-datatoken-information.md rename to developers/subgraph/get-datatoken-information.md index 3fb61ec1..9b3846c9 100644 --- a/building-with-ocean/using-ocean-subgraph/get-datatoken-information.md +++ b/developers/subgraph/get-datatoken-information.md @@ -1,16 +1,30 @@ +--- +description: >- + Explore the Power of Querying: Unveiling In-Depth Details of Individual + Datatokens +--- + # Get datatoken information -The result of following GraphQL query returns the information about a particular datatoken. Here, `0x122d10d543bc600967b4db0f45f80cb1ddee43eb` is the address of the datatoken. +To fetch detailed information about a specific datatoken, you can utilize the power of GraphQL queries. By constructing a query tailored to your needs, you can access key parameters such as the datatoken's ID, name, symbol, total supply, creator, and associated dataTokenAddress. This allows you to gain a deeper understanding of the datatoken's characteristics and properties. With this information at your disposal, you can make informed decisions, analyze market trends, and explore the vast potential of datatokens within the Ocean ecosystem. Harness the capabilities of GraphQL and unlock a wealth of datatoken insights. -{% hint style="info" %} -Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) to fetch the results from the mainnet. For other networks use [this table](./#ocean-subgraph-graphiql). -{% endhint %} -#### Query -```graphql -{ - token(id:"0x122d10d543bc600967b4db0f45f80cb1ddee43eb", subgraphError: deny){ +The result of the following GraphQL query returns the information about a particular datatoken. Here, `0x122d10d543bc600967b4db0f45f80cb1ddee43eb` is the address of the datatoken. + +_PS: In this example, the query is executed on the Ocean subgraph deployed on the mainnet. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ + +{% tabs %} +{% tab title="Javascript" %} +The javascript below can be used to run the query and fetch the information of a datatoken. If you wish to change the network, replace the variable's value `network` as needed. Change the value of the variable `datatokenAddress` with the address of your choice. + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const datatokenAddress = "0x122d10d543bc600967b4db0f45f80cb1ddee43eb"; + +const query = `{ + token(id:"${datatokenAddress}", subgraphError: deny){ id symbol nft { @@ -43,14 +57,30 @@ Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprot price active } -} +}` + +const network = "mainnet" +var config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + let result = JSON.stringify(response.data) + console.log(result); + }) + .catch(function (error) { + console.log(error); + }); + ``` +{% endtab %} -#### Code - -{% tabs %} {% tab title="Python" %} -The python script below can be used to run the the query. If you wish to change the network, then replace the value of variable `base_url` as needed. Change the value of the variable `datatoken_address` with the address of the datatoken of your choice. +The Python script below can be used to run the query and fetch a datatoken information. If you wish to change the network, replace the variable's value `base_url` as needed. Change the value of the variable `datatoken_address` with the address of the datatoken of your choice. **Create script** @@ -115,22 +145,16 @@ print(json.dumps(result, indent=4, sort_keys=True)) **Execute script** -
python datatoken_information.py
+
python datatoken_information.py
+
{% endtab %} -{% tab title="Javascript" %} -The javascript below can be used to run the the query. If you wish to change the network, then replace the value of variable `baseUrl` as needed. Change the value of the variable `datatokenAddress` with the address of the datatoken of your choice. +{% tab title="Query" %} +Copy the query to fetch the information of a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). -**Create script** - -{% code title="datatokenInfo.js" %} -```javascript -var axios = require('axios'); - -const datatokenAddress = "0x122d10d543bc600967b4db0f45f80cb1ddee43eb"; - -const query = `{ - token(id:"${datatokenAddress}", subgraphError: deny){ +``` +{ + token(id:"0x122d10d543bc600967b4db0f45f80cb1ddee43eb", subgraphError: deny){ id symbol nft { @@ -163,41 +187,11 @@ const query = `{ price active } -}` - -const baseUrl = "https://v4.subgraph.mainnet.oceanprotocol.com" -const route = "/subgraphs/name/oceanprotocol/ocean-subgraph" - -const url = `${baseUrl}${route}` - -var config = { - method: 'post', - url: url, - headers: { "Content-Type": "application/json" }, - data: JSON.stringify({ "query": query }) -}; - -axios(config) - .then(function (response) { - console.log(JSON.stringify(response.data)); - }) - .catch(function (error) { - console.log(error); - }); - -``` -{% endcode %} - -**Execute script** - -```bash -node datatokenInfo.js +} ``` {% endtab %} {% endtabs %} -#### Response -
Sample response diff --git a/developers/subgraph/get-veocean-stats.md b/developers/subgraph/get-veocean-stats.md new file mode 100644 index 00000000..cdc6650b --- /dev/null +++ b/developers/subgraph/get-veocean-stats.md @@ -0,0 +1,725 @@ +--- +description: 'Discover the World of veOCEAN: Retrieving a Stats' +--- + +# Get veOCEAN stats + +If you are already familiarized with veOCEAN, you're off to a great start. However, if you need a refresher, we recommend visiting the [veOCEAN](../../rewards/veocean.md) page for a quick overview :mag: + +On this page, you'll find a few examples to fetch the stats of veOCEANS from the Ocean Subgraph. These examples serve as a valuable starting point to help you retrieve essential information about veOCEAN. However, if you're eager to delve deeper into the topic, we invite you to visit our [GitHub](https://github.com/oceanprotocol/ocean-subgraph/blob/main/test/integration/VeOcean.test.ts) repository. There, you'll discover a wealth of additional examples, which provide comprehensive insights. Feel free to explore and expand your knowledge! :books: + +{% hint style="info" %} +The veOcean is deployed on the Ethereum mainnet, along with two test networks, namely Mumbai and Goerli. The statistical data available is specifically limited to these networks. +{% endhint %} + +### + +### Get the total amount of locked Ocean tokens + +{% tabs %} +{% tab title="JavaScript" %} +You can utilize the following JavaScript code snippet to execute the query and retrieve the total number of locked Ocean tokens: + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const query = `query{ + globalStatistics{ + totalOceanLocked + } + }` + +var config = { + method: 'post', + url: `https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + console.log(response.data.data.globalStatistics) + }) + .catch(function (error) { + console.log(error); + }); + +``` +{% endtab %} + +{% tab title="Python" %} +You can employ the following Python script to execute the query and retrieve the total amount of locked Ocean tokens from the subgraph: + +**Create script** + +{% code title="get_ocean_locked.py" %} +```python +import requests +import json + +query = """ +{ + globalStatistics { + totalOceanLocked + } +}""" + +base_url = "https://v4.subgraph.mainnet.oceanprotocol.com" +route = "/subgraphs/name/oceanprotocol/ocean-subgraph" + +url = base_url + route + +headers = {"Content-Type": "application/json"} +payload = json.dumps({"query": query}) +response = requests.request("POST", url, headers=headers, data=payload) +result = response.json() + +print(json.dumps(result, indent=4, sort_keys=True)) +``` +{% endcode %} + +**Execute script** + +``` +python get_ocean_locked.py +``` +{% endtab %} + +{% tab title="Query" %} +To fetch the total amount of Ocean locked in the Ocean Subgraph [GraphiQL](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) interface, you can use the following query: + +```graphql +query { + globalStatistics { + totalOceanLocked + } +} +``` +{% endtab %} +{% endtabs %} + +
+ +Sample response + +```json +{ + "data": { + "globalStatistics": [ + { + "totalOceanLocked": "38490790.606836146522318627" + } + ] + } +} +``` + +
+ +### Get the veOCEAN holders list + +{% tabs %} +{% tab title="JavaScript" %} +You can utilize the following JavaScript code snippet to execute the query and fetch the list of veOCEAN holders. + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const query = `query { + veOCEANs { + id, + lockedAmount + unlockTime + } +}` + +var config = { + method: 'post', + url: `https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + for (let veHolder of response.data.data.veOCEANs) { + console.log(veHolder) + } + }) + .catch(function (error) { + console.log(error); + }); + +``` +{% endtab %} + +{% tab title="Python" %} +You can employ the following Python script to execute the query and fetch the list of veOCEAN holders from the subgraph. + +{% code title="get_veOcean_holders.py" %} +```python +import requests +import json + +query = """ +{ + veOCEANs { + id, + lockedAmount + unlockTime + } +}""" + +base_url = "https://v4.subgraph.mainnet.oceanprotocol.com" +route = "/subgraphs/name/oceanprotocol/ocean-subgraph" + +url = base_url + route + +headers = {"Content-Type": "application/json"} +payload = json.dumps({"query": query}) +response = requests.request("POST", url, headers=headers, data=payload) +result = json.loads(response.text) + +print(json.dumps(result, indent=4, sort_keys=True)) +``` +{% endcode %} + +**Execute script** + +```bash +python get_veOcean_holders.py +``` +{% endtab %} + +{% tab title="Query" %} +To fetch the list of veOCEAN holders in the Ocean Subgraph [GraphiQL](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) interface, you can use the following query: + +```graphql +query { + veOCEANs { + id, + lockedAmount + unlockTime + } +} +``` +{% endtab %} +{% endtabs %} + +
+ +Sample response + +{% code overflow="wrap" %} +```json +{ + "data": { + "veOCEANs": [ + { + "id": "0x000afce0e19523ca2566b142bd12968fe1e44fe8", + "lockedAmount": "1011", + "unlockTime": "1727913600" + }, + { + "id": "0x001b71fad769b3cd47fd4c9849c704fdfabf6096", + "lockedAmount": "8980", + "unlockTime": "1790208000" + }, + { + "id": "0x002570980aa53893c6981765698b6ebab8ae7ea1", + "lockedAmount": "126140", + "unlockTime": "1790208000" + }, + { + "id": "0x006d0f31a00e1f9c017ab039e9d0ba699433a28c", + "lockedAmount": "75059", + "unlockTime": "1812585600" + }, + { + "id": "0x006d559fc29090589d02fb71d4142aa58b030013", + "lockedAmount": "100", + "unlockTime": "1793232000" + }, + { + "id": "0x008ed443f31a4b3aee02fbfe61c7572ddaf3a679", + "lockedAmount": "1100", + "unlockTime": "1795651200" + }, + { + "id": "0x009ec7d76febecabd5c73cb13f6d0fb83e45d450", + "lockedAmount": "11200", + "unlockTime": "1790812800" + }, + { + "id": "0x01d5595949fdbe521fbc39eaf09192dffb3bfc17", + "lockedAmount": "28576", + "unlockTime": "1675900800" + }, + { + "id": "0x02535d7bab47a83d33623c9a4ca854a1b1192121", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x02a6ab92964309e0d8a739e0252b3acfd3a58972", + "lockedAmount": "1178", + "unlockTime": "1712188800" + }, + { + "id": "0x02aa319b5ce28294b7207bdce3bbcf4bf514c05b", + "lockedAmount": "300", + "unlockTime": "1736985600" + }, + { + "id": "0x02ae6dfaffc2c1f410fcad1f36885f6cc8b677d5", + "lockedAmount": "1009", + "unlockTime": "1730937600" + }, + { + "id": "0x034e1f7a66b582b68e511b325ed0ccb71bb4bc12", + "lockedAmount": "15919", + "unlockTime": "1727913600" + }, + { + "id": "0x035a209abf018e4f94173fdeabe5abe69f1efbed", + "lockedAmount": "1907", + "unlockTime": "1714003200" + }, + { + "id": "0x03d4682823c33995184a6a85a97f4ca1715c9d5c", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x04aa87fa73238b563417d17ca7e57fd91ccd521e", + "lockedAmount": "9435", + "unlockTime": "1801699200" + }, + { + "id": "0x04c697561092c9cc56be6ff5b8e2789b0ca5837c", + "lockedAmount": "226", + "unlockTime": "1681948800" + }, + { + "id": "0x051f12380b842104391a0f9c55b32f6636cc7a0f", + "lockedAmount": "24900", + "unlockTime": "1685577600" + }, + { + "id": "0x054e061f1e1c1d775a2e5f20304aab83af7dab63", + "lockedAmount": "5000", + "unlockTime": "1701907200" + }, + { + "id": "0x054efb6d55466ba2ffb4133f39ae67985a314bed", + "lockedAmount": "33083", + "unlockTime": "1697068800" + }, + { + "id": "0x05a79e69c0dcb9335cbfa5b579635cbbd60f70ba", + "lockedAmount": "15837", + "unlockTime": "1728518400" + }, + { + "id": "0x05b2716d750f50c4fcd2110c5bff3f74bf0910e6", + "lockedAmount": "744", + "unlockTime": "1796256000" + }, + { + "id": "0x05b93ddd5a0ecfbdda3ccccd11882820f9cf7454", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x05c01104bd6c4c099fe4d13b0faf0a8c94f11082", + "lockedAmount": "106026", + "unlockTime": "1723680000" + }, + { + "id": "0x06a2006ca85813e652506b865e590f44eae3928a", + "lockedAmount": "3100", + "unlockTime": "1727308800" + }, + { + "id": "0x0705adac1869aa2648ddcf00da24b0ab6b76ede1", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x07dee7fb11086d543ed943bf075ad6ac2007aada", + "lockedAmount": "34", + "unlockTime": "1665014400" + }, + { + "id": "0x0848db7cb495e7b9ada1d4dc972b9a526d014d84", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x0861fcabe37a5ce396a8d85cd816e0cc6b4633ff", + "lockedAmount": "500", + "unlockTime": "1738800000" + }, + { + "id": "0x08c26d09393dc0adc7349c0c8d1bdae63555c312", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x0a8162d91d6bf4530950e539068c75f7ddf972bc", + "lockedAmount": "534", + "unlockTime": "1791417600" + }, + { + "id": "0x0abe9b7740686cbf24b9f206e7d4e8ec25519476", + "lockedAmount": "230", + "unlockTime": "1690416000" + }, + { + "id": "0x0aef715335d0a19b870ca20fb540e16a6e606fbd", + "lockedAmount": "210", + "unlockTime": "1696464000" + }, + { + "id": "0x0b5665d637f45d6fff6c4afd4ea4191904ef38bb", + "lockedAmount": "10000", + "unlockTime": "1710979200" + }, + { + "id": "0x0bc1e0d21e3806056eeca20b69dd3f33bb49d0c7", + "lockedAmount": "690", + "unlockTime": "1738195200" + }, + { + "id": "0x0bc9cd548cc04bfcf8ef2fca50c13b9b4f62f6d4", + "lockedAmount": "1250", + "unlockTime": "1796256000" + }, + { + "id": "0x0bdf0d54e6f64da97728051e702fa0b9f61d2375", + "lockedAmount": "1024", + "unlockTime": "1701302400" + }, + { + "id": "0x0be1b7f1a2eacde1cf5b48a4a1034c70dac06a70", + "lockedAmount": "19982", + "unlockTime": "1800489600" + }, + { + "id": "0x0c16b6d59a9d242f9cf6ca1999e372dd89a098a2", + "lockedAmount": "1000", + "unlockTime": "1723075200" + }, + { + "id": "0x0c21d79f460f7cacf3fd35172151bdbc5d61d9c1", + "lockedAmount": "10", + "unlockTime": "1676505600" + }, + { + "id": "0x0c4f299cce0e56004a6e3a30f43146a205bd2b9d", + "lockedAmount": "250", + "unlockTime": "1690416000" + }, + { + "id": "0x0c59aeeb4f82bbb7e38958900df5bf499c3e9e4f", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x0c6415489a8cc61ca7d32a29f7cdc1e980af16f1", + "lockedAmount": "3788", + "unlockTime": "1725494400" + }, + { + "id": "0x0ca0c241a45a9e8abad30a632df1a9a09a4eb692", + "lockedAmount": "24987", + "unlockTime": "1729123200" + }, + { + "id": "0x0cf776d57e0223f47ed3a101927bb78d41ad8a13", + "lockedAmount": "16967", + "unlockTime": "1790208000" + }, + { + "id": "0x0d04e73d950ff53e586da588c43bb3ac5ae53872", + "lockedAmount": "19517", + "unlockTime": "1703721600" + }, + { + "id": "0x0daefc5251f8f7f5a5dc987e8a6c96d9deb84559", + "lockedAmount": "3000", + "unlockTime": "1727308800" + }, + { + "id": "0x0e0bab764f38d63abf08680a50b33718c98b90e6", + "lockedAmount": "13782", + "unlockTime": "1797465600" + }, + { + "id": "0x0ed8063fcc5b44f664333b59a12d187de6551088", + "lockedAmount": "265", + "unlockTime": "1804118400" + }, + { + "id": "0x0ed8486119b992258a3754decaa36bf8bed543e8", + "lockedAmount": "25881", + "unlockTime": "1697068800" + }, + { + "id": "0x0efbdc4e858cbb269545d48f7b30ab260a3e5d10", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x0f1107f97af6ae6eb37a9d35060aaa21cdaa109f", + "lockedAmount": "15000", + "unlockTime": "1790812800" + }, + { + "id": "0x0f84452c0dcda0c9980a0a802eb8b8dbaaf52c54", + "lockedAmount": "25", + "unlockTime": "1687392000" + }, + { + "id": "0x1019b7e639234c589c34385955adfbe0af8d8453", + "lockedAmount": "2121", + "unlockTime": "1706140800" + }, + { + "id": "0x104e9bce2d1a6fb449c14272f0157422a00adaa5", + "lockedAmount": "7300", + "unlockTime": "1744243200" + }, + { + "id": "0x111849a4943891b071f7cdb1babebcb74415204a", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x11300251b903ba70f51262f3e49aa7c22f81e1b2", + "lockedAmount": "1504", + "unlockTime": "1794441600" + }, + { + "id": "0x119b6e8c6b258b2b93443e949ef5066a85d75e44", + "lockedAmount": "30000", + "unlockTime": "1748476800" + }, + { + "id": "0x11e43d79e4193dfc1247697cb0ae15b17d27fc5b", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1215fed867ad6eb5f078fc8b477a1a32eb59d75d", + "lockedAmount": "18752", + "unlockTime": "1730332800" + }, + { + "id": "0x126bc064dbd1d0205fc608c3178a60c9706b482c", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1280cfea89a214b490c202fa22688813df8d8c04", + "lockedAmount": "26000", + "unlockTime": "1727913600" + }, + { + "id": "0x13203b4fef73f05b3db709c41c96179b37bf01eb", + "lockedAmount": "293", + "unlockTime": "1738195200" + }, + { + "id": "0x1479a4884dee82dc8471e0006102f9d400445332", + "lockedAmount": "13009", + "unlockTime": "1698883200" + }, + { + "id": "0x149756907221491eca8c5816a6b5d6b60fcd7d60", + "lockedAmount": "4985", + "unlockTime": "1701907200" + }, + { + "id": "0x153785d85dffe5b92083e30003aa58f18344d032", + "lockedAmount": "50", + "unlockTime": "1802304000" + }, + { + "id": "0x15558eb2aeb93ed561515a47441bf49250933ba9", + "lockedAmount": "500000", + "unlockTime": "1804118400" + }, + { + "id": "0x15a919e499d88a71e94d34ab76986799f69b4ff2", + "lockedAmount": "4940", + "unlockTime": "1733961600" + }, + { + "id": "0x15abf18f424cd2755e9d680eeeaa02bc00c1f00e", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x15f311af257d6e8520ebf29eae5ba76c4dd45c6a", + "lockedAmount": "1420", + "unlockTime": "1796860800" + }, + { + "id": "0x1609665376e39e9d9cdfdc75e44f80bb899e9d21", + "lockedAmount": "8016", + "unlockTime": "1699488000" + }, + { + "id": "0x1694ab8e597e90fcb4cd637bafa3e553fc1d0083", + "lockedAmount": "364", + "unlockTime": "1734566400" + }, + { + "id": "0x175437b00da09f18d89571b95a41a15aa8415eba", + "lockedAmount": "88050", + "unlockTime": "1798675200" + }, + { + "id": "0x1758bc68a87abfede6a213666d15c028f2708b2b", + "lockedAmount": "1494", + "unlockTime": "1731542400" + }, + { + "id": "0x1789bf2df0fffa3ab5d235b41ecb72f48294d955", + "lockedAmount": "920", + "unlockTime": "1701302400" + }, + { + "id": "0x1843c3d1dd3e2564fada8ea50bb73819c6b53047", + "lockedAmount": "3354", + "unlockTime": "1793836800" + }, + { + "id": "0x184f19323defce76af86bb5a63aa976cd9f256d7", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x18559e7f5d87f5c607a34ed45453d62832804c97", + "lockedAmount": "3275", + "unlockTime": "1687996800" + }, + { + "id": "0x1891c8d948bc041b5e7c1a35185cc593a33b4a6c", + "lockedAmount": "7436", + "unlockTime": "1790208000" + }, + { + "id": "0x1a0d80e1bd429127bc9a4acee880426b818764ee", + "lockedAmount": "420", + "unlockTime": "1807747200" + }, + { + "id": "0x1a2409444f2f349c2e539eb013eed985b9d54e2f", + "lockedAmount": "500", + "unlockTime": "1687996800" + }, + { + "id": "0x1a9a6198c28d4dd5b9ab58e84677520ec741cb29", + "lockedAmount": "2565", + "unlockTime": "1683158400" + }, + { + "id": "0x1ab21891e9230e4a8c3e09d88e3c0b48d54f1a86", + "lockedAmount": "980", + "unlockTime": "1734566400" + }, + { + "id": "0x1bafc574581ea4b938dcfe0d0d93778303cb3fb7", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1c175ce4f8f3e8a16df7165f15057a82a88c025c", + "lockedAmount": "953", + "unlockTime": "1692230400" + }, + { + "id": "0x1c7b100cc8a2966d35ac6cc0ccaf4d5cba463b94", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1cd1b778cdc329292d196e490b65b7950bee1c97", + "lockedAmount": "301", + "unlockTime": "1700092800" + }, + { + "id": "0x1d11c308464f09228f7c81daa253ff9f415ea4f7", + "lockedAmount": "21908", + "unlockTime": "1697068800" + }, + { + "id": "0x1d3c2dc18ca3da0406cfb3634faab589c769215b", + "lockedAmount": "625", + "unlockTime": "1689811200" + }, + { + "id": "0x1dc865705a03d63953e7df83caefc8928e555b6c", + "lockedAmount": "5245", + "unlockTime": "1812585600" + }, + { + "id": "0x1ddb98275a09552b5be11e8e3118684ed6a809fc", + "lockedAmount": "10000", + "unlockTime": "1725494400" + }, + { + "id": "0x1e180d121eff6cd1b376af9318d4128093c46032", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1e2394b6b88f9329127d98347f6e696e4af33e13", + "lockedAmount": "0", + "unlockTime": "0" + }, + { + "id": "0x1e38e305126bfe9b6329f5fdce28d72fdf9d5647", + "lockedAmount": "183844", + "unlockTime": "1801699200" + }, + { + "id": "0x1f130be1f04e159ef98c54f677b9b980b012417b", + "lockedAmount": "10663", + "unlockTime": "1745452800" + }, + { + "id": "0x1f3bcd409b2b2d88259aca77115e858ea3c65e9c", + "lockedAmount": "2000", + "unlockTime": "1732147200" + }, + { + "id": "0x1fac06467b7d9c3a9361f42ab7bd09e6a5719ec7", + "lockedAmount": "81285", + "unlockTime": "1802908800" + }, + { + "id": "0x1fba4f4446859ab451cb7f3b8fbce9bcdc97fdb9", + "lockedAmount": "560", + "unlockTime": "1689206400" + }, + { + "id": "0x200fa3e7e3fbfeb15b76e53f2810faec71a5336d", + "lockedAmount": "2375", + "unlockTime": "1805932800" + }, + { + "id": "0x2017ade0a289de891ca7e733513b264cfec2c8ce", + "lockedAmount": "9119", + "unlockTime": "1703721600" + } + ] + } +} +``` +{% endcode %} + +
+ diff --git a/building-with-ocean/using-ocean-subgraph/list-data-nfts.md b/developers/subgraph/list-data-nfts.md similarity index 68% rename from building-with-ocean/using-ocean-subgraph/list-data-nfts.md rename to developers/subgraph/list-data-nfts.md index 69e31081..c947f590 100644 --- a/building-with-ocean/using-ocean-subgraph/list-data-nfts.md +++ b/developers/subgraph/list-data-nfts.md @@ -1,36 +1,64 @@ -# List data NFTs +--- +description: 'Discover the World of NFTs: Retrieving a List of Data NFTs' +--- -The result of following GraphQL query returns the information about data nfts. +# Get data NFTs -{% hint style="info" %} -Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) to fetch the results from the mainnet. For other networks use [this table](./#ocean-subgraph-graphiql). -{% endhint %} +If you are already familiarized with the concept of NFTs, you're off to a great start. However, if you require a refresher, we recommend visiting the [data NFTs and datatokens page](../contracts/datanft-and-datatoken.md) for a quick overview. -#### Query +Now, let us delve into the realm of utilizing the subgraph to extract a list of data NFTs that have been published using the Ocean contracts. By employing GraphQL queries, we can seamlessly retrieve the desired information from the subgraph. You'll see how simple it is :sunglasses: -```graphql -{ - nfts (skip:0, first: 10, subgraphError:deny){ - id - name - symbol - owner - address - assetState - tx - block - transferable - } -} -``` +You'll find below an example of a GraphQL query that retrieves the first 10 data NFTs from the subgraph. The GraphQL query is structured to access the "nfts" route, extracting the first 10 elements. For each item retrieved, it retrieves the `id`, `name`, `symbol`, `owner`, `address`, `assetState`, `tx`, `block` and `transferable` parameters. -#### Code snippets +There are several options available to see this query in action. Below, you will find three: + +1. Run the GraphQL query in the GraphiQL interface. +2. Execute the query in Python by following the code snippet. +3. Execute the query in JavaScript by clicking on the "Run" button of the Javascript tab. + +_PS: In these examples, the query is executed on the Ocean subgraph deployed on the mainnet. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ {% tabs %} -{% tab title="Python" %} -The python script below can be used to run the the query. If you wish to change the network, then replace the value of variable `base_url` as needed. +{% tab title="Javascript" %} +The javascript below can be used to run the query and retrieve a list of NFTs. If you wish to change the network, then replace the value of `network` variable as needed. -#### Create script +```runkit nodeVersion="18.x.x" +const axios = require('axios') + +const query = `{ + nfts (skip:0, first: 10, subgraphError:deny){ + id + name + symbol + owner + address + assetState + tx + block + transferable + } +}` + +const network = "mainnet" +const config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { 'Content-Type': 'application/json' }, + data: JSON.stringify({ query: query }) +} + +const response = await axios(config) +for (let nft of response.data.data.nfts) { + console.log(' id:' + nft.id + ' name: ' + nft.name + ' address: ' + nft.address) +} + +``` +{% endtab %} + +{% tab title="Python" %} +The Python script below can be used to run the query to fetch a list of data NFTs from the subgraph. If you wish to change the network, replace the value of the variable `base_url` as needed. + +**Create script** {% code title="list_dataNFTs.py" %} ```python @@ -75,61 +103,27 @@ python list_dataNFTs.py ``` {% endtab %} -{% tab title="Javascript" %} -The javascript below can be used to run the the query. If you wish to change the network, then replace the value of variable `baseUrl` as needed. +{% tab title="Query" %} +Copy the query to fetch a list of data NFTs in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). -#### Create script - -{% code title="listDatatoken.js" %} -```javascript -var axios = require('axios'); - -const query = `{ - nfts (skip:0, first: 10, subgraphError:deny){ - id - name - symbol - owner - address - assetState - tx - block - transferable - } -}` - -const baseUrl = "https://v4.subgraph.mainnet.oceanprotocol.com" -const route = "/subgraphs/name/oceanprotocol/ocean-subgraph" - -const url = `${baseUrl}${route}` - -var config = { - method: 'post', - url: url, - headers: { "Content-Type": "application/json" }, - data: JSON.stringify({ "query": query }) -}; - -axios(config) - .then(function (response) { - console.log(JSON.stringify(response.data)); - }) - .catch(function (error) { - console.log(error); - }); -``` -{% endcode %} - -#### Execute script - -```bash -node listDatatoken.js +```graphql +{ + nfts (skip:0, first: 10, subgraphError:deny){ + id + name + symbol + owner + address + assetState + tx + block + transferable + } +} ``` {% endtab %} {% endtabs %} -#### Response -
Sample response diff --git a/building-with-ocean/using-ocean-subgraph/list-datatokens.md b/developers/subgraph/list-datatokens.md similarity index 66% rename from building-with-ocean/using-ocean-subgraph/list-datatokens.md rename to developers/subgraph/list-datatokens.md index 36475b98..a93fefeb 100644 --- a/building-with-ocean/using-ocean-subgraph/list-datatokens.md +++ b/developers/subgraph/list-datatokens.md @@ -1,51 +1,75 @@ -# List all Tokens +--- +description: 'Discover the World of datatokens: Retrieving a List of datatokens' +--- -The result of following GraphQL query returns the information about datatokens. +# Get datatokens -{% hint style="info" %} -Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) to fetch the results from the mainnet. For other networks use [this table](./#ocean-subgraph-graphiql). -{% endhint %} +With your newfound knowledge of fetching data NFTs and retrieving the associated information, fetching a list of datatokens will be a breeze :ocean:. Building upon your understanding, let's now delve into the process of retrieving a list of datatokens. By applying similar techniques and leveraging the power of GraphQL queries, you'll be able to effortlessly navigate the landscape of datatokens and access the wealth of information they hold. So, let's dive right in and unlock the potential of exploring datatokens with ease and efficiency. -#### Query -```graphql -{ - tokens(skip:0, first: 2, subgraphError: deny){ - id - symbol - nft { - name - symbol - address - } - name - symbol - cap - isDatatoken - holderCount - orderCount - orders(skip:0,first:1){ - amount - serviceIndex - payer { - id - } - consumer{ - id - } - estimatedUSDValue - lastPriceToken - lastPriceValue - } - } -} -``` -#### Code +_PS: In this example, the query is executed on the Ocean subgraph deployed on the mainnet. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ {% tabs %} +{% tab title="Javascript" %} +The javascript below can be used to run the query. If you wish to change the network, replace the variable's value `network` as needed. + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const query = `{ + tokens(skip:0, first: 2, subgraphError: deny){ + id + symbol + nft { + name + symbol + address + } + name + symbol + cap + isDatatoken + holderCount + orderCount + orders(skip:0,first:1){ + amount + serviceIndex + payer { + id + } + consumer{ + id + } + estimatedUSDValue + lastPriceToken + lastPriceValue + } + } +}` + +const network = "mainnet" +var config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + let result = JSON.stringify(response.data) + console.log(result); + }) + .catch(function (error) { + console.log(error); + }); + +``` +{% endtab %} + {% tab title="Python" %} -The python script below can be used to run the the query. If you wish to change the network, then replace the value of variable `base_url` as needed. +The Python script below can be used to run the query and fetch a list of datatokens. If you wish to change the network, then replace the value of the variable `base_url` as needed. **Create script** @@ -109,78 +133,44 @@ python list_all_tokens.py ``` {% endtab %} -{% tab title="Javascript" %} -The javascript below can be used to run the the query. If you wish to change the network, then replace the value of variable `baseUrl` as needed. +{% tab title="Query" %} +Copy the query to fetch a list of datatokens in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). -#### Create script - -{% code title="listAllTokens.js" %} -```javascript -var axios = require('axios'); - -const query = `{ - tokens(skip:0, first: 2, subgraphError: deny){ - id - symbol - nft { - name - symbol - address - } +```graphql +{ + tokens(skip:0, first: 2, subgraphError: deny){ + id + symbol + nft { name symbol - cap - isDatatoken - holderCount - orderCount - orders(skip:0,first:1){ - amount - serviceIndex - payer { - id - } - consumer{ - id - } - estimatedUSDValue - lastPriceToken - lastPriceValue - } + address } -}` - -const baseUrl = "https://v4.subgraph.mainnet.oceanprotocol.com" -const route = "/subgraphs/name/oceanprotocol/ocean-subgraph" - -const url = `${baseUrl}${route}` - -var config = { - method: 'post', - url: url, - headers: { "Content-Type": "application/json" }, - data: JSON.stringify({ "query": query }) -}; - -axios(config) - .then(function (response) { - console.log(JSON.stringify(response.data)); - }) - .catch(function (error) { - console.log(error); - }); -``` -{% endcode %} - -#### Execute script - -```bash -node listAllTokens.js + name + symbol + cap + isDatatoken + holderCount + orderCount + orders(skip:0,first:1){ + amount + serviceIndex + payer { + id + } + consumer{ + id + } + estimatedUSDValue + lastPriceToken + lastPriceValue + } + } +} ``` {% endtab %} {% endtabs %} -#### Response -
Sample Response diff --git a/building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges.md b/developers/subgraph/list-fixed-rate-exchanges.md similarity index 71% rename from building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges.md rename to developers/subgraph/list-fixed-rate-exchanges.md index 25c1151f..93416c94 100644 --- a/building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges.md +++ b/developers/subgraph/list-fixed-rate-exchanges.md @@ -1,15 +1,23 @@ -# List Fixed Rate Exchanges +--- +description: 'Discover the World of NFTs: Retrieving a List of Fixed-rate exchanges' +--- -The result of following GraphQL query returns the information about the Fixed Rate Exchanges. +# Get fixed-rate exchanges -{% hint style="info" %} -Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql) to fetch the results from the mainnet. For other networks use [this table](./#ocean-subgraph-graphiql). -{% endhint %} +Having gained knowledge about fetching lists of data NFTs and datatokens and extracting specific information about each, let's now explore the process of retrieving the information of fixed-rate exchanges. A fixed-rate exchange refers to a mechanism where data assets can be traded at a predetermined rate or price. These exchanges offer stability and predictability in data transactions, enabling users to securely and reliably exchange data assets based on fixed rates. If you need a refresher on fixed-rate exchanges, visit the [asset pricing](../contracts/pricing-schemas.md#fixed-pricing) page. -#### Query -```graphql -{ + +_PS: In this example, the query is executed on the Ocean subgraph deployed on the mainnet. If you want to change the network, please refer to_ [_this table_](README.md#ocean-subgraph-deployments)_._ + +{% tabs %} +{% tab title="Javascript" %} +The javascript below can be used to run the query and fetch a list of fixed-rate exchanges. If you wish to change the network, replace the variable's value `network` as needed. + +```runkit nodeVersion="18.x.x" +var axios = require('axios'); + +const query = `{ fixedRateExchanges(skip:0, first:2, subgraphError:deny){ id contract @@ -41,14 +49,30 @@ Copy the query in the [GraphiQL interface](https://v4.subgraph.mainnet.oceanprot tx } } -} +}` + +const network = "mainnet" +var config = { + method: 'post', + url: `https://v4.subgraph.${network}.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph`, + headers: { "Content-Type": "application/json" }, + data: JSON.stringify({ "query": query }) +}; + +axios(config) + .then(function (response) { + let result = JSON.stringify(response.data) + console.log(result) + }) + .catch(function (error) { + console.log(error); + }); + ``` +{% endtab %} -#### Code - -{% tabs %} {% tab title="Python" %} -The python script below can be used to run the the query. If you wish to change the network, then replace the value of variable `base_url` as needed. +The Python script below can be used to run the query and retrieve a list of fixed-rate exchanges. If you wish to change the network, then replace the value of the variable `base_url` as needed. **Create script** @@ -115,16 +139,11 @@ python list_fixed_rate_exchanges.py ``` {% endtab %} -{% tab title="Javascript" %} -The javascript below can be used to run the the query. If you wish to change the network, then replace the value of variable `baseUrl` as needed. +{% tab title="Query" %} +Copy the query to fetch a list of fixed-rate exchanges in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). -#### Create script - -{% code title="listFRE.js" %} -```javascript -var axios = require('axios'); - -const query = `{ +``` +{ fixedRateExchanges(skip:0, first:2, subgraphError:deny){ id contract @@ -156,40 +175,11 @@ const query = `{ tx } } -}` - -const baseUrl = "https://v4.subgraph.mainnet.oceanprotocol.com" -const route = "/subgraphs/name/oceanprotocol/ocean-subgraph" - -const url = `${baseUrl}${route}` - -var config = { - method: 'post', - url: url, - headers: { "Content-Type": "application/json" }, - data: JSON.stringify({ "query": query }) -}; - -axios(config) - .then(function (response) { - console.log(JSON.stringify(response.data)); - }) - .catch(function (error) { - console.log(error); - }); -``` -{% endcode %} - -#### Execute script - -```bash -node listFRE.js +} ``` {% endtab %} {% endtabs %} -#### Response -
Sample response diff --git a/discover/README.md b/discover/README.md new file mode 100644 index 00000000..1dfa1d39 --- /dev/null +++ b/discover/README.md @@ -0,0 +1,45 @@ +--- +description: Why Ocean Protocol? +cover: ../.gitbook/assets/cover/discover_banner.png +coverY: 7.413145539906106 +--- + +# 🌊 Discover + +{% embed url="https://youtu.be/4P72ZelkEpQ" %} + +Society is increasingly reliant on data as AI becomes more popular. However, a small handful of organizations possess and control massive amounts of our personal data, posing a threat to a free and open society ☢️ + +The concentration of vast datasets in the hands of a few organizations can lead to significant negative consequences for society. These include: + +* 📛 **Monopolistic Control**: When a small number of organizations control large amounts of data and AI tools, they gain a significant advantage over competitors. This can lead to monopolistic behavior, where these organizations dominate the market and stifle competition. As a result, innovation may suffer, prices can be inflated, and consumer choice becomes limited. +* 📛 **Single Point of Failure**: Concentrating data in the hands of a few organizations creates a single point of failure. If a breach or data leak occurs within one of these organizations, the impact can be far-reaching, affecting a significant portion of the population whose data is held by that organization. The potential scale of such breaches can be much larger than if data were distributed across multiple entities, making the consequences more severe. +* 📛 **Algorithmic Bias and Discrimination**: AI tools rely on data to make decisions and predictions. If the datasets used for training are biased or incomplete, AI systems can perpetuate and amplify existing biases and discrimination. The concentration of datasets in the hands of a few organizations can exacerbate this issue, as their AI models may reflect the biases present in their data, leading to unfair or discriminatory outcomes in various domains, such as hiring, lending, and criminal justice. +* 📛 **Lack of Transparency and Accountability**: The complex nature of AI algorithms and the concentration of power make it difficult to understand and scrutinize the decisions made by these systems. When only a few organizations control AI tools, it can limit transparency and accountability. This lack of visibility can lead to distrust in AI systems, as people are unable to understand how decisions are being made or to challenge them when they are unfair or erroneous. The desire to extract value from data can create a conflict between the need to protect individual privacy and the pursuit of business interests. +* 📛 **Lack of Privacy**: In today's digital age, traditional data-sharing methods often compromise privacy, raising concerns among individuals and organizations alike. With the increasing amount of personal and sensitive information being collected, stored, and shared, it has become essential to address it. +* 📛 **Limited Data Monetization**: Many data owners struggle to monetize their data assets effectively due to various factors. Firstly, they often lack data-driven business models that can translate their data into valuable insights and actionable opportunities. Additionally, data quality and trust play a critical role, as inaccurate or unreliable data hinders the development of data-driven products and erodes trust among potential buyers. Limited market access is another challenge, with the fragmented data economy and the absence of standardized platforms or marketplaces making it difficult for data owners to connect with the right audience. +* 📛 **Intermediaries and Inefficiencies**: In the current data economy, intermediaries play a significant role in facilitating data transactions between data owners and data consumers. However, the reliance on intermediaries introduces additional costs and complexities that can hinder efficient data monetization. + +If we don't enable personal **sovereignty** over our data, then we could be at the mercy of Big Tech's decisions again in the future of Web3. What's worse, we could even be excluded from developing AI innovations entirely. ☢️☠️ + +{% embed url="https://giphy.com/clips/spongebob-PPEmM68bHy2zEZUcs7" %} + +That's why we made the decision to take action, and it led to the creation of the **Ocean Protocol**. + +**Ocean Protocol aims to kick-start a new data economy by giving power back to people and giving researchers the data they need. Ocean Protocol technology enables data to become an asset.** + +By providing a decentralized and open platform and the tools for data sharing, **Ocean Protocol** aims to address these concerns and create a more equitable and collaborative data economy. + +We believe that data is a valuable resource that should be accessible to **everyone** :people\_holding\_hands:, not just a select few large corporations. We aim to empower data providers to monetize their data and enable data consumers to access high-quality data for their projects. Ultimately, we seek to establish a more equitable and collaborative data economy that benefits businesses, researchers, and communities worldwide. + +To find out more about the amazing team behind Ocean, you can visit our [website](https://oceanprotocol.com/about). + +Now that we've made you curious about our mission and how we're making a difference in the world, you won't want to miss this video featuring our co-founder, [Trent McConaghy](http://www.trent.st/). He'll share some fascinating insights into what we're doing and why it matters. + +{% embed url="https://youtu.be/XN_PHg1K61w" fullWidth="false" %} +A new data economy with power to the people - Trent McConaghy +{% endembed %} + +### Ocean Protocol Whitepaper + +If you'd like to explore the details of our technology, feel free to dive into our [whitepaper](https://oceanprotocol.com/tech-whitepaper.pdf)! It's a comprehensive resource that explains all the technical details and the core concepts that drive Ocean Protocol. It's a great way to get a deeper understanding of what we're all about. diff --git a/discover/basic-concepts.md b/discover/basic-concepts.md new file mode 100644 index 00000000..f2ce7fdf --- /dev/null +++ b/discover/basic-concepts.md @@ -0,0 +1,33 @@ +--- +description: Learn the Web3 concepts backing up Ocean Protocol tech +--- + +# Basic Concepts + +You'll need to know a thing or two about **Web3** to understand Ocean Protocol's tech... Let's get started with the basics 🧑‍🏫 + +

Prepare yourself, my friend

+ +### Blockchain: The backbone of Ocean Protocol + +Blockchain is a revolutionary technology that enables the decentralized nature of Ocean Protocol. At its core, blockchain is a **distributed ledger** that securely **records and verifies transactions across a network of computers**. It operates on the following key concepts that ensure trust and immutability: + +* **Decentralization**: Blockchain eliminates the need for intermediaries by enabling a peer-to-peer network where transactions are validated collectively. This decentralized structure reduces reliance on centralized authorities, enhances transparency, and promotes a more inclusive data economy. +* **Immutability**: Once a transaction is recorded on the blockchain, it becomes virtually impossible to alter or tamper with. The data is stored in blocks, which are cryptographically linked together, forming an unchangeable chain of information. Immutability ensures the integrity and reliability of data, providing a foundation of trust in the Ocean Protocol ecosystem. Furthermore, it enables reliable traceability of historical transactions. +* **Consensus Mechanisms**: Blockchain networks employ consensus mechanisms to validate and agree upon the state of the ledger. These mechanisms ensure that all participants validate transactions without relying on a central authority, crucially maintaining a reliable view of the blockchain's history. The consensus mechanisms make it difficult for malicious actors to manipulate the blockchain's history or conduct fraudulent transactions. Popular consensus mechanisms include Proof of Work (PoW) and Proof of Stake (PoS). + +Ocean Protocol harnesses the power of blockchain to facilitate secure and auditable data exchange. This ensures that data transactions are transparent, verifiable, and tamper-proof. Here's how blockchain is utilized in the Ocean Protocol ecosystem: + +* **Data Asset Representation**: Data assets in Ocean Protocol are represented as non-fungible tokens (NFTs) on the blockchain. NFTs provide a unique identifier for each data asset, allowing for seamless tracking, ownership verification, and access control. Through NFTs and datatokens, data assets become easily tradable and interoperable within the Ocean ecosystem. +* **Smart Contracts**: Ocean Protocol utilizes smart contracts to automate and enforce the terms of data exchange. Smart contracts act as self-executing agreements that facilitate the transfer of data assets between parties based on predefined conditions - they are the exact mechanisms of decentralization. This enables cyber-secure data transactions and eliminates the need for intermediaries. +* **Tamper-Proof Audit Trail**: Every data transaction on Ocean Protocol is recorded on the blockchain, creating an immutable and tamper-proof audit trail. This ensures the transparency and traceability of data usage, providing data scientists with a verifiable record of the data transaction history. Data scientists can query addresses of data transfers on-chain to understand data usage. + +By integrating blockchain technology, Ocean Protocol establishes a trusted infrastructure for data exchange. It empowers individuals and organizations to securely share, monetize, and leverage data assets while maintaining control and privacy. + +### **OCEAN Tokens: Empowering Data Ownership and Monetization** + +Ocean tokens (**OCEAN**) are the native cryptocurrency of the Ocean Protocol ecosystem. They serve as the medium of exchange for data services, enabling data owners, consumers, and service providers to participate in the data economy. Here's how Ocean tokens are used within the ecosystem: + +1. **Data Ownership**: Ocean tokens empower data owners by providing them with control over their data assets. Through the use of smart contracts, data owners can define access permissions, usage rights, and pricing terms for their data. By holding and staking Ocean tokens, data owners can exercise even greater control over their data assets. +2. **Data Monetization and Consumption**: Ocean tokens facilitate seamless and secure transactions between data providers and consumers, fostering a thriving new data economy. Data owners can set a price in Ocean tokens for consumers to access and utilize their data. This creates opportunities for unlocking value from siloed or otherwise unused data. +3. **Stake for veOcean and Curate Datasets**: Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOCEAN](../rewards/veocean.md). By staking your OCEAN, you not only support the growth and sustainability of the ecosystem but also earn a share of data asset sales 💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/README.md) while making a meaningful impact in the data marketplace. diff --git a/discover/explore.md b/discover/explore.md new file mode 100644 index 00000000..1856cc34 --- /dev/null +++ b/discover/explore.md @@ -0,0 +1,150 @@ +--- +description: What is Ocean Protocol? +--- + +# Explore + +Ocean Protocol is a **decentralized data exchange protocol** that aims to unlock the value of data and enable secure and privacy-preserving on-chain data sharing and monetization ⛓️ It provides a framework for building data-centric applications that facilitate the exchange of data assets while ensuring data privacy, security, and compliance. + +Ocean Protocol is used for a variety of purposes, including: + +1. **Data Sharing**: Ocean Protocol allows individuals and organizations to share data securely and privately, enabling data owners to monetize their data assets while maintaining control over their data. +2. **Data Monetization**: Data owners can monetize their data by offering it for sale or by providing data services through compute-to-data (C2D) capabilities. Data consumers can access and utilize data assets. +3. **Decentralized Data Marketplaces**: Ocean Protocol facilitates the creation of decentralized data marketplaces where data providers can list their data assets and data consumers can discover and access them. These marketplaces operate on a peer-to-peer basis, eliminating the need for intermediaries and providing more efficient and transparent data transactions. +4. **AI Development**: Ocean Protocol supports the development of AI models by providing access to diverse and high-quality datasets. Data scientists and AI developers can leverage these datasets to train and improve their models, leading to more accurate and robust AI systems. +5. **Access control:** Ocean Protocol incorporates token-gating mechanisms that grant or restrict access to specific data assets based on predefined criteria, ensuring controlled and regulated data sharing within the ecosystem. + +By leveraging **blockchain technology** and **smart contracts**, Ocean Protocol offers **open-source tools** to securely publish [NFTs](../developers/contracts/data-nfts.md) of your data and algorithms to seamlessly collaborate, trade, and innovate with others. + +

A vast ocean of data awaits you...

+ +Get a glimpse into some of the things you can do with Ocean Protocol. The opportunities with our protocol to leverage an "Ocean of data" are great and ever-evolving. Together, we'll dive deeper and uncover even more ways to harness the power of decentralized data. + +
+ +Build Your dApp

Building a dApp using Ocean Protocol can be a great way to bring next generation data sharing tools to your users. By utilizing Ocean's technology stack, you will leverage Web3 technological advantages that distinguish your application from others and can tap into a robust network of Ocean core data engineers/AI scientists supporting dApp developers.
+ +When building a dApp on top of Ocean Protocol, you gain access to a wide range of features and functionalities: + +1. **Data access and discovery**: Utilize Ocean's data marketplace infrastructure to access diverse and valuable data sets. Leverage the data discovery mechanisms to help users find relevant data assets for their applications. +2. **Data interoperability**: Seamlessly integrate and interact with various data sources using Ocean's standardized data representation formats. Ensure compatibility and easy data integration within your dApp. +3. **Data privacy and security**: Leverage the cryptographic capabilities of Ocean Protocol to ensure privacy and security of sensitive data. Implement access controls, encryption, and secure data-sharing mechanisms within your dApp. +4. **Provenance and transparency**: Leverage the transparency and immutability of the blockchain to establish data provenance. Build trust among users by providing an auditable record of data sources, usage, and transactions. +5. **Tokenized incentives**: Utilize datatokens (ERC20) within your dApp to incentivize data providers and consumers. Design token economies that align with the specific requirements of your application, encouraging participation and value creation. +6. **Community participation**: Leverage the community-driven nature of Ocean Protocol to foster collaboration, feedback, and innovation. Engage with the Ocean community to share ideas, contribute to the ecosystem, and gather insights to enhance your dApp. + +These are a few examples of what can be built on top of Ocean. + +1. [Ocean Waves](https://waves.oceanprotocol.com/) - Decentralized music [marketplace](https://github.com/oceanprotocol/waves) +2. [Ocean Market](https://market.oceanprotocol.com) - Decentralized data [marketplace](https://github.com/oceanprotocol/market) +3. [Autobot](https://autobotocean.com/) - Tokengated [data farming](https://df.oceandao.org/) intelligence app +4. [Ocean Token Gate](https://tokengate.oceanprotocol.com/) - Tokengated [content](https://github.com/oceanprotocol/token-gating-template) +5. [Acentrik Market](https://market.acentrik.io/) - Enterprise decentralized data marketplace + +
+ +
+ +Build Your Data Marketplace

With Ocean Protocol, you have the flexibility to create and launch your own data marketplace tailored to your specific requirements. Utilize the protocol's infrastructure and tools to establish a platform where data providers and consumers can connect and transact.
+ +You can choose from two options: + +1. **Fork the** [**Ocean Marketplace**](https://github.com/oceanprotocol/market) **and customize it**: You have the ability to fork the existing [Ocean Marketplace](https://github.com/oceanprotocol/market) codebase and customize it according to your needs. This allows you to leverage the foundational infrastructure and functionality already built by Ocean Protocol while tailoring the user interface, features, and branding to align with your marketplace vision. Follow this [tutorial](../developers/build-a-marketplace/) to learn how to do it. +2. **Build your marketplace with Ocean components**: Alternatively, you can build your data marketplace from scratch using Ocean Protocol's modular components. Ocean provides a comprehensive set of building blocks, such as the [**Aquarius**](https://github.com/oceanprotocol/aquarius), [**Provider**](https://github.com/oceanprotocol/provider), Ocean [contracts](https://github.com/oceanprotocol/contracts), and Ocean libraries ([**ocean.js**](https://github.com/oceanprotocol/ocean.js) & [**ocean.py**](https://github.com/oceanprotocol/ocean.py)), which you can integrate into your own marketplace development. This empowers you to create a unique and customized data marketplace experience while leveraging the underlying capabilities and standards provided by Ocean Protocol. + +
+ +
+ +Tokengate Your dApp or Content

Are you interested in token gating your dApp or content using an Ocean data NFT? We offer you all the code and support that you need to make this happen.
+ +Feel free to fork the [Ocean Token Gate template](https://github.com/oceanprotocol/token-gating-template) code and customize it to start building your dApp from scratch. If you already have an existing dApp of yours, then you can also modify it simply to use an Ocean data NFT. The [Ocean Token Gate repo](https://github.com/oceanprotocol/token-gating-template) will be helpful to inform your coding to do this, of course. But we also explain the [smart contract mechanics](../developers/contracts/) of Data NFTs and Datatokens[ in our docs](../developers/contracts/) for you to understand the code better. Remember, we're always here to help guide you with any coding questions on [Discord](https://discord.gg/TnXjkR5). + +
+ +
+ +Buy or Sell Data

Users have the possibility to publish, purchase, and sell data within the Ocean ecosystem. Data is tokenized in the form of interoperable ERC721 data NFTs and ERC20 datatokens. The platform acts as a decentralized exchange (DEX) specifically designed for data transactions. Every action, including publishing, purchasing, and consuming data, is securely recorded on the blockchain, creating a tamper-proof audit trail.
+ +For data scientists and AI practitioners, Ocean presents opportunities such as increased access to a broader range of data, including private data, crypto-secured provenance for data and AI training, and potential income streams from selling and curating data. + +To showcase these capabilities, Ocean developed a demonstrator marketplace known as the [Ocean Market](https://market.oceanprotocol.com/). + +The following guides will help you get started with buying and selling data: + +* [Publish an NFT](../user-guides/publish-data-nfts.md) +* [Download an NFT](../user-guides/buy-data-nfts.md) +* [Host Assets](../user-guides/asset-hosting/README.md) + +
+ +
+ +Manage datatokens and data NFTs for use in DeFi

Ocean makes it easy to publish data services (deploy ERC721 data NFTs and ERC20 datatokens), and to consume data services (spend datatokens). Crypto wallets, exchanges, and DAOs become data wallets, exchanges, and DAOs.
+ +Use Ocean [JavaScript](../developers/ocean.js/README.md) or [Python](../developers/ocean.py/README.md) drivers to manage data NFTs and datatokens: + +Ocean-based apps make data assets on-ramps and off-ramps easy for end users. Ocean smart contracts and libraries make this easy for developers. The data itself does not need to be on-chain, just the access control. + +Data NFTs are ERC721 tokens representing the unique asset and datatokens are ERC20 tokens to access data services. Each data service gets its own data NFT and one or more type of datatokens. + +To access the dataset, you send 1.0 datatokens to the data provider (running Ocean Provider). To give access to someone else, send them 1.0 datatokens. That's it. + +Since datatokens are ERC20, and live on Ethereum mainnet, there's a whole ecosystem to leverage. + +* _Publish and access data services:_ downloadable files or compute-to-data. Use Ocean to deploy a new [ERC721](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-721.md) and [ERC20](https://github.com/ethereum/EIPs/blob/7f4f0377730f5fc266824084188cc17cf246932e/EIPS/eip-20.md) datatoken contract for each data service, then mint datatokens. +* _Transfer datatokens_ to another owner (or approve & transferFrom). +* _And more._ Use ERC20 support in [web3.js](https://web3js.readthedocs.io/), [web3.py](https://web3py.readthedocs.io/en/stable/examples.html#working-with-an-erc20-token-contract) and Solidity to connect datatokens with crypto wallets and other DeFi services. + +
+ +
+ +Run Your Own Provider

You have the option to generate revenue by running your own provider. It has come to our attention that many of you have not pursued this opportunity, primarily due to the lack of compelling incentives.
+ +If you're not familiar with it, the Ocean [Provider](../developers/provider/README.md) serves as the proxy service responsible for encrypting/decrypting data and streaming it to the consumer. It also verifies user access privileges for specific data assets or services. It plays a vital role in the Ocean architecture. + +Fees are now paid to the individual or organization running the provider when a user downloads a data asset. The download fees are set based on the cost per MB, and there is also a provider fee for compute jobs, which is priced per minute. + +Both the download and compute fees can be set to any absolute amount, and you have the flexibility to choose the token in which you want to receive these fees. They do not have to be in the same currency used in the marketplace. For instance, the provider fee could be a fixed rate of 5 USDT per 1000 MB of data downloaded, and this fee will remain fixed in USDT even if the marketplace uses a different currency. + +Furthermore, provider fees are not restricted to data consumption; they can also be utilized to charge for compute resources. For example, a provider can charge a fixed fee of 15 DAI to reserve compute resources for one hour. This presents an advantage for both the user and the provider host. Users can now reserve the appropriate amount of computing resources according to their requirements. For provider hosts, this creates an additional income opportunity. + +
+ +
+ +Earn Rewards

The Data Farming initiative is a key feature of Ocean Protocol that empowers participants to earn rewards while contributing to a decentralized data economy. By staking Ocean tokens and actively participating in data markets, users play a vital role in enhancing the availability and accessibility of valuable data assets
+ +Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOcean](../rewards/veocean.md). By staking your tokens, you not only support the growth and sustainability of the ecosystem but also earn a share of the generated incentives💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/README.md) while making a meaningful impact in the data marketplace. + +Participating in the Data Farming initiative demonstrates a commitment to the principles of **fairness**, **transparency**, and **collaboration** that underpin Ocean Protocol. It allows you to actively engage with the ecosystem, promoting innovation, and driving the evolution of the decentralized data economy. + +
+ +
+ +Data Challenges

Ocean Data Challenges offer you a unique opportunity to showcase your skills and creativity in the world of data. These organized events and competitions encourage data scientists, researchers, and developers like yourself to explore and innovate using Ocean Protocol.
+ +By participating in Ocean Data Challenges, you can tackle real-world problems, leverage data assets, and utilize the technologies within the Ocean ecosystem. Not only do you have the chance to compete for recognition and prizes, but you also contribute to driving innovation, fostering collaboration, and making a positive impact in the data space. + +
+ +
+ +Become an Ambassador

Becoming an Ocean Ambassador presents a unique opportunity to actively contribute to the growth and adoption of Ocean Protocol while being at the forefront of the decentralized data revolution.
+ +As an Ocean Ambassador, you become an advocate for the protocol, promoting its vision of democratizing data and empowering individuals. By sharing your knowledge and enthusiasm, you can educate others about the benefits and potential of Ocean Protocol, inspiring them to join the ecosystem. As part of a global community of like-minded individuals, you gain access to exclusive resources, networking opportunities, and collaborations that further enhance your expertise in the data economy. As an Ambassador, you play a vital role in shaping the future of data by driving awareness, fostering innovation, and helping to build a more open and equitable data ecosystem. + +
+ +
+ +Contribute to Ocean Code Development

Make a positive impact in the Web3 data economy by contributing to Ocean's open source code on Github! From feature requests to pull requests, contributions of all kinds are appreciated.
+ +To begin, [visit our Github page](https://github.com/oceanprotocol) where you can see the repos and contributors. If you're going to contribute code to a repo, then we ask that you fork the code first, make your changes, and then create a pull request for us to review. If you are reporting an issue, then please first search the existing issues to see if it is documented yet. If not, then please open a new issue by describe your problem as best as possible and include screenshots. +We also welcome you to join our [Discord developer community](https://discord.gg/TnXjkR5) where you can get rapid, practical advice on using Ocean tech but also get to know Ocean core team more personally! + +
+ +This is just the beginning of what Ocean Protocol has to offer. Join us as we explore, innovate, and push the boundaries of what's possible with decentralized data. Together, we can shape a future where data is accessible, secure and empowers individuals and organizations alike. Let's dive in and discover the endless possibilities of Ocean Protocol :ocean: diff --git a/orientation/faq.md b/discover/faq.md similarity index 74% rename from orientation/faq.md rename to discover/faq.md index 38bbd137..1e40b3ee 100644 --- a/orientation/faq.md +++ b/discover/faq.md @@ -1,8 +1,5 @@ --- title: FAQs -order: 5 -hideLanguageSelector: true -featuredImage: images/creatures/mantaray/mantaray-full@2x.png description: Frequently Asked Questions about Ocean Protocol --- @@ -36,17 +33,9 @@ Once a user has Metamask installed and an Ethereum address, they can register, c
-How is Ocean different from other data marketplaces? - -Ocean Protocol is a decentralized data marketplace which gives users complete control of their data. The Ocean Protocol technology is built on smart contracts, decentralized computer scripts with no intermediary that are triggered by the users. The Ocean Market exposes the functionality of the smart contracts in a browser-friendly interface. Data providers and consumers can discover one another and transact in a peer-to-peer manner with the minimal amount of intermediary involvement. - -
- -
- How do I price my data? -Ocean gives you two different options for pricing your data - fixed price or free. You need to decide what your dataset is worth and how you want to price it. You can change the price but you can’t change the price format (e.g. from fixed to free). +Ocean gives you two different options for pricing your data - [fixed price](../developers/contracts/pricing-schemas.md#fixed-pricing) or [free](../developers/contracts/pricing-schemas.md#free-pricing). You need to decide what your dataset is worth and how you want to price it. You can change the price but you can’t change the price format (e.g. from fixed to free).
@@ -54,7 +43,7 @@ Ocean gives you two different options for pricing your data - fixed price or fre Is my data secure? -Yes. Ocean Protocol understands that some data is too sensitive to be shared — potentially due to GDPR or other reasons. For these types of datasets, we offer a unique service called compute-to-data. This enables you to monetise the dataset that sits behind a firewall without ever revealing the raw data to the consumer. For example, researchers and data scientists pay to run their algorithms on the data set and the computation is performed behind a firewall; all the researchers or data scientists receive is the results generated by their algorithm. +Yes. Ocean Protocol understands that some data is too sensitive to be shared — potentially due to GDPR or other reasons. For these types of datasets, we offer a unique service called [compute-to-data](../developers/compute-to-data/README.md). This enables you to monetize the dataset that sits behind a firewall without ever revealing the raw data to the consumer. For example, researchers and data scientists pay to run their algorithms on the data set, and the computation is performed behind a firewall; all the researchers or data scientists receive is the results generated by their algorithm.
@@ -62,7 +51,7 @@ Yes. Ocean Protocol understands that some data is too sensitive to be shared — Where is my data stored? -Ocean does not provide data storage. Users have the choice to store their data on their own servers, cloud or decentralized storage. Users need only to provide a URL to the dataset, which is then encrypted as a means to protect the access to the dataset. +Ocean does not provide data storage. Users have the choice to [store](../user-guides/asset-hosting/README.md) their data on their own servers, cloud, or decentralized storage. Users need only to provide a URL, an IPFS hash, an Arweave CID, or the on-chain information to the dataset. This is then encrypted as a means to protect access to the dataset.
@@ -70,7 +59,7 @@ Ocean does not provide data storage. Users have the choice to store their data o How do I control who accesses my data? -Ocean provides tools for access control, fine grained permissions, passlisting and blocklisting addresses. Data and AI services can be shared under the conditions set by the owner of data. There is no central intermediary, which ensures no one can interfere with the transaction and both the publisher and user have transparency. +Ocean provides tools for access control, [fine-grained permissions](../developers/fg-permissions.md), passlisting, and blocklisting addresses. Data and AI services can be shared under the conditions set by the owner of the data. There is no central intermediary, which ensures no one can interfere with the transaction and both the publisher and user have transparency.
@@ -78,15 +67,8 @@ Ocean provides tools for access control, fine grained permissions, passlisting a Can I restrict who is able to access my dataset? -Yes - Ocean has implemented fine grained permissions. This means that you can create allow and deny lists that restrict access from certain individuals or limit access to particular organizations. - -
- -
- -What is the reach of Ocean Market - how many data buyers can I sell to? - -Hundreds of unique datasets are available that are sourced from private individuals, research institutions, commercial enterprises and government. Publishing data on Ocean offers data providers and algorithm owners an exciting new channel to connect with a rapidly growing community of Web3 enthusiasts and data science professionals around the world. +Yes - Ocean has implemented [fine-grained permissions](../developers/fg-permissions.md). This means that you can create allow and deny lists that restrict access from certain individuals or limit access to particular organizations. \ +PS: [Fine-grained permissions](../developers/fg-permissions.md) are not integrated into the Ocean Marketplace.
@@ -120,7 +102,7 @@ The blockchain can do more than just store information - it can also run code. A What is a datatoken? -A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. . If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset. +A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset.
@@ -224,7 +206,7 @@ Checkout our [roadmap](https://oceanprotocol.com/technology/roadmap) to see what What assets are eligible for Data Farming? -The data asset may be of any type — dataset (for static URIs), algorithm for Compute-to-Data, or any other Datatoken token-gated system. The data asset may be fixed price or free price. You can find more details in the [DF Background page](df-background.md#data-assets-that-qualify-for-df) +The data asset may be of any type — dataset (for static URIs), algorithm for Compute-to-Data, or any other Datatoken token-gated system. The data asset may be fixed price or free price. You can find more details in the [DF Background page](../rewards/df-max-out-yield.md#assets-that-qualify-for-data-farming)
@@ -238,14 +220,6 @@ The counting starts at 12.01am on Thursday, and ends at 11.59pm on the following
-I staked for just one day. What rewards might I expect? - -At least 50 snapshots are randomly taken throughout the week. If you’ve staked just one day, and all else being equal, you should expect 1/7 the rewards compared to the full 7 days. - -
- -
- The datatoken price may change throughout the week. What price is taken in the DCV calculation? The price is taken at the same time as each consume. E.g. if a data asset has three consumes, where price was 1 OCEAN when the first consume happened, and the price was 10 OCEAN when the other consumes happened, then the total DCV for the asset is 1 + 10 + 10 = 21. @@ -268,7 +242,8 @@ Caveat: it’s no at least in theory! Sometimes there may be tweaks if there is What is the official formula for the Linear Decay? -The Linear Decay formula for veOCEAN can be expressed as follows in python. +The Linear Decay formula for veOCEAN can be expressed as follows in python. + ```python FOUR_YEARS = 60 * 60 * 24 * 7 * 52 @@ -285,7 +260,7 @@ To learn more about systems driving veOCEAN and Data Farming, please [visit our What about passive stakers — people who just want to stake in one place and be done? -Earnings are passive by default +Earnings are passive by default.
@@ -293,7 +268,7 @@ Earnings are passive by default What about active stakers — people who want to do extra work and get rewarded? -Ot works. Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more $$ +Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more \$$. @@ -303,7 +278,7 @@ Ot works. Half the DF revenue goes to veOCEAN stake that users can allocate. All In this scheme, can people stake on fixed-price datasets? -Yes. They allocate their veOCEAN to datasets. Then DF rewards follow the usual DF formula: DCV * veOCEAN stake. +Yes. They allocate their veOCEAN to datasets. Then DF rewards follow the usual DF formula: DCV \* veOCEAN stake. @@ -311,7 +286,7 @@ Yes. They allocate their veOCEAN to datasets. Then DF rewards follow the usual D In this scheme, can people stake on free datasets? -Yes. They allocate their veOCEAN to datasets. Then DF rewards follow the usual DF formula: DCV * veOCEAN stake. Except in this case although DCV is 0, the gas fees will still count towards calculating rewards. +Yes. They allocate their veOCEAN to datasets. Then DF rewards follow the usual DF formula: DCV \* veOCEAN stake. Except in this case although DCV is 0, the gas fees will still count towards calculating rewards. @@ -336,9 +311,10 @@ Yes, from the get-go! It doesn’t matter how data is priced, this works for all
Which networks are eligible for Data Farming? + Data assets for DF may published in any network where Ocean’s deployed in production: Eth Mainnet, Polygon, BSC, and more. -You can find a list of [all supported chains here](https://docs.oceanprotocol.com/core-concepts/networks). +You can find a list of [all supported chains here](networks/README.md).
@@ -354,15 +330,7 @@ They are deployed on Ethereum mainnet, alongside other Ocean contract deployment What is the official veOCEAN epoch start_time? -veFeeDistributor has a start_time of 1663804800 (Thu Sep 22 2022 00:00:00) - - - -
- -Will the Market still need to be multi-chain? - -Yes, Ocean Market still needs to be multi-chain: all the reasons that we went multi-chain for are as valid as ever. +veFeeDistributor has a start\_time of 1663804800 (Thu Sep 22 2022 00:00:00).
diff --git a/discover/glossary.md b/discover/glossary.md new file mode 100644 index 00000000..bd9cfc0b --- /dev/null +++ b/discover/glossary.md @@ -0,0 +1,458 @@ +--- +description: >- + A comprehensive list of key terms, concepts, and acronyms used in the Ocean + Protocol ecosystem +--- + +# Glossary + +### Ocean Protocol Concepts + +
+ +Ocean Protocol + +Ocean Protocol is a decentralized data exchange protocol that enables individuals and organizations to share, sell, and consume data in a secure, transparent, and privacy-preserving manner. The protocol is designed to address the current challenges in data sharing, such as data silos, lack of interoperability, and data privacy concerns. Ocean Protocol uses blockchain technology, smart contracts, and cryptographic techniques to create a network where data providers can offer their data assets for sale, data consumers can purchase and access the data, and developers can build data-driven applications and services on top of the protocol. + +
+ +
+ +$OCEAN + +The Ocean Protocol token (OCEAN) is a utility token used in the Ocean Protocol ecosystem. It serves as a medium of exchange and a unit of value for data services in the network. Participants in the Ocean ecosystem can use OCEAN tokens to buy and sell data, stake on data assets, and participate in the governance of the protocol. + +
+ +
+ +Data Consume Volume (DCV) + +The data consume value (DCV) is a key metric that refers to the amount of $ spent to buy data assets where the data assets are subsequently consumed. + +
+ +
+ +Transaction Volume (TV) + +The transaction value is a key metric that refers to the value of transactions within the ecosystem. + +Transaction volume(TV) is often used interchangeably with data consume volume (DCV). DCV is a more refined metric that excludes activities like wash trading. DCV measures the actual consumption or processing of data within the protocol, which is a more accurate measure of the value generated by the ecosystem. + +
+ +
+ +Base IP + +**Base IP** means the artifact being copyrighted. Represented by the {ERC721 address, tokenId} from the publish transactions. + +
+ +
+ +Base IP holder + +**Base IP holder** means the holder of the Base IP. Represented as the actor that did the initial "publish" action. + +
+ +
+ +Sub-licensee + +**Sub-licensee** is the holder of the sub-license. Represented as the entity that controls address ERC721.\_owners\[tokenId=x]. + +
+ +
+ +To Publish + +Claim copyright or exclusive base license. + +
+ +
+ +To Sub-license + +Transfer one (of many) sub-licenses to new licensee: ERC20.transfer(to=licensee, value=1.0). + +
+ +
+ +Ocean Data Challenges + +[Ocean Data Challenges](https://oceanprotocol.com/challenges) is a program organized by Ocean Protocol that seeks to expedite the shift into a New Data Economy by incentivizing data-driven insights and the building of algorithms geared toward solving complex business challenges. The challenges aim to encourage the Ocean community and other data enthusiasts to collaborate and leverage the capabilities of the Ocean Protocol to produce data-driven insights and design algorithms that are specifically tailored to solving intricate business problems. + +Ocean Data Challenges typically involve a specific data problem or use case, for which participants are asked to develop a solution. The challenges are open to many participants, including data scientists, developers, researchers, and entrepreneurs. Participants are given access to relevant data sets, tools, and resources and invited to submit their solutions + +
+ +
+ +Ocean Market + +The [Ocean Market](http://market.oceanprotocol.com) is a decentralized data marketplace built on top of the Ocean Protocol. It is a platform where data providers can list their data assets for sale, and data consumers can browse and purchase data that meets their specific needs. The Ocean Market supports a wide range of data types, including but not limited to, text, images, videos, and sensor data. + +While the Ocean Market is a vital part of the Ocean Protocol ecosystem and is anticipated to facilitate the unlocking of data value and stimulate data-driven innovation, it is important to note that it is primarily a **technology demonstrator**. As a decentralized data marketplace built on top of the Ocean Protocol, the Ocean Market **showcases** the capabilities and features of the protocol, including secure and transparent data exchange, flexible access control, and token-based incentivization. It serves as a testbed for the development and refinement of the protocol's components and provides a sandbox environment for experimentation and innovation. As such, the Ocean Market is a powerful tool for demonstrating the potential of the Ocean Protocol and inspiring the creation of new data-driven applications and services. + +
+ +
+ +Ocean Shipyard + +[Ocean Shipyard](https://oceanprotocol.com/shipyard) is an early-stage grant program established to fund the next generation of Web3 dApps built on Ocean Protocol. It is made for entrepreneurs looking to build open-source Web3 solutions on Ocean, make valuable data available, build innovations, and create value for the Ocean ecosystem. + +In Shipyard, the Ocean core team curates project proposals that are set up to deliver according to clear delivery milestone timelines and bring particular strategic value for the future development of Ocean. + +
+ +
+ +veOCEAN + +_ve_ tokens have been introduced by several projects such as [Curve](https://curve.fi/) and [Balancer](https://balancer.fi/). These tokens require users to lock _project tokens_ in return for _ve\_. + +[veOCEAN](https://df.oceandao.org/veocean) gives token holders the ability to lock OCEAN to earn yield and curate data. + +In exchange for locking tokens, users can earn rewards. The amount of reward depends on how long the tokens are locked. Furthermore, veTokens can be used for asset curation. + +
+ +
+ +Ocean Data Farming (DF) + +[Ocean Data Farming (DF)](https://df.oceandao.org/) incentivizes for growth of Data Consume Volume (DCV) in the Ocean ecosystem. [DF](../rewards/df-intro.md) is like DeFi liquidity mining, but tuned for DCV. DF emits OCEAN for passive rewards and active rewards. + +* As a veOCEAN holder, you get _passive_ rewards by default. +* If you _actively_ curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), then you can earn more. + +
+ +
+ +DF Passive Rewards + +When a user locks their OCEAN tokens for a finite period of time, they get veOCEAN tokens in return. Based on the quantity of veOCEAN, the user accumulates weekly OCEAN [rewards](../rewards/df-intro.md#what-are-active-rewards). Because rewards are generated without human intervention, these are called "passive" OCEAN rewards. OCEAN rewards are claimable every Thursday on the [Rewards page](https://df.oceandao.org/rewards). + +
+ +
+ +DF Active Rewards + +When a user allocates veOCEAN tokens to Ocean Market projects, then weekly OCEAN rewards are given to a user based on the sales of those projects. Since these rewards depend on human intervention to decide the allocations, they are categorized as "active" [rewards](../rewards/df-intro.md#what-are-passive-rewards) instead of passive rewards. OCEAN rewards are claimable every Thursday on the [Rewards page](https://df.oceandao.org/rewards). + +
+ +
+ +H2O + +[H2O](https://www.h2odata.xyz/) is a decentralized protocol that introduced the first non-pegged stable asset, $H2O. Initially, it is backed by the OCEAN token but there are plans to be backed by other data tokens. + +The H2O non-pegged stable asset is a friendly fork of RAI. Whereas RAI uses ether (ETH) as its asset for collateral, H2O uses OCEAN. The price of H2O is managed by an algorithm that rebalances to bring the redemption price close to the market price and participants are incentivized to aid in this process. Traditional stable assets are pegged to a price such as 1 USD. In contrast, RAI (and soon H2O) are free-floating but typically settle around a price; for RAI this has been \~$3. + +
+ +
+ +$POSEIDON + +The [POSEIDON token](https://docs.h2odata.xyz/protocol-overview/poseidon-mechanics) is the governance token of the H2O protocol. With the following function inside the protocol: + +* Ungovernance: once governance minimization is finalized, POSEIDON holders will be able to remove control from any remaining components in H2O or, if needed, continue to manage components that may be challenging to ungovern (such as oracles or any other component interacting with other protocols). + +
+ +
+ +$psdnOcean + +[psdnOCEAN](https://docs.h2odata.xyz/protocol-overview/psdnocean-veocean-liquid-staking) is the liquid staking wrapper for veOCEAN. + +* Convert OCEAN to psdnOCEAN with a 1:1 ratio. +* Gain access to a liquid asset and receive a share of the revenue of veOCEAN. + +
+ + + +### Web3 Fundamentals + +
+ +Web3 + +Web3 (also known as Web 3.0 or the decentralized web) is a term used to describe the next evolution of the internet, where decentralized technologies are used to enable greater privacy, security, and user control over data and digital assets. + +While the current version of the web (Web 2.0) is characterized by centralized platforms and services that collect and control user data, Web3 aims to create a more decentralized and democratized web by leveraging technologies such as blockchain, peer-to-peer networking, and decentralized file storage. + +Ocean Protocol is designed to be a Web3-compatible platform that allows users to create and operate decentralized data marketplaces. This means that data providers and consumers can transact directly with each other, without the need for intermediaries or centralized authorities. + +
+ +
+ +Blockchain + +A distributed ledger technology (DLT) that enables secure, transparent, and decentralized transactions. Blockchains use cryptography to maintain the integrity and security of the data they store. + +By using blockchain technology, Ocean Protocol provides a transparent and secure way to share and monetize data, while also protecting the privacy and ownership rights of data providers. Additionally, blockchain technology enables the creation of immutable and auditable records of data transactions, which can be used for compliance, auditing, and other purposes. + +
+ +
+ +Decentralization + +Decentralization is the distribution of power, authority, or control away from a central authority or organization, towards a network of distributed nodes or participants. Decentralized systems are often characterized by their ability to operate without a central point of control, and their ability to resist censorship and manipulation. + +In the context of Ocean Protocol, decentralization refers to the use of blockchain technology to create a decentralized data exchange protocol. Ocean Protocol leverages decentralization to enable the sharing and monetization of data while preserving privacy and data ownership. + +
+ +
+ +Block Explorer + +A tool that allows users to view information about transactions, blocks, and addresses on a blockchain network. Block explorers provide a [graphical interface](https://etherscan.io/token/0x967da4048cD07aB37855c090aAF366e4ce1b9F48) for interacting with a blockchain, and they allow users to search for specific transactions, view the details of individual blocks, and track the movement of cryptocurrency between addresses. Block explorers are commonly used by cryptocurrency enthusiasts, developers, and businesses to monitor network activity and verify transactions. + +
+ +
+ +Cryptocurrency + +A digital or virtual currency that uses cryptography for security and operates independently of a central bank. Cryptocurrencies use blockchain or other distributed ledger technologies to maintain their transaction history and prevent fraud. + +Ocean Protocol uses a cryptocurrency called Ocean (OCEAN) as its native token. OCEAN is used as a means of payment for data transactions on the ecosystem, and it is also used to incentivize network participants, such as data providers, validators, and curators. + +Like other cryptocurrencies, OCEAN operates on a blockchain, which ensures that transactions are secure, transparent, and immutable. The use of a cryptocurrency like OCEAN provides a number of benefits for the Ocean Protocol network, including faster transaction times, lower transaction fees, and greater transparency and trust. + +
+ +
+ +Decentralized applications (dApps) + +dApps (short for decentralized applications) are software applications that run on decentralized peer-to-peer networks, such as blockchain. Unlike traditional software applications that rely on a centralized server or infrastructure, dApps are designed to be decentralized, open-source, and community-driven. + +dApps in the Ocean ecosystem are designed to enable secure and transparent data transactions between data providers and consumers, without the need for intermediaries or centralized authorities. These applications can take many forms, including data marketplaces, data analysis tools, data-sharing platforms, and many more. A good example of a dApp is the [Ocean Market](https://market.oceanprotocol.com/). + +
+ +
+ +Interoperability + +The ability of different blockchain networks to communicate and interact with each other. Interoperability is important for creating a seamless user experience and enabling the transfer of value across different blockchain ecosystems. + +In the context of Ocean Protocol, interoperability enables the integration of the protocol with other blockchain networks and decentralized applications (dApps). This enables data providers and users to access and share data across different networks and applications, creating a more open and connected ecosystem for data exchange. + +
+ +
+ +Smart contract + +Smart contracts are self-executing digital contracts that allow for the automation and verification of transactions without the need for a third party. They are programmed using code and operate on a decentralized blockchain network. Smart contracts are designed to enforce the rules and regulations of a contract, ensuring that all parties involved fulfill their obligations. Once the conditions of the contract are met, the smart contract automatically executes the transaction, ensuring that the terms of the contract are enforced in a transparent and secure manner. + +Ocean ecosystem smart contracts are deployed on multiple blockchains like Polygon, Energy Web Chain, Binance Smart Chain, and others. The code is open source and available on the organization's [GitHub](https://github.com/oceanprotocol/contracts). + +
+ +
+ +Ethereum Virtual Machine (EVM) + +The Ethereum Virtual Machine (EVM) is a runtime environment that executes smart contracts on the Ethereum blockchain. It is a virtual machine that runs on top of the Ethereum network, allowing developers to create and deploy decentralized applications (dApps) on the network. The EVM provides a platform for developers to create smart contracts in various programming languages, including Solidity, Vyper, and others. + +The Ocean Protocol ecosystem is a decentralized data marketplace built on the Ethereum blockchain. It is designed to provide a secure and transparent platform for sharing and selling data. + +
+ +
+ +ERC + +ERC stands for Ethereum Request for Comments and refers to a series of technical standards for Ethereum-based tokens and smart contracts. ERC standards are created and proposed by developers to the Ethereum community for discussion, review, and implementation. These standards ensure that smart contracts and tokens are compatible with other applications and platforms built on the Ethereum blockchain. + +In the context of Ocean Protocol, several ERC standards are used to create and manage tokens on the network. Standards like [ERC-20](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/), [ERC-721](https://eips.ethereum.org/EIPS/eip-721) and [ERC-1155](https://eips.ethereum.org/EIPS/eip-1155). + +
+ +
+ +ERC-20 + +[ERC-20](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/) is a technical standard used for smart contracts on the Ethereum blockchain that defines a set of rules and requirements for creating tokens that are compatible with the Ethereum ecosystem. ERC-20 tokens are fungible, meaning they are interchangeable with other ERC-20 tokens and have a variety of use cases such as creating digital assets, utility tokens, or fundraising tokens for initial coin offerings (ICOs). + +The ERC-20 standard is used for creating fungible tokens on the Ocean Protocol network. Fungible tokens are identical and interchangeable with each other, allowing them to be used interchangeably on the network. + +
+ +
+ +ERC-721 + +[ERC-721](https://eips.ethereum.org/EIPS/eip-721) is a technical standard used for smart contracts on the Ethereum blockchain that defines a set of rules and requirements for creating non-fungible tokens (NFTs). ERC-721 tokens are unique and cannot be exchanged for other tokens or assets on a one-to-one basis, making them ideal for creating digital assets such as collectibles, game items, and unique digital art. + +The ERC-721 standard is used for creating non-fungible tokens (NFTs) on the Ocean Protocol network. NFTs are unique and non-interchangeable tokens that can represent a wide range of assets, such as digital art, collectibles, and more. + +
+ +
+ +ERC-1155 + +[ERC-1155](https://eips.ethereum.org/EIPS/eip-1155) is a technical standard for creating smart contracts on the Ethereum blockchain that allows for the creation of both fungible and non-fungible tokens within the same contract. This makes it a "multi-token" standard that provides more flexibility than the earlier ERC-20 and ERC-721 standards, which only allow for the creation of either fungible or non-fungible tokens, respectively. + +The ERC-1155 standard is used for creating multi-token contracts on the Ocean Protocol network. Multi-token contracts allow for the creation of both fungible and non-fungible tokens within the same contract, providing greater flexibility for developers. + +
+ +
+ +Consensus Mechanism + +A consensus mechanism is a method used in blockchain networks to ensure that all participants in the network agree on the state of the ledger or the validity of transactions. Consensus mechanisms are designed to prevent fraud, double-spending, and other types of malicious activity on the network. + +In the context of Ocean Protocol, the consensus mechanism used is Proof of Stake (PoS). + +
+ +
+ +Proof of Stake (PoS) + +A consensus mechanism used in blockchain networks that require validators to hold a certain amount of cryptocurrency as a stake in order to participate in the consensus process. PoS is an alternative to proof of work (PoW) and is designed to be more energy efficient. + +
+ +
+ +Proof of Work (PoW) + +A consensus mechanism used in blockchain networks that require validators to solve complex mathematical puzzles in order to participate in the consensus process. PoW is the original consensus mechanism used in the Bitcoin blockchain and is known for its high energy consumption. + +
+ +
+ +BUIDL + +A term used in the cryptocurrency and blockchain space to encourage developers and entrepreneurs to build new products and services. The term is a deliberate misspelling of the word "build" and emphasizes the importance of taking action and creating value in the ecosystem. + +
+ +### + +### Decentralized Finance (DeFi) fundamentals + +
+ +DeFi + +A financial system that operates on a decentralized, blockchain-based platform, rather than relying on traditional financial intermediaries such as banks, brokerages, or exchanges. In a DeFi system, financial transactions are executed using smart contracts, which are self-executing computer programs that automatically enforce the terms of an agreement between parties. + +
+ +
+ +Decentralized exchange (DEX) + +A Decentralized exchange (DEX) is an exchange that operates on a decentralized platform, allowing users to trade cryptocurrencies directly with one another without the need for a central authority or intermediary. DEXs typically use smart contracts to facilitate trades and rely on a network of nodes to process transactions and maintain the integrity of the exchange. + +
+ +
+ +Staking + +The act of holding a cryptocurrency in a wallet or on a platform to support the network and earn rewards. Staking is typically used in proof-of-stake (PoS) blockchain networks as a way to secure the network and maintain consensus. + +
+ +
+ +Lending + +The act of providing cryptocurrency to a borrower in exchange for interest payments. Lending platforms match borrowers with lenders and use smart contracts to facilitate loan agreements. + +
+ +
+ +Borrowing + +The act of borrowing cryptocurrency from a lender and agreeing to repay the loan with interest. Borrowing platforms match borrowers with lenders and use smart contracts to facilitate loan agreements. + +
+ +
+ +Farming + +A strategy in which investors provide liquidity to a DeFi protocol in exchange for rewards in the form of additional cryptocurrency or governance tokens. Farming typically involves providing liquidity to a liquidity pool and earning a share of the trading fees generated by the pool. Yield farming is a type of farming strategy. + +
+ +
+ +Annual percentage Yield (APY) + +Represents the total amount of interest earned on a deposit or investment account over one year, including the effect of compounding. + +
+ +
+ +Annual Percentage Rate (APR) + +Represents the annual cost of borrowing money, including the interest rate and any fees or charges associated with the loan, expressed as a percentage. + +
+ +
+ +Liquidty pools (LP) + +Liquidity Pools (LPs) are pools of tokens that are locked in a smart contract on a decentralized exchange (DEX) in order to facilitate the trading of those tokens. LPs provide liquidity to the DEX and allow traders to exchange tokens without needing a counterparty, while LP providers earn a share of the trading fees in exchange for providing liquidity. + +
+ +
+ +Yield Farming + +A strategy in which investors provide liquidity to a DeFi protocol in exchange for rewards in the form of additional cryptocurrency or governance tokens. Yield farming is designed to incentivize users to contribute to the growth and adoption of a DeFi protocol. + +
+ +### Data Science Terminology + +
+ +AI + +AI stands for Artificial Intelligence. It refers to the development of computer systems that can perform tasks that would typically require human intelligence to complete. AI technologies enable computers to learn, reason, and adapt in a way that resembles human cognition. + +
+ +
+ +Machine learning + +Machine learning is a subfield of artificial intelligence (AI) that involves teaching computers to learn from data, without being explicitly programmed. In other words, it is a way for machines to automatically learn and improve from experience, without being explicitly told what to do in every situation. + +
+ + + diff --git a/discover/networks/README.md b/discover/networks/README.md new file mode 100644 index 00000000..9af3bfb8 --- /dev/null +++ b/discover/networks/README.md @@ -0,0 +1,76 @@ +--- +title: Supported Networks +description: All the public networks the Ocean Protocol contracts are deployed to. +--- + +# Networks + +Ocean Protocol contracts are deployed on multiple public networks. You can always find the most up-to-date deployment addresses for all individual contracts in the [address.json](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json). + +In each network, whether it's the Ethereum mainnet, a testnet, or the Polygon/Matic network, you'll need ETH or Matic to pay for gas and OCEAN for certain actions on the Ocean Protocol network. The Ethereum mainnet and the Polygon network are both live networks and the tokens on these networks have real value. However, the tokens on the test networks are not of real value and are only used for testing purposes. You can obtain testnet ETH and OCEAN tokens from faucets, which are services that provide small amounts of tokens for free. + +### Ethereum Mainnet + +The Ethereum mainnet is a production network, which means that it is a live and operational network that handles real transactions and has actual economic value. To connect to the Ethereum mainnet using a wallet such as MetaMask, you can click on the network name dropdown and select Ethereum mainnet from the list of available networks. + +
Gas Token
OCEAN Token0x967da4048cD07aB37855c090aAF366e4ce1b9F48
Explorerhttps://etherscan.io
+ +### Polygon Mainnet + +Ocean Protocol is also deployed to Polygon Mainnet, which is another production network. The native token of Polygon Mainnet is MATIC. If you cannot find Polygon Mainnet as a predefined network in your wallet, you can manually connect to it by following Polygon's [guide](https://wiki.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/#add-the-polygon-network-manually), which provides step-by-step instructions for connecting to Polygon Mainnet. + +
Gas TokenMatic(Native token)
OCEAN0x282d8efCe846A88B159800bd4130ad77443Fa1A1
Explorerhttps://polygonscan.com
+ +**Bridge** + +Check our Polygon Bridge [guide](bridges.md) to learn how you can deposit, withdraw and send tokens. + +### Binance Smart Chain + +Ocean Protocol is also deployed to Binance Smart Chain (BSC), which is another production network. The native token of the Binance Smart Chain is BNB, which is the token of the Binance exchange. If Binance Smart Chain is not listed as a predefined network in your wallet, you can manually connect to it by following Binance's [guide](https://academy.binance.com/en/articles/connecting-metamask-to-binance-smart-chain), which provides detailed instructions on how to connect to Binance Smart Chain. + +
Gas TokenBSC BNB(Native token)
OCEAN0xdce07662ca8ebc241316a15b611c89711414dd1a
Explorerhttps://bscscan.com/
+ +**Bridge** + +Check our BSC Bridge [guide](bridges.md#binance-smart-chain-bsc-bridge) to learn how you can deposit, withdraw and send tokens. + +### Energy Web Chain + +Ocean Protocol is also deployed to [Energy Web Chain](https://energy-web-foundation.gitbook.io/energy-web/technology/trust-layer-energy-web-chain), which is another production network. The native token of the Energy Web Chain is EWT. If you cannot find Energy Web Chain as a predefined network in your wallet, you can manually connect to it by following this [guide](https://energy-web-foundation.gitbook.io/energy-web/how-tos-and-tutorials/connect-to-energy-web-chain-main-network-with-metamash). + +
Gas TokenEnergy Web Chain EWT(Native token)
OCEAN0x593122aae80a6fc3183b2ac0c4ab3336debee528
Explorerhttps://explorer.energyweb.org/
+ +**Bridge** + +To bridge assets between Energy Web Chain and Ethereum mainnet, you can use [this](https://bridge.carbonswap.exchange/) bridge. + +### Moonriver + +Ocean Protocol is also deployed to [Moonriver](https://docs.moonbeam.network/builders/get-started/networks/moonriver/), which is another production network. The native token of Moonriver is MOVR. If Moonriver is not listed as a predefined network in your wallet, you can manually connect to it by following this [guide](https://docs.moonbeam.network/builders/get-started/networks/moonriver/#connect-metamask). + +
Gas TokenMoonriver MOVR(Native token)
OCEAN0x99C409E5f62E4bd2AC142f17caFb6810B8F0BAAE
Explorerhttps://blockscout.moonriver.moonbeam.network
+ +**Bridge** + +To bridge assets between Moonriver and Ethereum mainnet, you can use [this](https://anyswap.exchange/#/bridge) bridge. + +### Görli + +Ocean Protocol is deployed on the Görli test network, which is used for testing and experimentation. Tokens on Görli do not hold real economic value, as it is a non-production network. To connect to Görli using a wallet like MetaMask, simply click on the network name dropdown and select _Goerli_ from the list of available networks. + +
Gas TokenGörli ETH(Native token)
Görli ETHFaucet. You may find others by searching.
Görli OCEANFaucet
OCEAN0xCfDdA22C9837aE76E0faA845354f33C62E03653a
Explorerhttps://blockscout.moonriver.moonbeam.network
+ +### Mumbai + +Ocean Protocol is deployed on the Mumbai test network Matic / Polygon, which is designed for testing and experimentation purposes. Tokens in Mumbai do not hold any real economic value, as it is not a production network. To connect to Mumbai using a wallet like MetaMask, you can select "Görli" from the network dropdown list. + +If Mumbai is not listed as a predefined network in your wallet, you can connect to it manually by following [Matic's guide](https://wiki.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/). + +
Mumbai MATIC(Native token)
Mumbai MATICFaucet. You may find others by searching.
Mumbai OCEANFaucet
OCEAN0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8
Explorerhttps://mumbai.polygonscan.com
+ +### Sepolia + +Ocean Protocol is deployed on the Sepolia test network, which is designed for testing and experimentation purposes. Tokens in Sepolia do not hold any real economic value, as it is not a production network. To connect to Sepolia using a wallet like MetaMask, you can select "Sepolia" from the network dropdown list(enable "Show test networks"). + +
Gas TokenSepoliaETH (Native token)
SepoliaETHFaucet
Sepolia OCEANFaucet
OCEAN0x1B083D8584dd3e6Ff37d04a6e7e82b5F622f3985
Explorerhttps://sepolia.etherscan.io/
diff --git a/core-concepts/networks/bridges.md b/discover/networks/bridges.md similarity index 83% rename from core-concepts/networks/bridges.md rename to discover/networks/bridges.md index f4c0b547..2b04aa01 100644 --- a/core-concepts/networks/bridges.md +++ b/discover/networks/bridges.md @@ -14,23 +14,26 @@ We suggest using the following solutions to transfer Ocean tokens between Ethere To transfer Ocean tokens to and from the Binance Smart Chain, we recommend using the [Binance Bridge](https://www.bnbchain.org/en/bridge). BSC offers various options such as withdrawing crypto from [Binance](https://www.binance.com/en) and utilizing the [Binance Bridge](https://www.bnbchain.org/en/bridge). You can refer to the Binance Academy article "[How to Get Started with BSC](https://academy.binance.com/en/articles/how-to-get-started-with-binance-smart-chain-bsc)" for more information. {% hint style="warning" %} -In case you opt for an alternative bridge option and intend to transfer tokens to Binance, it is crucial to ensure that the contract address you are sending the tokens to is correct. +In case you opt for an alternative bridge option and intend to transfer tokens to Binance, it is **crucial** to ensure that the contract address you are sending the tokens to is correct. -#### Binance deposit +Binance deposit {% endhint %} ## Polygon (ex Matic) Bridge -[![How to Get mOCEAN Youtube Video](https://img.youtube.com/vi/W5eIipUHl-w/0.jpg)](https://www.youtube.com/watch?v=W5eIipUHl-w) - - The Polygon Network (previously known as Matic) offers a [bridge](https://wallet.polygon.technology/bridge/), which lets you easily transfer digital assets between Ethereum and Polygon blockchains and a dedicated [wallet](https://wallet.polygon.technology/) designed for this purpose, which can be linked to your account through Metamask or other compatible wallets. + + +If you prefer a video tutorial, here is one available for you. Otherwise, you can follow the steps below. + +{% embed url="https://www.youtube.com/watch?v=W5eIipUHl-w" %} + All you need to do is click on the [wallet](https://wallet.polygon.technology/) link, select your preferred method of connection, and log in to get started. In this guide, we'll be using Metamask to connect the wallet. -
Polygon login options

Login options

+
Polygon login options

Login options

-You might come across the name "Matic" in some places instead of "Polygon" because the network is still using its old brand name in certain instances. Don't worry though, it's the same network whether you see Matic or Polygon. +You might come across the name "Matic" in some places instead of "Polygon" because the network is still using its old brand name in certain instances. Don't worry though, it's the same network whether you see Matic or Polygon. Check out our [blog post](https://blog.oceanprotocol.com/ocean-on-polygon-network-8abad19cbf47) for more details. @@ -38,15 +41,15 @@ Check out our [blog post](https://blog.oceanprotocol.com/ocean-on-polygon-networ When you access the wallet's main page, you'll be able to view all the tokens you possess on the Polygon Mainnet. If you want to deposit tokens (i.e., transfer them from the Ethereum Mainnet), there are two ways to do it: you can either click the "deposit" button for a specific token or use the "Move funds from Ethereum to Polygon" option. -![Main wallet page](../../.gitbook/assets/polygon-wallet-page.png) +![Main wallet page](../../.gitbook/assets/wallet/polygon-wallet-page.png) In case you are unable to find the Ocean token in the list while depositing, simply click on "Manage token list" and enable the Polygon Tokens option, which contains a greater number of listed tokens. This will add Ocean to the tokens list. -![Ocean on Polygon](../../.gitbook/assets/polygon-ocean.png) +![Ocean on Polygon](../../.gitbook/assets/wallet/polygon-ocean.png) Both of these options will redirect you to the bridge interface. If you select the second option, you'll need to use the dropdown menu to choose the token that you wish to transfer from the Ethereum Mainnet. -![Bridge interface](../../.gitbook/assets/polygon-bridge.png) +![Bridge interface](../../.gitbook/assets/wallet/polygon-bridge.png) Select the number of tokens you want to transfer and hit the "Transfer" button. The bridge interface provided by Polygon will guide you through all the necessary steps, including signing two transactions on the Ethereum Mainnet. The first transaction involves giving permission for the tokens to be traded on Polygon's bridge, while the second transaction is the actual deposit. @@ -62,7 +65,7 @@ Unlike the first two cases where transactions are signed on the Ethereum Mainnet The easiest one is to go to the [polygon network explorer](https://polygonscan.com/) and tap the "Add polygon network" button. -
+
Alternatively, you can manually configure the network on Metamask by using the following parameters. To learn how to set up a custom network in Metamask using these values, you can refer to our guide. @@ -74,4 +77,4 @@ Alternatively, you can manually configure the network on Metamask by using the f | Currency Symbol | `MATIC` | | Block Explorer URL | [`https://polygonscan.com`](https://polygonscan.com) | -Follow our guide to learn how to use those values to [set up a custom network in MetaMask](../metamask-setup.md#set-up-custom-network). +Follow our guide to learn how to use those values to [set up a custom network in MetaMask](../wallets/metamask-setup.md#set-up-custom-network). diff --git a/discover/ocean-101.md b/discover/ocean-101.md new file mode 100644 index 00000000..42f62fd5 --- /dev/null +++ b/discover/ocean-101.md @@ -0,0 +1,31 @@ +# Ocean 101 + +

Let's see how it works

+ +## How Does Ocean Work? + +Ocean Protocol utilizes a combination of blockchain technology, decentralized networks, and cryptographic techniques to facilitate secure and privacy-preserving data sharing. Here's an overview of how Ocean works: + +1. **Asset Registration**: Data providers register their data assets on the Ocean blockchain, providing metadata that describes the asset, its usage terms, and pricing information. This metadata is stored on-chain and can be accessed by potential data consumers. +2. **Discovery and Access Control**: Data consumers can discover available data assets through decentralized metadata services like Aquarius. Access control mechanisms, such as smart contracts, verify the consumer's permissions and handle the transfer of data access tokens. +3. **Secure Data Exchange**: When a data consumer purchases access to a data asset, the asset's metadata, and access instructions are encrypted by the data provider using the Provider service. The encrypted asset is then securely transferred to the consumer, who can decrypt and utilize it without revealing the asset's URL. +4. [**Compute-to-Data**](../developers/compute-to-data/README.md) **(C2D)**: Ocean Protocol supports C2D capabilities, allowing data consumers to perform computations on data assets without direct access to the underlying data. The compute operations are executed in a secure and controlled environment, ensuring data privacy and compliance. +5. **Incentives and Governance**: Ocean Protocol incorporates tokenomics and a governance framework to incentivize participants and ensure the sustainability and evolution of the ecosystem. Participants can earn and stake Ocean tokens (OCEAN) for veOCEANs, curate data, contribute to the network and participate in governance decisions. + +Ocean Protocol also combines advanced technologies and web components to create a robust and efficient data ecosystem. + +

Ocean architectural overview

+ +Powerful libraries such as [Ocean.js](../developers/ocean.js/README.md) (JavaScript) and [Ocean.py](../developers/ocean.py/README.md) (Python) facilitate seamless integration and interaction with the protocol, offering a wide range of functionalities. + +Ocean Protocol incorporates middleware components that enhance efficiency and streamline interactions. Components such as [Aquarius](../developers/aquarius/README.md) act as a metadata cache, improving search efficiency by caching on-chain data into Elasticsearch while [Provider](../developers/provider/README.md) plays a crucial role in various ecosystem operations, assisting in asset downloading, handling encryption of [Decentralized Data Objects](../developers/ddo-specification.md) (DDOs), and facilitating communication with the operator-service for Compute-to-Data jobs. And finally, the [Subgraph](../developers/subgraph/README.md), an off-chain service leveraging GraphQL, offers efficient access to information related to datatokens, users, and balances. + +These libraries and middleware components contribute to efficient data discovery and secure interactions within the Ocean Protocol ecosystem. + +By leveraging these tools and technologies, developers can harness the power of decentralized data while creating innovative applications and unlocking the true value of data assets. + +

Build dApps with Ocean

+ +Ocean Protocol gives people and organizations the power to unleash the true value of their data. With its decentralized marketplaces, rock-solid data-sharing technologies, and privacy protection measures, Ocean Protocol opens the door for collaboration, sparks innovation, and encourages responsible and ethical data usage. + +It's all about making data work for everyone in a fair and transparent data economy. diff --git a/discover/wallets-and-ocean-tokens.md b/discover/wallets-and-ocean-tokens.md new file mode 100644 index 00000000..bc349b17 --- /dev/null +++ b/discover/wallets-and-ocean-tokens.md @@ -0,0 +1,38 @@ +--- +description: >- + How to use a crypto wallet to check your OCEAN token balance and send OCEAN + Tokens to others +--- + +# Manage Your OCEAN Tokens + +If you don't see any Ocean Tokens in your crypto wallet software 🔎 (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet. + +### Token Information + +Almost all ERC-20 wallets require these values for adding a custom token: + +
Network nameContract AddressSymbolDecimals
Mainnet0x967da4048cD07aB37855c090aAF366e4ce1b9F48OCEAN18
Polygon
(ex Matic)
0x282d8efCe846A88B159800bd4130ad77443Fa1A1mOCEAN18
BSC
Binance Smart Chain
0xdce07662ca8ebc241316a15b611c89711414dd1aOCEAN18
Görli0xCfDdA22C9837aE76E0faA845354f33C62E03653aOCEAN18
Mumbai0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8OCEAN18
+ +The [OCEAN Token page](https://oceanprotocol.com/token) at oceanprotocol.com has further details. + +### MetaMask + +1. Make sure MetaMask is connected to the Ethereum Mainnet. +2. Select the account you want to manage. +3. Scroll down until the `Import Tokens` link is visible, then click on it. +4. Click on `Custom Tokens`. +5. Paste the Ocean Token contract address listed above into the _Token Contract Address_ field. The other two fields should auto-fill. If not, add `OCEAN` for the symbol and `18` for the precision. +6. Click `Add custom token`. +7. Click `Import Tokens`. + +If you prefer visual demonstrations, we have prepared a visual demo that illustrates the steps mentioned above. + +{% embed url="https://app.arcade.software/share/yHiKKN336QGdAkhTlsIh" fullWidth="false" %} +{% endembed %} + +MetaMask should now show your Ocean Token (OCEAN) balance, and when you're looking at that, there should be a `Send` button to send Ocean Tokens to others. For help with that, see [the MetaMask docs about how to send tokens](https://metamask.zendesk.com/hc/en-us/articles/360015488931-How-to-Send-Tokens). + +### Other Wallet Software + +Do a web search to find out how to add a custom ERC-20 token to the wallet software you're using. diff --git a/discover/wallets/README.md b/discover/wallets/README.md new file mode 100644 index 00000000..45ac4d2a --- /dev/null +++ b/discover/wallets/README.md @@ -0,0 +1,35 @@ +--- +description: Fundamental knowledge of using ERC-20 crypto wallets. +--- + +# Wallets + +Ocean Protocol users require an ERC-20 compatible wallet to manage their OCEAN and ETH tokens. In this guide, we will provide some recommendations for different wallet options. + +
+ +### What is a wallet? + +In the blockchain world, a wallet is a software program that stores cryptocurrencies secured by private keys to allow users to interact with the blockchain network. Private keys are used to sign transactions and provide proof of ownership for the digital assets stored on the blockchain. Wallets can be used to send and receive digital currencies, view account balances, and monitor transaction history. There are several types of wallets, including desktop wallets, mobile wallets, hardware wallets, and web-based wallets. Each type of wallet has its own unique features, advantages, and security considerations. + +### Recommendations + +* **Easiest:** Use the [MetaMask](https://metamask.io/) browser plug-in. +* **Still easy, but more secure:** Get a [Trezor](https://trezor.io/) or [Ledger](https://www.ledger.com/) hardware wallet, and use MetaMask to interact with it. +* The [OCEAN Token page](https://oceanprotocol.com/token) at oceanprotocol.com lists some other possible wallets. + +### Related Terminology + +When you set up a new wallet, it might generate a **seed phrase** for you. Store that seed phrase somewhere secure and non-digital (e.g. on paper in a safe). It's extremely secret and sensitive. Anyone with your wallet's seed phrase could spend all the Ether and Ocean Tokens in all the accounts in your wallet. + +Once your wallet is set up, it will have one or more **accounts**. + +Each account has several **balances**, e.g. an Ether balance, an Ocean Token balance, and maybe other balances. All balances start at zero. + +An account's Ether balance might be 7.1 ETH in the Ethereum Mainnet, 2.39 ETH in Görli testnet. You can move ETH from one network to another only with a special setup exchange or bridge. Also, you can't transfer tokens from networks holding value such as Ethereum mainnet to networks not holding value, i.e., testnets like Görli. The same is true of OCEAN token balances. + +Each account has one **private key** and one **address**. The address can be calculated from the private key. You must keep the private key secret because it's what's needed to spend/transfer ETH and OCEAN (or to sign transactions of any kind). You can share the address with others. In fact, if you want someone to send some ETH or OCEAN to an account, you give them the account's address. + +{% hint style="info" %} +Unlike traditional pocket wallets, crypto wallets don't actually store ETH or OCEAN. They store private keys. +{% endhint %} diff --git a/orientation/metamask-setup.md b/discover/wallets/metamask-setup.md similarity index 71% rename from orientation/metamask-setup.md rename to discover/wallets/metamask-setup.md index b7ddc7da..40be3325 100644 --- a/orientation/metamask-setup.md +++ b/discover/wallets/metamask-setup.md @@ -1,36 +1,38 @@ --- -description: Tutorial about how to set up MetaMask for Chrome. +description: How to set up a MetaMask wallet on Chrome --- # Set Up MetaMask Wallet -> MetaMask can also be used with a TREZOR or Ledger hardware wallet but we don't cover those options below; see [the MetaMask documentation](https://metamask.zendesk.com/hc/en-us/articles/360020394612-How-to-connect-a-Trezor-or-Ledger-Hardware-Wallet). +Before you can publish or purchase assets, you will need a crypto wallet. As Metamask is one of the most popular crypto wallets around, we made a tutorial to show you how to get started with Metamask to use Ocean's tech. -### MetaMask Set Up Steps +> MetaMask can be connected with a TREZOR or Ledger hardware wallet but we don't cover those options below; see [the MetaMask documentation](https://metamask.zendesk.com/hc/en-us/articles/360020394612-How-to-connect-a-Trezor-or-Ledger-Hardware-Wallet). + +### Set up 1. Go to the [Chrome Web Store for extensions](https://chrome.google.com/webstore/category/extensions) and search for MetaMask. -![metamask-chrome-store]() +![metamask-chrome-store](../../.gitbook/assets/wallet/metamask-chrome-extension.png) * Install MetaMask. The wallet provides a friendly user interface that will help you through each step. MetaMask gives you two options: importing an existing wallet or creating a new one. Choose to `Create a Wallet`: -![Create a wallet]() +![Create a wallet](../../.gitbook/assets/wallet/create-new-metamask-wallet.png) * In the next step create a new password for your wallet. Read through and accept the terms and conditions. After that, MetaMask will generate Secret Backup Phrase for you. Write it down and store it in a safe place. -![Secret Backup Phrase]() +![Secret Backup Phrase](../../.gitbook/assets/wallet/secret-backup-phrase.png) * Continue forward. On the next page, MetaMask will ask you to confirm the backup phrase. Select the words in the correct sequence: -![Confirm secret backup phrase]() +![Confirm secret backup phrase](../../.gitbook/assets/wallet/confirm-backup-phrase.png) * Voila! Your account is now created. You can access MetaMask via the browser extension in the top right corner of your browser. -![MetaMask browser extension]() +![MetaMask browser extension](../../.gitbook/assets/wallet/metamask-browser-extension.png) * You can now manage Ether and Ocean Tokens with your wallet. You can copy your account address to the clipboard from the options. When you want someone to send Ether or Ocean Tokens to you, you will have to give them that address. It's not a secret. -![Manage tokens]() +![Manage tokens](../../.gitbook/assets/wallet/manage-tokens.png) You can also watch our [tutorial video snippets](https://www.youtube.com/playlist?list=PL\_dn0wVs9kWolBCbtHaFxsi408cumOeth) if you want more help setting up MetaMask. @@ -40,9 +42,9 @@ Sometimes it is required to use custom or external networks in MetaMask. We can Open the Settings menu and find the `Networks` option. When you open it, you'll be able to see all available networks your MetaMask wallet currently use. Click the `Add Network` button. -![Add custom/external network]() +![Add custom/external network](../../.gitbook/assets/wallet/metamask-add-network.png) -There are a few empty inputs we need to fill: +There are a few empty inputs we need to fill in: * **Network Name:** this is the name that MetaMask is going to use to differentiate your network from the rest. * **New RPC URL:** to operate with a network we need an endpoint (RPC). This can be a public or private URL. diff --git a/infrastructure/README.md b/infrastructure/README.md new file mode 100644 index 00000000..823883a4 --- /dev/null +++ b/infrastructure/README.md @@ -0,0 +1,21 @@ +--- +description: Learn how to deploy Ocean components in your environment. +cover: ../.gitbook/assets/cover/infrastructure_banner.png +coverY: 0 +--- + +# 🔨 Infrastructure + +There are many ways in which the components can be deployed, from simple configurations used for development and testing to complex configurations, used for production systems. + +All the Ocean Protocol components ([Provider](../developers/provider/README.md), [Aquarius](../developers/aquarius/README.md), [Subgraph](../developers/subgraph/README.md)) are designed to run in Docker containers, on a Linux operating system. For simple configurations, we rely on Docker Engine and Docker Compose products to deploy and run our components, while for complex configurations we use Kubernetes. The guides included in this section will present both deployment options. + +Please note that deploying the Ocean components requires a good understanding of: + +* Linux operating system +* Docker Engine +* Docker Compose or Kubernetes (depending on the configuration chosen for the component deployment) + +Please note that although Ocean Marketplace is not a core component of our stack but rather an example of what can be achieved with our technology, in this section we included a guide on how to deploy it. + +All components need to be deployed on a server, so we included a guide about how to install and configure a server will all the necessary tools. diff --git a/building-with-ocean/compute-to-data/compute-to-data-docker-registry.md b/infrastructure/compute-to-data-docker-registry.md similarity index 83% rename from building-with-ocean/compute-to-data/compute-to-data-docker-registry.md rename to infrastructure/compute-to-data-docker-registry.md index d3589e01..c1e24cf7 100644 --- a/building-with-ocean/compute-to-data/compute-to-data-docker-registry.md +++ b/infrastructure/compute-to-data-docker-registry.md @@ -5,9 +5,9 @@ description: >- algorithms in a C2D environment. --- -# Setting up private docker registry +# C2D - Private Docker Registry -The document is intended for a production setup. The tutorial provides the steps to setup a private docker registry on the server for the following scenarios: +The document is intended for a production setup. The tutorial provides the steps to set up a private docker registry on the server for the following scenarios: * Allow registry access only to the C2D environment. * Anyone can pull the image from the registry but, only authenticated users will push images to the registry. @@ -22,10 +22,10 @@ _Note: Please change the domain names to your application-specific domain names. #### 1.1 Prerequisites -* Running docker environment on the linux server. +* A docker environment running on a Linux server. * Docker compose is installed. * C2D environment is running. -* The domain names is mapped to the server hosting the registry. +* The domain names are mapped to the server hosting the registry. #### 1.2 Generate certificates @@ -34,9 +34,9 @@ _Note: Please change the domain names to your application-specific domain names. sudo certbot certonly --standalone --cert-name example.com -d example.com ``` -_Note: Do check the access right of the files/directories where certificates are stored. Usually, they are at `/etc/letsencrypt/`._ +_Note: Check the access right of the files/directories where certificates are stored. Usually, they are at `/etc/letsencrypt/`._ -#### 1.3 Generate password file +#### 1.3 Generate a password file Replace content in `<>` with appropriate content. @@ -48,7 +48,7 @@ docker run \ #### 1.4 Docker compose template file for registry -Copy the below yml content to `docker-compose.yml` file and replace content in `<>`. +Copy the below `yml` content to `docker-compose.yml` file and replace content in `<>`. ```yml version: '3' @@ -114,9 +114,9 @@ http { } ``` -#### 1.6 Create kubernetes secret in C2D server +#### 1.6 Create Kubernetes secret in C2D server -Login into Compute-to-data enviroment and run the following command with appropriate credentials: +Login into the compute-to-data enviroment and run the following command with the appropriate credentials: ```bash kubectl create secret docker-registry regcred --docker-server=example.com --docker-username= --docker-password= --docker-email= -n ocean-compute @@ -124,7 +124,7 @@ kubectl create secret docker-registry regcred --docker-server=example.com --dock #### 1.7 Update operator-engine configuration -Add `PULL_SECRET` property with value `regcred` in the [operator.yml](https://github.com/oceanprotocol/operator-engine/blob/main/kubernetes/operator.yml) file of operator-engine configuration. For more detials on operator-engine properties refer this [link](https://github.com/oceanprotocol/operator-engine/blob/177ca7185c34aa2a503afbe026abb19c62c69e6d/README.md?plain=1#L106) +Add `PULL_SECRET` property with value `regcred` in the [operator.yml](https://github.com/oceanprotocol/operator-engine/blob/main/kubernetes/operator.yml) file of operator-engine configuration. For more details on operator-engine properties refer to the [operator-engine readme](https://github.com/oceanprotocol/operator-engine/blob/v4main/README.md). Apply updated operator-engine configuration. @@ -133,20 +133,20 @@ kubectl config set-context --current --namespace ocean-compute kubectl apply -f operator-engine/kubernetes/operator.yml ``` -### Steup 2: Allow anyonymous `pull` operations +### Steup 2: Allow anonymous `pull` operations To implement this use case, 2 domains will be required: -* **example.com**: This domain will allow image push/pull operations only to the authenticated users. +* **example.com**: This domain will only allow image push/pull operations from authenticated users. * **readonly.example.com**: This domain will allow only image pull operations _Note: Please change the domain names to your application-specific domain names._ #### 2.1 Prerequisites -* Running docker environment on the linux server. +* Running docker environment on the Linux server. * Docker compose is installed. -* 2 domain names is mapped to the same server IP address. +* 2 domain names are mapped to the same server IP address. #### 2.2 Generate certificates @@ -158,7 +158,7 @@ sudo certbot certonly --standalone --cert-name readonly.example.com -d readonly. _Note: Do check the access right of the files/directories where certificates are stored. Usually, they are at `/etc/letsencrypt/`._ -#### 2.3 Generate password file +#### 2.3 Generate a password file Replace content in `<>` with appropriate content. @@ -170,7 +170,7 @@ docker run \ #### 2.4 Docker compose template file for registry -Copy the below yml content to `docker-compose.yml` file and replace content in `<>`. Here, we will be creating two services of the docker registry so that anyone can `pull` the images from the registry but, only authenticated users can `push` the images. +Copy the below `yml` content to `docker-compose.yml` file and replace content in `<>`. Here, we will be creating two services of the docker registry so that anyone can `pull` the images from the registry but, only authenticated users can `push` the images. ```yml version: '3' @@ -305,10 +305,10 @@ docker image pull readonly.example.com/my-algo:latest #### Next step -You can publish an algorithm asset with the metadata containing registry URL, image, and tag information to enable users to run C2D jobs. +You can publish an algorithm asset with the metadata containing the registry URL, image, and tag information to enable users to run C2D jobs. ### Further references * [Setup Compute-to-Data environment](compute-to-data-minikube.md) -* [Writing algorithms](compute-to-data-algorithms.md) +* [Writing algorithms](../developers//compute-to-data/compute-to-data-algorithms.md) * [C2D example](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/c2d-flow.md) diff --git a/building-with-ocean/compute-to-data/compute-to-data-minikube.md b/infrastructure/compute-to-data-minikube.md similarity index 52% rename from building-with-ocean/compute-to-data/compute-to-data-minikube.md rename to infrastructure/compute-to-data-minikube.md index 948401df..e4b7d7cd 100644 --- a/building-with-ocean/compute-to-data/compute-to-data-minikube.md +++ b/infrastructure/compute-to-data-minikube.md @@ -1,15 +1,41 @@ --- title: Minikube Compute-to-Data Environment -description: --- -## Requirements +# Deploying C2D -- functioning internet-accessable provider service -- machine capable of running compute (e.g. we used a machine with 8 CPUs, 16 GB Ram, 100GB SSD and fast internet connection) -- Ubuntu 20.04 +This chapter will present how to deploy the C2D component of the Ocean stack. As mentioned in the [C2D Architecture chapter](../developers/compute-to-data/#architecture-and-overview-guides), the Compute-to-Data component uses Kubernetes to orchestrate the creation and deletion of the pods in which the C2D jobs are run. -## Install Docker and Git +For the ones that do not have a Kubernetes environment available, we added to this guide instructions on how to install Minikube, which is a lightweight Kubernetes implementation that creates a VM on your local machine and deploys a simple cluster containing only one node. In case you have a Kubernetes environment in place, please skip directly to step 4 of this guide. + + + +### Requirements + +* Communications: a functioning internet-accessible provider service +* Hardware: a server capable of running compute jobs (e.g. we used a machine with 8 CPUs, 16 GB Ram, 100GB SSD, and a fast internet connection). See [this guide](setup-server.md) for how to create a server; +* Operating system: Ubuntu 22.04 LTS + + + +### Steps + +1. [Install Docker and Git](compute-to-data-minikube.md#install-docker-and-git) +2. [Install Minikube](compute-to-data-minikube.md#install-minikube) +3. [Start Minikube](compute-to-data-minikube.md#start-minikube) +4. [Install the Kubernetes command line tool (kubectl)](compute-to-data-minikube.md#install-the-kubernetes-command-line-tool-kubectl) +5. [Run the IPFS host (optional)](compute-to-data-minikube.md#run-the-ipfs-host-optional) +6. [Update the storage class](compute-to-data-minikube.md#update-the-storage-class) +7. [Download and Configure Operator Service](compute-to-data-minikube.md#download-and-configure-operator-service) +8. [Download and Configure Operator Engine](compute-to-data-minikube.md#download-and-configure-operator-engine) +9. [Create namespaces](compute-to-data-minikube.md#create-namespaces) +10. [Deploy Operator Service](compute-to-data-minikube.md#deploy-operator-service) +11. [Deploy Operator Engine](compute-to-data-minikube.md#deploy-operator-engine) +12. [Expose Operator Service](compute-to-data-minikube.md#expose-operator-service) +13. [Initialize the database](compute-to-data-minikube.md#initialize-database) +14. [Update Provider](compute-to-data-minikube.md#update-provider) + +#### Install Docker and Git ```bash sudo apt update @@ -17,16 +43,16 @@ sudo apt install git docker.io sudo usermod -aG docker $USER && newgrp docker ``` -## Install Minikube +#### Install Minikube ```bash wget -q --show-progress https://github.com/kubernetes/minikube/releases/download/v1.22.0/minikube_1.22.0-0_amd64.deb sudo dpkg -i minikube_1.22.0-0_amd64.deb ``` -## Start Minikube +#### Start Minikube -First command is imporant, and solves a [PersistentVolumeClaims problem](https://github.com/kubernetes/minikube/issues/7828). +The first command is important and solves a [PersistentVolumeClaims problem](https://github.com/kubernetes/minikube/issues/7828). ```bash minikube config set kubernetes-version v1.16.0 @@ -37,7 +63,7 @@ Depending on the number of available CPUs, RAM, and the required resources for r For other options to run minikube refer to this [link](https://minikube.sigs.k8s.io/docs/commands/start/) -## Install kubectl +#### Install the Kubernetes command line tool (kubectl) ```bash curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl" @@ -47,14 +73,17 @@ echo "$(> /etc/hosts' ``` -## Storage class (Optional) +#### Update the storage class -For minikube, you can use the default 'standard' class. +The storage class is used by Kubernetes to create the temporary volumes on which the data used by the algorithm will be stored. -For AWS, please make sure that your class allocates volumes in the same region and zone in which you are running your pods. +Please ensure that your class allocates volumes in the same region and zone where you are running your pods. -We created our own 'standard' class in AWS: +You need to consider the storage class available for your environment. + +For Minikube, you can use the default 'standard' class. + +In AWS, we created our own 'standard' class: ```bash kubectl get storageclass standard -o yaml @@ -96,29 +129,30 @@ volumeBindingMode: Immediate For more information, please visit https://kubernetes.io/docs/concepts/storage/storage-classes/ -## Download and Configure Operator Service +#### Download and Configure Operator Service -Open new terminal and run the command below. +Open a new terminal and run the command below. ```bash git clone https://github.com/oceanprotocol/operator-service.git ``` -Edit `operator-service/kubernetes/postgres-configmap.yaml`. Change `POSTGRES_PASSWORD` to nice long random password. +Edit `operator-service/kubernetes/postgres-configmap.yaml`. Change `POSTGRES_PASSWORD` to a nice long random password. Edit `operator-service/kubernetes/deployment.yaml`. Optionally change: -- `ALGO_POD_TIMEOUT` -- add `requests_cpu` -- add `requests_memory` -- add `limits_cpu` -- add `limits_memory` +* `ALGO_POD_TIMEOUT` +* add `requests_cpu` +* add `requests_memory` +* add `limits_cpu` +* add `limits_memory` ```yaml -... - spec: - containers: - - env: + +--- +spec: + containers: + - env: - name: requests_cpu value: "4" - name: requests_memory @@ -129,28 +163,26 @@ Edit `operator-service/kubernetes/deployment.yaml`. Optionally change: value: "15Gi" - name: ALGO_POD_TIMEOUT value: "3600" -... ``` -## Download and Configure Operator Engine +#### Download and Configure Operator Engine ```bash git clone https://github.com/oceanprotocol/operator-engine.git ``` -Check the [README](https://github.com/oceanprotocol/operator-engine#customize-your-operator-engine-deployment) section of operator engine to customize your deployment. +Check the [README](https://github.com/oceanprotocol/operator-engine#customize-your-operator-engine-deployment) section of the operator engine to customize your deployment. -At a minimum you should add your IPFS URLs or AWS settings, and add (or remove) notification URLs. +At a minimum, you should add your IPFS URLs or AWS settings, and add (or remove) notification URLs. - -## Create namespaces +#### Create namespaces ```bash kubectl create ns ocean-operator kubectl create ns ocean-compute ``` -## Deploy Operator Service +#### Deploy Operator Service ```bash kubectl config set-context --current --namespace ocean-operator @@ -161,7 +193,7 @@ kubectl create -f operator-service/kubernetes/postgresql-service.yaml kubectl apply -f operator-service/kubernetes/deployment.yaml ``` -## Deploy Operator Engine +#### Deploy Operator Engine ```bash kubectl config set-context --current --namespace ocean-compute @@ -177,7 +209,7 @@ kubectl create -f operator-service/kubernetes/postgres-configmap.yaml kubectl -n ocean-compute apply -f /ocean/operator-engine/kubernetes/egress.yaml ``` -## Expose Operator Service +#### Expose Operator Service ```bash kubectl expose deployment operator-api --namespace=ocean-operator --port=8050 @@ -191,15 +223,15 @@ kubectl -n ocean-operator port-forward svc/operator-api 8050 Alternatively you could use another method to communicate between the C2D Environment and the provider, such as an SSH tunnel. -## Initialize database +#### Initialize database -If your minikube is running on compute.example.com: +If your Minikube is running on compute.example.com: ```bash curl -X POST "https://compute.example.com/api/v1/operator/pgsqlinit" -H "accept: application/json" ``` -## Update Provider +#### Update Provider Update your provider service by updating the `operator_service.url` value in `config.ini` @@ -209,4 +241,3 @@ operator_service.url = https://compute.example.com/ Restart your provider service. -[Watch the explanatory video for more details](https://vimeo.com/580934725) diff --git a/infrastructure/deploying-aquarius.md b/infrastructure/deploying-aquarius.md new file mode 100644 index 00000000..f0a0f821 --- /dev/null +++ b/infrastructure/deploying-aquarius.md @@ -0,0 +1,553 @@ +# Deploying Aquarius + +### About Aquarius + +Aquarius is an off-chain component that caches the asset's metadata published on-chain. By deploying their own instance of Aquarius, developers can control which assets are visible in their DApp. For example, having a custom Aquarius instance allows only the assets from specific addresses to be visible in the DApp. + +This tutorial will provide the steps to deploy Aquarius. Ocean Protocol provides Aquarius Docker images which can be viewed [here](https://hub.docker.com/r/oceanprotocol/aquarius/tags). Visit [this](https://github.com/oceanprotocol/aquarius) page to view the Aquarius source code. + +Aquarius consists of two parts: + +* **API:** The Aquarius API provides a user with a convenient way to access the metadata without scanning the chain itself. +* **Event monitor:** Aquarius continually monitors the chains for MetadataCreated and MetadataUpdated events, processes these events, and adds them to the database. + +As mentioned in the [Setup a Server](setup-server.md) document, all Ocean components can be deployed in two configurations: simple, based on Docker Engine and Docker Compose, and complex, based on Kubernetes with Docker Engine. This document will present how to deploy Aquarius in each of these configurations. + +## Deploying Aquarius using Docker Engine and Docker Compose + +This guide will deploy Aquarius, including Elasticsearch as a single systemd service. + +### Prerequisites + +* A server for hosting Aquarius. See [this guide](setup-server.md) for how to create a server; +* Docker Compose and Docker Engine are installed and configured on the server. See [this guide](setup-server.md#install-docker-engine-and-docker-compose) for how to install these products. +* The RPC URLs and API keys for each of the networks to which the Aquarius will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. + +### Steps + +#### 1. Create the /etc/docker/compose/aquarius/docker-compose.yml file + +From a terminal console, create /etc/docker/compose/aquarius/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. The following example is for deploying Aquarius for Goerli network. + +For each other network in which you want to deploy Aquarius, add to the file a section similar to "aquarius-events-goerli" included in this example and update the corresponding parameters (i.e. EVENTS\_RPC, OCEAN\_ADDRESS, SUBGRAPH\_URLS) specific to that network. + +```yaml +version: '3.9' +services: + elasticsearch: + image: elasticsearch:8.7.0 + container_name: elasticsearch + restart: on-failure + environment: + ES_JAVA_OPTS: "-Xms512m -Xmx512m" + MAX_MAP_COUNT: "64000" + discovery.type: "single-node" + ELASTIC_PASSWORD: "changeme" + xpack.security.enabled: "false" + xpack.security.http.ssl.enabled: "false" + volumes: + - data:/usr/share/elasticsearch/data + ports: + - 9200:9200 + networks: + - backend + aquarius: + image: oceanprotocol/aquarius:v5.1.2 + container_name: aquarius + restart: on-failure + ports: + - 5000:5000 + networks: + - backend + depends_on: + - elasticsearch + environment: + DB_MODULE: elasticsearch + DB_HOSTNAME: http://elasticsearch + DB_PORT: 9200 + DB_USERNAME: elastic + DB_PASSWORD: changeme + DB_NAME: aquarius + DB_SCHEME: http + DB_SSL : "false" + LOG_LEVEL: "INFO" + AQUARIUS_URL: "http://0.0.0.0:5000" + AQUARIUS_WORKERS : "4" + RUN_AQUARIUS_SERVER: "1" + AQUARIUS_CONFIG_FILE: "config.ini" + EVENTS_ALLOW: 0 + RUN_EVENTS_MONITOR: 0 + ALLOWED_PUBLISHERS: '[""]' + aquarius-events-goerli: + image: oceanprotocol/aquarius:v5.1.2 + container_name: aquarius-events-goerli + restart: on-failure + networks: + - backend + depends_on: + - elasticsearch + environment: + DB_MODULE: elasticsearch + DB_HOSTNAME: http://elasticsearch + DB_PORT: 9200 + DB_USERNAME: elastic + DB_PASSWORD: changeme + DB_NAME: aquarius + DB_SCHEME: http + DB_SSL : "false" + LOG_LEVEL: "INFO" + AQUARIUS_URL: "http://0.0.0.0:5000" + AQUARIUS_WORKERS : "1" + RUN_AQUARIUS_SERVER : "0" + AQUARIUS_CONFIG_FILE: "config.ini" + ALLOWED_PUBLISHERS: '[""]' + NETWORK_NAME: "goerli" + EVENTS_RPC: "https://goerli.infura.io/v3/" + METADATA_UPDATE_ALL : "0" + OCEAN_ADDRESS : 0xcfdda22c9837ae76e0faa845354f33c62e03653a + RUN_EVENTS_MONITOR: 1 + BLOCKS_CHUNK_SIZE: "5000" + SUBGRAPH_URLS: "5: https://v4.subgraph.goerli.oceanprotocol.com" +volumes: + data: + driver: local +networks: + backend: + driver: bridge +``` + +#### 2. Create the /etc/systemd/system/docker-compose@aquarius.service file + +Create the _/etc/systemd/system/docker-compose@aquarius.service_ file then copy and paste the following content to it. This example file could be customized if needed. + +```yaml +[Unit] +Description=%i service with docker compose +Requires=docker.service +After=docker.service + +[Service] +Type=oneshot +RemainAfterExit=true +Environment="PROJECT=ocean" +WorkingDirectory=/etc/docker/compose/%i +ExecStartPre=/usr/bin/env docker-compose -p $PROJECT pull +ExecStart=/usr/bin/env docker-compose -p $PROJECT up -d +ExecStop=/usr/bin/env docker-compose -p $PROJECT stop +ExecStopPost=/usr/bin/env docker-compose -p $PROJECT down + + +[Install] +WantedBy=multi-user.target +``` + +#### 3. Reload the systemd manager configuration + +Run the following command to reload the systemd manager configuration + +```bash +sudo systemctl daemon-reload +``` + +Optionally, you can enable the services to start at boot, using the following command: + +```bash +sudo systemctl enable docker-compose@aquarius.service +``` + +#### 4. Start Aquarius service + +To start the Aquarius service, run the following command: + +```bash +sudo systemctl start docker-compose@aquarius.service +``` + +#### 5. Check the service's status + +Check the status of the service by running the following commands: + +```bash +sudo systemctl status docker-compose@aquarius.service +``` + +#### 6. Confirm Aquarius is accessible + +Run the following commands to access Aquarius The output should be similar to the one displayed here. + +
$ curl localhost:9200
+{
+  "name" : "a93d989293ac",
+  "cluster_name" : "docker-cluster",
+  "cluster_uuid" : "Bs16cyCwRCOIbmaBUEj5fA",
+  "version" : {
+    "number" : "8.7.0",
+    "build_flavor" : "default",
+    "build_type" : "docker",
+    "build_hash" : "09520b59b6bc1057340b55750186466ea715e30e",
+    "build_date" : "2023-03-27T16:31:09.816451435Z",
+    "build_snapshot" : false,
+    "lucene_version" : "9.5.0",
+    "minimum_wire_compatibility_version" : "7.17.0",
+    "minimum_index_compatibility_version" : "7.0.0"
+  },
+  "tagline" : "You Know, for Search"
+}
+
+ +```bash +$ curl localhost:5000 +{"plugin":"module","software":"Aquarius","version":"5.1.2"} +``` + +#### 7. Use Docker CLI to check the Aquarius service's logs + +If needed, use docker CLI to check Aquarius' service logs. + +First, identify the container id: + +```bash +$ docker ps +CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES +355baee34d50 oceanprotocol/aquarius:v5.1.2 "/aquarius/docker-en…" About a minute ago Up About a minute 5000/tcp aquarius-events-goerli +f1f97d6f146f oceanprotocol/aquarius:v5.1.2 "/aquarius/docker-en…" About a minute ago Up About a minute 0.0.0.0:5000->5000/tcp, :::5000->5000/tcp aquarius +a93d989293ac elasticsearch:8.7.0 "/bin/tini -- /usr/l…" About a minute ago Up About a minute 0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp elasticsearch + +``` + +Then, check the logs from the Aqauarius' Docker containers: + +```bash +$ docker logs aquarius [--follow] +$ docker logs aquarius-events-goerli [--follow] +``` + +## Deploying Aquarius using Kubernetes + +Aquarius depends on the backend database and in this example we will deploy the following resources: + +* Elasticsearch. +* Aquarius ([Deployment](https://kubernetes.io/docs/concepts/workloads/controllers/deployment/)) + +Templates (yaml files) are provided and could be customized based on the environment's specifics. + +### Prerequisites + +* A server for hosting Aquarius. See [this guide](setup-server.md) for how to create a server; +* Kubernetes with Docker Engine is installed and configured on the server. See [this chapter](setup-server.md#install-kubernetes-with-docker-engine) for information on installing Kubernetes. +* The RPC URLs and API keys for each of the networks to which the Aquarius will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. + +### Steps + +1. [Deploy Elasticsearch service](deploying-aquarius.md#1.-deploy-elasticsearch) +2. [Deploy Aquarius service](deploying-aquarius.md#2.-deploy-aquarius) + +#### 1. Deploy Elasticsearch + +It is recommended to deploy Elasticsearch through Helm [chart](https://github.com/elastic/cloud-on-k8s). + +a. Once the Elasticsearch pods are running, the database service should be available: + +```bash +$ kubectl port-forward --namespace ocean svc/elasticsearch-master 9200:9200 +Forwarding from 127.0.0.1:9200 -> 9200 +Forwarding from [::1]:9200 -> 9200 +``` + +b. Check that the Elasticsearch service is accessible: + +``` +$ curl localhost:9200 +{ + "name" : "elasticsearch-master-2", + "cluster_name" : "elasticsearch", + "cluster_uuid" : "KMAfL5tVSJWFfmCOklT0qg", + "version" : { + "number" : "8.5.2", + "build_flavor" : "default", + "build_type" : "docker", + "build_hash" : "a846182fa16b4ebfcc89aa3c11a11fd5adf3de04", + "build_date" : "2022-11-17T18:56:17.538630285Z", + "build_snapshot" : false, + "lucene_version" : "9.4.1", + "minimum_wire_compatibility_version" : "7.17.0", + "minimum_index_compatibility_version" : "7.0.0" + }, + "tagline" : "You Know, for Search" +} +``` + +#### 2. Deploy Aquarius + +Aquarius supports indexing multiple chains using a single instance to serve API requests and one instance for each chain that must be indexed. + +

Aquarius deployment - multiple chains indexing

+ +The following deployment templates could be used for guidance. Some parameters are [optional](https://github.com/oceanprotocol/aquarius) and the template could be adjusted based on these considerations. Common cases are the deployments for one/multiple Ethereum networks: + +* Mainnet +* Goerli +* Mumbai + +a. Create a YAML file for Aquarius configuration. + +The following templates (annotated) could be edited and used for deployment. + +* [_aquarius-deployment.yaml_](https://github.com/oceanprotocol/aquarius/blob/update-deploy-docs/deployment/aquarius-deployment.yaml) (annotated): this deployment is responsible for serving API requests + +```yaml +apiVersion: apps/v1 +kind: Deployment +metadata: + annotations: + labels: + app: aquarius + name: aquarius +spec: + progressDeadlineSeconds: 600 + replicas: 1 + revisionHistoryLimit: 5 + selector: + matchLabels: + app: aquarius + strategy: + rollingUpdate: + maxSurge: 25% + maxUnavailable: 25% + type: RollingUpdate + template: + metadata: + creationTimestamp: null + labels: + app: aquarius + spec: + containers: + - env: + - name: LOG_LEVEL + value: DEBUG + - name: AQUARIUS_URL + value: http://0.0.0.0:5000 + - name: AQUARIUS_WORKERS + value: "4" + - name: DB_HOSTNAME + value: < ES service hostname > + - name: DB_MODULE + value: elasticsearch + - name: DB_NAME + value: aquarius + - name: DB_PORT + value: "9200" + - name: DB_SCHEME + value: http + - name: DB_USERNAME + value: < ES username > + - name: DB_PASSWORD + value: < ES password > + - name: DB_SSL + value: "false" + - name: RUN_AQUARIUS_SERVER + value: "1" + - name: RUN_EVENTS_MONITOR + value: "0" + - name: EVENTS_ALLOW + value: "0" + - name: CONFIG_FILE + value: config.ini + - name: ALLOWED_PUBLISHERS + value: '[""]' + image: oceanprotocol/aquarius:v5.1.2 => check the available versions: https://hub.docker.com/repository/docker/oceanprotocol/aquarius/tags?page=1&ordering=last_updated + imagePullPolicy: Always + livenessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 5000 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 2 + name: aquarius + ports: + - containerPort: 5000 + protocol: TCP + readinessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 5000 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 1 + resources: + limits: + cpu: 800m + memory: 1Gi + requests: + cpu: 800m + memory: 1Gi + terminationMessagePath: /dev/termination-log + terminationMessagePolicy: File + dnsPolicy: ClusterFirst + restartPolicy: Always + schedulerName: default-scheduler + terminationGracePeriodSeconds: 30ya +``` + +Example deployment for _Mumbai_ (Polygon testnet): + +* [aquarius-events-mumbai-deployment.yaml](https://github.com/oceanprotocol/aquarius/blob/update-deploy-docs/deployment/aquarius-events-mumbai-deployment.yaml) (annotated) - this deployment will be responsible for indexing the block and storing the metadata published on-chain: + +```yaml +apiVersion: apps/v1 +kind: Deployment +metadata: + annotations: + labels: + app: aquarius-events-mumbai + name: aquarius-events-mumbai +spec: + progressDeadlineSeconds: 600 + replicas: 1 + revisionHistoryLimit: 5 + selector: + matchLabels: + app: aquarius-events-mumbai + strategy: + rollingUpdate: + maxSurge: 25% + maxUnavailable: 25% + type: RollingUpdate + template: + metadata: + creationTimestamp: null + labels: + app: aquarius-events-mumbai + spec: + containers: + - env: + - name: LOG_LEVEL + value: DEBUG + - name: AQUARIUS_URL + value: http://0.0.0.0:5000 + - name: AQUARIUS_WORKERS + value: "1" + - name: DB_HOSTNAME + value: < ES service hostname > + - name: DB_MODULE + value: elasticsearch + - name: DB_NAME + value: aquarius + - name: DB_PORT + value: "9200" + - name: DB_SCHEME + value: http + - name: DB_USERNAME + value: < ES username > + - name: DB_PASSWORD + value: < ES password > + - name: DB_SSL + value: "false" + - name: RUN_AQUARIUS_SERVER + value: "0" + - name: RUN_EVENTS_MONITOR + value: "1" + - name: CONFIG_FILE + value: config.ini + - name: ALLOWED_PUBLISHERS + value: '[""]' + - name: NETWORK_NAME + value: mumbai + - name: EVENTS_RPC + value: https://polygon-mumbai.infura.io/v3/< INFURA ID > => or another RPC service for this network + - name: METADATA_UPDATE_ALL + value: "0" + - name: ASSET_PURGATORY_URL + value: https://raw.githubusercontent.com/oceanprotocol/list-purgatory/main/list-assets.json + - name: ACCOUNT_PURGATORY_URL + value: https://raw.githubusercontent.com/oceanprotocol/list-purgatory/main/list-accounts.json + - name: PURGATORY_UPDATE_INTERVAL + value: "60" + - name: OCEAN_ADDRESS + value: 0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8 + - name: SUBGRAPH_URLS + value: | + {"80001": "https://v4.subgraph.mumbai.oceanprotocol.com"} => or your own deployed Ocean Subgraph service for this network + - name: BLOCKS_CHUNK_SIZE + value: "3500" + - name: EVENTS_HTTP + value: "1" + image: oceanprotocol/aquarius:v5.1.2 => check the available versions: https://hub.docker.com/repository/docker/oceanprotocol/aquarius/tags?page=1&ordering=last_updated + imagePullPolicy: Always + livenessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 5001 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 1 + name: aquarius-events-mumbai + ports: + - containerPort: 5000 + protocol: TCP + readinessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 5001 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 1 + resources: + limits: + cpu: 500m + memory: 1Gi + requests: + cpu: 500m + memory: 1Gi + terminationMessagePath: /dev/termination-log + terminationMessagePolicy: File + dnsPolicy: ClusterFirst + restartPolicy: Always + schedulerName: default-scheduler + terminationGracePeriodSeconds: 30 +``` + +Tip: before deployment, you can [validate](https://github.com/instrumenta/kubeval) the yaml file. + +b. Deploy the configuration + +Deploy the configuration in Kubernetes using the following commands. + +```bash +$ kubectl apply -f aquarius-deployment.yaml +$ kubectl apply -f aquarius-events-rinkeby-deployment.yaml + + +kubectl get pods -l app=aquarius +NAME READY STATUS RESTARTS AGE +aquarius-6fd9cc975b-fxr4d 1/1 Running 0 1d + + kubectl get pods -l app=aquarius-events-mumbai +NAME READY STATUS RESTARTS AGE +aquarius-events-mumbai-8748976c4-mh24n 1/1 Running 0 1d +``` + +Check the logs for newly deployed Aquarius by running the following command: + +```bash +$ kubectl logs aquarius-6fd9cc975b-fxr4d [--follow] + +$ kubectl logs aquarius-events-mumbai-8748976c4-mh24n [--follow] +``` + +c. Create a Kubernetes service + +The next step is to create a Kubernetes service (eg. ClusterIP, NodePort, Loadbalancer, ExternalName) for this deployment, depending on the environment specifications. Follow [this link](https://kubernetes.io/docs/concepts/services-networking/service/) for details on how to create a Kubernetes service. diff --git a/infrastructure/deploying-marketplace.md b/infrastructure/deploying-marketplace.md new file mode 100644 index 00000000..28a35c8f --- /dev/null +++ b/infrastructure/deploying-marketplace.md @@ -0,0 +1,71 @@ +# Deploying Marketplace + +### Prerequisites + +* A server for hosting Ocean Marketplace. See [this guide](setup-server.md) on creating a server. +* Obtain API key for wanted network. See [this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for this. + +### Push your customized Ocean Market code to your Git repository + +In case you customized the Ocean Market using the tutorial from this chapter (link), push your code to a Git repository. + +### Create a directory + +```bash +mkdir my-marketplace +cd my-marketplace +``` + +### Create a file with the name \`.env\` + +If you already created the .env file as instructed in ...(link to customize the market chapter), you can skip this step, otherwise copy the below content into the \`.env\` file. + +{% code title=".env" overflow="wrap" %} +```bash +# Update this value if your Market should use custom Aquarius +NEXT_PUBLIC_METADATACACHE_URI=https://v4.aquarius.oceanprotocol.com + +# Provide INFURA project ID from the obtained API key for NEXT_PUBLIC_INFURA_PROJECT_ID +#NEXT_PUBLIC_INFURA_PROJECT_ID="xxx" +#NEXT_PUBLIC_MARKET_FEE_ADDRESS="0xxx" +#NEXT_PUBLIC_PUBLISHER_MARKET_ORDER_FEE="1" +#NEXT_PUBLIC_CONSUME_MARKET_ORDER_FEE="1" +#NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="1" + +# +# ADVANCED SETTINGS +# + +# Toggle pricing options presented during price creation +#NEXT_PUBLIC_ALLOW_FIXED_PRICING="true" +#NEXT_PUBLIC_ALLOW_FREE_PRICING="true" + +# Privacy Preference Center +#NEXT_PUBLIC_PRIVACY_PREFERENCE_CENTER="true" +``` +{% endcode %} + +### Create a \`Dockerfile\` file and copy the below content into it. + +In the following Dockerfile, replace \ with the url of your Ocean Market fork repository or use "https://github.com/oceanprotocol/market.git" if you want to deploy our standard image of Ocean Market. + +
FROM node:16
+RUN git clone <YOUR_GIT_REPO_URL> /usr/app/market
+WORKDIR /usr/app/market
+RUN npm ci --legacy-peer-deps
+RUN npm run build
+EXPOSE 3000
+CMD ["npx", "next", "start"]
+
+ +### Build a docker image + +```bash +docker build . -f Dockerfile -t market:latest +``` + +### Start the marketplace + +```bash +docker start market +``` diff --git a/infrastructure/deploying-ocean-subgraph.md b/infrastructure/deploying-ocean-subgraph.md new file mode 100644 index 00000000..48f390a8 --- /dev/null +++ b/infrastructure/deploying-ocean-subgraph.md @@ -0,0 +1,704 @@ +# Deploying Ocean Subgraph + +### About Ocean Subgraph + +Ocean subgraph allows querying the datatoken, data NFT, and all event information using GraphQL. Hosting the Ocean subgraph saves the cost and time required in querying the data directly from the blockchain. The steps in this tutorial will explain how to host Ocean subgraph for the EVM-compatible chains supported by Ocean Protocol. + +Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotocol/graph-node), therefore, in this document, we will show first how to deploy graph-node - either using Docker Engine or Kubernetes - and then how to install Ocean Subgraph on the graph-node system. + +## Deploying Graph-node using Docker Engine and Docker Compose + +### Prerequisites + +* A server for hosting Graph-node. See [this guide](setup-server.md) for how to create a server; +* Docker Compose and Docker Engine are installed and configured on the server. See [this guide](setup-server.md#install-docker-engine-and-docker-compose) for how to install these products. +* The RPC URLs and API keys for each of the networks to which Ocean Subgraph will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. + +### Steps + +1. [Create the /etc/docker/compose/graph-node/docker-compose.yml file](deploying-ocean-subgraph.md#1-create-the-etcdockercomposegraph-nodedocker-composeyml-file) +2. [Create the /etc/systemd/system/docker-compose@graph-node.service file](deploying-ocean-subgraph.md#2-create-the-etcsystemdsystemdocker-composegraph-nodeservice-file) +3. [Reload the systemd manager configuration](deploying-ocean-subgraph.md#3.-reload-the-systemd-manager-configuration) +4. [Start the Ocean Subgraph service](deploying-ocean-subgraph.md#4-deploy-ocean-subgraph) +5. [Check the service's status](deploying-ocean-subgraph.md#5.-check-the-services-status) +6. [Check Ocean Subgraph's service logs](deploying-ocean-subgraph.md#6-check-graph-node-service-logs) + +#### 1. Create the /etc/docker/compose/graph-node/docker-compose.yml file + +From a terminal console, create the _/etc/docker/compose/graph-node/docker-compose.yml_ file, then copy and paste the following content to it (. Check the comments in the file and replace the fields with the specific values of your implementation. + +_/etc/docker/compose/graph-node/docker-compose.yml_ (annotated - example for `mumbai` network) + +```yaml +version: '3' +services: + graph-node: + image: graphprotocol/graph-node:v0.28.2 + container_name: graph-node + restart: on-failure + ports: + - '8000:8000' + - '8020:8020' + - '8030:8030' + - '8040:8040' + depends_on: + - ipfs + - postgres-graph + environment: + postgres_host: postgres-graph + postgres_user: graph-node + postgres_pass: < password > + postgres_db: mumbai + ipfs: 'ipfs:5001' + ethereum: 'mumbai:https://polygon-mumbai.infura.io/v3/< INFURA ID >' + GRAPH_LOG: info + ipfs: + image: ipfs/go-ipfs:v0.4.23 + container_name: ipfs + restart: on-failure + ports: + - '5001:5001' + volumes: + - ipfs-graph-node:/data/ipfs + postgres-graph: + image: postgres:15.3 + container_name: postgres + restart: on-failure + ports: + - '5432:5432' + command: ["postgres", "-cshared_preload_libraries=pg_stat_statements"] + environment: + POSTGRES_USER: graph-node + POSTGRES_PASSWORD: < password > + POSTGRES_DB: mumbai + volumes: + - pgdata-graph-node:/var/lib/postgresql/data +volumes: + pgdata-graph-node: + driver: local + ipfs-graph-node: + driver: local +``` + +#### 2. Create the /etc/systemd/system/docker-compose@graph-node.service file + +Create the _/etc/systemd/system/docker-compose@graph-node.service_ file then copy and paste the following content to it. This example file could be customized if needed. + +``` +[Unit] +Description=%i service with docker compose +Requires=docker.service +After=docker.service + +[Service] +Type=oneshot +RemainAfterExit=true +Environment="PROJECT=ocean" +WorkingDirectory=/etc/docker/compose/%i +ExecStartPre=/usr/bin/env docker-compose -p $PROJECT pull +ExecStart=/usr/bin/env docker-compose -p $PROJECT up -d +ExecStop=/usr/bin/env docker-compose -p $PROJECT stop +ExecStopPost=/usr/bin/env docker-compose -p $PROJECT down + + +[Install] +WantedBy=multi-user.target +``` + +#### 3. Reload the systemd manager configuration + +Run the following command to reload the systemd manager configuration + +```bash +sudo systemctl daemon-reload +``` + +Optionally, you can enable the services to start at boot, using the following command: + +```bash +sudo systemctl enable docker-compose@graph-node.service +``` + +#### 4. Start graph-node service + +To start the Ocean Subgraph service, run the following command: + +```bash +sudo systemctl start docker-compose@graph-node.service +``` + +#### 5. Check the service's status + +Check the status of the service by running the following command. The output of the command should be similar to the one presented here. + +```bash +$ sudo systemctl status docker-compose@graph-node.service +● docker-compose@graph-node.service - graph-node service with docker compose + Loaded: loaded (/etc/systemd/system/docker-compose@graph-node.service; disabled; vendor preset: enabled) + Active: active (exited) since Sun 2023-06-25 17:05:25 UTC; 6s ago + Process: 4878 ExecStartPre=/usr/bin/env docker-compose -p $PROJECT pull (code=exited, status=0/SUCCESS) + Process: 4887 ExecStart=/usr/bin/env docker-compose -p $PROJECT up -d (code=exited, status=0/SUCCESS) + Main PID: 4887 (code=exited, status=0/SUCCESS) + CPU: 123ms + +Jun 25 17:05:24 testvm env[4887]: Container ipfs Created +Jun 25 17:05:24 testvm env[4887]: Container graph-node Creating +Jun 25 17:05:24 testvm env[4887]: Container graph-node Created +Jun 25 17:05:24 testvm env[4887]: Container ipfs Starting +Jun 25 17:05:24 testvm env[4887]: Container postgres Starting +Jun 25 17:05:24 testvm env[4887]: Container ipfs Started +Jun 25 17:05:25 testvm env[4887]: Container postgres Started +Jun 25 17:05:25 testvm env[4887]: Container graph-node Starting +Jun 25 17:05:25 testvm env[4887]: Container graph-node Started +Jun 25 17:05:25 testvm systemd[1]: Finished graph-node service with docker compose. + +``` + +#### 6. Check graph-node service logs + +If needed, use docker CLI to check Ocean Subgraph service logs. + +First, check the container status + +```bash +$ docker ps --format "table {{.Image}}\t{{.Ports}}\t{{.Names}}\t{{.Status}}" +IMAGE PORTS NAMES STATUS +graphprotocol/graph-node:v0.28.2 0.0.0.0:8000->8000/tcp, :::8000->8000/tcp, 0.0.0.0:8020->8020/tcp, :::8020->8020/tcp, 0.0.0.0:8030->8030/tcp, :::8030->8030/tcp, 0.0.0.0:8040->8040/tcp, :::8040->8040/tcp, 8001/tcp graph-node Up 55 minutes +ipfs/go-ipfs:v0.4.23 4001/tcp, 8080-8081/tcp, 0.0.0.0:5001->5001/tcp, :::5001->5001/tcp ipfs Up 55 minutes +postgres:15.3 0.0.0.0:5432->5432/tcp, :::5432->5432/tcp postgres Up 55 minutes +``` + +Then, check the logs of the Ocean Subgraph docker container: + +```bash +docker logs graph-node [--follow] +``` + +## Deploying graph-node using Kubernetes + +In this example, we will deploy graph-node as a Kubernetes deployment service. [graph-node](https://github.com/graphprotocol/graph-node) has the following dependencies: PostgreSQL and IPFS. + +### Prerequisites: + +* A server for hosting graph-node. See [this guide](setup-server.md) for how to create a server; +* Kubernetes with Docker Engine is installed and configured on the server. See [this chapter](setup-server.md#install-kubernetes-with-docker-engine) for information on installing Kubernetes. +* The RPC URLs and API keys for each of the networks to which the Provider will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. + +### Steps + +1. [Deploy PostgreSQL](deploying-ocean-subgraph.md#1.-deploy-postgresql) +2. [Deploy IPFS](deploying-ocean-subgraph.md#2.-deploy-ipfs) +3. [Deploy Graph-node](deploying-ocean-subgraph.md#deploy-graph-node) + +#### 1. Deploy PostgreSQL + +It is recommended to deploy PostgreSQL as helm chart. + +References: [https://github.com/bitnami/charts/tree/main/bitnami/postgresql/#installing-the-chart](https://github.com/bitnami/charts/tree/main/bitnami/postgresql/#installing-the-chart) + +Once PostgreSQL pods are running, a database must be created: eg. `mumbai.` + +#### 2. Deploy IPFS + +The following template can be customized to deploy IPFS statefulset and service. + +```yaml +apiVersion: apps/v1 +kind: StatefulSet +metadata: + labels: + app: ipfs + name: ipfs +spec: + podManagementPolicy: OrderedReady + replicas: 1 + revisionHistoryLimit: 10 + selector: + matchLabels: + app: ipfs + serviceName: ipfs + template: + metadata: + creationTimestamp: null + labels: + app: ipfs + spec: + containers: + - image: ipfs/go-ipfs:v0.4.22 + imagePullPolicy: IfNotPresent + livenessProbe: + failureThreshold: 3 + httpGet: + path: /debug/metrics/prometheus + port: api + scheme: HTTP + initialDelaySeconds: 15 + periodSeconds: 3 + successThreshold: 1 + timeoutSeconds: 1 + name: s1-ipfs + ports: + - containerPort: 5001 + name: api + protocol: TCP + - containerPort: 8080 + name: gateway + protocol: TCP + readinessProbe: + failureThreshold: 3 + httpGet: + path: /debug/metrics/prometheus + port: api + scheme: HTTP + initialDelaySeconds: 15 + periodSeconds: 3 + successThreshold: 1 + timeoutSeconds: 1 + terminationMessagePath: /dev/termination-log + terminationMessagePolicy: File + volumeMounts: + - mountPath: /data/ipfs + name: ipfs-storage + dnsPolicy: ClusterFirst + restartPolicy: Always + schedulerName: default-scheduler + securityContext: + fsGroup: 1000 + runAsUser: 1000 + terminationGracePeriodSeconds: 30 + updateStrategy: + rollingUpdate: + partition: 0 + type: RollingUpdate + volumeClaimTemplates: + - apiVersion: v1 + kind: PersistentVolumeClaim + metadata: + creationTimestamp: null + name: ipfs-storage + spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: 1G + volumeMode: Filesystem + status: + phase: Pending +--- +apiVersion: v1 +kind: Service +metadata: + labels: + app: ipfs + name: ipfs +spec: + clusterIP: + clusterIPs: + ipFamilies: + - IPv4 + ipFamilyPolicy: SingleStack + ports: + - name: api + port: 5001 + - name: gateway + port: 8080 + selector: + app: ipf +``` + +#### Deploy Graph-node + +The following annotated templated can be customized to deploy graph-node deployment and service: + +```yaml +apiVersion: apps/v1 +kind: Deployment +metadata: + annotations: + labels: + app: mumbai-graph-node + name: mumbai-graph-node +spec: + progressDeadlineSeconds: 600 + replicas: 1 + revisionHistoryLimit: 10 + selector: + matchLabels: + app: mumbai-graph-node + strategy: + rollingUpdate: + maxSurge: 25% + maxUnavailable: 25% + type: RollingUpdate + template: + metadata: + creationTimestamp: null + labels: + app: mumbai-graph-node + spec: + containers: + - env: + - name: ipfs + value: ipfs..svc.cluster.local:5001 + - name: postgres_host + value: postgresql..svc.cluster.local + - name: postgres_user + value: < postgresql user > + - name: postgres_pass + value: < postgresql database password > + - name: postgres_db + value: < postgresql database > + - name: ethereum + value: mumbai:https://polygon-mumbai.infura.io/v3/< INFURA ID> + - name: GRAPH_KILL_IF_UNRESPONSIVE + value: "true" + image: graphprotocol/graph-node:v0.28.2 + imagePullPolicy: IfNotPresent + livenessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 8000 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 1 + name: mumbai-graph-node + ports: + - containerPort: 8000 + name: graphql + protocol: TCP + - containerPort: 8020 + name: jsonrpc + protocol: TCP + - containerPort: 8030 + name: indexnode + protocol: TCP + - containerPort: 8040 + name: metrics + protocol: TCP + readinessProbe: + failureThreshold: 3 + httpGet: + path: / + port: 8000 + scheme: HTTP + initialDelaySeconds: 20 + periodSeconds: 10 + successThreshold: 1 + timeoutSeconds: 1 + resources: + limits: + cpu: "2" + memory: 1536Mi + requests: + cpu: 1500m + memory: 1536Mi + terminationMessagePath: /dev/termination-log + terminationMessagePolicy: File + dnsPolicy: ClusterFirst + restartPolicy: Always + schedulerName: default-scheduler + terminationGracePeriodSeconds: 30 +--- +apiVersion: v1 +kind: Service +metadata: + labels: + app: mumbai-graph-node + name: mumbai-graph-node +spec: + clusterIP: + clusterIPs: + internalTrafficPolicy: Cluster + ipFamilies: + - IPv4 + ipFamilyPolicy: SingleStack + ports: + - name: graphql + port: 8000 + - name: jsonrpc + port: 8020 + - name: indexnode + port: 8030 + - name: metrics + port: 8040 + selector: + app: mumbai-graph-nodeyam +``` + +## Deploy Ocean Subgraph + +After you deployed graph-node, either using Kubernetes or Docker Compose, you can proceed to deploy Ocean Subgraph on top of it. + +### Prerequisites + +* graph-node up-and-running + +### Steps + +1. [Install Node.js locally](deploying-ocean-subgraph.md#1.-install-node.js-locally) +2. [Download and extract Ocean-subgraph](#2.-download-and-extract-ocean-subgraph) + +#### 1. Install Node.js locally + +To install Node.js locally, please refer to this [link ](https://nodejs.org/en/download)for instructions. + +#### 2. Download and extract Ocean-subgraph + +Download and extract [Ocean-subgraph](https://github.com/oceanprotocol/ocean-subgraph) (check [here](https://github.com/oceanprotocol/ocean-subgraph/releases) the available releases). + +#### 3. Install dependencies + +From the directory where Ocean subgraph was extracted, run the following command: + +```bash +npm i +``` + +#### 4. Deploy Ocean Subgraph + +In the following example, we are deploying on Ocean Subgraph on graph-node running for `mumbai` testnet. + +Note: for `ocean-subgraph` deployment in the Kubernetes environment, both `graph-node` and `ipfs` services must be locally forwarded using `kubectl port-forward` command. + +Run the following command: + +```bash +$ npm run quickstart:mumbai + +> ocean-subgraph@3.0.8 quickstart:mumbai +> node ./scripts/generatenetworkssubgraphs.js mumbai && npm run codegen && npm run create:local && npm run deploy:local + +Creating subgraph.yaml for mumbai + Adding veOCEAN +Skipping polygon +Skipping bsc +Skipping energyweb +Skipping moonriver +Skipping mainnet +Skipping goerli +Skipping polygonedge +Skipping gaiaxtestnet +Skipping alfajores +Skipping gen-x-testnet +Skipping filecointestnet + +> ocean-subgraph@3.0.8 codegen +> graph codegen --output-dir src/@types + + Skip migration: Bump mapping apiVersion from 0.0.1 to 0.0.2 + Skip migration: Bump mapping apiVersion from 0.0.2 to 0.0.3 + Skip migration: Bump mapping apiVersion from 0.0.3 to 0.0.4 + Skip migration: Bump mapping apiVersion from 0.0.4 to 0.0.5 + Skip migration: Bump mapping apiVersion from 0.0.5 to 0.0.6 + Skip migration: Bump manifest specVersion from 0.0.1 to 0.0.2 + Apply migration: Bump manifest specVersion from 0.0.2 to 0.0.4 +✔ Apply migrations +✔ Load subgraph from subgraph.yaml + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/ERC721Factory.sol/ERC721Factory.json + Load contract ABI from abis/ERC20.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/FactoryRouter.sol/FactoryRouter.json + Load contract ABI from abis/ERC20.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veAllocate.sol/veAllocate.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veOCEAN.vy/veOCEAN.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veDelegation.vy/veDelegation.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veFeeDistributor.vy/veFeeDistributor.json + Load contract ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/df/DFRewards.sol/DFRewards.json +✔ Load contract ABIs + Generate types for contract ABI: ERC721Factory (node_modules/@oceanprotocol/contracts/artifacts/contracts/ERC721Factory.sol/ERC721Factory.json) + Write types to src/@types/ERC721Factory/ERC721Factory.ts + Generate types for contract ABI: ERC20 (abis/ERC20.json) + Write types to src/@types/ERC721Factory/ERC20.ts + Generate types for contract ABI: FactoryRouter (node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/FactoryRouter.sol/FactoryRouter.json) + Write types to src/@types/FactoryRouter/FactoryRouter.ts + Generate types for contract ABI: ERC20 (abis/ERC20.json) + Write types to src/@types/FactoryRouter/ERC20.ts + Generate types for contract ABI: veAllocate (node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veAllocate.sol/veAllocate.json) + Write types to src/@types/veAllocate/veAllocate.ts + Generate types for contract ABI: veOCEAN (node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veOCEAN.vy/veOCEAN.json) + Write types to src/@types/veOCEAN/veOCEAN.ts + Generate types for contract ABI: veDelegation (node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veDelegation.vy/veDelegation.json) + Write types to src/@types/veDelegation/veDelegation.ts + Generate types for contract ABI: veFeeDistributor (node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veFeeDistributor.vy/veFeeDistributor.json) + Write types to src/@types/veFeeDistributor/veFeeDistributor.ts + Generate types for contract ABI: DFRewards (node_modules/@oceanprotocol/contracts/artifacts/contracts/df/DFRewards.sol/DFRewards.json) + Write types to src/@types/DFRewards/DFRewards.ts +✔ Generate types for contract ABIs + Generate types for data source template ERC20Template + Generate types for data source template ERC721Template + Generate types for data source template Dispenser + Generate types for data source template FixedRateExchange + Write types for templates to src/@types/templates.ts +✔ Generate types for data source templates + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20Template.sol/ERC20Template.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20TemplateEnterprise.sol/ERC20TemplateEnterprise.json + Load data source template ABI from abis/ERC20.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC20Roles.sol/ERC20Roles.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC721Template.sol/ERC721Template.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC721RolesAddress.sol/ERC721RolesAddress.json + Load data source template ABI from abis/ERC20.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/dispenser/Dispenser.sol/Dispenser.json + Load data source template ABI from abis/ERC20.json + Load data source template ABI from node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/fixedRate/FixedRateExchange.sol/FixedRateExchange.json + Load data source template ABI from abis/ERC20.json +✔ Load data source template ABIs + Generate types for data source template ABI: ERC20Template > ERC20Template (node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20Template.sol/ERC20Template.json) + Write types to src/@types/templates/ERC20Template/ERC20Template.ts + Generate types for data source template ABI: ERC20Template > ERC20TemplateEnterprise (node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20TemplateEnterprise.sol/ERC20TemplateEnterprise.json) + Write types to src/@types/templates/ERC20Template/ERC20TemplateEnterprise.ts + Generate types for data source template ABI: ERC20Template > ERC20 (abis/ERC20.json) + Write types to src/@types/templates/ERC20Template/ERC20.ts + Generate types for data source template ABI: ERC20Template > ERC20Roles (node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC20Roles.sol/ERC20Roles.json) + Write types to src/@types/templates/ERC20Template/ERC20Roles.ts + Generate types for data source template ABI: ERC721Template > ERC721Template (node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC721Template.sol/ERC721Template.json) + Write types to src/@types/templates/ERC721Template/ERC721Template.ts + Generate types for data source template ABI: ERC721Template > ERC721RolesAddress (node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC721RolesAddress.sol/ERC721RolesAddress.json) + Write types to src/@types/templates/ERC721Template/ERC721RolesAddress.ts + Generate types for data source template ABI: ERC721Template > ERC20 (abis/ERC20.json) + Write types to src/@types/templates/ERC721Template/ERC20.ts + Generate types for data source template ABI: Dispenser > Dispenser (node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/dispenser/Dispenser.sol/Dispenser.json) + Write types to src/@types/templates/Dispenser/Dispenser.ts + Generate types for data source template ABI: Dispenser > ERC20 (abis/ERC20.json) + Write types to src/@types/templates/Dispenser/ERC20.ts + Generate types for data source template ABI: FixedRateExchange > FixedRateExchange (node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/fixedRate/FixedRateExchange.sol/FixedRateExchange.json) + Write types to src/@types/templates/FixedRateExchange/FixedRateExchange.ts + Generate types for data source template ABI: FixedRateExchange > ERC20 (abis/ERC20.json) + Write types to src/@types/templates/FixedRateExchange/ERC20.ts +✔ Generate types for data source template ABIs +✔ Load GraphQL schema from schema.graphql + Write types to src/@types/schema.ts +✔ Generate types for GraphQL schema + +Types generated successfully + + +> ocean-subgraph@3.0.8 create:local +> graph create oceanprotocol/ocean-subgraph --node http://127.0.0.1:8020 + +Created subgraph: oceanprotocol/ocean-subgraph + +> ocean-subgraph@3.0.8 deploy:local +> graph deploy oceanprotocol/ocean-subgraph subgraph.yaml -l $npm_package_version --debug --ipfs http://127.0.0.1:5001 --node http://127.0.0.1:8020 + + Skip migration: Bump mapping apiVersion from 0.0.1 to 0.0.2 + Skip migration: Bump mapping apiVersion from 0.0.2 to 0.0.3 + Skip migration: Bump mapping apiVersion from 0.0.3 to 0.0.4 + Skip migration: Bump mapping apiVersion from 0.0.4 to 0.0.5 + Skip migration: Bump mapping apiVersion from 0.0.5 to 0.0.6 + Skip migration: Bump manifest specVersion from 0.0.1 to 0.0.2 + Skip migration: Bump manifest specVersion from 0.0.2 to 0.0.4 +✔ Apply migrations +✔ Load subgraph from subgraph.yaml + Compile data source: ERC721Factory => build/ERC721Factory/ERC721Factory.wasm + Compile data source: FactoryRouter => build/FactoryRouter/FactoryRouter.wasm + Compile data source: veAllocate => build/veAllocate/veAllocate.wasm + Compile data source: veOCEAN => build/veOCEAN/veOCEAN.wasm + Compile data source: veDelegation => build/veDelegation/veDelegation.wasm + Compile data source: veFeeDistributor => build/veFeeDistributor/veFeeDistributor.wasm + Compile data source: DFRewards => build/DFRewards/DFRewards.wasm + Compile data source template: ERC20Template => build/templates/ERC20Template/ERC20Template.wasm + Compile data source template: ERC721Template => build/templates/ERC721Template/ERC721Template.wasm + Compile data source template: Dispenser => build/templates/Dispenser/Dispenser.wasm + Compile data source template: FixedRateExchange => build/templates/FixedRateExchange/FixedRateExchange.wasm +✔ Compile subgraph + Copy schema file build/schema.graphql + Write subgraph file build/ERC721Factory/node_modules/@oceanprotocol/contracts/artifacts/contracts/ERC721Factory.sol/ERC721Factory.json + Write subgraph file build/ERC721Factory/abis/ERC20.json + Write subgraph file build/FactoryRouter/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/FactoryRouter.sol/FactoryRouter.json + Write subgraph file build/FactoryRouter/abis/ERC20.json + Write subgraph file build/veAllocate/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veAllocate.sol/veAllocate.json + Write subgraph file build/veOCEAN/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veOCEAN.vy/veOCEAN.json + Write subgraph file build/veDelegation/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veDelegation.vy/veDelegation.json + Write subgraph file build/veFeeDistributor/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veFeeDistributor.vy/veFeeDistributor.json + Write subgraph file build/DFRewards/node_modules/@oceanprotocol/contracts/artifacts/contracts/df/DFRewards.sol/DFRewards.json + Write subgraph file build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20Template.sol/ERC20Template.json + Write subgraph file build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20TemplateEnterprise.sol/ERC20TemplateEnterprise.json + Write subgraph file build/ERC20Template/abis/ERC20.json + Write subgraph file build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC20Roles.sol/ERC20Roles.json + Write subgraph file build/ERC721Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC721Template.sol/ERC721Template.json + Write subgraph file build/ERC721Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC721RolesAddress.sol/ERC721RolesAddress.json + Write subgraph file build/ERC721Template/abis/ERC20.json + Write subgraph file build/Dispenser/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/dispenser/Dispenser.sol/Dispenser.json + Write subgraph file build/Dispenser/abis/ERC20.json + Write subgraph file build/FixedRateExchange/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/fixedRate/FixedRateExchange.sol/FixedRateExchange.json + Write subgraph file build/FixedRateExchange/abis/ERC20.json + Write subgraph manifest build/subgraph.yaml +✔ Write compiled subgraph to build/ + Add file to IPFS build/schema.graphql + .. QmQa3a9ypCLC84prHGQdhbcGG4DHJceqADGxmZMmAAXuTz + Add file to IPFS build/ERC721Factory/node_modules/@oceanprotocol/contracts/artifacts/contracts/ERC721Factory.sol/ERC721Factory.json + .. QmSoG3r5vyWXqjEfKAQYjwtQcQkZCsZEcJXVFWVq1tT1dD + Add file to IPFS build/ERC721Factory/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 + Add file to IPFS build/FactoryRouter/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/FactoryRouter.sol/FactoryRouter.json + .. QmcBVA1R3yi2167UZMvV4LvG4cMHjL8ZZXmPMriCjn8DEe + Add file to IPFS build/FactoryRouter/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 (already uploaded) + Add file to IPFS build/veAllocate/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veAllocate.sol/veAllocate.json + .. Qmc3iwQkQAhqe1PjzTt6KZLh9rsWQvyxkFt7doj2iXv8C3 + Add file to IPFS build/veOCEAN/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veOCEAN.vy/veOCEAN.json + .. QmahFjirJqiwKpytFZ9CdE92LdPGBUDZs6AWpsrH2wn1VP + Add file to IPFS build/veDelegation/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veDelegation.vy/veDelegation.json + .. QmfU6kZ5sksLdj3q88n7SUP63C1cnhQjU8vuMmRYwf2v5r + Add file to IPFS build/veFeeDistributor/node_modules/@oceanprotocol/contracts/artifacts/contracts/ve/veFeeDistributor.vy/veFeeDistributor.json + .. QmVU51oBr62D4UFXTwnMcbzuBBAAeQssqmqM9jic7A6L3v + Add file to IPFS build/DFRewards/node_modules/@oceanprotocol/contracts/artifacts/contracts/df/DFRewards.sol/DFRewards.json + .. QmcckRMahzpL7foEFGpWfkDBsyoWbNRfLC32uFq8ceUV3a + Add file to IPFS build/ERC721Factory/ERC721Factory.wasm + .. QmVfDAgZdKWxMuNfT7kso1LbFre2xhYbEeHBGm3gH3R9oE + Add file to IPFS build/FactoryRouter/FactoryRouter.wasm + .. QmYCC9AcaYw3nGSqNXNFHVsuB67FQEyZ8twRjRXrprcgyp + Add file to IPFS build/veAllocate/veAllocate.wasm + .. QmUFaYDxChi5nKEJLvHQZP1cRoqqP5k3fYSwk2JjuSceiJ + Add file to IPFS build/veOCEAN/veOCEAN.wasm + .. QmRYCyYKwHdSeM55vuvL1mdCooDkFQm6d2TQ7iK2N1qgur + Add file to IPFS build/veDelegation/veDelegation.wasm + .. QmaTjRLirzfidtQTYgzxqVVD9AX9e69TN1Y8fEsNQ9AEZq + Add file to IPFS build/veFeeDistributor/veFeeDistributor.wasm + .. QmZCEp4yxiDyuksEjSaceogJwLMto2UGfV1KxVuJTJLTqg + Add file to IPFS build/DFRewards/DFRewards.wasm + .. QmRSxe52B836bdfoJbuDY4tUCawzqgkHRNxe9ucU1JdYm5 + Add file to IPFS build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20Template.sol/ERC20Template.json + .. QmPkhFvnBbqA3You7NsK5Zsyh8kkizXUHF9pcC5V6qDJQu + Add file to IPFS build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC20TemplateEnterprise.sol/ERC20TemplateEnterprise.json + .. QmZnogwnfr4TeBPykvmCL2oaX63AKQP1F1uBAbbfnyPAzB + Add file to IPFS build/ERC20Template/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 (already uploaded) + Add file to IPFS build/ERC20Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC20Roles.sol/ERC20Roles.json + .. QmTWTzg4jTx4GxGApVyxirNRTxB7QovS4bHGuWnnW8Ciz2 + Add file to IPFS build/templates/ERC20Template/ERC20Template.wasm + .. QmUcxes5La7n9481Vf9AoQ2Mjt1CrbS7T6tDhpnfF77Uh5 + Add file to IPFS build/ERC721Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/templates/ERC721Template.sol/ERC721Template.json + .. QmPE82CiACicgu1WxEjeFrLmskiJADroQRnxH7owpK6jaP + Add file to IPFS build/ERC721Template/node_modules/@oceanprotocol/contracts/artifacts/contracts/utils/ERC721RolesAddress.sol/ERC721RolesAddress.json + .. Qmdhi7UK6Ww8vXH9YC3JxVUEFjTyx3XycF53rRZapVK5c3 + Add file to IPFS build/ERC721Template/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 (already uploaded) + Add file to IPFS build/templates/ERC721Template/ERC721Template.wasm + .. QmNhLws24szwpz8LM2sL6HHKc6KK4vtJwzfeZWkghuqn7Q + Add file to IPFS build/Dispenser/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/dispenser/Dispenser.sol/Dispenser.json + .. QmdiN7Fhw9sjoVVJgHtTtzxv5fwtFMHLNH1x1yqbswsThW + Add file to IPFS build/Dispenser/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 (already uploaded) + Add file to IPFS build/templates/Dispenser/Dispenser.wasm + .. QmTpn9wagpmH6byjjdCBZdgypFgcw2mva3bC52nC4z3eLW + Add file to IPFS build/FixedRateExchange/node_modules/@oceanprotocol/contracts/artifacts/contracts/pools/fixedRate/FixedRateExchange.sol/FixedRateExchange.json + .. Qmd2ToAptK74j8pGxe8mZXfAvY3AxstgmYH8JDMAfLtAGd + Add file to IPFS build/FixedRateExchange/abis/ERC20.json + .. QmXuTbDkNrN27VydxbS2huvKRk62PMgUTdPDWkxcr2w7j2 (already uploaded) + Add file to IPFS build/templates/FixedRateExchange/FixedRateExchange.wasm + .. QmRrwwoFF33LvPhnGCGgLBLyuLizrFgD44kW9io81tPZzX +✔ Upload subgraph to IPFS + +Build completed: QmVUKpgwuyDh9KgUxTzZvVNFJbdevc56YrZpZjQvu8Yp7q + +Deployed to http://127.0.0.1:8000/subgraphs/name/oceanprotocol/ocean-subgraph/graphql + +Subgraph endpoints: +Queries (HTTP): http://127.0.0.1:8000/subgraphs/name/oceanprotocol/ocean-subgraph +``` + +Ocean Subgraph is deployed under /subgraphs/name/oceanprotocol/ocean-subgraph/. To access it from the server on which it was deployed, open a browser and go to [http://127.0.0.1:8000/subgraphs/name/oceanprotocol/ocean-subgraph/graphql](http://127.0.0.1:8000/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). diff --git a/infrastructure/deploying-provider.md b/infrastructure/deploying-provider.md new file mode 100644 index 00000000..6bfe18b0 --- /dev/null +++ b/infrastructure/deploying-provider.md @@ -0,0 +1,312 @@ +# Deploying Provider + +### About Provider + +Provider encrypts the URL and metadata during publishing and decrypts the URL when the dataset is downloaded or a compute job is started. It enables access to the data assets by streaming data (and never the URL). It performs checks on-chain for buyer permissions and payments. It also provides compute services (connects to a C2D environment). + +Provider is a multichain component, meaning that it can handle these tasks on multiple chains with the proper configurations. The source code of Provider can be accessed from [here](https://github.com/oceanprotocol/provider). + +As mentioned in the Setup a Server document, all Ocean components can be deployed in two types of configurations: simple, based on Docker Engine and Docker Compose, and complex, based on Kubernetes with Docker Engine. In this document, we will present how to deploy Provider in each of these configurations. + + +## Deploying Provider using Docker Engine and Docker Compose + +In this guide, we will deploy Provider for two chains: Goerli (Ethereum test network) and Mumbai (Polygon test network). Therefore, please note that in the following configuration files, "5" and "80001" are the chain IDs for Goerli and Mumbai respectively. + + +### Prerequisites + +* A server for hosting Provider. See [this guide](setup-server.md) for how to create a server; +* Docker Compose and Docker Engine are installed and configured on the server. See [this guide](setup-server.md#install-docker-engine-and-docker-compose) for how to install these products. +* The RPC URLs and API keys for each of the networks to which the Provider will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. +* The private key which will be used by Provider to encrypt/decrypt URLs. + +### Steps + +The steps to deploy the Provider using Docker Engine and Docker Compose are: + +1. [Create the /etc/docker/compose/provider/docker-compose.yml file](deploying-provider.md#1-create-the-etcdockercomposeproviderdocker-composeyml-file) +2. [Create the /etc/systemd/system/docker-compose@provider.service file](deploying-provider.md#2-create-the-etcsystemdsystemdocker-composeproviderservice-file) +3. [Reload the systemd manager configuration](deploying-provider.md#3.-reload-the-systemd-manager-configuration) +4. [Start the Provider service](deploying-provider.md#4.-start-the-provider-service) +5. [Check the service's status](deploying-provider.md#5.-check-the-services-status) +6. [Confirm the Provider is accessible](deploying-provider.md#6.-confirm-the-provider-is-accessible) +7. [Check Provider service logs](deploying-provider.md#7.-check-provider-service-logs) + + +#### 1. Create the /etc/docker/compose/provider/docker-compose.yml file + +From a terminal console, create /etc/docker/compose/provider/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. + +```yaml +version: '3' +services: + provider: + image: oceanprotocol/provider-py:latest =>(check on https://hub.docker.com/r/oceanprotocol/provider-py for specific tag) + container_name: provider + restart: on-failure + ports: + - 8030:8030 + networks: + backend: + environment: + ARTIFACTS_PATH: "/ocean-contracts/artifacts" + NETWORK_URL: '{"5":"https://goerli.infura.io/v3/","80001":"https://polygon-mumbai.infura.io/v3/"}' + PROVIDER_PRIVATE_KEY: '{"5":"","80001":"" + OCEAN_PROVIDER_TIMEOUT: "9000" + OPERATOR_SERVICE_URL: "https://stagev4.c2d.oceanprotocol.com" => (use custom value for Operator Service URL) + AQUARIUS_URL: "http//localhost:5000" => (use custom value Aquarius URL) + REQUEST_TIMEOUT: "10" +networks: + backend: + driver: bridge +``` + + +#### 2. Create the _/etc/systemd/system/docker-compose@provider.service_ file + +Create the _/etc/systemd/system/docker-compose@provider.service_ file then copy and paste the following content to it. This example file could be customized if needed. + +``` +[Unit] +Description=%i service with docker compose +Requires=docker.service +After=docker.service + +[Service] +Type=oneshot +RemainAfterExit=true +Environment="PROJECT=ocean" +WorkingDirectory=/etc/docker/compose/%i +ExecStartPre=/usr/bin/env docker-compose -p $PROJECT pull +ExecStart=/usr/bin/env docker-compose -p $PROJECT up -d +ExecStop=/usr/bin/env docker-compose -p $PROJECT stop + + +[Install] +WantedBy=multi-user.target +``` + + +#### 3. Reload the systemd manager configuration + +Run the following command to reload the systemd manager configuration + +```bash +sudo systemctl daemon-reload +``` + +Optionally, you can enable the services to start at boot, using the following command: + +```bash +sudo systemctl enable docker-compose@provider.service +``` + + +#### 4. Start the Provider service + +To start the Provider service, run the following command: + +```bash +sudo systemctl start docker-compose@provider.service +``` + + +#### 5. Check the service's status + +Check the status of the service by running the following command. The output of the command should be similar to the one presented here. + +```bash +$ sudo systemctl status docker-compose@provider.service +● docker-compose@provider.service - provider service with docker compose + Loaded: loaded (/etc/systemd/system/docker-compose@provider.service; disabled; vendor preset: enabled) + Active: active (exited) since Wed 2023-06-14 09:41:53 UTC; 20s ago + Process: 4118 ExecStartPre=/usr/bin/env docker-compose -p $PROJECT pull (code=exited, status=0/SUCCESS) + Process: 4126 ExecStart=/usr/bin/env docker-compose -p $PROJECT up -d (code=exited, status=0/SUCCESS) + Main PID: 4126 (code=exited, status=0/SUCCESS) + CPU: 93ms + +Jun 14 09:41:52 testvm systemd[1]: Starting provider service with docker compose... +Jun 14 09:41:52 testvm env[4118]: provider Pulling +Jun 14 09:41:53 testvm env[4118]: provider Pulled +Jun 14 09:41:53 testvm env[4126]: Container provider Created +Jun 14 09:41:53 testvm env[4126]: Container provider Starting +Jun 14 09:41:53 testvm env[4126]: Container provider Started +Jun 14 09:41:53 testvm systemd[1]: Finished provider service with docker compose. +``` + + +#### 6. Confirm the Provider is accessible + +Once started, the Provider service is accessible on `localhost` port 8030/tcp. Run the following command to access the Provider. The output should be similar to the one displayed here. + +```bash +$ curl localhost:8030 +{"chainIds":[5,80001],"providerAddresses":{"5":"0x00c6A0BC5cD0078d6Cd0b659E8061B404cfa5704","80001":"0x4256Df50c94D9a7e04610976cde01aED91eB531E"},"serviceEndpoints":{"computeDelete":["DELETE","/api/services/compute"],"computeEnvironments":["GET","/api/services/computeEnvironments"],"computeResult":["GET","/api/services/computeResult"],"computeStart":["POST","/api/services/compute"],"computeStatus":["GET","/api/services/compute"],"computeStop":["PUT","/api/services/compute"],"create_auth_token":["GET","/api/services/createAuthToken"],"decrypt":["POST","/api/services/decrypt"],"delete_auth_token":["DELETE","/api/services/deleteAuthToken"],"download":["GET","/api/services/download"],"encrypt":["POST","/api/services/encrypt"],"fileinfo":["POST","/api/services/fileinfo"],"initialize":["GET","/api/services/initialize"],"initializeCompute":["POST","/api/services/initializeCompute"],"nonce":["GET","/api/services/nonce"],"validateContainer":["POST","/api/services/validateContainer"]},"software":"Provider","version":"2.0.2"} +``` + + +#### 7. Check Provider service logs + +If needed, use docker CLI to check provider service logs. + +First, identify the container id: + +```bash +$ docker ps +CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES +594415b13f8c oceanprotocol/provider-py:v2.0.2 "/ocean-provider/doc…" 12 minutes ago Up About a minute 0.0.0.0:8030->8030/tcp, :::8030->8030/tcp provider + +``` + +Then, check the logs from the Provider's docker container: + +```bash +$ docker logs --follow provider +[2023-06-14 09:31:02 +0000] [8] [INFO] Starting gunicorn 20.0.4 +[2023-06-14 09:31:02 +0000] [8] [INFO] Listening at: http://0.0.0.0:8030 (8) +[2023-06-14 09:31:02 +0000] [8] [INFO] Using worker: sync +[2023-06-14 09:31:02 +0000] [10] [INFO] Booting worker with pid: 10 +2023-06-14 09:31:02 594415b13f8c rlp.codec[10] DEBUG Consider installing rusty-rlp to improve pyrlp performance with a rust based backend +2023-06-14 09:31:12 594415b13f8c ocean_provider.run[10] INFO incoming request = http, GET, 172.18.0.1, /? +2023-06-14 09:31:12 594415b13f8c ocean_provider.run[10] INFO root endpoint called +2023-06-14 09:31:12 594415b13f8c ocean_provider.run[10] INFO root endpoint response = +[2023-06-14 09:41:53 +0000] [8] [INFO] Starting gunicorn 20.0.4 +[2023-06-14 09:41:53 +0000] [8] [INFO] Listening at: http://0.0.0.0:8030 (8) +[2023-06-14 09:41:53 +0000] [8] [INFO] Using worker: sync +[2023-06-14 09:41:53 +0000] [10] [INFO] Booting worker with pid: 10 +2023-06-14 09:41:54 594415b13f8c rlp.codec[10] DEBUG Consider installing rusty-rlp to improve pyrlp performance with a rust based backend +2023-06-14 09:42:40 594415b13f8c ocean_provider.run[10] INFO incoming request = http, GET, 172.18.0.1, /? +2023-06-14 09:42:40 594415b13f8c ocean_provider.run[10] INFO root endpoint called +2023-06-14 09:42:40 594415b13f8c ocean_provider.run[10] INFO root endpoint response = + +``` + + +## Deploying Provider using Kubernetes with Docker Engine + + +In this example, we will run Provider as a Kubernetes deployment resource. We will deploy Provider for two chains: Goerli (Ethereum test network) and Mumbai (Polygon test network). Therefore, please note that in the following configuration files, "5" and "80001" are the chain IDs for Goerli and Mumbai respectively. + +### Prerequisites + +* A server for hosting Ocean Marketplace. See [this guide](setup-server.md) for how to create a server; +* Kubernetes with Docker Engine is installed and configured on the server. See [this chapter](setup-server.md#install-kubernetes-with-docker-engine) for information on installing Kubernetes. +* The RPC URLs and API keys for each of the networks to which the Provider will be connected. See[ this guide](https://app.gitbook.com/o/mTcjMqA4ylf55anucjH8/s/BTXXhmDGzR0Xgj13fyfM/\~/changes/548/developers/obtaining-api-keys-for-blockchain-access) for how to obtain the URL and the API key. +* The private key that will be used by Provider to encrypt/decrypt URLs. +* Aquarius is up and running + +### Steps + +The steps to deploy the Provider in Kubernetes are: + +[1. Create a YAML file for Provider configuration.](deploying-provider.md#1-create-a-yaml-file-for-provider-configuration) + +[2. Deploy the configuration.](deploying-provider.md#2.-deploy-the-configuration) + +[3. Create a Kubernetes service.](deploying-provider.md#3.-create-a-kubernetes-service) + + +#### 1. Create a YAML file for Provider configuration. + +From a terminal window, create a YAML file (in our example the file is named provider-deploy.yaml) then copy and paste the following content. Check the comments in the file and replace the fields with the specific values of your implementation (RPC URLs, the private key etc.). + +```yaml +apiVersion: apps/v1 +kind: Deployment +metadata: + labels: + app: provider + name: provider +spec: + progressDeadlineSeconds: 2147483647 + replicas: 1 + revisionHistoryLimit: 2147483647 + selector: + matchLabels: + app: provider + strategy: + rollingUpdate: + maxSurge: 25% + maxUnavailable: 25% + type: RollingUpdate + template: + metadata: + labels: + app: provider + spec: + containers: + - env: + - name: ARTIFACTS_PATH + value: /ocean-provider/artifacts + - name: NETWORK_URL + value: | + {"5":"https://goerli.infura.io/v3/","80001":"https://polygon-mumbai.infura.io/v3/"} + - name: PROVIDER_PRIVATE_KEY + value: | + {"5":"","80001":""} + - name: LOG_LEVEL + value: DEBUG + - name: OCEAN_PROVIDER_URL + value: http://0.0.0.0:8030 + - name: OCEAN_PROVIDER_WORKERS + value: "4" + - name: IPFS_GATEWAY + value: < your IPFS gateway > + - name: OCEAN_PROVIDER_TIMEOUT + value: "9000" + - name: OPERATOR_SERVICE_URL + value: < Operator service URL> + - name: AQUARIUS_URL + value: < Aquarius URL > + - name: UNIVERSAL_PRIVATE_KEY + value: + - name: REQUEST_TIMEOUT + value: "10" + image: oceanprotocol/provider-py:latest => (check on https://hub.docker.com/r/oceanprotocol/provider-py for specific tag) + imagePullPolicy: Always + name: provider + ports: + - containerPort: 8030 + protocol: TCP + resources: + limits: + cpu: 500m + memory: 700Mi + requests: + cpu: 500m + memory: 700Mi + terminationMessagePath: /dev/termination-log + terminationMessagePolicy: File + dnsPolicy: ClusterFirst + restartPolicy: Always + schedulerName: default-scheduler + terminationGracePeriodSeconds: 30 +``` + +Tip: before deployment, you can [validate](https://github.com/instrumenta/kubeval) the yaml file. + + +#### 2. Deploy the configuration + +Deploy the configuration in Kubernetes using the following commands. + +```bash +kubectl config set-context --current --namespace ocean +kubectl apply -f provider-deploy.yaml +deployment.apps/provider created + +kubectl get pod -l app=provider +NAME READY STATUS RESTARTS AGE +provider-865cb8cf9d-r9xm4 1/1 Running 0 67s +``` + + +#### 3. Create a Kubernetes service + +The next step is to create a Kubernetes service (eg. ClusterIP, NodePort, Loadbalancer, ExternalName) for this deployment, depending on the environment specifications. Follow [this link](https://kubernetes.io/docs/concepts/services-networking/service/) for details on how to create a Kubernetes service. + diff --git a/infrastructure/setup-server.md b/infrastructure/setup-server.md new file mode 100644 index 00000000..c5240239 --- /dev/null +++ b/infrastructure/setup-server.md @@ -0,0 +1,122 @@ +--- +description: >- + The following tutorial shows how to create a server ready for hosting Ocean + Protocol's components. +--- + +# Setup a Server + +Each deployment of the Ocean components starts with setting up a server on which these will be installed, either on-premise or hosted in a cloud platform. + +## Prerequisites + +For simple configurations: + +* Operating System: Linux distribution supported by the Docker Engine and Docker Compose products. Please refer to these links for choosing a compatible operating system: [Docker Compose supported platforms](https://docs.docker.com/desktop/install/linux-install/); [Docker Engine supported platforms](https://docs.docker.com/engine/install/). + +For complex configurations: + +* Operating System: Linux distribution supported by Kubernetes and Docker Engine. Please refer to this link for details: [Kubernetes with Docker Engine](https://kubernetes.io/docs/setup/production-environment/container-runtimes/#docker). + + + +## Server Size + +The required CPU and memory for the server depend on the number of requests the component is expected to serve, however, the minimum configuration of the server is: + +* 1 core +* 1 GB RAM + +## Steps + +The steps for setting up a server on which to deploy the Ocean components are the following: + +For simple configurations: + +1. [Install the operating system](setup-server.md#install-the-operating-system) +2. [Install Docker Engine and Docker Compose](setup-server.md#install-docker-engine-and-docker-compose) + + + +For complex configurations: + +1. [Install the operating system](setup-server.md#install-the-operating-system) +2. [Install Kubernetes with Docker Engine](setup-server.md#install-kubernetes-with-docker-engine) + +### Install the operating system + +As mentioned earlier, you can use either an on-premise server or one hosted in the cloud (AWS, Azure, Digitalocean, etc.). To install the operating system on an on-premise server, please refer to the installation documentation of the operating system. + +If you choose to use a server hosted in the cloud, you need to create the server using the user interface provided by the cloud platform. Following is an example of how to create a server in Digitalocean. + +#### Example: Create an Ubuntu Linux server in the Digitalocean cloud + +1. Create an account and set billing + +Go to [https://www.digitalocean.com/](https://www.digitalocean.com/) and create an account. Provide the appropriate information for billing and accounting. + +2. Create a server + +Click on **`Create`** button and choose **`Droplets`** options from dropdown. + +

Select Droplet

+ + + +3. Select a server configuration + +Select Ubuntu OS, and choose a plan and a configuration. + +

Configure the server

+ +### + +4. Select the region and set the root password + +Select the region where you want the component to be hosted and a root password. + +

Select the region and set the root password

+ + + +5. Finish the configuration and create the server + +Specify a hostname for the server, specify the project to which you assign the server, and then click on `Create Droplet.` + +

Finalize and create the server

+ +6. Access the server's console + +After the server is ready, select the `Access console` option from the dropdown list to open a terminal window. + +

Access the server's console

+ +### Install Docker Engine and Docker Compose + +From the terminal window, run the following commands to install Docker and Docker Compose. + +```bash +sudo apt-get update +sudo apt-get install ca-certificates curl gnupg lsb-release +sudo mkdir -p /etc/apt/keyrings +curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg +echo \ + "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \ + $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null +sudo apt-get update +sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin + +# Now install docker-compose +sudo apt-get update +sudo apt-get install docker-compose-plugin +``` + +### Install Kubernetes with Docker Engine + +Kubernetes is an orchestration engine for containerized applications and the initial setup is dependent on the platform on which it is deployed - presenting how this product must be installed and configured is outside the scope of this document. + +For cloud deployment, most of the cloud providers have dedicated turnkey solutions for Kubernetes. A comprehensive list of such cloud providers is presented [here](https://kubernetes.io/docs/setup/production-environment/turnkey-solutions/). + +For an on-premise deployment of Kubernetes, please refer to this [link](https://kubernetes.io/docs/setup/). + +Now that the execution environment is prepared and the prerequisites installed, we can proceed to deploy the Ocean's components. diff --git a/orientation/images/confirm-backup-phrase (1).png b/orientation/images/confirm-backup-phrase (1).png deleted file mode 100644 index 5fa40a1c..00000000 Binary files a/orientation/images/confirm-backup-phrase (1).png and /dev/null differ diff --git a/orientation/images/confirm-backup-phrase (2).png b/orientation/images/confirm-backup-phrase (2).png deleted file mode 100644 index 5fa40a1c..00000000 Binary files a/orientation/images/confirm-backup-phrase (2).png and /dev/null differ diff --git a/orientation/images/confirm-backup-phrase.png b/orientation/images/confirm-backup-phrase.png deleted file mode 100644 index 5fa40a1c..00000000 Binary files a/orientation/images/confirm-backup-phrase.png and /dev/null differ diff --git a/orientation/images/create-new-metamask-wallet (1).png b/orientation/images/create-new-metamask-wallet (1).png deleted file mode 100644 index f53a81cd..00000000 Binary files a/orientation/images/create-new-metamask-wallet (1).png and /dev/null differ diff --git a/orientation/images/create-new-metamask-wallet (2).png b/orientation/images/create-new-metamask-wallet (2).png deleted file mode 100644 index f53a81cd..00000000 Binary files a/orientation/images/create-new-metamask-wallet (2).png and /dev/null differ diff --git a/orientation/images/create-new-metamask-wallet.png b/orientation/images/create-new-metamask-wallet.png deleted file mode 100644 index f53a81cd..00000000 Binary files a/orientation/images/create-new-metamask-wallet.png and /dev/null differ diff --git a/orientation/images/manage-tokens (1).png b/orientation/images/manage-tokens (1).png deleted file mode 100644 index 09a6f4c4..00000000 Binary files a/orientation/images/manage-tokens (1).png and /dev/null differ diff --git a/orientation/images/manage-tokens (2).png b/orientation/images/manage-tokens (2).png deleted file mode 100644 index 09a6f4c4..00000000 Binary files a/orientation/images/manage-tokens (2).png and /dev/null differ diff --git a/orientation/images/manage-tokens.png b/orientation/images/manage-tokens.png deleted file mode 100644 index 09a6f4c4..00000000 Binary files a/orientation/images/manage-tokens.png and /dev/null differ diff --git a/orientation/images/metamask-add-network (1).png b/orientation/images/metamask-add-network (1).png deleted file mode 100644 index 7b756c36..00000000 Binary files a/orientation/images/metamask-add-network (1).png and /dev/null differ diff --git a/orientation/images/metamask-add-network (2).png b/orientation/images/metamask-add-network (2).png deleted file mode 100644 index 7b756c36..00000000 Binary files a/orientation/images/metamask-add-network (2).png and /dev/null differ diff --git a/orientation/images/metamask-add-network.png b/orientation/images/metamask-add-network.png deleted file mode 100644 index 7b756c36..00000000 Binary files a/orientation/images/metamask-add-network.png and /dev/null differ diff --git a/orientation/images/metamask-browser-extension (1).png b/orientation/images/metamask-browser-extension (1).png deleted file mode 100644 index 7f590505..00000000 Binary files a/orientation/images/metamask-browser-extension (1).png and /dev/null differ diff --git a/orientation/images/metamask-browser-extension (2).png b/orientation/images/metamask-browser-extension (2).png deleted file mode 100644 index 7f590505..00000000 Binary files a/orientation/images/metamask-browser-extension (2).png and /dev/null differ diff --git a/orientation/images/metamask-browser-extension.png b/orientation/images/metamask-browser-extension.png deleted file mode 100644 index 7f590505..00000000 Binary files a/orientation/images/metamask-browser-extension.png and /dev/null differ diff --git a/orientation/images/metamask-chrome-extension (1).png b/orientation/images/metamask-chrome-extension (1).png deleted file mode 100644 index af811b08..00000000 Binary files a/orientation/images/metamask-chrome-extension (1).png and /dev/null differ diff --git a/orientation/images/metamask-chrome-extension (2).png b/orientation/images/metamask-chrome-extension (2).png deleted file mode 100644 index af811b08..00000000 Binary files a/orientation/images/metamask-chrome-extension (2).png and /dev/null differ diff --git a/orientation/images/metamask-chrome-extension.png b/orientation/images/metamask-chrome-extension.png deleted file mode 100644 index af811b08..00000000 Binary files a/orientation/images/metamask-chrome-extension.png and /dev/null differ diff --git a/orientation/images/secret-backup-phrase (1).png b/orientation/images/secret-backup-phrase (1).png deleted file mode 100644 index 04a2a278..00000000 Binary files a/orientation/images/secret-backup-phrase (1).png and /dev/null differ diff --git a/orientation/images/secret-backup-phrase (2).png b/orientation/images/secret-backup-phrase (2).png deleted file mode 100644 index 04a2a278..00000000 Binary files a/orientation/images/secret-backup-phrase (2).png and /dev/null differ diff --git a/orientation/images/secret-backup-phrase.png b/orientation/images/secret-backup-phrase.png deleted file mode 100644 index 04a2a278..00000000 Binary files a/orientation/images/secret-backup-phrase.png and /dev/null differ diff --git a/rewards/README.md b/rewards/README.md new file mode 100644 index 00000000..e326b29c --- /dev/null +++ b/rewards/README.md @@ -0,0 +1,49 @@ +--- +description: Learn how to generate OCEAN rewards by using our Data Farming dApp +cover: ../.gitbook/assets/cover/rewards_banner.png +coverY: 0 +--- + +# 💰 Rewards + +### Why did we create the Data Farming dApp? + +The purpose of Ocean Protocol's Data Farming dApp reward system is to incentivize the curation and publishing of high-quality data NFTs in the Ocean Ecosystem. Data Farming participants earn OCEAN rewards for these activities. At a minimum, Data Farmers earn "passive rewards" for locking their OCEAN tokens to get veOCEAN tokens in return. Then, Data Farmers can maximize their yield by earning "active rewards" using their veOCEAN tokens. Active rewards are generated by participants voting on their favorite Ocean ecosystem NFTs by allocating their veOCEAN to these assets, thus gaining a portion of these asset sales. + +### The belt system + +Earn your white, blue, purple, brown, and black belts in Data Farming knowledge by reading our docs on this topic in increasing difficulty! + +## veOCEAN + +Learning about [veOCEAN](veocean.md) will help you answer the question "What is the purpose of holding veOCEAN?" & give insights on how veOCEAN (vote-escrowed OCEAN) works. It will teach you everything you need to know about why it exists and how it works. + +You will learn that by just holding veOCEAN passively, you are able to rewards. + +veOCEAN is a fork of veCRV. This enables participants to become a governance delegate, and eligible to receive rewards and engage with different protocol mechanisms. + +## Data Farming + +![DF Rewards Page](../.gitbook/assets/rewards/df_rewards_page.png) + +[Data Farming 101](df-intro.md) introduces the different reward systems, how they work, and how to access them. By the end of the page, you should be more familiar with how Data Farming works and able to take next steps to curate assets. + +[Data Farming Background](df-max-out-yield.md) will provide you with more intuitions about Data Farming, briefly explain the Reward Function, and how the program evolved over time. + +## Delegation + +[Delegation](../user-guides/how-to-data-farm.md#how-to-delegate-your-active-rewards) will teach you how to share your veOCEAN allocation power to other users who can manage Data Farming for you. + +Once delegated, rewards will be sent to the wallet address you delegated to. The delegation receiver is in charge of your rewards and the process of returning those back to you. + +## Further Reading + +Finally, if you want to continue expanding your knowledge on OCEAN token emissions, APY estimates, and get useful answers to some of the most common questions, you can read the following: + +[Emissions & APYs](df-emissions-apys.md) will provide you will information about how OCEAN will be released over time through the Data Farming program and provide you with APY studies. + +Our [FAQ](../discover/faq.md) answers many different questions about staking, chains, deployments, and other details that may be valuable to you. + +## Reference + +All content within has been assembled via reference of the [Ocean Data Farming Series](https://blog.oceanprotocol.com/ocean-data-farming-series-c7922f1d0e45), official [Ocean Protocol github repositories](https://github.com/oceanprotocol/), and [v4 Whitepapers](https://oceanprotocol.com/tech-whitepaper.pdf). diff --git a/rewards/df-basic.md b/rewards/df-basic.md new file mode 100644 index 00000000..8882562c --- /dev/null +++ b/rewards/df-basic.md @@ -0,0 +1,27 @@ +--- +description: Learn the basic moves to start kicking a** Data Farming +--- + +# DF Basic Actions (Blue Belt) + +

Like Neo, you have great potential.

+ +### Get Started + +Our [User Guides](../user-guides/README.md) get you started Data Farming quickly to do the basic operations. Follow these guides to earn your blue belt in Data Farming understanding. + +{% content-ref url="../user-guides/get-started-df.md" %} +[get-started-df.md](../user-guides/get-started-df.md) +{% endcontent-ref %} + +{% content-ref url="../user-guides/how-to-data-farm.md" %} +[how-to-data-farm.md](../user-guides/how-to-data-farm.md) +{% endcontent-ref %} + +{% content-ref url="../user-guides/claim-ocean-rewards.md" %} +[claim-ocean-rewards.md](../user-guides/claim-ocean-rewards.md) +{% endcontent-ref %} + +### Not much of a reader? Watch and learn, friend + +{% embed url="https://youtu.be/zAQlPHkK3og" %} diff --git a/rewards/df-emissions-apys.md b/rewards/df-emissions-apys.md new file mode 100644 index 00000000..08e7deae --- /dev/null +++ b/rewards/df-emissions-apys.md @@ -0,0 +1,60 @@ +--- +description: >- + Hey there, Bruce Lee! If you can understand the emission curves and estimated + APYs, then you've earned yourself a solid black belt in Data Farming + understanding 🥋 +--- + +# DF Emissions & APYs (Black Belt) + +

Like a true master of The Way of Data Farming.

+ +### Why veOCEAN is important to OceanDAO + +veOCEAN enables OceanDAO to be more user-focused, community-driven, and futuristic revenue-sharing like CurveDAO: + +* ve (vote escrowed) is at the heart with v = voting (in asset curation) and e = escrowed (locked) OCEAN. The longer that Data Farmers lockup their OCEAN, the more voting and rewards, which reconciles near and long-term DAO incentives. +* OceanDAO has an increased bias to automation and to minimizing the governance attack (hack) surface via veOCEAN. + +The baseline emissions schedule determines the weekly OCEAN budget for this phase. The schedule mimics Bitcoin when including a half-life of 4 years. Unlike Bitcoin, there is a burn-in period to ratchet up value-at-risk versus time: + +* The curve initially gets a multiplier of 10% for 12 months (DF Main 1) +* Then, it transitions to multiplier of 25% for 6 months (DF Main 2) +* Further, a multiplier of 50% for 6 months (DF Main 3) +* Finally, a multiplier of 100%. (DF Main 4) + +We implement the first three phases as constants, because they are relatively short in duration. We implement the fourth phase as a Bitcoin-style exponential: constant, with the constant dividing by two (“halvening”) every four years. + +Let’s visualize! + +## Emissions — first 5 years. + +The image below shows the first 5 years. The y-axis is OCEAN released each week. It’s log-scaled to easily see the differences. The x-axis is time, measured in weeks. In weeks 0–29, we can see the distinct phases for DF Alpha (DF1 // week 0), DF/VE Alpha (DF5 // week 4), DF Beta (DF9 // week 8), DF Main 1 (DF29 // week 28), DF Main 2 (DF80 // week 79), DF Main 3 (DF106 // week 105), and DF Main 4 (DF132 // week 131). + +

OCEAN released to DF per week — first 5 years

+ +## Emissions — first 20 years. + +The image below is like the previous one: OCEAN released per week, but now for the first 20 years. Week 131 onwards is DF Main 4. We can see that the y-value divides by two (“halvens”) every four years. + +

OCEAN released to DF per week — first 20 years

+ +## Total OCEAN released. + +The image below shows the total OCEAN released by DF for the first 20 years. The y-axis is log-scaled to capture both the small initial rewards and exponentially larger values later on. The x-axis is also log-scaled so that we can more readily see how the curve converges over time. + +

Total OCEAN released to DF — first 20 years

+ +## Example APYs + +The plot below shows estimated APY over time. Green includes both passive and active rewards; black is just passive rewards. As of DF29, wash consume is no longer profitable, so we should expect a large drop in DCV and therefore in active rewards. So passive rewards (black) provides a great baseline with upside in active rewards (green). + +APYs are an estimate because APY depends on OCEAN locked. OCEAN locked for future weeks is not known precisely; it must be estimated. The yellow line is the model for OCEAN locked. We modeled OCEAN locked by observing linear growth from week 5 (when OCEAN locking was introduced) to week 28 (now): OCEAN locked grew from 7.89M OCEAN to 34.98M OCEAN respectively, or 1.177M more OCEAN locked per week. + +

Green: estimated APYs (passive + active). Black: estimated APYs (just passive). Yellow: estimated staking

+ +All the plots are calculated from [this Google Sheet](https://docs.google.com/spreadsheets/d/1F4o7PbV45yW1aPWOJ2rwZEKkgJXbIk5Yq7tj8749drc/edit#gid=1051477754). + +OCEAN lock time affects APY. The numbers above assume that all locked OCEAN is locked for 4 years, so that 1 OCEAN → 1 veOCEAN. But APY could be much worse or more if you lock for shorter durations. Here are approximate bounds. + +If you lock for 4 years, and everyone else locks for 2, then multiply expected APY by 2. If you lock for 4 years and others for 1, then multiply by 4. Conversely, if you lock for 2 years and everyone else for 4, then divide your expected APY by 2. If you lock for 1 year and others for 4, then divide by 4. The numbers assume that you’re actively allocating veOCEAN allocation towards high-DCV data assets. For passive locking or low-DCV data assets, divide APY by 2 (approximate). diff --git a/rewards/df-intro.md b/rewards/df-intro.md new file mode 100644 index 00000000..bacddba4 --- /dev/null +++ b/rewards/df-intro.md @@ -0,0 +1,59 @@ +--- +description: Learn the fundamentals of The Way of Data Farming 🧑‍🏫 +--- + +# Data Farming 101 (White Belt) + +

Meet your sensei.

+ +### What is Data Farming? + +Data Farming (DF) is Ocean Protocol's **incentive system for curating and publishing valuable assets in the Ocean ecosystem.** Participants vote on the Ocean ecosystem assets that they believe are high quality and likely to sell. If they are right, then these Data Farmers **get a portion of the sales of these Ocean ecosystem assets** they voted on! + +(If you are familiar with 'liquidity mining', then you will find that Data Farming is similar but tuned instead for the curation of high quality assets in the Ocean ecosystem.) + +### What's the difference between Data Farming and Yield Farming? + +Unlike yield farming in DeFi, data farming has real intrinsic utility for Ocean Protocol stakeholders: as Data Farmers determine which are the highest quality assets in the Ocean ecosystem to purchase, then the Data Farmers earn active OCEAN rewards when these assets sell. It's this **curation of the "best" assets in the Ocean ecosystem** that shortens the search times for those looking to shop for assets in the Ocean ecosystem. We also put in place an incentive system for Publishers of assets to gain **2x the rewards** in Data Farming, thus driving forward the addition of great assets in the Ocean ecosystem. + +### Passive and Active Rewards + +Every week OCEAN rewards are paid out to Data Farmers in two different ways: **passive** rewards and **active** rewards. The two reward functions produce different variable APYs. + +#### What are Passive Rewards? + +Passive rewards are the OCEAN rewards paid to Data Farmers just for locking their OCEAN tokens. + +
+ +[To start getting passive rewards, go here.](../user-guides/get-started-df.md) + +#### What are Active Rewards? + +Active rewards are OCEAN rewards paid to Data Farmers that allocate their veOCEAN tokens to Ocean ecosystem assets. They're called Active rewards because the amount of rewards relies on the active participation of the Data Farmer to select and allocate veOCEAN to these assets. **Active rewards yield depends on the sales of allocated assets.** No sales = no rewards, so choose your favorites wisely & then allocate. Always DYOR. + +Active rewards are governed and defined by the [Reward Function](df-max-out-yield.md#reward-schedule). + +[To start getting active rewards, go here.](../user-guides/how-to-data-farm.md) + +#### Splitting the Pie + +Each Data Farming weekly round has a pool of OCEAN rewards, and 50% of the pool is paid out in the form of passive rewards & 50% in the form of active rewards. + +| Passive Rewards | Active Rewards | +| --------------- | -------------- | +| 50% | 50% | + +### What are Publisher Rewards? + +

Publishing makes you *more* OCEAN rewards

+ +Data Farming strongly incentivizes publishing assets in the Ocean ecosystem by giving double the active rewards to Data Farmers that allocate to their own published assets. + +How is it calculated? _All the veOCEAN a Data Farmer has allocated to an asset they’ve published is **doubled for the rewards calculation.**_ + +You can read more about the implementation [in this blog post](https://blog.oceanprotocol.com/data-farming-publisher-rewards-f2639525e508). + +## [GET STARTED DATA FARMING HERE.](https://df.oceandao.org) + +### Unsure how? Our [guides](../user-guides/README.md) will show you how to do the basics. diff --git a/rewards/df-max-out-yield.md b/rewards/df-max-out-yield.md new file mode 100644 index 00000000..d90a9a15 --- /dev/null +++ b/rewards/df-max-out-yield.md @@ -0,0 +1,97 @@ +--- +description: >- + If you've gotten this far, then you're half way to getting a black belt in + Ocean Protocol's Data Farming dApp! 🥋 +--- + +# DF Max Out Yield (Purple Belt) + +

You know enough to be dangerous.

+ +### How to Maximize Your Yield + +If you only lock your OCEAN tokens to get passive yield, then you're leaving money on the table. Data Farming rewards farmers that allocate their veOCEAN tokens to assets that **generate revenue** in the Ocean ecosystem. (No revenue, no rewards.) In addition, Data Farming incentivizes **publishing** assets in the Ocean ecosystem too - you get **2x the allocation power** when you allocate to an asset that you publish! + +Thus, if you really want to max out your APY: + +1. Lock your OCEAN for veOCEAN to claim weekly Passive Rewards +2. Create & publish assets (and make $ in selling them) — or work with people who can +3. Lock OCEAN and stake veOCEAN on your published assets for weekly Active Rewards +4. Claim the rewards and compound + +#### Don't have time to publish your own datasets? + +Another way to improve your yield is by [delegating](../user-guides/how-to-data-farm.md#how-to-delegate-your-active-rewards) your veOCEAN to someone to generate Active Rewards for you! In this case, the idea is that they may do a better job at publishing assets or picking winners better than you can. However, there is some risk to this because the rewards generated will be sent to the person you delegated to, and it's their responsibility to return those rewards back to you if that's the agreement you both made. To read more, see our [info on Delegation](../user-guides/how-to-data-farm.md#how-to-delegate-your-active-rewards). + +### Those assets don't sell themselves! + +Marketing your assets to buyers is your challenge. Just because you publish them in the Ocean ecosystem doesn't mean that they will sell. It will take real work. Your reward is great APY. Its incentives all the way down 🙂 + +
+ +### Measuring Data Farming's Success + +**Data Consume Volume (DCV)** is our term for **the total $ amount spent on purchases of Ocean ecosystem assets**, transaction fees, and more. The higher DCV of Ocean ecosystem assets, then the more OCEAN rewards are distributed to Data Farmers. It's that simple! + +### How Rewards are Calculated + +The Reward Function (RF) governs how active rewards are allocated to Data Farmers. + +**Rewards are calculated as follows:** + +1. Distribute OCEAN across each asset **based on rank**: highest-DCV asset gets the most OCEAN, etc. +2. For each asset and each veOCEAN holder: If the holder is a publisher, 2x the effective allocation – Baseline rewards = (% allocation in asset) \* (OCEAN for an asset) – Bound rewards to the asset by 125% APY – Bound rewards by asset’s DCV \* 0.1%. + +For mathematicians and coders, you can find this code inside [calcrewards.py](https://github.com/oceanprotocol/df-py/blob/main/df_py/volume/calc_rewards.py) in the Ocean Protocol [df-py repo](https://github.com/oceanprotocol/df-py/)! + +### What are Ranked Rewards? + +In Data Farming Round 23 Ranked Rewards were introduced to smooth out the reward distribution by using a logarithmic function. + +**Since rewards are distributed across the Top 100 assets, all data farmers (Publishers & Curators) are now incentivized to support a broader range of assets rather than optimizing on a single asset.** + +At the top-end, this helps increase the quality and diversification of inventory. + +At the bottom-end, this eliminates some potential free-rider issues and smooths out the reward distribution. + +![Ranked Rewards](../.gitbook/assets/rewards/ranked_rewards_study.png) + +You can read more about the why, what, and how of Ranked Rewards [in this blog post](https://blog.oceanprotocol.com/data-farming-df22-completed-df23-started-reward-function-tuned-ffd4359657ee) and find the full study [in these slides](https://docs.google.com/presentation/d/1HIA2zV8NUPpCELmi2WFwnAbHmFFrcXjNQiCpEqJ2Jdg/). + +### Assets that Qualify for Data Farming + +Data assets that have veOCEAN allocated towards them get Data Farming active rewards. + +The asset may be of any type — dataset, an algorithm for Compute-to-Data, or any other Datatoken token-gated system. The asset may be fixed price or free price. If fixed price, any token of exchange is alright (OCEAN, H2O, USDC, etc). + +To qualify for DF, an asset must also: + +* Have been created by Ocean Smart contracts [deployed](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json) by OPF to [production networks](../discover/networks/README.md) +* The asset must be listed on Ocean Market +* Can’t be in [purgatory](https://github.com/oceanprotocol/list-purgatory/blob/main/policies/README.md) + +### A Brief History of Data Farming + +Data Farming has evolved over time and will continue to do so as the Emission Curve progresses. Below are the phases and parameters incurred during the evolution of the Data Farming program. We are now in the DF Main phase. + +**DF Alpha - Rounds 1-4 (4 wks)**\ +10K OCEAN rewards were budgeted per week. Counting started Thu June 16, 2022 and ended July 13, 2022. Rewards were distributed at the end of every week, for the activity of the previous week. It ran for 4 weeks. The aim was to test technology, learn, and onboard data publishers. + +**DF/VE Alpha - Rounds 5-8 (4 wks)**\ +10K OCEAN rewards were budgeted per week. Counting started Thu Sep 29, 2022 and ended Oct 27, 2022. Rewards were distributed at the end of every week, for the activity of the previous week. It ran for 4 weeks. The aim was to resume Data Farming along with veOCEAN, test the technology, onboard data publishers, and keep learning. + +**DF Beta - Rounds 9-28 (20 wks)**\ +Up to 100K OCEAN rewards were budgeted per week. Counting started Thu Oct 27, 2022, and ended on March 15, 2023. It ran for 20 weeks. The aim was to test the effect of larger incentives, and support ecosystem participation, while continually refining the underlying technology. + +**DF Main - Rounds 29-1000+**\ +We are now in DF Main which immediately followed the release of DF Beta on Thu Mar 16, 2023. Rewards begin at 150k per week and goes up to 1.1M OCEAN per week. DF Main emits 503.4M OCEAN worth of rewards and lasts for decades. + +The amount of OCEAN released is determined by the emission schedule as defined by the [Emission Curve](df-emissions-apys.md#emissions--first-5-years), and perhaps more easily understood in the Reward Schedule below. + +### Reward Schedule + +The table below shows the total amount of OCEAN rewards that will be distributed among Passive and Active rewards each week. The table cross-references DF Round Number, Start Date, Phase & Week, Sub-Phase & Week, and OCEAN Rewards/Week. + +

Ocean Reward Schedule for the next 20+ years

+ +
diff --git a/rewards/images/DF-Grid.png b/rewards/images/DF-Grid.png deleted file mode 100644 index 5820084c..00000000 Binary files a/rewards/images/DF-Grid.png and /dev/null differ diff --git a/rewards/images/Rewards-Page.png b/rewards/images/Rewards-Page.png deleted file mode 100644 index 4ca998a8..00000000 Binary files a/rewards/images/Rewards-Page.png and /dev/null differ diff --git a/rewards/images/veOCEAN-After-Lock.png b/rewards/images/veOCEAN-After-Lock.png deleted file mode 100644 index 6f03a021..00000000 Binary files a/rewards/images/veOCEAN-After-Lock.png and /dev/null differ diff --git a/rewards/images/veOCEAN-Before-Lock.png b/rewards/images/veOCEAN-Before-Lock.png deleted file mode 100644 index 9949800e..00000000 Binary files a/rewards/images/veOCEAN-Before-Lock.png and /dev/null differ diff --git a/rewards/images/veOCEAN-DF-Homepage.png b/rewards/images/veOCEAN-DF-Homepage.png deleted file mode 100644 index 3e5bcae8..00000000 Binary files a/rewards/images/veOCEAN-DF-Homepage.png and /dev/null differ diff --git a/rewards/images/vedf_youtube_thumbnail.png b/rewards/images/vedf_youtube_thumbnail.png deleted file mode 100644 index 455ddddd..00000000 Binary files a/rewards/images/vedf_youtube_thumbnail.png and /dev/null differ diff --git a/rewards/veOcean-Data-Farming-Tutorial.md b/rewards/veOcean-Data-Farming-Tutorial.md deleted file mode 100644 index c18c8566..00000000 --- a/rewards/veOcean-Data-Farming-Tutorial.md +++ /dev/null @@ -1,83 +0,0 @@ ---- -description: Follow this step-by-step guide to get OCEAN rewards. ---- -# Tutorial: How To Earn OCEAN Rewards - -There are two types of OCEAN rewards: passive and active rewards. OCEAN token holders may generate passive OCEAN rewards by locking up OCEAN in exchange for veOCEAN tokens. veOCEAN tokens can then be allocated to Ocean Market datasets and algorithms to generate active OCEAN rewards. - -[![veOCEAN and Data Farming Rewards](./images/vedf_youtube_thumbnail.png)](https://www.youtube.com/watch?v=zAQlPHkK3og) - -## How do I get rewards? - -To generate rewards, start by navigating to [df.oceandao.org](https://df.oceandao.org). At the top of this page is the weekly round of OCEAN rewards and the quantity of OCEAN rewards to be distributed. The countdown timer shows the time until **each Thursday** when rewards are distributed. OCEAN rewards can be claimed every Thursday on the [Rewards page](https://df.oceandao.org/rewards). - -- If you don't have veOCEAN tokens, then click the [Get veOCEAN](https://df.oceandao.org/veocean) button on the Passive Rewards panel on the left side of the page to navigate to the veOCEAN tab. - -- If you already have veOCEAN tokens, then on the Active Rewards panel on the right side of the page, click the [Set Allocations](https://df.oceandao.org/data) button to navigate to the Data Farming tab. - -![](./images/veOCEAN-DF-Homepage.png) - -## veOCEAN - -### Passive Rewards - -When a user locks their OCEAN tokens for a finite period of time, they get veOCEAN tokens in return. Based on the quantity of veOCEAN, the user accumulates weekly OCEAN rewards. Because rewards are generated without human intervention, these are called "passive" OCEAN rewards. OCEAN rewards are claimable every Thursday on the [Rewards page](https://df.oceandao.org/rewards). - -#### How do I get veOCEAN? - -After navigating to the [veOCEAN page](https://df.oceandao.org/veocean), you can generate passive OCEAN rewards by locking OCEAN tokens on the "Lock OCEAN, get veOCEAN" panel on the right side of the page. Connect a wallet to see the balance of OCEAN tokens update above the OCEAN Amount form field. Select a lock end date to see the Lock Multiplier and Receive veOcean fields update. - -**The more OCEAN tokens you lock or if you lock them for a longer period of time, then the more rewards you get!** - -Click the checkbox below the inactive pink ALLOW button, then click the activated pink ALLOW button. Sign the transaction with your wallet. Then, click the LOCK OCEAN button. Sign the transaction with your wallet. Note that all veOCEAN contracts are deployed on the Ethereum mainnet. - - -![](./images/veOCEAN-Before-Lock.png) - - -Now the OCEAN tokens are locked in exchange for veOCEAN. The left side panel called "My veOCEAN" shows the corresponding balances of OCEAN and veOCEAN. You can withdraw your OCEAN tokens on this panel when the lock time ends. - -**Note that your OCEAN tokens cannot be withdrawn until the Lock End Date!** - -![](./images/veOCEAN-After-Lock.png) - -Notice the right side panel is now titled "Update veOCEAN Lock". You can add OCEAN tokens to your lock or you can increase the Lock End Date, but you cannot shorten your Lock End Date. - -## Data Farming - -### Active Rewards - -When a user allocates veOCEAN tokens to Ocean Market projects, then weekly OCEAN rewards are given to a user based on the sales of those projects. Since these rewards depend on human intervention to decide the allocations, they are categorized as "active" rewards instead of passive rewards. OCEAN rewards are claimable every Thursday on the [Rewards page](https://df.oceandao.org/rewards). - -#### How do I allocate veOCEAN? - -You can generate active OCEAN rewards by allocating veOCEAN to various OCEAN Market projects to gain a portion of project sales. - -Click on the Data Farming tab at the top of the page to navigate to the [Data Farming page](https://df.oceandao.org/data). - -![](./images/DF-Grid.png) - -The OCEAN Market datasets and algorithms are listed in the grid. Each column is sortable, and there is a Search field on top of the grid to search for specific projects. - -It is recommended to allocate all of your veOCEAN tokens to OCEAN Market projects to generate maximum rewards. When you allocate your veOCEAN to these datasets or algorithms, then you get a portion of those projects's sales. **If you allocate your veOCEAN to a project with many other allocators, then your portion of rewards will become diluted because there are more allocators to reward. You may be interested then in allocating veOCEAN to a project with fewer allocators to generate a greater percentage of rewards. However, if a project does not sell, then no rewards are generated.** Thus, it is important to allocate veOCEAN to projects you believe in, and do your research. - -Once you allocate your percentage of veOCEAN to projects using the MyAllocation column, then click on the UPDATE ALLOCATIONS button and sign the transaction with your wallet. Note that as the OCEAN Market projects exist on different networks i.e. Ethereum, Polygon, etc. you can allocate your veOCEAN towards assets that are published on these different networks. - -## Claim Rewards - -Click on the Rewards tab at the top of the page to come to the [same page](https://test-df.oceandao.org/rewards) as at the beginning of this tutorial. Notice the balance of veOCEAN appears under the Passive Rewards panel on the left and the percentage allocated appears on the Active Rewards panel on the right. - -All rewards are paid out in OCEAN tokens. On every Thursday the pink "Claim" buttons on this page become activated, and you can claim your weekly OCEAN rewards directly into your wallet by clicking on these active buttons. - -![](./images/Rewards-Page.png) - -#### Linear Decay - -**Your balance of veOCEAN may be less than the amount when you first locked your tokens because your veOCEAN balance decreases linearly over time until the Lock End Date when you can withdraw your OCEAN tokens.** This is because rewards are designed to be paid out weekly in a decreasing amount until you unlock your OCEAN tokens entirely. The veOCEAN code is a fork of Curve's battle tested [veCRV](https://curve.readthedocs.io/dao-vecrv.html) token code. - -### Withdrawl - -After the Lock End Date, then you can withdraw your principal OCEAN tokens on the [veOCEAN page](https://df.oceandao.org/veocean) on the left side panel. - -## Learn More -If you would like to find out more details about veOCEAN, Data Farming, and rewards calculations, then please visit the About tab to read a great [blog post](https://blog.oceanprotocol.com/ocean-data-farming-series-c7922f1d0e45) on this topic. diff --git a/rewards/veocean.md b/rewards/veocean.md new file mode 100644 index 00000000..e26c76e5 --- /dev/null +++ b/rewards/veocean.md @@ -0,0 +1,132 @@ +--- +description: >- + Let's discuss the "ve" in veOCEAN for our last jutsu before earning a black + belt in Data Farming knowledge! +--- + +# DF "ve" in veOCEAN (Brown Belt) + +

Data Farming is getting effortless.

+ +### What does the "ve" in veOCEAN stand for? + +"ve" stands for **vote escrowed**. And the "vote" part of "ve" and veOCEAN is what you really need to pay attention to in order to truly understand the function of this token. + +You see, when you acquire veOCEAN via locking your OCEAN tokens in our Data Farming dApp, the intended use is to **vote on your favorite assets** in the Ocean ecosystem! + +When you allocate to assets that sell, then **you get a portion of the sales**! + +You can do this all from the Data Farming dApp [Farms page](https://df.oceandao.org/farms). + +### The Superpowers of veOCEAN + +veOCEAN allows you to engage with different Ocean Protocol mechanisms and benefit from the reward programs available. + +4 key utility functionalities of veOCEAN: + +1. **Holding it** veOCEAN pays **Passive OCEAN Rewards** every week. +2. **Allocating it** veOCEAN pays **Active OCEAN Rewards** every week to the top selling assets in the Ocean ecosystem. +3. **Delegating it** You can delegate veOCEAN to other Data Farmers who can curate Datasets for you. In return for their services, these farmers may charge you a fee for helping you receive APY on **Active Rewards**. The Delegate feature has just been recently released and enables veOCEAN holders to more easily access Active Rewards. +4. **2x Publisher Stake** If you are a publisher in the Ocean ecosystem, then allocating your veOCEAN to your own asset gives your veOCEAN **a 2x Bonus**. This is an incentive for publishers to engage with their assets and benefit the assets in the Ocean ecosystem further. + +### The Nitty Gritty of **Passive & Active Rewards** + +#### Passive Rewards from Data Farming + +veOCEAN holders get weekly Data Farming rewards with a small carveout for any Ocean Protocol Data Challenges that run through Data Farming operations. + +#### Active Rewards from Community Fees + +veOCEAN holders can generate yield completely passively if they wish, though they are incentivized with larger real yield if they **actively participate** in farming yield from assets. + +Active rewards follow the usual Data Farming formula: $ of sales of the asset \* allocation to that asset. + +There is no liquidity locked inside a datatoken pool, and this allocation is safe: you can’t lose your OCEAN as it is merely locked. + +### veOCEAN Time Locking + +Users can lock their OCEAN for different lengths of time to gain more veOCEAN **voting power**. The Data Farming dApp is designed to lock OCEAN for **a minimum of 2 weeks and a maximum of four years** (for max rewards). The longer you lock your OCEAN, the more veOCEAN + OCEAN rewards you get! + +On the dApp's [veOCEAN page](https://df.oceandao.org/veocean), the "Lock Multiplier" represents the percentage amount of veOCEAN tokens received per OCEAN token locked. + +When users commit to locking their OCEAN tokens for an extended time duration, they are rewarded with an increased amount of veOCEAN tokens. This incentivizes users to have act with strong network support and confidence in the ecosystem. + +| Year | Lock Multiplier | veOCEAN | +| ---- | --------------- | ------- | +| 1 | 0.25x | 0.25 | +| 2 | 0.50x | 0.50 | +| 3 | 0.75x | 0.75 | +| 4 | 1.00x | 1.00 | + +After choosing your lock period and locking up your OCEAN into the vault, you will be credited with veOCEAN. + +veOCEAN is non-transferable. You can’t sell it or send it to other addresses. + +### Linear Decay + +Your veOCEAN balance will slowly start declining as soon as you receive it. + +veOCEAN balance decreases linearly over time until the Lock End Date. When your lock time has lapsed by 50%, you will have 50% of your original veOCEAN balance. + +When your lock time ends your veOCEAN balance will hit 0, and your OCEAN tokens can be withdrawn. + +If you lock 1.0 OCEAN for 4 years, you get 1.0 veOCEAN at the start. + +| Years Passed | veOCEAN Left | +| ------------ | ------------ | +| 1 year | 0.75 | +| 2 years | 0.50 | +| 3 years | 0.25 | +| 4 years | 0.00 | + +At the end of your 4 years, your OCEAN is unlocked. + +#### Linear Decay + +**Your balance of veOCEAN may be less than the amount when you first locked your tokens because your veOCEAN balance decreases linearly over time until the Lock End Date when you can withdraw your OCEAN tokens.** This is because rewards are designed to be paid out weekly in a decreasing amount until you unlock your OCEAN tokens entirely. The veOCEAN code is a fork of Curve's battle-tested [veCRV](https://curve.readthedocs.io/dao-vecrv.html) token code. + +### Replenishing your veOCEAN + +You can choose to update your lock and replenish your veOCEAN balance at any time. + +To maximize rewards, participants would need to update their 4-year lock every week in order to maintain their veOCEAN balance as high as possible. + +### veOCEAN Earnings + +All earnings for veOCEAN holders are claimable in the Ethereum mainnet. (Data assets for DFing may be published in any network where Ocean’s deployed in production: ETH Mainnet, Polygon, etc.) + +Data Farming rounds occur weekly; in line with this, there’s a new [`ve`](https://github.com/oceanprotocol/df-py/tree/main/contracts/ve) distribution “epoch” every week. This affects when you can first claim rewards. Specifically, if you lock OCEAN on day x, you’ll be able to claim rewards on the first ve epoch that begins after day x+7. Put another way, from the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. (This behavior is inherited from veCRV. Here’s the [code](https://github.com/oceanprotocol/df-py/tree/main/contracts/ve) ) + +### DYOR! + +veOCEAN is architected to be locked (i.e. 'staked') for a certain period of time and cannot be transferred or sold during the lock time that is determined by each user. + +So it's important to **NOTE:** that you will not be able to retrieve your locked OCEAN tokens until the Lock End Date you selected on the dApp! + +### Withdrawal + +After the Lock End Date, then you can withdraw your principal OCEAN tokens on the [veOCEAN page](https://df.oceandao.org/veocean) on the left side panel. + +### Flow of Value + +The image below illustrates the flow of value. On the left, at time 0, the staker locks their OCEAN into the veOCEAN contract, and receives veOCEAN. In the middle, the staker receives OCEAN rewards every time there’s revenue to the Ocean Protocol Community (top), and also as part of Data Farming rewards (bottom). On the right, when the lock expires (e.g. 4 years) then the staker is able to move their OCEAN around again. + +

Flow of Value

+ +The veOCEAN design is in accordance with the Web3 Sustainability Loop, which Ocean uses as its system-level design. + +The veOCEAN code was forked from the veCRV code. veCRV parameters will be the starting point. To minimize risk, tweaks will be circumspect. + +### Where the heck did we get this idea from? + +The "veTokenomics" model of veOCEAN (vote-escrowed token economics) is inspired by Curve Finance's [veCRV](https://curve.readthedocs.io/dao-fees.html) token code. We took this inspiration to enable our users to participate in on-chain governance and earn rewards within the Ocean Protocol ecosystem. + +[Here is Ocean Protocol's open-source code](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veFeeDistributor.vy#L240-L256) for veOCEAN, and if you're a developer, then you'll notice the strong similarities to [veCRV's](https://curve.readthedocs.io/dao-fees.html) code. + +### veOCEAN's Smart Contracts Security + +[veOCEAN core contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts/ve) use [veCRV contracts](https://curve.readthedocs.io/dao-vecrv.html) with zero changes, on purpose: the veCRV contracts have been battle-tested since inception and have not had security issues. Nearly 500 million USD is locked across all forks of veCRV, with the leading DeFi protocols adopting this standard. veCRV contracts [have been audited by Trail of Bits and Quantstamp](https://github.com/curvefi/curve-dao-contracts#audits-and-security). + +We have built [a new contract](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veAllocate.sol) for users to point their veOCEAN towards given data assets (“allocate veOCEAN”). These new contracts do not control the veOCEAN core contracts at all. In the event of a breach, the only funds at risk would be the rewards distributed for a single week; and we would be able to redirect future funds to a different contract. + +We have an [ongoing bug bounty via Immunefi](https://immunefi.com/bounty/oceanprotocol/) for Ocean software, including veOCEAN and DF components. If you identify an issue, please report it there and get rewarded. diff --git a/user-guides/README.md b/user-guides/README.md new file mode 100644 index 00000000..12feaff4 --- /dev/null +++ b/user-guides/README.md @@ -0,0 +1,59 @@ +--- +description: >- + The definitive guides on how to do almost anything you want with Ocean + Protocol tech! +cover: ../.gitbook/assets/cover/user_guides_banner.png +coverY: 0 +--- + +# 📚 User Guides + +
+ +### Tokenize your cool stuff and make money 🤑 + +Buy, mint, and sell NFTs using the Ocean Market following the guides below. + +{% content-ref url="publish-data-nfts.md" %} +[publish-data-nfts.md](publish-data-nfts.md) +{% endcontent-ref %} + +{% content-ref url="buy-data-nfts.md" %} +[buy-data-nfts.md](buy-data-nfts.md) +{% endcontent-ref %} + +{% content-ref url="compute-to-data/" %} +[compute-to-data](compute-to-data/README.md) +{% endcontent-ref %} + +{% content-ref url="asset-hosting/" %} +[asset-hosting](asset-hosting/README.md) +{% endcontent-ref %} + +{% content-ref url="using-ocean-market.md" %} +[using-ocean-market.md](using-ocean-market.md) +{% endcontent-ref %} + +### Make yield from dataset and algorithm NFTs on-chain ⛓️ + +Farm data like a pro. 😎🥕 + +{% content-ref url="get-started-df.md" %} +[get-started-df.md](get-started-df.md) +{% endcontent-ref %} + +{% content-ref url="how-to-data-farm.md" %} +[how-to-data-farm.md](how-to-data-farm.md) +{% endcontent-ref %} + +{% content-ref url="claim-ocean-rewards.md" %} +[claim-ocean-rewards.md](claim-ocean-rewards.md) +{% endcontent-ref %} + +### Antique Stuff 🏺 + +Out with the old, in with the new! + +{% content-ref url="remove-liquidity-pools.md" %} +[remove-liquidity-pools.md](remove-liquidity-pools.md) +{% endcontent-ref %} diff --git a/user-guides/asset-hosting/README.md b/user-guides/asset-hosting/README.md new file mode 100644 index 00000000..1d907446 --- /dev/null +++ b/user-guides/asset-hosting/README.md @@ -0,0 +1,31 @@ +--- +description: How to host your data and algorithm NFT assets like a champ 🏆 😎 +--- + +# Host Assets + +The most important thing to remember is that wherever you host your asset... it needs to be **reachable & downloadable**. It cannot live behind a private firewall such as a private Github repo. You need to **use a proper hosting service!** + +**The URL to your asset is encrypted in the publishing process!** + +### Publish. Cool. Things. + +**If you want to publish cool things on the Ocean Marketplace, then you'll first need a place to host your assets!** You have SO many options where to host your asset including centralized and decentralized storage systems. Places to host may include: Github, IPFS, Arweave, AWS, Azure, Google Cloud, and your own personal home server (if that's you, then you probably don't need a tutorial on hosting assets). Really, anywhere with a downloadable link to your asset is fine. + +In this section, we'll walk you through three options to store your assets: Arweave (decentralized storage), AWS (centralized storage), and Azure (centralized storage). Let's goooooo! + +Read on, if you are interested in the security details! + +### Security Considerations + +{% embed url="https://media.giphy.com/media/81xwEHX23zhvy/giphy.gif" %} +These guys know what's up +{% endembed %} + +When you publish your asset as an NFT, then the URL/TX ID/CID required to access the asset is encrypted and stored as a part of the NFT's [DDO](../../developers/identifiers.md) on the blockchain. Buyers don't have access directly to this information, but they interact with the [Provider](https://github.com/oceanprotocol/provider#provider), which decrypts the DDO and acts as a proxy to serve the asset. + +We recommend implementing a security policy that allows **only the Provider's IP address to access the file** and blocks requests from other unauthorized actors is recommended. Since not all hosting services provide this feature, **you must carefully consider the security features while choosing a hosting service.** + +{% hint style="warning" %} +**Please use a proper hosting solution to keep your files.** Systems like `Google Drive` are not specifically designed for this use case. They include various virus checks and rate limiters that prevent the [`Provider`](../../developers/provider/README.md)downloading the asset once it was purchased. +{% endhint %} diff --git a/user-guides/asset-hosting/arweave.md b/user-guides/asset-hosting/arweave.md new file mode 100644 index 00000000..cc2b3e04 --- /dev/null +++ b/user-guides/asset-hosting/arweave.md @@ -0,0 +1,43 @@ +--- +description: How to use decentralized hosting for your NFT assets +--- + +# Arweave + +### Arweave + +[Arweave](https://www.arweave.org/) is a global, permanent, and decentralized data storage layer that allows you to store documents and applications forever. Arweave is different from other decentralized storage solutions in that there is only one up-front cost to upload each file. + +**Step 1 - Get a new wallet and AR tokens** + +Download & save a new wallet (JSON key file) and receive a small amount of AR tokens for free using the [Arweave faucet](https://faucet.arweave.net/). If you already have an Arweave browser wallet, you can skip to Step 3. + +At the time of writing, the faucet provides 0.02 AR which is more than enough to upload a file. + +If at any point you need more AR tokens, you can fund your wallet from one of Arweave's [supported exchanges](https://arwiki.wiki/#/en/Exchanges). + +**Step 2 - Load the key file into the arweave.app web wallet** + +Open [arweave.app](https://arweave.app/) in a browser. Select the '+' icon in the bottom left corner of the screen. Import the JSON key file from step 1. + +![Arweave.app import key file](../../.gitbook/assets/hosting/arweave-1.png) + +**Step 3 - Upload file** + +Select the newly imported wallet by clicking the "blockies" style icon in the top left corner of the screen. Select **Send.** Click the **Data** field and select the file you wish to upload. + +![Arweave.app upload file](../../.gitbook/assets/hosting/arweave-2.png) + +The fee in AR tokens will be calculated based on the size of the file and displayed near the bottom middle part of the screen. Select **Submit** to submit the transaction. + +After submitting the transaction, select **Transactions** and wait until the transaction appears and eventually finalizes. This can take over 5 minutes so please be patient. + +**Step 4 - Copy the transaction ID** + +Once the transaction finalizes, select it, and copy the transaction ID. + +![Arweave.app transaction ID](../../.gitbook/assets/hosting/arweave-3.png) + +**Step 5 - Publish the asset with the transaction ID** + +![Ocean Market - Publish with arweave transaction ID](../../.gitbook/assets/hosting/arweave-4.png) diff --git a/user-guides/asset-hosting/aws.md b/user-guides/asset-hosting/aws.md new file mode 100644 index 00000000..3125a4e2 --- /dev/null +++ b/user-guides/asset-hosting/aws.md @@ -0,0 +1,114 @@ +--- +description: How to use AWS centralized hosting for your NFT assets +--- + +# AWS + +### Amazon Web Services + +AWS provides various options to host data and multiple configuration possibilities. Publishers are required to do their research and decide what would be the right choice. The below steps provide one of the possible ways to host data using an AWS S3 bucket and publish it on Ocean Marketplace. + +**Prerequisite** + +Create an account on [AWS](https://aws.amazon.com/s3/). Users might also be asked to provide payment details and billing addresses that are out of this tutorial's scope. + +**Step 1 - Create a storage account** + +**Go to AWS portal** + +Go to the AWS portal for S3: https://aws.amazon.com/s3/ and select from the upper right corner `Create an AWS account` as shown below. + +![Click the orange create an account button](../../.gitbook/assets/hosting/aws-1.png) + +**Fill in the details** + +![Create an account - 2](../../.gitbook/assets/hosting/aws-2.png) + +**Create a bucket** + +After logging into the new account, search for the available services and select `S3` type of storage. + +![Select S3 storage](../../.gitbook/assets/hosting/aws-3.png) + +To create an S3 bucket, choose `Create bucket`. + +![Create a bucket](../../.gitbook/assets/hosting/aws-4.png) + +Fill in the form with the necessary information. Then, the bucket is up & running. + +![Check that the bucket is up and running](../../.gitbook/assets/hosting/aws-5.png) + +**Step 2 - Upload asset on S3 bucket** + +Now, the asset can be uploaded by selecting the bucket name and choosing `Upload` in the `Objects` tab. + +![Upload asset on S3 bucket](../../.gitbook/assets/hosting/aws-6.png) + +**Add files to the bucket** + +Get the files and add them to the bucket. + +The file is an example used in multiple Ocean repositories, and it can be found [here](https://raw.githubusercontent.com/oceanprotocol/c2d-examples/main/branin_and_gpr/branin.arff). + +![Upload asset on S3 bucket](../../.gitbook/assets/hosting/aws-7.png) + +The permissions and properties can be set afterward, for the moment keep them as default. + +After selecting `Upload`, make sure that the status is `Succeeded`. + +![Upload asset on S3 bucket](../../.gitbook/assets/hosting/aws-8.png) + +**Step 3 - Access the Object URL on S3 Bucket** + +By default, the permissions of accessing the file from the S3 bucket are set to private. To publish an asset on the market, the S3 URL needs to be public. This step shows how to set up access control policies to grant permissions to others. + +**Editing permissions** + +Go to the `Permissions` tab and select `Edit` and then uncheck `Block all public access` boxes to give everyone read access to the object and click `Save`. + +If editing the permissions is unavailable, modify the `Object Ownership` by enabling the ACLs as shown below. + +![Access the Object URL on S3 Bucket](../../.gitbook/assets/hosting/aws-9.png) + +**Modifying bucket policy** + +To have the bucket granted public access, its policy needs to be modified likewise. + +Note that the `` must be chosen from the personal buckets dashboard. + +```json +{ + "Version": "2012-10-17", + "Statement": [ + { + "Sid": "Public S3 Bucket", + "Principal": "*", + "Effect": "Allow", + "Action": "s3:GetObject", + "Resource": "arn:aws:s3:::/*" + } + ] +} +``` + +After saving the changes, the bucket should appear as `Public` access. + +![Access the Object URL on S3 Bucket](../../.gitbook/assets/hosting/aws-10.png) + +**Verify the object URL on public access** + +Select the file from the bucket that needs verification and select `Open`. Now download the file on your system. + +![Access the Object URL on S3 Bucket](../../.gitbook/assets/hosting/aws-11.png) + +**Step 4 - Get the S3 Bucket Link & Publish Asset on Market** + +Now that the S3 endpoint has public access, the asset will be hosted successfully. + +Go to [Ocean Market](https://market.oceanprotocol.com/publish/1) to complete the form for asset creation. + +Copy the `Object URL` that can be found at `Object Overview` from the AWS S3 bucket and paste it into the `File` field from the form found at [step 2](https://market.oceanprotocol.com/publish/2) as it is illustrated below. + +![Get the S3 Bucket Link & Publish Asset on Market](../../.gitbook/assets/hosting/aws-12.png) + +#### diff --git a/user-guides/asset-hosting/azure-cloud.md b/user-guides/asset-hosting/azure-cloud.md new file mode 100644 index 00000000..6f19bc15 --- /dev/null +++ b/user-guides/asset-hosting/azure-cloud.md @@ -0,0 +1,61 @@ +--- +description: How to use centralized hosting with Azure Cloud for your NFT assets +--- + +# Azure Cloud + +### Microsoft Azure + +Azure provides various options to host data and multiple configuration possibilities. Publishers are required to do their research and decide what would be the right choice. The below steps provide one of the possible ways to host data using Azure storage and publish it on Ocean Marketplace. + +**Prerequisite** + +Create an account on [Azure](https://azure.microsoft.com/en-us/). Users might also be asked to provide payment details and billing addresses that are out of this tutorial's scope. + +**Step 1 - Create a storage account** + +**Go to Azure portal** + +Go to the Azure portal: https://portal.azure.com/#home and select `Storage accounts` as shown below. + +![Select storage accounts](../../.gitbook/assets/hosting/azure1.png) + +**Create a new storage account** + +![Create a storage account](../../.gitbook/assets/hosting/azure2.png) + +**Fill in the details** + +![Add details](../../.gitbook/assets/hosting/azure3.png) + +**Storage account created** + +![Storage account created](../../.gitbook/assets/hosting/azure4.png) + +**Step 2 - Create a blob container** + +![Create a blob container](../../.gitbook/assets/hosting/azure5.png) + +**Step 3 - Upload a file** + +![Upload a file](../../.gitbook/assets/hosting/azure6.png) + +**Step 4 - Share the file** + +**Select the file to be published and click Generate SAS** + +![Click generate SAS](../../.gitbook/assets/hosting/azure7.png) + +**Configure the SAS details and click `Generate SAS token and URL`** + +![Generate link to file](../../.gitbook/assets/hosting/azure8.png) + +**Copy the generated link** + +![Copy the link](../../.gitbook/assets/hosting/azure9.png) + +**Step 5 - Publish the asset using the generated link** + +Now, copy and paste the link into the Publish page in the Ocean Marketplace. + +![Publish the file as an asset](../../.gitbook/assets/hosting/azure10.png) diff --git a/user-guides/asset-hosting/github.md b/user-guides/asset-hosting/github.md new file mode 100644 index 00000000..704fb507 --- /dev/null +++ b/user-guides/asset-hosting/github.md @@ -0,0 +1,66 @@ +--- +description: How to use Github for your NFT assets +--- + +# Github + +### **Github** + +GitHub can be used to host and share files. This allows you to easily share and collaborate on files, track changes using commits, and keep a history of updates. GitHub's hosting capabilities enable you make your content accessible on the web. + +### **Prerequisites** + +Create an account on [Github](https://github.com/). Users might also be asked to provide details and billing addresses that are outside of this tutorial's scope. + +**Step 1 - Create a new repository on GitHub or navigate to an existing repository where you want to host your files.** + +

Create new repository

+ +Fill in the repository details. **Make sure your Repo is public.** + +

Make the repository public

+ +### Host Your File + +**Step 2 - Upload a file** + +Go to your repo in Github and above the list of files, select the Add file dropdown menu and click Upload files. Alternatively, you can use version control to push your file to the repo. + +

Upload file on Github

+ +To select the files you want to upload, drag and drop the file or folder, or click 'choose your files'. + +

Drag and drop new files on your GitHub repo

+ +In the "Commit message" field, type a short, meaningful commit message that describes the change you made. + +

Commit changes

+ +Below the commit message field, decide whether to add your commit to the current branch or to a new branch. If your current branch is the default branch, then you should choose to create a new branch for your commit and then create a pull request. + +After you make your commit (and merge your pull request, if applicable), then click on the file. + +

Upload successful

+ +**Step 3 - Get the RAW version of your file** + +To use your file on the Market **you need to use the raw url of the asset**. Also, make sure your Repo is publicly accessible to allow the market to use that file. + +Open the File and click on the "Raw" button on the right side of the page. + +

Click the Raw button

+ +Copy the link in your browser's URL - it should begin with "https://raw.githubusercontent.com/...." like in the image below. + +

Grab the RAW github URL from your browser's URL bar

+ +

Copy paste the raw url

+ +**Step 4 - Publish the asset using the Raw link** + +Now, copy and paste the Raw Github URL into the File field of the Access page in the Ocean Market. + +

Upload on the Ocean Market

+ +Et voilà! You have now successfully hosted your asset on Github and properly linked it on the Ocean Market. + diff --git a/user-guides/asset-hosting/google-storage.md b/user-guides/asset-hosting/google-storage.md new file mode 100644 index 00000000..51ba9d58 --- /dev/null +++ b/user-guides/asset-hosting/google-storage.md @@ -0,0 +1,60 @@ +--- +description: How to use Google Storage for your NFT assets +--- + +# Google Storage + +**Google Storage** + +Google Cloud Storage is a scalable and reliable object storage service provided by Google Cloud. It allows you to store and retrieve large amounts of unstructured data, such as files, with high availability and durability. You can organize your data in buckets and benefit from features like access control, encryption, and lifecycle management. With various storage classes available, you can optimize cost and performance based on your data needs. Google Cloud Storage integrates seamlessly with other Google Cloud services and provides APIs for easy integration and management. + +**Prerequisite** + +Create an account on [Google Cloud](https://console.cloud.google.com/). Users might also be asked to provide payment details and billing addresses that are out of this tutorial's scope. + +**Step 1 - Create a storage account** + +**Go to** [**Google Cloud console**](https://console.cloud.google.com/storage/browser) + +In the Google Cloud console, go to the Cloud Storage Buckets page + +
+ +**Create a new bucket** + +
+ +**Fill in the details** + +
+ +**Allow access to your recently created Bucket** + +
+ +**Step 2 - Upload a file** + +
+ +**Step 3 - Change your file's access (optional)** + +**If your bucket's access policy is restricted, on the menu on the right click on Edit access (skip this step if your bucket is publicly accessible)** + +
+ +
+ +**Step 4 - Share the file** + +**Open the file and copy the generated link** + +
+ +
+ +**Step 5 - Publish the asset using the generated link** + +Now, copy and paste the link into the Publish page in the Ocean Marketplace. + +
+ diff --git a/user-guides/buy-data-nfts.md b/user-guides/buy-data-nfts.md new file mode 100644 index 00000000..3caa02b3 --- /dev/null +++ b/user-guides/buy-data-nfts.md @@ -0,0 +1,47 @@ +--- +description: How to Buy and Download Data on the Ocean Market +--- + +# Buy NFT Data + +

That moment when you buy your first datatoken

+ +### Let's Go Shopping! 💁‍♀️🛍️ + +1. Go to the [Ocean Market](https://v4.market.oceanprotocol.com/). +2. Search for NFTs using the search bar in the top right corner of the page. The Ocean Marketplace provides features to search for Data/Algorithm NFTs by text, and users can also sort the result by published date. +3. Connect your wallet. If you know the network the asset is hosted on, then you can also select the network from the dropdown to the left of the Connect Wallet button to filter! + +![Connect your wallet](../.gitbook/assets/market/consume-connect-wallet.png) + +### Steps to Download + +#### Step 1 - Buy the asset 🫰 + +The Buy button is enabled only if the connected wallet address has enough OCEAN tokens to purchase the asset. + +![Click the large pink Buy button](../.gitbook/assets/market/consume-1.png) + +Are you buying an asset on the Polygon network? Then you'll need mOCEAN "matic OCEAN" to buy assets! Watch our tutorial how to get mOCEAN so that you can go shopping on the Ocean Market 🤑🛒 + +{% embed url="https://www.youtube.com/watch?v=W5eIipUHl-w" %} +Learn how to get mOCEAN +{% endembed %} + +#### Step 2 - Confirm the 1st of 3 transactions - Allow access to OCEAN tokens + +![Transaction 1: Give the smart contract permission to access OCEAN tokens](../.gitbook/assets/market/consume-2.png) + +#### Step 3 - Confirm the 2nd of 3 transactions - Buy the asset in exchange for OCEAN tokens 💸 + +![Transaction 2: Buy the datatoken giving you access to the asset](../.gitbook/assets/market/consume-3.png) + +#### Step 4 - Download the asset ⬇️ + +![Download asset](../.gitbook/assets/market/consume-4.png) + +#### Step 5 - Sign the 3rd last transaction ✍️ + +After signing the message, the file download will start. + +![Sign the message using your wallet](../.gitbook/assets/market/consume-5.png) diff --git a/user-guides/claim-ocean-rewards.md b/user-guides/claim-ocean-rewards.md new file mode 100644 index 00000000..ea22617b --- /dev/null +++ b/user-guides/claim-ocean-rewards.md @@ -0,0 +1,33 @@ +--- +description: How to claim OCEAN token rewards from data farming 🧑‍🌾🥕 +--- + +# Claim Rewards Data Farming + +
+ +Ocean Protocol's Data Farming dapp dispenses rewards **every Thursday** to its participants. 💰 To claim your OCEAN token rewards for data farming, simply navigate to the Data Farming [Rewards page](https://df.oceandao.org/activerewards) and click the Claim OCEAN rewards buttons that will appear pink and clickable each Thursday. Yeehaw! + +Want to begin [data farming](https://df.oceandao.org)? Start [here](get-started-df.md). 🤠 + +### Step 1 - Navigate to the Data Farming Rewards page + +Go to [https://df.oceandao.org/](https://df.oceandao.org), and click the Rewards [link](https://df.oceandao.org/activerewards) at the top of the page. + +

Click the Rewards link at the top of the page

+ +### Step 2 - Click the pink 'Claim # OCEAN' buttons 🛎️ + +At the bottom of the 'Passive Rewards' and 'Active Rewards' panels are the 'Claim # OCEAN' buttons that appear pink and clickable on Thursdays. + +This is where you click to claim your rewards! Easy peasy. You will need to approve the transactions with your wallet. + +

Click the pink Claim # Ocean buttons

+ +### FIRST TIME CLAIMING? + +You will need to wait at least one week, but not more than two weeks to claim your rewards for the first time. Check back on Thursday! + +### FORGOT TO CLAIM? + +If you forget to claim your OCEAN rewards, then do not worry because they continue to accumulate! You can claim them anytime after Thursday. 😃 diff --git a/user-guides/compute-to-data/README.md b/user-guides/compute-to-data/README.md new file mode 100644 index 00000000..787dcd33 --- /dev/null +++ b/user-guides/compute-to-data/README.md @@ -0,0 +1,24 @@ +--- +description: >- + How to sell compute jobs on your datasets while keeping your data and + algorithms private +--- + +# Sell NFT Computations (Compute-to-Data) + +### Introducing, The Problem + +{% embed url="https://media0.giphy.com/media/v1.Y2lkPTc5MGI3NjExNjNmMTc3MjFjNTg2MjQwZTQyY2VkNzFiNjk1YzM5ZmJkM2NjMzA4ZiZlcD12MV9pbnRlcm5hbF9naWZzX2dpZklkJmN0PWc/17FxSFyYNOgThnonDK/giphy.gif" fullWidth="false" %} + +**Anyone could buy your data on the Ocean Market and then publicly share it all over the internet.** Pretty scary, right? But what if there was a way that buyers could access valuable insights from your data and algorithms without actually \*seeing\* the data or algorithms themselves? We have a solution for that! + +Enter, **Compute-to-Data** (also lovingly called C2D 🥰). Ocean Protocol's C2D feature enables you to monetize the OUTPUT of compute jobs on your datasets without revealing the contents of the data/algorithms themselves. Let's dive in how! + +## How to Compute-to-Data 💃 + +You will need to accomplish **3 main steps** in establishing a compute-to-data flow: [create an algorithm](make-a-boss-c2d-algorithm.md) that's compatible with C2D, [publish your C2D-specific algorithm NFT](publish-a-c2d-algorithm-nft.md), then [publish your data NFT with C2D configurations](publish-a-c2d-data-nft.md) allowing the algorithm to compute on it. That's it! Then you'll be able to sell compute jobs 🤩 Read the steps in this section's three subpages to create & sell a compute job from start to finish. 💪😃 + +Or you could watch our video tutorial about it below: + +{% embed url="https://youtu.be/2AF9mkqlf5Y" %} + diff --git a/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md b/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md new file mode 100644 index 00000000..e83d80ca --- /dev/null +++ b/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md @@ -0,0 +1,98 @@ +--- +description: >- + How to construct the beginnings of an awesome algorithm for C2D compute jobs + on datasets +--- + +# Make a Boss C2D Algorithm + +
+ +The beginning of any algorithm for Compute-to-Data starts by loading the dataset correctly. Read on, anon 👨🏻‍💻 + +### Open the local dataset file + +This code goes at the top of your algorithm file for your algorithm NFT asset to use with Compute-to-Data. It references your data NFT asset file on the Docker container you selected. + +{% tabs %} +{% tab title="Python" %} +```python +import csv +import json +import os + +def get_input(local=False): + + dids = os.getenv("DIDS", None) + + if not dids: + print("No DIDs found in the environment. Aborting.") + return + + dids = json.loads(dids) + + for did in dids: + filename = f"data/inputs/{did}/0" # 0 for metadata service + print(f"Reading asset file {filename}.") + return filename + +# Get the input filename using the get_input function +input_filename = get_input() + +if not input_filename: + # No input filename returned + exit() + +# Open the file & run your code +with open(input_filename, 'r') as file: + # Read the CSV file + csv_reader = csv.DictReader(file) + + +``` +{% endtab %} + +{% tab title="Javascript" %} +```javascript +const fs = require("fs"); + +var input_folder = "/data/inputs"; +var output_folder = "/data/outputs"; + +async function processfolder(Path) { + var files = fs.readdirSync(Path); + for (var i =0; i < files.length; i++) { + var file = files[i]; + var fullpath = Path + "/" + file; + if (fs.statSync(fullpath).isDirectory()) { + await processfolder(fullpath); + } else { + + } + } +} + + + +// Open the file & run your code +processfolder(input_folder); + +``` +{% endtab %} +{% endtabs %} + +**Note:** Here are the following Python libraries that you can use in your code: + +```python +// Python modules +numpy==1.16.3 +pandas==0.24.2 +python-dateutil==2.8.0 +pytz==2019.1 +six==1.12.0 +sklearn +xlrd == 1.2.0 +openpyxl >= 3.0.3 +wheel +matplotlib +``` diff --git a/user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md b/user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md new file mode 100644 index 00000000..910e1ba6 --- /dev/null +++ b/user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md @@ -0,0 +1,63 @@ +--- +description: How to publish a C2D algorithm NFT on the Ocean Market +--- + +# Publish a C2D Algorithm NFT + +

You're an algorithm guru after all!

+ +### Publish Your Algorithm NFT + +#### Step 1 - Navigate to the Ocean Market + +* Go to [https://market.oceanprotocol.com](https://market.oceanprotocol.com) + +#### Step 2 - Connect your wallet + +* Click the top right Connect Wallet button to connect your self-custody wallet to the Ocean Market + +

Connect your self-custody wallet

+ +#### Step 3 - Click the Publish link in the top left corner of the page + +* Click the Publish link + +

Navigate to the Publish page

+ +#### Step 4 - Enter the metadata + +* Enter the metadata for your algorithm NFT paying special attention to select the Algorithm asset type button: + +

Be sure to select the Algorithm asset type

+ +* Select the appropriate Docker image to run your code - most algorithms are written in Javascript or Python, so you can use either of these Docker images or your own custom image! + +

Select the appropriate Docker image for your algorithm type

+ +#### Step 5 - Enter the Access information + +* Make sure to keep this option checked! ✅ + +

Keep this option checked for Compute-to-Data

+ +#### Step 6 - Set a price, Fixed or Free, for your algorithm + +* It is recommended that you set a fixed price since the price that you choose for your algorithm will be charged any time that someone selects to run your algorithm on a dataset, including datasets that aren't yours! + +

Set a price for your C2D algorithm NFT

+ +#### Step 7 - Approve the preview + +* Your preview should look like the following: + +
+ +#### Step 8 - Submit the transactions + +* Click the pink Submit button at the bottom of the page and sign the two transactions with your wallet. You'll have to pay for gas when you sign to publish your algorithm asset. + +

Sign and pay gas for 2 transactions on the final publishing step

+ +#### Congratulations on publishing your algorithm! + +* On to the next step -> [Publish a C2D Data NFT](publish-a-c2d-data-nft.md) diff --git a/user-guides/compute-to-data/publish-a-c2d-data-nft.md b/user-guides/compute-to-data/publish-a-c2d-data-nft.md new file mode 100644 index 00000000..fc0c0843 --- /dev/null +++ b/user-guides/compute-to-data/publish-a-c2d-data-nft.md @@ -0,0 +1,91 @@ +--- +description: How to publish a data NFT with C2D configurations +--- + +# Publish a C2D Data NFT + +
+ +#### Step 1 - Navigate to the Ocean Market + +* Go to [https://market.oceanprotocol.com](https://market.oceanprotocol.com) + +#### Step 2 - Connect your wallet + +* Click the top right Connect Wallet button to connect your self-custody wallet to the Ocean Market + +

Connect your self-custody wallet

+ +#### Step 3 - Click the Publish link in the top left corner of the page + +* Click the Publish link + +

Navigate to the Publish page

+ +#### Step 4 - Enter the metadata + +* Enter the metadata for your data NFT keeping the Dataset asset type selected by default: + +

Keep the default Dataset Asset Type selected

+ +#### Step 5 - Enter the Access information + +* You must select the Compute access type in this step! + +

Make sure that you select Compute access type

+ +#### Step 6 - Set a price, Fixed or Free, for your dataset + +* Toggle the tab to decide whether you want to set a fixed or free price for your dataset. + +

Set a price for your C2D data NFT

+ +#### Step 7 - Approve the preview + +* Your preview should look like the following (this will change soon!): + +
+ +#### Step 8 - Submit the transactions + +* Click the pink Submit button at the bottom of the page and sign the two transactions with your wallet. You'll have to pay for gas when you sign to publish your algorithm asset. + +

Sign and pay gas for 2 transactions on the final publishing step

+ +**Ok, you've published a Data NFT that is \*almost\* ready for Compute-to-Data. Just a few tiny steps left and you're done!** + +#### Step 9 - Edit the asset (yes, again!) + +* On the webpage for your Data NFT, you need to click the Edit Asset link to change the C2D settings. + +

Click the Edit Asset link

+ +#### Step 10 - Edit the Compute settings + +* You must select the Edit Compute Settings button to add your algorithm to the data NFT for computation. + +

Select the Edit Compute Settings tab button

+ +* Then, search for your algorithm in the Selected algorithms search bar and check the box next to it to add it as an algorithm able to compute on the dataset. + +

Search for and select your algorithm to compute on your dataset

+ +#### Step 11 - Submit the transactions + +* Click the pink Submit button at the bottom of the page and sign all the transactions. + +

Click the Submit button to finalize the transaction

+ +#### Congratulations! You have fully finished the C2D flow. Check your work by verifying that your algorithm appears on the data NFT's page, like in the following example: + +

Your algorithm should appear now on the data NFT's page!

+ +If you would like to run the compute job, then simply click the radio button to the left of the algorithm's name and click Buy Compute Job. + +

Buy the compute job to check your work!

+ +The output of your algorithm's computation on the dataset will appear in the Algorithm.log output. Voilá! + +#### Now do a little dance, because you're done! + +{% embed url="https://media4.giphy.com/media/CxhWJeIicfOEynsEn6/giphy.gif?cid=ecf05e478eb1zzixmsjwbwx37a0d4e1096812j513crzr18j&ct=g&ep=v1_gifs_search&rid=giphy.gif" %} diff --git a/user-guides/get-started-df.md b/user-guides/get-started-df.md new file mode 100644 index 00000000..7ec020f4 --- /dev/null +++ b/user-guides/get-started-df.md @@ -0,0 +1,56 @@ +--- +description: Get veOCEAN tokens to use the Data Farming dApp and make yield! 🧑‍🌾🥕 +--- + +# Get Started Data Farming + +### What is Data Farming? + +[Data Farming](https://df.oceandao.org) is our dApp that generates yield for participants that curate and publish valuable assets in the Ocean ecosystem. + +

Get veOCEAN tokens

+ +### What is veOCEAN and why do you want it? 🌊 + +In order to **gain yield Data Farming**, you will need to lock your OCEAN tokens first! When you lock your OCEAN tokens, you get **veOCEAN tokens** and **weekly passive OCEAN rewards** in return. veOCEAN is a token used in Ocean Protocol's Data Farming dApp to **generate even more yield,** called **weekly active OCEAN rewards**, for allocating your veOCEAN tokens to your favorite assets and get a portion of their sales! + +### Don't have OCEAN tokens yet? Get those first! + +#### **Step 1 - Get OCEAN tokens** + +* Acquire $OCEAN via a decentralized exchange (DEX) such as Uniswap or a centralized exchange (CEX) such as Binance, Coinbase, etc. + +#### **Step 2 - Send to your self-custody wallet** + +* Send your OCEAN tokens to a self-custody wallet of yours that supports ERC-20 tokens, like Metamask for example. + +### Where the rubber meets the road 🚗💨 + +Not much of a reader? Watch and learn, friend + +{% embed url="https://www.youtube.com/watch?v=zAQlPHkK3og" fullWidth="false" %} +Watch and learn, friend +{% endembed %} + +**Step 3 - Go to Ocean Protocol's Data Farming dApp** + +* Go to [https://df.oceandao.org/](https://df.oceandao.org/) + +#### Step 4 - Connect Your Wallet + Lock your OCEAN for veOCEAN + +* Click on the purple circles in our interactive demo to walk through the steps for locking your OCEAN tokens for veOCEAN tokens. + +{% embed url="https://app.arcade.software/share/FUSkygksSRsJHwle1zFs" fullWidth="false" %} +{% endembed %} + +In this step you will: + +* Enter the amount of OCEAN tokens that you are going to lock up +* Select a Lock End Date indicating how many weeks you’re going to lock up your OCEAN tokens. (As the Lock End Date goes farther into the future, your Lock Multiplier increases). +* Click on the checkbox to agree to the disclaimer. +* Click the pink “Approve # OCEAN” button +* Accept the transaction in your wallet. +* Click the “Create Lock” button. +* Accept the transaction in your wallet. + +Congratulations! You have now locked your OCEAN tokens for veOCEAN tokens and are generating passive yield automatically. You can [claim your passive OCEAN rewards](claim-ocean-rewards.md) every Thursday - note that your first time claiming rewards will require at least one week, but not more than 2 weeks of wait! diff --git a/user-guides/how-to-data-farm.md b/user-guides/how-to-data-farm.md new file mode 100644 index 00000000..5d98e035 --- /dev/null +++ b/user-guides/how-to-data-farm.md @@ -0,0 +1,61 @@ +--- +description: Make extra dosh with active rewards yield in Data Farming +--- + +# Harvest More Yield Data Farming + +
+ +### Get More Yield from Active Rewards + +The bread and butter of the Data Farming dApp is incentivizing OCEAN rewards for curating valuable assets in the Ocean ecosystem. The way that users curate assets is by **allocating veOCEAN** to them using the Data Farming dApp. We'll show you how! + +#### Step 1 - Navigate to the Data Farming dApp + +* Go to [https://df.oceandao.org](https://df.oceandao.org) + +#### Step 2 - Connect your wallet + +* Connect your wallet to the Data Farming dApp using the Ethereum network (mainnet) + +#### Step 3 - Click on the Farms tab in the top menu + +

Click the Farms page link in the menu

+ +#### Step 4 - Select the assets which you would like to allocate to by toggling the percentage allocation at the end of the row + +* On the rightmost column, toggle the percentage of your total veOCEAN that you wish to allocate to each asset of your choice. You will **get a portion of the sales** of each asset that you allocate to! +* Note that if you allocate to an asset that YOU published, then you will get an **allocation** **boost** and your allocation will be counted as **2x**. The rows for assets that you publish will appear cream-colored. + +

Toggle the percentage of your veOCEAN that you would like to allocate to each asset

+ +#### Step 5 - Click the Update Allocations button + +* Click the pink Update Allocations button +* Sign the transactions with your wallet & pay the gas fees + +

Click the Update Allocations button

+ +That's it! You've successfully allocated (aka "voted on") your favorite assets in the Ocean ecosystem using your veOCEAN tokens and are generating active rewards yield. Now, just wait until next Thursday to see if you can [claim any OCEAN rewards](claim-ocean-rewards.md) on the Active Rewards section of the [Rewards page](https://df.oceandao.org/rewards) for your portion of the assets' sales. Remember that your first time claiming rewards will require at least one week, but not more than 2 weeks of wait! + +### How to Delegate Your Active Rewards + +Do you have multiple wallets? Say you want to send rewards to someone you 💖 We have a solution for that! Data Farming's [Delegate](https://df.oceandao.org/delegate) feature allows you to transfer your veOCEAN allocation power to another wallet address. Another reason for delegating - you might be able to earn a higher annual percentage yield (APY) by delegating your active rewards to an address that more efficiently manages your allocation. For all these reasons and more, then you might want to delegate - go to [https://df.oceandao.org/delegate](https://df.oceandao.org/delegate) to make the magic happen ✨ + + + +When you delegate, you transfer 100% of your veOCEAN allocation power for a limited period. You can delegate your active rewards \*without\* the need for reallocation and transaction fees! Note that after you delegate, then you cannot manage your allocations until the delegation expires. The delegation expiration date is the same as your veOCEAN Lock End Date at the time of delegation. If necessary, you can extend your Lock End Date before delegating. You can also cancel your delegation at any time 💪 Once delegated, rewards will be sent to the wallet address you delegated to. Then, the delegation receiver is in charge of your active rewards and is responsible for returning those back to you should you choose to do so. + +Follow these steps to delegate your veOCEAN: + +1. Go to the [Data Farming dApp](https://df.oceandao.org). +2. Navigate to the [Delegate page](https://df.oceandao.org/delegate). +3. Enter the wallet address you wish to delegate your active rewards to into the 'Receiver wallet address' field. +4. Click the Delegate button and sign the transaction with your wallet. You can view information about your delegation in the My Delegations component. +5. If you desire, the you can cancel the delegation to regain your allocation power before the delegation expires. + +#### What if someone delegates active rewards to you? + +If you receive veOCEAN allocation power from other wallets, then you will receive their active rewards. You cannot delegate the veOCEAN you received from delegates, only the veOCEAN you received from your lock. + +
diff --git a/user-guides/join-a-data-challenge.md b/user-guides/join-a-data-challenge.md new file mode 100644 index 00000000..5d62f807 --- /dev/null +++ b/user-guides/join-a-data-challenge.md @@ -0,0 +1,27 @@ +--- +description: >- + Roll with the brightest data scientists and machine learning experts for + prizes +--- + +# Join a Data Challenge + +

Bring on the data challenges.

+ +Hone your skills, work on real business problems, and earn sweet dosh along the way. + +### What is an Ocean Protocol data challenge? + +Ocean Protocol's data challenges are open competitions where participants must solve a real business problem using data science or machine learning skills. Some challenges are designed for data exploration, analysis, and reporting, while others require developing machine learning models. Thus, data challenges have different types of formats, topics, and sponsors. One of the main advantages of these data challenges is that users retain ownership of their IP and the ability to further monetize their work outside of the competition. + +### Where can I find the data challenges? + +[Discover our open challenges here.](https://oceanprotocol.com/challenges) + +### What is the typical flow for a data challenge? + +1. Participants download the necessary dataset(s) on the Ocean Market according to the data challenge instructions. +2. Participants may be tasked with building a report that combines data visualization and written explanations for a dataset, or perhaps tasked with building a machine learning model to predict a specific target value. +3. Participants publish their results either publicly or privately using Ocean Protocol smart contracts by the deadline as directed by the challenge instructions. +4. Winners are selected and announced generally within 2 weeks. +5. Winners will be sent instructions to claim their crypto prizes. diff --git a/user-guides/publish-data-nfts.md b/user-guides/publish-data-nfts.md new file mode 100644 index 00000000..d7155e85 --- /dev/null +++ b/user-guides/publish-data-nfts.md @@ -0,0 +1,145 @@ +--- +description: How to Mint and Publish Data NFTs Using the Ocean Market +--- + +# Publish Data NFTs + +
+ +### What to Publish? 🤷‍♀️ + +Ocean Protocol has a convenient marketplace, called the Ocean Market, for publishers and consumers of data. What data, you ask? This data spans anything from .`CSVs` and .`XLSX` files to images, audio, videos, algorithms in any language, or combinations of all these things! There is no exhaustive list of what type of data can be published on the Ocean Market. + +### What does it mean to publish an NFT using Ocean Protocol? + +The publishing process on the Ocean Market both mints (i.e. creates) a [data NFT](../developers/contracts/data-nfts.md) and a corresponding [datatoken](../developers/contracts/datatokens.md) for your IP. The data NFT stores your IP, and the datatoken controls access to it. If you publish your music IP on the Ocean Market, for example, then a data NFT containing your music and its datatoken are minted during the publishing process. When consumers purchase the datatoken, then they gain access to download/use your data NFT's music. + +### How to Publish an NFT on the Ocean Market 🧑‍🏫 + +### No code flow + +{% embed url="https://www.youtube.com/watch?v=3NGSmfXkHAQ" %} +Don't enjoy reading? Watch our video tutorial! +{% endembed %} + +#### Getting Started 🏃💨 + +1. Go to the [Ocean Market](https://v4.market.oceanprotocol.com). +2. Connect your wallet. +3. Select the network where you would like to publish your NFT (ex. Ethereum, Polygon, etc). + +

Connect your wallet

+ +In this tutorial, we will be using the Polygon Mumbai test network. + +4\. Click on the Publish link on the top left corner of the page. + +

Click the publish link

+ +#### Step 1 - Metadata 🤓 + +Fill in the [metadata](../developers/metadata.md). + +_Mandatory fields are marked with \*_ + +* **Asset type**\* + + An asset can be a _dataset_ or an _algorithm_. The asset type cannot be changed after publication. +* **Title**\* + + The descriptive name of the asset. This field is editable after the asset publication. +* **Description**\* + + Description of the asset. Ocean Marketplace **supports Markdown** and plain text formats for the description field. Feel free to enrich your NFT's description with GIFs, videos, images, etc! This field is editable after the asset publication. +* **Author**\* + + The author of the asset. The author can be an individual or an organization. This field is editable after the asset publication. +* **Tags** + + Tags help the asset to be searchable. If not provided, the list of tags is empty by default. + +

Enter the metadata

+ +Click Continue. + +#### Step 2 - Access details 🔑 + +_Mandatory fields are marked with \*_ + +* **Access Type**\* + + An asset can be either a downloadable file or a compute job on a dataset which buyers can run their algorithm. Through **download**, buyers will be able to download the data in the NFT. Through **compute**, buyers will be able to purchase a compute job on a dataset to see an output feed (see Compute-to-Data). +* **Provider URL**\* + + The Provider server encrypts the URL to the data and allows for the asset to be downloaded by buyers or for compute jobs. +* **File**\* + + The direct URL to the data. **Provider** encrypts this field before publishing the asset on-chain to hide the source of the data. The direct URL to the file needs to be **publicly accessible** so that the file can be downloaded by buyers (data hosted behind firewalls will not work!). If the file is hosted on services like Google Drive, then the URL needs to point directly to the data asset file. Also, the file needs to have the proper permissions to be downloaded by anybody. +* **Sample file** + + An optional field where publishers can provide a URL to a sample file of the data. Including a sample file helps to persuade buyers that the data is in a suitable format for their needs. The buyers can access it before buying the dataset. This field is editable after the asset publication. + + **Provider** encrypts this URL before publishing the asset on-chain. +* **Timeout**\* + + This field specifies how long the buyer can access the dataset after the dataset is purchased. This field is editable after the asset publication. + +

Enter the access information

+ +#### Step 3 - Pricing 🫰 + +The publisher needs to choose a pricing option - fixed or free - for the asset before publishing the data asset. The pricing schema is not editable after the asset publication. + +There are 2 pricing option tabs for asset publication on Ocean Marketplace. + +1. [Fixed pricing](../developers/contracts/pricing-schemas.md#fixed-pricing) +2. [Free pricing](../developers/contracts/pricing-schemas.md#free-pricing) + +With the _fixed pricing_ schema, the publisher sets the price that buyers will pay to download the data asset. + +With the _free pricing_ schema, the publisher provides an asset that is free to be downloaded by anyone. + +For a deep dive into the fee structure, please refer to this [document](../developers/contracts/fees.md). + +

Set the price

+ +#### Step 4 - Preview 🔍 + +

Preview your work

+ +If you are happy with the Preview of your NFT, then click Continue! + +#### Step 5 - Submit Your Blockchain Transactions 💃🕺 + +To publish your NFT on-chain, you must go through three steps including signing two transactions with your wallet. Note: this will cost some gas fees! + +![Transaction Signature 1 - Deploy data NFT and datatoken](../.gitbook/assets/market/publish-5.png) + +![Transaction Signature 2 - Deploy data NFT and datatoken](../.gitbook/assets/market/publish-6.png) + +#### Confirmation 🥳 + +Now, your data NFT is successfully published and available in the Ocean Market! + +![Successful publish](../.gitbook/assets/market/publish-7.png) + +On the [profile page](https://v4.market.oceanprotocol.com/profile), a publisher has access to all their published assets. + +### Code flow + +* **Python:** Are you looking at how to publish a data NFT using Python? Follow our ocean.py [Publish Flow](../developers/ocean.py/publish-flow.md) to mint a data NFT and datatoken using Python. +* **Javascript**: Are you looking at how to publish a data NFT using Javascript? Follow our ocean.js [Publish Flow](../developers/ocean.js/publish.md) to mint a data NFT and datatoken using Javascript. + +#### More Info 🧐 + +Your data or algorithm NFT is \*published\* on-chain once you complete the flow. However, you are not selling the actual NFT on-chain - **you are selling datatokens** that give buyers **access** to the NFT's data. More on this distinction in the Ocean Basics video tutorial. + +{% embed url="https://www.youtube.com/watch?v=I06AUNt7ee8" %} +Learn more about the publishing flow! +{% endembed %} + +**Note:** Ocean Protocol maintains a purgatory list [here](https://github.com/oceanprotocol/list-purgatory) to block addresses and remove assets for any violations. + +### Related Articles 📖 + +[https://blog.oceanprotocol.com/on-selling-data-in-ocean-market-9afcfa1e6e43](https://blog.oceanprotocol.com/on-selling-data-in-ocean-market-9afcfa1e6e43) diff --git a/using-ocean-market/remove-liquidity-using-etherscan.md b/user-guides/remove-liquidity-pools.md similarity index 53% rename from using-ocean-market/remove-liquidity-using-etherscan.md rename to user-guides/remove-liquidity-pools.md index 41814117..83c35402 100644 --- a/using-ocean-market/remove-liquidity-using-etherscan.md +++ b/user-guides/remove-liquidity-pools.md @@ -1,6 +1,6 @@ # Liquidity Pools \[deprecated] -In previous versions of Ocean liquidity pools and dynamic pricing were supported. These features have been deprecated and we now advise everyone to remove their liquidity from the remaining pools. It is no longer possible to do this via Ocean Market, so please follow this guide to remove your liquidity via etherscan. +Liquidity pools and dynamic pricing used to be supported in previous versions of the Ocean Market. However, these features have been deprecated and now we advise everyone to remove their liquidity from the remaining pools. It is no longer possible to do this via Ocean Market, so please follow this guide to remove your liquidity via etherscan. ## Remove liquidity using Etherscan @@ -10,32 +10,33 @@ In previous versions of Ocean liquidity pools and dynamic pricing were supported 2. Click _View All_ and look for Ocean Pool Token (OPT) transfers. Those transactions always come from the pool contract, which you can click on. 3. On the pool contract page, go to _Contract_ -> _Read Contract_. -![](<../.gitbook/assets/remove-liquidity-1 (1).png>) +

Read Contract

4\. Go to field `20. balanceOf` and insert your ETH address. This will retrieve your pool share token balance in wei. -![](../.gitbook/assets/remove-liquidity-2.png) +

Balance Of

5\. Copy this number as later you will use it as the `poolAmountIn` parameter. 6\. Go to field `55. totalSupply` to get the total amount of pool shares, in wei. -![](../.gitbook/assets/remove-liquidity-3.png) +

Total Supply

7\. Divide the number by 2 to get the maximum of pool shares you can send in one pool exit transaction. If your number retrieved in former step is bigger, you have to send multiple transactions. 8\. Go to _Contract_ -> _Write Contract_ and connect your wallet. Be sure to have your wallet connected to network of the pool. -![](../.gitbook/assets/remove-liquidity-4.png) +

Write Contract

9\. Go to the field `5. exitswapPoolAmountIn` * For `poolAmountIn` add your pool shares in wei * For `minAmountOut` use anything, like `1` * Hit _Write_ +* -![](../.gitbook/assets/remove-liquidity-5.png) +

Remove Liquidity

10\. Confirm transaction in Metamask -![](../.gitbook/assets/remove-liquidity-6.png) +

Confirm transaction

diff --git a/user-guides/sponsor-a-data-challenge.md b/user-guides/sponsor-a-data-challenge.md new file mode 100644 index 00000000..677b2aa8 --- /dev/null +++ b/user-guides/sponsor-a-data-challenge.md @@ -0,0 +1,16 @@ +--- +description: Sponsor a data challenge to crowdsource solutions for your business problems +--- + +# Sponsor a Data Challenge + +

Make the game, set the rules.

+ +Hosting a data challenge is a fun way to engage data scientists and machine learning experts around the world to **solve your real business problems**. Incentivize participants to **build products using your data**, **explain insights in your data**, or **provide useful data predictions** for your business. Plus, it's a whole lot cheaper than hiring an in-house data science team! + +### How to sponsor an Ocean Protocol data challenge? + +1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing. +2. Curate the dataset(s) that participants will use for the challenge. The key to hosting a good data challenge is to provide an exciting and through dataset that participants can use to build their solutions. Do your research to understand what data is available, whether it be free from an API, available for download, require any transformations, etc. For the first challenge, it is alright if the created dataset is a static file. However, it is best to ensure there is a path to making the data available from a dynamic endpoint so that entires can eventually be applied to current, real-world use cases. +3. Decide how the judging process will occur. This includes how long to make review period, how to score submissions, and how to decide any prizes will be divided among participants +4. Work with Ocean Protocol to gather participants for your data challenge. Creating blog posts and hosting Twitter Spaces is a good way to spread the word about your data challenge. diff --git a/user-guides/using-ocean-market.md b/user-guides/using-ocean-market.md new file mode 100644 index 00000000..57899247 --- /dev/null +++ b/user-guides/using-ocean-market.md @@ -0,0 +1,30 @@ +--- +description: Buy, mint, and sell all sorts of data on the Ocean Market +--- + +# Guide to the Ocean Market + +

Retail therapy mood!

+ +### What is the [Ocean Market](https://market.oceanprotocol.com/)? 🛒 + +The Ocean Market is a place for buyers + sellers of top-notch data and algorithms to exchange goods. Our market also gives you easy-to-use publishing and purchasing tools to monetize your intellectual property. 🤑 What's not to love? 💕 + +#### **You can:** + +* Buy access to unique data, algorithms, and compute jobs. 🛍️ +* Tokenize & monetize your intellectual property through blockchain technology. 💪 + +#### **Learn to:** + +* [Publish an NFT](publish-data-nfts.md) +* [Download NFT Assets](buy-data-nfts.md) +* [Host Your Assets](asset-hosting/README.md) + +#### Getting Basic 💁‍♀️ + +**If you are new to web3** and blockchain technologies then we suggest you first get familiar with some Web3 basics: + +* [Wallet Basics](../discover/wallets/README.md) 👛 +* [Set Up MetaMask](../discover/wallets/metamask-setup.md) 🦊 +* [Manage Your OCEAN Tokens](../discover/wallets-and-ocean-tokens.md) 🪙 diff --git a/using-ocean-market/README.md b/using-ocean-market/README.md deleted file mode 100644 index b0dbc8b0..00000000 --- a/using-ocean-market/README.md +++ /dev/null @@ -1,24 +0,0 @@ -# Using Ocean Market - -### About Ocean Market - -* [Ocean Market](https://market.oceanprotocol.com/) enables publishers to monetize their data and/or algorithms through blockchain technology. -* Consumers can purchase access to data, algorithms, compute services. - -![Ocean Market Landing Page](images/marketplace/marketplace-landing-page.png) - -The following guides will help you get started with buying and selling data: - -* [Publish a data asset](marketplace-publish-data-asset.md) -* [Download a data asset](marketplace-download-data-asset.md) -* [Publishing with hosting services](asset-hosting.md) - -If you are new to web3 and blockchain technologies then we suggest you first read these introductory guides: - -* [Wallet Basics](../building-with-ocean/wallets.md) -* [Set Up MetaMask Wallet](../orientation/metamask-setup.md) -* [Manage Your OCEAN Tokens](../building-with-ocean/wallets-and-ocean-tokens.md) - -### Removing Liquidity - -The AMM pools, and dynamic pricing schema are no longer available on the Ocean Market. Refer [this page](remove-liquidity-using-etherscan.md) on removing liquidity from the pool using Etherscan. diff --git a/using-ocean-market/asset-hosting.md b/using-ocean-market/asset-hosting.md deleted file mode 100644 index 2ccb3037..00000000 --- a/using-ocean-market/asset-hosting.md +++ /dev/null @@ -1,223 +0,0 @@ ---- -title: Publish assets using hosting services -description: Tutorial to publish assets using hosting services like Arweave, AWS, and Azure. ---- - -## Overview - -To publish on the Ocean Marketplace, publishers must first host their assets. It is up to the asset publisher to decide where to host the asset. For example, a publisher can store the content on decentralized storage like Arweave or choose a centralized solution like their AWS server, private cloud server, or other third-party hosting services. Through publishing, the information required to access the asset is encrypted and stored as a part of DDO on the blockchain. Buyers don't have access directly to this information, but they interact with the [Provider](https://github.com/oceanprotocol/provider#provider), which decrypts it and acts as a proxy to serve the asset. The DDO only stores the location of the file, which is accessed on-demand by the Provider. Implementing a security policy that allows only the Provider to access the file and blocks requests from other unauthorized actors is recommended. One of the possible ways to achieve this is to allow only the Provider's IP address to access the data. But, not all hosting services provide this feature. So, the publishers must consider the security features while choosing a hosting service. - -On Ocean Marketplace, a publisher must provide the asset information during the publish step in the field shown in the below image. The information is a `link` for a classic URL, a `transaction ID` for a file stored on Arweave or a `CID` for an IPFS file. - -![Publish - File URL field](../.gitbook/assets/marketplace-publish-file-field.png) - -Publishers can choose any hosting service of their choice. The below section explains how to use commonly used hosting services with Ocean Marketplace. - -⚠️ Note -**Please use a proper hosting solution to keep your files.** -Systems like `Google Drive` are not specifically designed for this use case. They include various virus checks and rate limiters that prevent the `Provider` to download the asset once it was purchased. - -## Decentralized hosting - -### Arweave - -[Arweave](https://www.arweave.org/) is a global, permanent, and decentralized data storage layer that allows you to store documents and applications forever. Arweave is different from other decentralized storage solutions in that there is only one up-front cost to upload each file. - -**Step 1 - Get a new wallet and AR tokens** - -Download & save a new wallet (JSON key file) and receive a small amount of AR tokens for free using the [Arweave faucet](https://faucet.arweave.net/). If you already have an Arweave browser wallet, you can skip to Step 3. - -At the time of writing, the faucet provides 0.02 AR which is more than enough to upload a file. - -If at any point you need more AR tokens, you can fund your wallet from one of Arweave's [supported exchanges](https://arwiki.wiki/#/en/Exchanges). - -**Step 2 - Load the key file into the arweave.app web wallet** - -Open [arweave.app](https://arweave.app/) in a browser. Select the '+' icon in the bottom left corner of the screen. Import the JSON key file from step 1. - -![Arweave.app import key file](../.gitbook/assets/arweave-1.png) - -**Step 3 - Upload file** - -Select the newly imported wallet by clicking the "blockies" style icon in the top left corner of the screen. Select **Send.** Click the **Data** field and select the file you wish to upload. - -![Arweave.app upload file](../.gitbook/assets/arweave-2.png) - -The fee in AR tokens will be calculated based on the size of the file and displayed near the bottom middle part of the screen. Select **Submit** to submit the transaction. - -After submitting the transaction, select **Transactions** and wait until the transaction appears and eventually finalizes. This can take over 5 minutes so please be patient. - -**Step 4 - Copy the transaction ID** - -Once the transaction finalizes, select it, and copy the transaction ID. - -![Arweave.app transaction ID](../.gitbook/assets/arweave-3.png) - -**Step 5 - Publish the asset with the transaction ID** - -![Ocean Market - Publish with arweave transaction ID](../.gitbook/assets/arweave-4.png) - -## Centralized hosting - -### AWS - -AWS provides various options to host data and multiple configuration possibilities. Publishers are required to do their research and decide what would be the right choice. The below steps provide one of the possible ways to host data using an AWS S3 bucket and publish it on Ocean Marketplace. - -**Prerequisite** - -Create an account on [AWS](https://aws.amazon.com/s3/). Users might also be asked to provide payment details and billing addresses that are out of this tutorial's scope. - -**Step 1 - Create a storage account** - -**Go to AWS portal** - -Go to the AWS portal for S3: https://aws.amazon.com/s3/ and select from the upper right corner `Create an AWS account` as shown below. - -![Create an account - 1](images/hosting-services/aws-1.png) - -**Fill in the details** - -![Create an account - 2](images/hosting-services/aws-2.png)) - -**Create a bucket** - -After logging into the new account, search for the available services and select `S3` type of storage. - -![Create an account - 3](images/hosting-services/aws-3.png) - -To create an S3 bucket, choose `Create bucket`. - -![Create an account - 4](images/hosting-services/aws-4.png) - -Fill in the form with the necessary information. Then, the bucket is up & running. - -![Create an account - 5](images/hosting-services/aws-5.png) - -**Step 2 - Upload asset on S3 bucket** - -Now, the asset can be uploaded by selecting the bucket name and choosing `Upload` in the `Objects` tab. - -![Upload asset on S3 bucket - 1](images/hosting-services/aws-6.png) - -**Add files to the bucket** - -Get the files and add them to the bucket. - -The file is an example used in multiple Ocean repositories, and it can be found [here](https://raw.githubusercontent.com/oceanprotocol/c2d-examples/main/branin_and_gpr/branin.arff). - -![Upload asset on S3 bucket - 3](images/hosting-services/aws-7.png) - -The permissions and properties can be set afterward, for the moment keep them as default. - -After selecting `Upload`, make sure that the status is `Succeeded`. - -![Upload asset on S3 bucket - 4](images/hosting-services/aws-8.png) - -**Step 3 - Access the Object URL on S3 Bucket** - -By default, the permissions of accessing the file from the S3 bucket are set to private. To publish an asset on the market, the S3 URL needs to be public. This step shows how to set up access control policies to grant permissions to others. - -**Editing permissions** - -Go to the `Permissions` tab and select `Edit` and then uncheck `Block all public access` boxes to give everyone read access to the object and click `Save`. - -If editing the permissions is unavailable, modify the `Object Ownership` by enabling the ACLs as shown below. - -![Access the Object URL on S3 Bucket - 1](images/hosting-services/aws-9.png) - -**Modifying bucket policy** - -To have the bucket granted public access, its policy needs to be modified likewise. - -Note that the `` must be chosen from the personal buckets dashboard. - -```JSON -{ - "Version": "2012-10-17", - "Statement": [ - { - "Sid": "Public S3 Bucket", - "Principal": "*", - "Effect": "Allow", - "Action": "s3:GetObject", - "Resource": "arn:aws:s3:::/*" - } - ] -} -``` - -After saving the changes, the bucket should appear as `Public` access. - -![Access the Object URL on S3 Bucket - 2](images/hosting-services/aws-10.png) - -**Verify the object URL on public access** - -Select the file from the bucket that needs verification and select `Open`. Now download the file on your system. - -![Access the Object URL on S3 Bucket - 3](images/hosting-services/aws-11.png) - -**Step 4 - Get the S3 Bucket Link & Publish Asset on Market** - -Now that the S3 endpoint has public access, the asset will be hosted successfully. - -Go to [Ocean Market](https://market.oceanprotocol.com/publish/1) to complete the form for asset creation. - -Copy the `Object URL` that can be found at `Object Overview` from the AWS S3 bucket and paste it into the `File` field from the form found at [step 2](https://market.oceanprotocol.com/publish/2) as it is illustrated below. - -![Get the S3 Bucket Link & Publish Asset on Market - 1](images/hosting-services/aws-12.png) - -### Azure storage - -Azure provides various options to host data and multiple configuration possibilities. Publishers are required to do their research and decide what would be the right choice. The below steps provide one of the possible ways to host data using Azure storage and publish it on Ocean Marketplace. - -**Prerequisite** - -Create an account on [Azure](https://azure.microsoft.com/en-us/). Users might also be asked to provide payment details and billing addresses that are out of this tutorial's scope. - -**Step 1 - Create a storage account** - -**Go to Azure portal** - -Go to the Azure portal: https://portal.azure.com/#home and select `Storage accounts` as shown below. - -![Create a storage account - 1](images/hosting-services/azure-1.png) - -**Create a new storage account** - -![Create a storage account - 2](images/hosting-services/azure-2.png) - -**Fill in the details** - -![Add details](images/hosting-services/azure-3.png) - -**Storage account created** - -![Storage account created](images/hosting-services/azure-4.png) - -**Step 2 - Create a blob container** - -![Create a blob container](images/hosting-services/azure-5.png) - -**Step 3 - Upload a file** - -![Upload a file](images/hosting-services/azure-6.png) - -**Step 4 - Share the file** - -**Select the file to be published and click Generate SAS** - -![Click generate SAS](images/hosting-services/azure-7.png) - -**Configure the SAS details and click `Generate SAS token and URL`** - -![Generate link to file](images/hosting-services/azure-8.png) - -**Copy the generated link** - -![Copy the link](images/hosting-services/azure-9.png) - -**Step 5 - Publish the asset using the generated link** - -Now, copy and paste the link into the Publish page in the Ocean Marketplace. - -![Publish the file as an asset](images/hosting-services/azure-10.png) diff --git a/using-ocean-market/images/hosting-services/azure-1.png b/using-ocean-market/images/hosting-services/azure-1.png deleted file mode 100644 index fa00009a..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-1.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-2.png b/using-ocean-market/images/hosting-services/azure-2.png deleted file mode 100644 index 511ff199..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-2.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-3.png b/using-ocean-market/images/hosting-services/azure-3.png deleted file mode 100644 index 4cf63916..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-3.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-4.png b/using-ocean-market/images/hosting-services/azure-4.png deleted file mode 100644 index 2b074641..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-4.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-5.png b/using-ocean-market/images/hosting-services/azure-5.png deleted file mode 100644 index c07525e5..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-5.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-6.png b/using-ocean-market/images/hosting-services/azure-6.png deleted file mode 100644 index 4c576189..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-6.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-7.png b/using-ocean-market/images/hosting-services/azure-7.png deleted file mode 100644 index 000dc780..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-7.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-8.png b/using-ocean-market/images/hosting-services/azure-8.png deleted file mode 100644 index d424ccae..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-8.png and /dev/null differ diff --git a/using-ocean-market/images/hosting-services/azure-9.png b/using-ocean-market/images/hosting-services/azure-9.png deleted file mode 100644 index f46b3c23..00000000 Binary files a/using-ocean-market/images/hosting-services/azure-9.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-1 (1).png b/using-ocean-market/images/marketplace/Swap-1 (1).png deleted file mode 100644 index e6f24f08..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-1 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-1.png b/using-ocean-market/images/marketplace/Swap-1.png deleted file mode 100644 index e6f24f08..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-1.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-2 (1).png b/using-ocean-market/images/marketplace/Swap-2 (1).png deleted file mode 100644 index 79608dfd..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-2 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-2.png b/using-ocean-market/images/marketplace/Swap-2.png deleted file mode 100644 index 79608dfd..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-2.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-3 (1).png b/using-ocean-market/images/marketplace/Swap-3 (1).png deleted file mode 100644 index 9888497a..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-3 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-3.png b/using-ocean-market/images/marketplace/Swap-3.png deleted file mode 100644 index 9888497a..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-3.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-4 (1).png b/using-ocean-market/images/marketplace/Swap-4 (1).png deleted file mode 100644 index f7445b06..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-4 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-4.png b/using-ocean-market/images/marketplace/Swap-4.png deleted file mode 100644 index f7445b06..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-4.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-5 (1).png b/using-ocean-market/images/marketplace/Swap-5 (1).png deleted file mode 100644 index e3bcef4f..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-5 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/Swap-5.png b/using-ocean-market/images/marketplace/Swap-5.png deleted file mode 100644 index e3bcef4f..00000000 Binary files a/using-ocean-market/images/marketplace/Swap-5.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-1 (1).png b/using-ocean-market/images/marketplace/add-liquidity-1 (1).png deleted file mode 100644 index 101e015a..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-1 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-1.png b/using-ocean-market/images/marketplace/add-liquidity-1.png deleted file mode 100644 index 101e015a..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-1.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-2 (1).png b/using-ocean-market/images/marketplace/add-liquidity-2 (1).png deleted file mode 100644 index 41f2dc60..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-2 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-2.png b/using-ocean-market/images/marketplace/add-liquidity-2.png deleted file mode 100644 index 41f2dc60..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-2.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-3 (1).png b/using-ocean-market/images/marketplace/add-liquidity-3 (1).png deleted file mode 100644 index 451af75c..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-3 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-3.png b/using-ocean-market/images/marketplace/add-liquidity-3.png deleted file mode 100644 index 451af75c..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-3.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-4 (1).png b/using-ocean-market/images/marketplace/add-liquidity-4 (1).png deleted file mode 100644 index ab83f845..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-4 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-4.png b/using-ocean-market/images/marketplace/add-liquidity-4.png deleted file mode 100644 index ab83f845..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-4.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-5 (1).png b/using-ocean-market/images/marketplace/add-liquidity-5 (1).png deleted file mode 100644 index d9ed0824..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-5 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-5.png b/using-ocean-market/images/marketplace/add-liquidity-5.png deleted file mode 100644 index d9ed0824..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-5.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-6 (1).png b/using-ocean-market/images/marketplace/add-liquidity-6 (1).png deleted file mode 100644 index 2546323c..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-6 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-6.png b/using-ocean-market/images/marketplace/add-liquidity-6.png deleted file mode 100644 index 2546323c..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-6.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-connect-wallet (1).png b/using-ocean-market/images/marketplace/add-liquidity-connect-wallet (1).png deleted file mode 100644 index 01b4fa51..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-connect-wallet (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/add-liquidity-connect-wallet.png b/using-ocean-market/images/marketplace/add-liquidity-connect-wallet.png deleted file mode 100644 index 01b4fa51..00000000 Binary files a/using-ocean-market/images/marketplace/add-liquidity-connect-wallet.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/connect-wallet (1).png b/using-ocean-market/images/marketplace/connect-wallet (1).png deleted file mode 100644 index feee3631..00000000 Binary files a/using-ocean-market/images/marketplace/connect-wallet (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/connect-wallet.png b/using-ocean-market/images/marketplace/connect-wallet.png deleted file mode 100644 index feee3631..00000000 Binary files a/using-ocean-market/images/marketplace/connect-wallet.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-1 (1).png b/using-ocean-market/images/marketplace/consume-1 (1).png deleted file mode 100644 index 562608f8..00000000 Binary files a/using-ocean-market/images/marketplace/consume-1 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-1.png b/using-ocean-market/images/marketplace/consume-1.png deleted file mode 100644 index 562608f8..00000000 Binary files a/using-ocean-market/images/marketplace/consume-1.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-2 (1).png b/using-ocean-market/images/marketplace/consume-2 (1).png deleted file mode 100644 index 5735e447..00000000 Binary files a/using-ocean-market/images/marketplace/consume-2 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-2.png b/using-ocean-market/images/marketplace/consume-2.png deleted file mode 100644 index 5735e447..00000000 Binary files a/using-ocean-market/images/marketplace/consume-2.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-3 (1).png b/using-ocean-market/images/marketplace/consume-3 (1).png deleted file mode 100644 index 51376a61..00000000 Binary files a/using-ocean-market/images/marketplace/consume-3 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-3.png b/using-ocean-market/images/marketplace/consume-3.png deleted file mode 100644 index 51376a61..00000000 Binary files a/using-ocean-market/images/marketplace/consume-3.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-4 (1).png b/using-ocean-market/images/marketplace/consume-4 (1).png deleted file mode 100644 index 41581d67..00000000 Binary files a/using-ocean-market/images/marketplace/consume-4 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-4.png b/using-ocean-market/images/marketplace/consume-4.png deleted file mode 100644 index 41581d67..00000000 Binary files a/using-ocean-market/images/marketplace/consume-4.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-5 (1).png b/using-ocean-market/images/marketplace/consume-5 (1).png deleted file mode 100644 index aa6ffb46..00000000 Binary files a/using-ocean-market/images/marketplace/consume-5 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-5.png b/using-ocean-market/images/marketplace/consume-5.png deleted file mode 100644 index aa6ffb46..00000000 Binary files a/using-ocean-market/images/marketplace/consume-5.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-6.png b/using-ocean-market/images/marketplace/consume-6.png deleted file mode 100644 index 756e8e73..00000000 Binary files a/using-ocean-market/images/marketplace/consume-6.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-connect-wallet (1).png b/using-ocean-market/images/marketplace/consume-connect-wallet (1).png deleted file mode 100644 index 34ba0432..00000000 Binary files a/using-ocean-market/images/marketplace/consume-connect-wallet (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/consume-connect-wallet.png b/using-ocean-market/images/marketplace/consume-connect-wallet.png deleted file mode 100644 index 34ba0432..00000000 Binary files a/using-ocean-market/images/marketplace/consume-connect-wallet.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/marketplace-landing-page.png b/using-ocean-market/images/marketplace/marketplace-landing-page.png deleted file mode 100644 index e57548bd..00000000 Binary files a/using-ocean-market/images/marketplace/marketplace-landing-page.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/pricing-1.png b/using-ocean-market/images/marketplace/pricing-1.png deleted file mode 100644 index be239cf5..00000000 Binary files a/using-ocean-market/images/marketplace/pricing-1.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/pricing-fixed-2.png b/using-ocean-market/images/marketplace/pricing-fixed-2.png deleted file mode 100644 index 60f3063d..00000000 Binary files a/using-ocean-market/images/marketplace/pricing-fixed-2.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/pricing-fixed-3.png b/using-ocean-market/images/marketplace/pricing-fixed-3.png deleted file mode 100644 index cda668a2..00000000 Binary files a/using-ocean-market/images/marketplace/pricing-fixed-3.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/pricing-fixed-4.png b/using-ocean-market/images/marketplace/pricing-fixed-4.png deleted file mode 100644 index 6966d027..00000000 Binary files a/using-ocean-market/images/marketplace/pricing-fixed-4.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/pricing-type.png b/using-ocean-market/images/marketplace/pricing-type.png deleted file mode 100644 index 252c87b4..00000000 Binary files a/using-ocean-market/images/marketplace/pricing-type.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish (1).png b/using-ocean-market/images/marketplace/publish (1).png deleted file mode 100644 index aac3a82d..00000000 Binary files a/using-ocean-market/images/marketplace/publish (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-1 (1).png b/using-ocean-market/images/marketplace/publish-1 (1).png deleted file mode 100644 index 155612cb..00000000 Binary files a/using-ocean-market/images/marketplace/publish-1 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-1.png b/using-ocean-market/images/marketplace/publish-1.png deleted file mode 100644 index 155612cb..00000000 Binary files a/using-ocean-market/images/marketplace/publish-1.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-2 (1).png b/using-ocean-market/images/marketplace/publish-2 (1).png deleted file mode 100644 index b918af46..00000000 Binary files a/using-ocean-market/images/marketplace/publish-2 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-2.png b/using-ocean-market/images/marketplace/publish-2.png deleted file mode 100644 index b918af46..00000000 Binary files a/using-ocean-market/images/marketplace/publish-2.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-3 (1).png b/using-ocean-market/images/marketplace/publish-3 (1).png deleted file mode 100644 index 1bf12e6c..00000000 Binary files a/using-ocean-market/images/marketplace/publish-3 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-3.png b/using-ocean-market/images/marketplace/publish-3.png deleted file mode 100644 index 1bf12e6c..00000000 Binary files a/using-ocean-market/images/marketplace/publish-3.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-4 (1).png b/using-ocean-market/images/marketplace/publish-4 (1).png deleted file mode 100644 index 65b47328..00000000 Binary files a/using-ocean-market/images/marketplace/publish-4 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-4.png b/using-ocean-market/images/marketplace/publish-4.png deleted file mode 100644 index 65b47328..00000000 Binary files a/using-ocean-market/images/marketplace/publish-4.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-5 (1).png b/using-ocean-market/images/marketplace/publish-5 (1).png deleted file mode 100644 index 94621b3e..00000000 Binary files a/using-ocean-market/images/marketplace/publish-5 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-5.png b/using-ocean-market/images/marketplace/publish-5.png deleted file mode 100644 index 94621b3e..00000000 Binary files a/using-ocean-market/images/marketplace/publish-5.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-6 (1).png b/using-ocean-market/images/marketplace/publish-6 (1).png deleted file mode 100644 index 85292a34..00000000 Binary files a/using-ocean-market/images/marketplace/publish-6 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-6.png b/using-ocean-market/images/marketplace/publish-6.png deleted file mode 100644 index 85292a34..00000000 Binary files a/using-ocean-market/images/marketplace/publish-6.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-7 (1).png b/using-ocean-market/images/marketplace/publish-7 (1).png deleted file mode 100644 index b74663e1..00000000 Binary files a/using-ocean-market/images/marketplace/publish-7 (1).png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-7.png b/using-ocean-market/images/marketplace/publish-7.png deleted file mode 100644 index b74663e1..00000000 Binary files a/using-ocean-market/images/marketplace/publish-7.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish-8.png b/using-ocean-market/images/marketplace/publish-8.png deleted file mode 100644 index 4bee7cee..00000000 Binary files a/using-ocean-market/images/marketplace/publish-8.png and /dev/null differ diff --git a/using-ocean-market/images/marketplace/publish.png b/using-ocean-market/images/marketplace/publish.png deleted file mode 100644 index aac3a82d..00000000 Binary files a/using-ocean-market/images/marketplace/publish.png and /dev/null differ diff --git a/using-ocean-market/marketplace-download-data-asset.md b/using-ocean-market/marketplace-download-data-asset.md deleted file mode 100644 index 6f80ac33..00000000 --- a/using-ocean-market/marketplace-download-data-asset.md +++ /dev/null @@ -1,43 +0,0 @@ ---- -description: Tutorial to download assets using Ocean Market ---- - -# Download a Data Asset - -### Access marketplace - -1. Go to Ocean [Marketplace](https://v4.market.oceanprotocol.com/). -2. Search for the data asset. The Ocean Marketplace provides features to search the Data/Algorithms by text, and users can also sort the result by published date. -3. Connect wallet. - -![Connect wallet](../.gitbook/assets/consume-connect-wallet.png) - -``` -In this tutorial, we will be using the Polygon Mumbai test network. -``` - -### Tutorial - -#### Step 1 - Click buy - -The buy button is enabled only if the connected wallet address has enough OCEAN tokens to exchange them with 1 datatoken. - -![Buy](../.gitbook/assets/consume-1.png) - -#### Step 2 - Allow access to OCEAN token(s) - -![Transaction 1: Permissions to access OCEAN tokens](../.gitbook/assets/consume-2.png) - -#### Step 3 - Buy a datatoken by exchanging it with OCEAN token(s) - -![Transaction 2: Buy datatoken](../.gitbook/assets/consume-3.png) - -#### Step 4 - Click download - -![Download asset](../.gitbook/assets/consume-4.png) - -#### Step 5 - Sign message - -After signing the message, the file download will start. - -![Sign](../.gitbook/assets/consume-5.png) diff --git a/using-ocean-market/marketplace-publish-data-asset.md b/using-ocean-market/marketplace-publish-data-asset.md deleted file mode 100644 index f3559cba..00000000 --- a/using-ocean-market/marketplace-publish-data-asset.md +++ /dev/null @@ -1,116 +0,0 @@ ---- -description: Tutorial to publish assets using the Ocean Market ---- - -# Publish a Data Asset - -### What can be published? - -Ocean Market provides a convenient interface for individuals and organizations to publish their data. Datasets can be images, location information, audio, video, sales data, or combinations of all! There is no exhaustive list of what type of data can be published on the Market. Please note the Ocean Protocol team maintains a purgatory list [here](https://github.com/oceanprotocol/list-purgatory) to block addresses and remove assets for any violations. - -### Tutorial - -#### Connect wallet and navigate to the publish page - -1. Go to [Ocean Market](https://v4.market.oceanprotocol.com) -2. Connect wallet. - -![Connect wallet](../.gitbook/assets/connect-wallet.png) - -In this tutorial, we will be using the Polygon Mumbai test network. - -3\. Go to the publish page. - -![Publish page](../.gitbook/assets/publish.png) - -#### Step 1 - Metadata - -Fill in the metadata. - -_Mandatory fields are marked with \*_ - -* **Asset type**\* - - An asset can be a _dataset_ or an _algorithm_. The asset type cannot be changed after publication. -* **Title**\* - - The descriptive name of the asset. This field is editable after the asset publication. -* **Description**\* - - Description of the asset. Ocean Marketplace supports plain text and Markdown format for the description field. This field is editable after the asset publication. -* **Author**\* - - The author of the asset. The author can be an individual or an organization. This field is editable after the asset publication. -* **Tags** - - Tags help the asset to be discoverable. If not provided, the list of tags is empty by default. - -![Asset metadata](../.gitbook/assets/publish-1.png) - -#### Step 2 - Access details - -_Mandatory fields are marked with \*_ - -* **Access Type**\* - - An asset can be a downloadable file or a compute service on which buyers can run their algorithm. Through **download**, buyers will be able to download the dataset. Through **compute**, buyers will be able to use the dataset in a compute-to-data environment. -* **Provider URL**\* - - Provider facilitates the asset download to buyers or for computing jobs and much more. -* **File**\* - - The direct URL of the dataset to be published. The file needs to be publicly accessible to be downloadable by buyers. If the file is hosted on services like Google Drive, the URL provided needs to point directly to the data asset file. Also, the file needs to have the proper permissions to be downloaded by anybody. - - **Provider** encrypts this field before publishing the asset on-chain. -* **Sample file** - - An optional field through which publishers provide a sample file of the dataset they want to publish. The buyers can access it before buying the dataset. This field is editable after the asset publication. - - **Provider** encrypts this field before publishing the asset on-chain. -* **Timeout**\* - - This field specifies how long the buyer can access the dataset after the dataset is purchased. This field is editable after the asset publication. - -![Access details](../.gitbook/assets/publish-2.png) - -#### Step 3 - Pricing - -The publisher needs to choose a pricing option for the asset before publishing the data asset. The pricing schema is not editable after the asset publication. - -There are 2 pricing options for asset publication on Ocean Marketplace. - -1. Fixed pricing -2. Free pricing - -With the _fixed pricing_ schema, the publisher sets the price that buyers will pay to download the data asset. - -With the _free pricing_ schema, the publisher provides an asset that is free to be downloaded by anyone. - -For more information on the pricing models, please refer this [document](../core-concepts/asset-pricing.md). - -For a deep dive into the fee structure, please refer to this [document](../core-concepts/fees.md). - -![Asset pricing](../.gitbook/assets/publish-3.png) - -#### Step 4 - Preview - -![Preview](../.gitbook/assets/publish-4.png) - -#### Step 5 - Blockchain transactions - -![Transaction 1 - Deploy data NFT and datatoken](../.gitbook/assets/publish-5.png) - - -![Transaction 2 - Deploy data NFT and datatoken](../.gitbook/assets/publish-6.png) - -#### Confirmation - -Now, the asset is successfully published and available in the Ocean Market. - -![Successful publish](../.gitbook/assets/publish-7.png) - -On the [profile page](https://v4.market.oceanprotocol.com/profile), the publisher has access to all his published assets. - -### Other Articles - -https://blog.oceanprotocol.com/on-selling-data-in-ocean-market-9afcfa1e6e43 diff --git a/veocean-data-farming/README.md b/veocean-data-farming/README.md deleted file mode 100644 index 6bde6a78..00000000 --- a/veocean-data-farming/README.md +++ /dev/null @@ -1,40 +0,0 @@ ---- -description: An overview of Ocean Protocol's governance and incentives mechanisms ---- -# veOCEAN & Data Farming - -veOCEAN is a fork of veCRV, and enables you to become a governance participant, eligible to receive rewards and engage with different protocol mechanisms. - -The following docs should provide you with sufficient intuition to access, utilize, and build upon the protocol's core incentive and reward system: Data Farming. - -![](./images/df_rewards_page.png) - -## veOCEAN - -Learning about [veOCEAN](veocean.md) will help you answer the question "What can I do with my veOCEAN?" and give you insights on how veOCEAN works. It will teach you everything you need to know about why it exists and how it works. - -You will learn that by just holding veOCEAN passively, you are able to earn rewards. - -## Data Farming - -[Data Farming 101](df-intro.md) will teach you about the different reward systems, how they work, and how to access them. By the end of it, you should be more familiar with how Data Farming works and able to take next steps to curate assets. - -[Data Farming Background](df-background.md) will provide you with more intuitions about Data Farming, briefly explain the Reward Function, and how the program evolved over time. - -## Delegation - -[Delegation](delegation.md) will teach you how to share your veOCEAN allocation power to other users who can manage Data Farming for you. - -Once delegated, rewards will be sent to the wallet address you delegated to. The delegation receiver is in charge of your rewards and the process of returning those back to you. - -## Further Reading - -Finally, if you want to continue expanding your knowledge on OCEAN token emissions, APY estimates, and get useful answers to some of the most common questions, you can read the following: - -[Emissions & APYs](emissions-apys.md) will provide you will information about how OCEAN will be released over time through the Data Farming program and provide you with APY studies. - -Our [FAQ](faq.md) answers many different questions about staking, chains, deployments, and other details that may be valuable to you. - -## Reference - -All content within has been assembled via reference of the [Ocean Data Farming Series](https://blog.oceanprotocol.com/ocean-data-farming-series-c7922f1d0e45), official [Ocean Protocol github repositories](https://github.com/oceanprotocol/), and [v4 Whitepapers](https://oceanprotocol.com/tech-whitepaper.pdf). \ No newline at end of file diff --git a/veocean-data-farming/delegation.md b/veocean-data-farming/delegation.md deleted file mode 100644 index e5ed4c59..00000000 --- a/veocean-data-farming/delegation.md +++ /dev/null @@ -1,19 +0,0 @@ -## Delegation - -Delegation allows you to temporarily transfer your veOCEAN allocation power to another wallet address. This feature enables you to earn a higher annual percentage yield (APY) by delegating to an address that efficiently manages your allocation, without the need for reallocation and transaction fees. - -If you have multiple wallets, delegation enables you to manage your allocations with just one wallet address. - -When you delegate, you transfer 100% of your veOCEAN allocation power for a limited period. After delegation, you cannot manage your allocations until the delegation expires. The delegation expiration date is the same as the veOCEAN lock end date at the time of delegation. If necessary, you can extend your lock before delegating. You can cancel your delegation at any time. - -Once delegated, rewards will be sent to the wallet address you delegated to. The delegation receiver is in charge of your rewards and the process of returning those back to you. - -Follow these steps to delegate you veOcean: - -1. Go to the DF Portal and enter or copy the wallet address you wish to delegate to into the 'Receiver wallet address' field. -2. Click the Delegate button and sign the transaction. You can view information about your delegation in the My Delegations component. -3. If needed, you can cancel the delegation to regain your allocation power before the delegation expires. - -![](./images/veOCEAN-Delegation.png) - -If you receive veOCEAN allocation power from other wallets, you will receive their rewards and be responsible for distributing rewards to the delegators. You cannot delegate the veOCEAN you received from delegates, only the veOCEAN you received from your lock. diff --git a/veocean-data-farming/df-background.md b/veocean-data-farming/df-background.md deleted file mode 100644 index 4bcdfdcd..00000000 --- a/veocean-data-farming/df-background.md +++ /dev/null @@ -1,61 +0,0 @@ ---- -description: Data Farming (DF) incentivizes for growth of Data Consume Volume (DCV) in the Ocean ecosystem. ---- -# Data Farming Background - -It rewards OCEAN to stakers as a function of consume volume and liquidity. It’s like DeFi liquidity mining, but tuned for data consumption. DF’s aim is to achieve a minimum supply of data for network effects to kick in, and once the network flywheel is spinning, to increase growth rate. - -## Active Work to Drive APY - -Data Farming is not a wholly passive activity. The name of the game is to drive Data Consume Volume (DCV). High APYs happen only when there is sufficiently high DCV. High DCV means publishing and consuming truly useful datasets (or algorithms). - -Thus, if you really want to max out your APY: -1. Create & publish datasets (and make $ in selling them) — or work with people who can -1. Lock OCEAN and stake veOCEAN on them. -1. Buy the datasets (and use them to make $) — or work with people who can -1. Claim the rewards. - -Driving DCV for publishing & consuming is your challenge. It will take real work. And then the reward is APY. It’s incentives all the way down :) - -## Reward Function - -The Reward Function (RF) governs how active rewards are allocated to stakers. - -Rewards are calculated as follows: -1. Distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. -1. For each asset and each veOCEAN holder: -– If the holder is a publisher, 2x the effective stake -– Baseline rewards = (% stake in asset) * (OCEAN for asset) -– Bound rewards to the asset by 125% APY -– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume. - -You can find this code inside [calcrewards.py](https://github.com/oceanprotocol/df-py/blob/main/util/calcrewards.py) in the Ocean Protocol [df-py repo](https://github.com/oceanprotocol/df-py/) - -## Data Assets that Qualify for DF - -Data assets that have veOCEAN allocated towards them get DF rewards. - -The data asset may be of any type — dataset (for static URIs), algorithm for Compute-to-Data, or any other Datatoken token-gated system. The data asset may be fixed price or free price. If fixed price, any token of exchange is alright (OCEAN, H2O, USDC, ..). - -To qualify for DF, a data asset must also: -- Have been created by Ocean Smart contracts [deployed](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json) by OPF to [production networks](https://docs.oceanprotocol.com/core-concepts/networks) -- Be visible on [Ocean Market](https://market.oceanprotocol.com/) -- Can’t be in [purgatory](https://github.com/oceanprotocol/list-purgatory/blob/main/policies/README.md) - -## 4 Phases of Data Farming - -Data Farming has evolved over time and will continue to do so as the Emission Curve progresses. We are now in DF main, below are the previous phases and parameters incurred during the evolution of the Data Farming program. - -**DF Alpha - Rounds 1-4 (4 wks)** -10K OCEAN rewards were budgeted per week. Counting started Thu June 16, 2022 and ended July 13, 2022. Rewards were distributed at the end of every week, for the activity of the previous week. It ran for 4 weeks. The aim was to test technology, learn, and onboard data publishers. - -**DF/VE Alpha - Rounds 5-8 (4 wks)** -10K OCEAN rewards were budgeted per week. Counting started Thu Sep 29, 2022 and ended Oct 27, 2022. Rewards were distributed at the end of every week, for the activity of the previous week. It ran for 4 weeks. The aim was to resume Data Farming along with veOCEAN, test the technology, onboard data publishers, and keep learning. - -**DF Beta - Rounds 9-28 (20 wks)** -Up to 100K OCEAN rewards were budget per week. Counting started Thu Oct 27, 2022 and ended March 15, 2023. It ran for 20 weeks. The aim was to test the effect of larger incentives, support ecosystem participation, while continue refining the underlying technology. - -**DF Main - Rounds 29-1000+** -Immediately followed the release of DF Beta on Thu Mar 16, 2023. Rewards begin at 150k per week and go to 1.1M OCEAN per week. DF Main emits 503.4M OCEAN worth of rewards and lasts for decades. Expected APY is 125% over many months (once fully ramped), staying generous over the long term. - -The amount of OCEAN released is determined by the emission schedule as defined by the [Emission Curve](emissions-apys.md#emissions--apys), and perhaps more easily uderstood in the [Reward Schedule](df-intro.md#reward-schedule) \ No newline at end of file diff --git a/veocean-data-farming/df-intro.md b/veocean-data-farming/df-intro.md deleted file mode 100644 index a903d6e8..00000000 --- a/veocean-data-farming/df-intro.md +++ /dev/null @@ -1,99 +0,0 @@ ---- -description: An introduction to Data Farming and Ocean Protocol's key incentives. ---- - -# Data Farming 101 - -Data Farming (DF) incentivizes for growth of Data Consume Volume (DCV) in the Ocean ecosystem. - -It rewards OCEAN to liquidity providers (stakers) as a function of consume volume and liquidity. It’s like DeFi liquidity mining, but tuned for data consumption. DF’s aim is to achieve a minimum supply of data for network effects to kick in, and once the network flywheel is spinning, to increase growth rate. - -## Reward Categories - -Rewards are paid in OCEAN and distributed every week on Thursday as follow: - -| Passive Rewards | Active Rewards | -| --------------- | -------------- | -| 50% | 50% | - -Active Rewards are governed and defined by the [Reward Function](df-background.md#reward-function). - -**Final Caveat:** We reserve the right to make reasonable changes to these plans, if unforeseen circumstances emerge. - -## How to access DF and claim rewards - -Please [follow this tutorial](../rewards/veOcean-Data-Farming-Tutorial.md) to learn how the Ocean Protocol reward programs work, and how to access them. - -Otherwise, go to the DF webapp at [df.oceandao.org](df.oceandao.org/) and explore Data Farming for yourself. - -### Where to claim? - -All earnings for veOCEAN holders are claimable on the ”Rewards” page inside the Data Farming webapp on Ethereum mainnet. - -Data assets for DF may published in any [network where Ocean’s deployed in production](../core-concepts/networks.md): Eth mainnet, Polygon, etc. - -### When to claim? - -There are fresh rewards available every Thursday at midnight GMT. If you wish, you can wait for many weeks to accumulate before claiming. (It’s all on-chain.) - -Rewards are calculated based on different factors, like your current veOCEAN balance and your Data Farming allocation. - -**Passive rewards** are distributed based on the amount of veOCEAN tokens held at the end of the previous round. If you lock your tokens during DF Round 1, you will only receive your passive rewards at the end of DF Round 2. - -**Active rewards** are distributed based on the amount of veOCEAN tokens allocated during the current round. If you allocate veOCEAN tokens before the end of DF Round 1, you will receive active rewards for that round. - -You can learn more about these details in the [Reward Function section](df-background#reward-function) and in the next section. - -### When to do a first claim? - -From the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. - -The nerdy version: if you lock OCEAN on day x, you’ll be able to claim rewards on the first weekly ve “epoch” that begins after day x+7. - -This behavior is inherited from [veCRV](https://curve.readthedocs.io/dao-fees.html); [here’s the code](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veFeeDistributor.vy#L240-L256). - -## DF Main - -DF Main started Mar 16, 2023 in DF Round 29. DF29 has 150K OCEAN rewards available (a 2x increase from DF28). As DF Main progresses, rewards will increase to 300K (another 2x), then 600K (another 2x), then beyond 1.1M OCEAN/week (near 2x) then decaying over time. - -As of DF29 (Mar 16, 2023), wash consuming is not profitable. So, organically-generated Data Consume Volume is the main driver of active DF rewards. - -[Example APYs are 5–20%](emissions-apys.md#example-apys) between Passive & Active rewards. - -Full implementation of DF Main will be over many months, after which DF will be decentralized. - -DF Main lasts for decades. - -## Reward Schedule - -The table below cross-references DF Round Number, Start Date, Phase & Week, Sub-Phase & Week, and OCEAN Rewards/Week. - -![](./images/reward_schedule.png) -_Ocean Reward Schedule for the next 20+ years_ - -## Ranked Rewards - -In DF23 Ranked Rewards were introduced and smooth the reward distribution by using a logarithmic function. - -**Since rewards are distributed across the Top 100 assets, all participants (Publishers & Curators) are now incentivized to support a broader range of assets rather than optimizing on a single asset.** - -At the top-end, this helps increase quality and diversification of inventory. - -At the bottom-end, this eliminates some potential free-rider issues and smooths out the reward distribution. - -![](images/ranked_rewards_study.png) - -You can read more about the implementation [in this blog post](https://blog.oceanprotocol.com/data-farming-df22-completed-df23-started-reward-function-tuned-ffd4359657ee) and find the full study [in these slides](https://docs.google.com/presentation/d/1HIA2zV8NUPpCELmi2WFwnAbHmFFrcXjNQiCpEqJ2Jdg/). - -## Publisher Rewards - 2x Stake - -DF gives stronger incentives to publish data services, as follows. - -_All the veOCEAN a publisher has allocated to an asset they’ve published (“staked”) is treated as 2x the stake for rewards calculation._ - -1. As a staker, due to their staked veOCEAN on their own assets (1x). -1. As a publisher, for having veOCEAN staked on their own asset(1x). - -The final reward is then calculated and bundled together to be distributed. - -You can read more about the implementation [in this blog post](https://blog.oceanprotocol.com/data-farming-publisher-rewards-f2639525e508). diff --git a/veocean-data-farming/emissions-apys.md b/veocean-data-farming/emissions-apys.md deleted file mode 100644 index e7b32030..00000000 --- a/veocean-data-farming/emissions-apys.md +++ /dev/null @@ -1,56 +0,0 @@ ---- -description: Details on the emission curves and a study on estimated APYs ---- -# Emissions & APYs - -With veOCEAN, OceanDAO evolves to be more like CurveDAO: - -- ve is at the heart with v = voting (in data asset curation) and e = escrowed (locked) OCEAN. The longer the lockup, the more voting and rewards, which reconciles near and long-term DAO incentives. -- The DAO has increased bias to automation, and to minimizing the governance attack surface. - -The baseline emissions schedule determines the weekly OCEAN budget for this phase. The schedule is like Bitcoin, including a half-life of 4 years. Unlike Bitcoin, there is a burn-in period to ratchet up value-at-risk versus time: -- The curve initially gets a multiplier of 10% for 12 months (DF Main 1) -- Then, it transitions to multiplier of 25% for 6 months (DF Main 2) -- Then, a multiplier of 50% for 6 months (DF Main 3) -- Finally, a multiplier of 100%. (DF Main 4) - -We implement the first three phases as constants, because they are relatively short in duration. We implement the fourth phase as a Bitcoin-style exponential: constant, with the constant dividing by two (“halvening”) every four years. - -Let’s visualize! - -## Emissions — first 5 years. - -The image below shows the first 5 years. The y-axis is OCEAN released each week. It’s log-scaled to easily see the differences. The x-axis is time, measured in weeks. In weeks 0–29, we can see the distinct phases for DF Alpha (DF1 // week 0), DF/VE Alpha (DF5 // week 4), DF Beta (DF9 // week 8), DF Main 1 (DF29 // week 28), DF Main 2 (DF80 // week 79), DF Main 3 (DF106 // week 105), and DF Main 4 (DF132 // week 131). - -![](./images/emissions_first_5years.png) -_OCEAN released to DF per week — first 5 years_ - -## Emissions — First 20 years. - -The image below is like the previous one: OCEAN released per week, but now for the first 20 years. Week 131 onwards is DF Main 4. We can see that the y-value divides by two (“halvens”) every four years. - -![](./images/emissions_first_20years.png) -_OCEAN released to DF per week — first 20 years_ - -## Total OCEAN released. - -The image below shows the total OCEAN released by DF for the first 20 years. The y-axis is log-scaled to capture both the small initial rewards and exponentially larger values later on. The x-axis is also log-scaled so that we can more readily see how the curve converges over time. - -![](./images/emissions_lifetime.png) -_Total OCEAN released to DF — first 20 years_ - -## Example APYs - -The plot below shows estimated APY over time. Green includes both passive and active rewards; black is just passive rewards. As of DF29, wash consume is no longer profitable, so we should expect a large drop in DCV and therefore in active rewards. So passive rewards (black) provides a great baseline with upside in active rewards (green). - -APYs are an estimate because APY depends on OCEAN locked. OCEAN locked for future weeks is not known precisely; it must be estimated. The yellow line is the model for OCEAN locked. We modeled OCEAN locked by observing linear growth from week 5 (when OCEAN locking was introduced) to week 28 (now): OCEAN locked grew from 7.89M OCEAN to 34.98M OCEAN respectively, or 1.177M more OCEAN locked per week. - -![](./images/example_apys.png) -_Green: estimated APYs (passive + active). Black: estimated APYs (just passive). Yellow: estimated staking_ -The plots are calculated from [this Google Sheet](https://docs.google.com/spreadsheets/d/1F4o7PbV45yW1aPWOJ2rwZEKkgJXbIk5Yq7tj8749drc/edit#gid=1051477754). - -OCEAN lock time affects APY. The numbers above assume that all locked OCEAN is locked for 4 years, so that 1 OCEAN → 1 veOCEAN. But APY could be much worse or more if you lock for shorter durations. Here are approximate bounds. - -If you lock for 4 years, and everyone else locks for 2, then multiply expected APY by 2. If you lock for 4 years and others for 1, then multiply by 4. -Conversely, if you lock for 2 years and everyone else for 4, then divide your expected APY by 2. If you lock for 1 year and others for 4, then divide by 4. -The numbers assume that you’re actively allocating veOCEAN allocation towards high-DCV data assets. For passive locking or low-DCV data assets, divide APY by 2 (approximate). \ No newline at end of file diff --git a/veocean-data-farming/veocean.md b/veocean-data-farming/veocean.md deleted file mode 100644 index 6c48eb17..00000000 --- a/veocean-data-farming/veocean.md +++ /dev/null @@ -1,109 +0,0 @@ ---- -description: An overview of the governance token, veOCEAN (vote-escrowed). ---- -# veOCEAN - -veOCEAN is a mechanism to align near-term incentives (maximize APY) with long-term incentives (long-term locking). It's a fork of veCRV contracts which have been battle-tested over years. - -The amount of reward depends on how long the tokens are locked for. You must lock up your OCEAN into the vault for to obtain veOCEAN. Going forward, veOCEAN will be the main mechanism for staking OCEAN, and for curation of datasets. - -After creating your lock you will be credited veOCEAN. We sometimes refer to veOCEAN as your “voting power”. - -**WARNING:** You will not be able to retrieve your original OCEAN deposit until the lock ends. - -## What can I do with veOCEAN? - -veOCEAN allows you to engage with different protocol mechanisms and benefit from the reward programs available. - -There are 4 things you can do with veOCEAN. -1. **Hold it** veOCEAN pays **Passive Rewards** every week. -2. **Allocate it** veOCEAN pays **Active Rewards** every week to the top performing Datasets, Algorithms, dApps, and more. -3. **Delegate it** You can delegate veOCEAN to other Data Farmers who can curate Datasets for you. In return for their services, these farmers may charge you a fee for helping you receive APY on **Active Rewards**. The Delegate feature has just been recently released and enables veOCEAN holders to more easily access Active Rewards. -4. **2x Stake** If you are a publisher, allocating veOCEAN to your own Dataset gives your veOCEAN a 2x Bonus. This is an incentive for publishers to engage with their assets and benefit from from the protocol further. - -## What is time locking? - -Users can lock their OCEAN for different lengths of time to gain voting power. Our app is configured to lock OCEAN for a minimum of 2 weeks and a maximum of four years for max benefit. - -Users that lock their OCEAN for a longer period of time receive more veOCEAN to reflect their conviction in the system. - -| Year | Lock Multiplier | veOCEAN | -| ---- | ----------| ------- | -| 1 | 0.25x | 0.25 | -| 2 | 0.50x | 0.50 | -| 3 | 0.75x | 0.75 | -| 4 | 1.00x | 1.00 | - -_The Lock Multiplier. Amount of veOCEAN received per OCEAN locked._ - -If you’ve locked OCEAN for 4 years, you will be unable to retrieve your deposit until this time expires. - -After choosing your lock period and locking up your OCEAN into the vault, you will be credited with veOCEAN. - -veOCEAN is non-transferable. You can’t sell it or send it to other addresses. - -## Linear Decay - -Your veOCEAN balance will slowly start declining as soon as you receive it. - -veOCEAN balance decreases linearly over time until the Lock End Date. When your lock time has lapsed by 50%, you will have 50% of your original veOCEAN balance. - -When your lock time ends your veOCEAN balance will hit 0, and your OCEAN tokens can be withdrawn. - -If you lock 1.0 OCEAN for 4 years, you get 1.0 veOCEAN at the start. - -| Years Passed | veOCEAN Left | -| ---- | ---- | -| 1 year | 0.75 | -| 2 years | 0.50 | -| 3 years | 0.25 | -| 4 years | 0.00 | - -At the end of your 4 years, your OCEAN is unlocked. - -## Replenishing your veOCEAN - -You can choose to update your lock and replenish your veOCEAN balance at any time. - -To maximize rewards, participants would need to update their 4-year lock every week in order to maintain their veOCEAN balance as high as possible. - -## veOCEAN Earnings - -All earnings for veOCEAN holders are claimable in Ethereum mainnet. (Data assets for DF may published in any network where Ocean’s deployed in production: Eth mainnet, Polygon, etc.) - -There’s a new DF round every week; in line with this, there’s a new ve distribution “epoch” every week. This affects when you can first claim rewards. Specifically, if you lock OCEAN on day x, you’ll be able to claim rewards on the first ve epoch that begins after day x+7. Put another way, from the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. (This behavior is inherited from veCRV. Here’s the code. ) - -veOCEAN holders have earnings from two sources: - -### Earnings from Community Fees - -Every transaction in Ocean Market and Ocean backend generates transaction fees, some of which go to the community. 50% of the community fees will go to veOCEAN holders; the rest will go to Ocean community-oriented traction programs. - -All earnings here are passive. - -### Earnings from Data Farming - -veOCEAN holders will each get weekly DF rewards allocation, except a small carveout for any Data Challenge initiatives that may run through DF ops. - -**veOCEAN holders can be passive, though they will earn more if active.** - -“Being active” means allocating veOCEAN to promising data assets (data NFTs). Then, rewards follow the usual DF formula: DCV * stake. Stake is the amount of veOCEAN allocated to the data asset. There is no liquidity locked inside a datatoken pool. (And this stake is safe: you can’t lose your OCEAN as it is merely locked.) - -## Flow of Value - -The image below illustrates the flow of value. On the left, at time 0, the staker locks their OCEAN into the veOCEAN contract, and receives veOCEAN. In the middle, the staker receives OCEAN rewards every time there’s revenue to the Ocean Protocol Community (top), and also as part of Data Farming rewards (bottom). On the right, when the lock expires (e.g. 4 years) then the staker is able to move their OCEAN around again. - -![](./images/flow_of_value.png) -_Flow of Value_ - -The veOCEAN design is in accordance with the Web3 Sustainability Loop, which Ocean uses as its system-level design. - -The veOCEAN code was forked from the veCRV code. veCRV parameters will be the starting point. To minimize risk, tweaks will be circumspect. - -## Security - -[veOCEAN core contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts/ve) use [veCRV contracts](https://curve.readthedocs.io/dao-vecrv.html) with zero changes, on purpose: the veCRV contracts have been battle-tested for two years and have not had security issues. Nearly 500 million USD is locked across all forks of veCRV, with the leading DeFi protocols adopting this standard. veCRV contracts [have been audited by Trail of Bits and Quantstamp](https://github.com/curvefi/curve-dao-contracts#audits-and-security). - -We have built [a new contract](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veAllocate.sol) for users to point their veOCEAN towards given data assets (“allocate veOCEAN”). These new contracts do not control the veOCEAN core contracts at all. In the event of a breach, the only funds at risk would be the rewards distributed for a single week; and we would be able to redirect future funds to a different contract. - -We have an [ongoing bug bounty via Immunefi](https://immunefi.com/bounty/oceanprotocol/) for Ocean software, including veOCEAN and DF components. If you identify an issue, please report it there and get rewarded. \ No newline at end of file