diff --git a/.gitbook.yaml b/.gitbook.yaml index e77fcd41..f2baa714 100644 --- a/.gitbook.yaml +++ b/.gitbook.yaml @@ -1,37 +1,65 @@ root: ./ redirects: - building-with-ocean/wallets: discover/wallets/README - orientation/metamask-setup: discover/wallets/metamask-setup - building-with-ocean/wallets-and-ocean-tokens: discover/wallets-and-ocean-tokens - core-concepts/architecture: developers/architecture - core-concepts/datanft-and-datatoken: developers/contracts/datanft-and-datatoken - core-concepts/roles: developers/contracts/roles - core-concepts/networks: discover/networks/README - core-concepts/networks/bridges: discover/networks/bridges - core-concepts/fees: developers/contracts/fees - core-concepts/asset-pricing: developers/contracts/pricing-schemas - core-concepts/did-ddo: developers/identifiers - using-ocean-market/marketplace-publish-data-asset: user-guides/publish-data-nfts - using-ocean-market/marketplace-download-data-asset: user-guides/buy-data-nfts - using-ocean-market/asset-hosting: user-guides/asset-hosting/README - using-ocean-market/remove-liquidity-using-etherscan: user-guides/remove-liquidity-pools - building-with-ocean/build-a-marketplace/forking-ocean-market: developers/build-a-marketplace/forking-ocean-market - building-with-ocean/build-a-marketplace/customising-your-market: developers/build-a-marketplace/customising-your-market - building-with-ocean/build-a-marketplace/deploying-market: developers/build-a-marketplace/deploying-market - building-with-ocean/using-ocean-libraries/configuration: developers/ocean.js/configuration - building-with-ocean/using-ocean-libraries/creating_dataNFT: developers/ocean.js/creating-datanft - building-with-ocean/using-ocean-libraries/mint-datatoken: developers/ocean.js/mint-datatoken - building-with-ocean/using-ocean-libraries/update-metadata: developers/ocean.js/update-metadata - building-with-ocean/compute-to-data/compute-to-data-architecture: developers/compute-to-data/compute-to-data-architecture - building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms: developers/compute-to-data/compute-to-data-datasets-algorithms - building-with-ocean/compute-to-data/compute-to-data-algorithms: developers/compute-to-data/compute-to-data-algorithms - building-with-ocean/deploying-components/deploying-marketplace: developers/build-a-marketplace/deploying-market - building-with-ocean/using-ocean-subgraph/list-data-nfts: developers/using-ocean-subgraph/list-data-nfts - building-with-ocean/using-ocean-subgraph/list-datatokens: developers/using-ocean-subgraph/list-datatokens - building-with-ocean/using-ocean-subgraph/get-swap-tx: developers/using-ocean-subgraph/get-swap-tx - building-with-ocean/deploying-components/deploying-subgraph: developers/using-ocean-subgraph/deploying-subgraph - building-with-ocean/deploying-components/deploying-provider: infrastructure/deploying-components/deploying-provider - building-with-ocean/using-ocean-subgraph/deploying-ocean-subgraph: infrastructure/deploying-components/deploying-ocean-subgraph - using-ocean-market/ocean-kubernetes: infrastructure/ocean-kubernetes - resources: resources/README + readme/metamask-setup: discover/wallets/metamask-setup.md + readme/wallets: discover/wallets/README.md + readme/wallets-and-ocean-tokens: discover/wallets-and-ocean-tokens.md + core-concepts: developers/README.md + core-concepts/architecture: developers/architecture.md + core-concepts/datanft-and-datatoken: developers/contracts/datanft-and-datatoken.md + core-concepts/roles: developers/contracts/roles.md + core-concepts/networks: discover/networks.md + core-concepts/networks/bridges: discover/networks/bridges.md + core-concepts/fees: developers/contracts/fees.md + core-concepts/asset-pricing: developers/contracts/pricing-schemas.md + core-concepts/did-ddo: developers/identifiers.md + using-ocean-market: user-guides/using-ocean-market.md + using-ocean-market/marketplace-publish-data-asset: user-guides/publish-data-nfts.md + using-ocean-market/marketplace-download-data-asset: user-guides/buy-data-nfts.md + using-ocean-market/asset-hosting: user-guides/asset-hosting.md + using-ocean-market/remove-liquidity-using-etherscan: user-guides/remove-liquidity-pools.md + building-with-ocean: developers/README.md + building-with-ocean/build-a-marketplace: developers/build-a-marketplace/README.md + building-with-ocean/build-a-marketplace/forking-ocean-market: developers/build-a-marketplace/forking-ocean-market.md + building-with-ocean/build-a-marketplace/customising-your-market: developers/build-a-marketplace/customising-your-market.md + building-with-ocean/build-a-marketplace/deploying-market: developers/build-a-marketplace/deploying-market.md + building-with-ocean/using-ocean-libraries: developers/ocean.js/README.md + building-with-ocean/using-ocean-libraries/configuration: developers/ocean.js/configuration.md + building-with-ocean/using-ocean-libraries/creating_dataNFT: developers/ocean.js/creating-datanft.md + building-with-ocean/using-ocean-libraries/create-datatoken-with-fixed-pricing: developers/ocean.js/publish.md + building-with-ocean/using-ocean-libraries/mint-datatoken: developers/ocean.js/mint-datatoken.md + building-with-ocean/using-ocean-libraries/update-metadata: developers/ocean.js/update-metadata.md + building-with-ocean/compute-to-data: developers/compute-to-data/README.md + building-with-ocean/compute-to-data/compute-to-data-architecture: developers/compute-to-data/compute-to-data-architecture.md + building-with-ocean/compute-to-data/compute-to-data-datasets-algorithms: developers/compute-to-data/compute-to-data-datasets-algorithms.md + building-with-ocean/compute-to-data/compute-to-data-algorithms: developers/compute-to-data/compute-to-data-algorithms.md + building-with-ocean/compute-to-data/compute-to-data-minikube: infrastructure/compute-to-data-minikube.md + building-with-ocean/compute-to-data/compute-to-data-docker-registry: infrastructure/compute-to-data-docker-registry.md + building-with-ocean/compute-to-data/user-defined-parameters: developers/compute-to-data/compute-options.md + building-with-ocean/deploying-components: infrastructure/README.md + building-with-ocean/deploying-components/setup-server: infrastructure/setup-server.md + building-with-ocean/deploying-components/deploying-ocean-subgraph: infrastructure/deploying-ocean-subgraph.md + building-with-ocean/deploying-components/deploying-marketplace: infrastructure/deploying-marketplace.md + building-with-ocean/deploying-components/deploying-aquarius: infrastructure/deploying-aquarius.md + building-with-ocean/deploying-components/deploying-provider: infrastructure/deploying-provider.md + building-with-ocean/using-ocean-subgraph: developers/subgraph/README.md + building-with-ocean/using-ocean-subgraph/list-data-nfts: developers/subgraph/list-data-nfts.md + building-with-ocean/using-ocean-subgraph/list-datatokens: developers/subgraph/list-datatokens.md + building-with-ocean/using-ocean-subgraph/get-data-nft-information: developers/subgraph/get-data-nft-information.md + building-with-ocean/using-ocean-subgraph/get-datatoken-information: developers/subgraph/get-datatoken-information.md + building-with-ocean/using-ocean-subgraph/list-fixed-rate-exchanges: developers/subgraph/list-fixed-rate-exchanges.md + building-with-ocean/using-ocean-subgraph/deploying-ocean-subgraph: infrastructure/deploying-components/deploying-ocean-subgraph.md + building-with-ocean/contributing: contribute/README.md + building-with-ocean/contributing/code-of-conduct: contribute/code-of-conduct.md + building-with-ocean/contributing/legal-reqs: contribute/legal-reqs.md + building-with-ocean/projects-using-ocean: contribute/projects-using-ocean.md + veocean-data-farming: rewards/README.md + veocean-data-farming/veocean: rewards/veocean.md + veocean-data-farming/df-intro: rewards/df-intro.md + veocean-data-farming/df-background: rewards/df-max-out-yield.md + veocean-data-farming/emissions-apys: rewards/df-emissions-apys.md + veocean-data-farming/delegation: rewards/README.md + rewards/veOcean-Data-Farming-Tutorial: user-guides/get-started-df.md + api-references: developers/README.md + api-references/aquarius-rest-api: developers/aquarius/asset-requests.md + api-references/provider-rest-api: developers/provider/general-endpoints.md diff --git a/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.png b/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.png deleted file mode 100644 index 440b2985..00000000 Binary files a/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.png and /dev/null differ diff --git a/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.webp b/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.webp deleted file mode 100644 index 939dd246..00000000 Binary files a/.gitbook/assets/1_EgTB42Dy1zd2m0cCCuqbRA.webp and /dev/null differ diff --git a/.gitbook/assets/1_KgkLBZ7zmQATAHrwdO3A7Q.webp b/.gitbook/assets/1_KgkLBZ7zmQATAHrwdO3A7Q.webp deleted file mode 100644 index 7c202091..00000000 Binary files a/.gitbook/assets/1_KgkLBZ7zmQATAHrwdO3A7Q.webp and /dev/null differ diff --git a/.gitbook/assets/200.webp b/.gitbook/assets/200.webp deleted file mode 100644 index b978dd0a..00000000 Binary files a/.gitbook/assets/200.webp and /dev/null differ diff --git a/.gitbook/assets/200w.webp b/.gitbook/assets/200w.webp deleted file mode 100644 index 9b2552d7..00000000 Binary files a/.gitbook/assets/200w.webp and /dev/null differ diff --git a/.gitbook/assets/3d-data.gif b/.gitbook/assets/3d-data.gif deleted file mode 100644 index fe27a94d..00000000 Binary files a/.gitbook/assets/3d-data.gif and /dev/null differ diff --git a/.gitbook/assets/C2D High Level Architecture.jpg b/.gitbook/assets/C2D High Level Architecture.jpg deleted file mode 100644 index 4f273561..00000000 Binary files a/.gitbook/assets/C2D High Level Architecture.jpg and /dev/null differ diff --git a/.gitbook/assets/Connect-Wallet.png b/.gitbook/assets/Connect-Wallet.png deleted file mode 100644 index d781f43e..00000000 Binary files a/.gitbook/assets/Connect-Wallet.png and /dev/null differ diff --git a/.gitbook/assets/DDO Flow.jpg b/.gitbook/assets/DDO Flow.jpg deleted file mode 100644 index 8bb9bab5..00000000 Binary files a/.gitbook/assets/DDO Flow.jpg and /dev/null differ diff --git a/.gitbook/assets/DataNFT and Datatoken Flow.jpg b/.gitbook/assets/DataNFT and Datatoken Flow.jpg deleted file mode 100644 index 7c4b9351..00000000 Binary files a/.gitbook/assets/DataNFT and Datatoken Flow.jpg and /dev/null differ diff --git a/.gitbook/assets/Enter-Metadata (1).png b/.gitbook/assets/Enter-Metadata (1).png deleted file mode 100644 index 8dd1d3fa..00000000 Binary files a/.gitbook/assets/Enter-Metadata (1).png and /dev/null differ diff --git a/.gitbook/assets/OP High Level Architecture.jpg b/.gitbook/assets/OP High Level Architecture.jpg deleted file mode 100644 index 56007b69..00000000 Binary files a/.gitbook/assets/OP High Level Architecture.jpg and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-06 at 18.27.03.png b/.gitbook/assets/Screenshot 2023-06-06 at 18.27.03.png deleted file mode 100644 index 1c564a79..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-06 at 18.27.03.png and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44 (1).png b/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44 (1).png deleted file mode 100644 index cf0a0fea..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44 (1).png and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44.png b/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44.png deleted file mode 100644 index cf0a0fea..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-15 at 15.40.44.png and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.56.51.png b/.gitbook/assets/Screenshot 2023-06-15 at 15.56.51.png deleted file mode 100644 index 19dad7f4..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-15 at 15.56.51.png and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 16.14.26.png b/.gitbook/assets/Screenshot 2023-06-15 at 16.14.26.png deleted file mode 100644 index 1cb26e27..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-15 at 16.14.26.png and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.52.04.png b/.gitbook/assets/Screenshot 2023-06-16 at 07.52.04.png deleted file mode 100644 index 62931ff0..00000000 Binary files a/.gitbook/assets/Screenshot 2023-06-16 at 07.52.04.png and /dev/null differ diff --git a/.gitbook/assets/anchorman-teamwork.gif b/.gitbook/assets/anchorman-teamwork.gif deleted file mode 100644 index 19ec6d2c..00000000 Binary files a/.gitbook/assets/anchorman-teamwork.gif and /dev/null differ diff --git a/.gitbook/assets/aquarius-did-metadata.jpg b/.gitbook/assets/aquarius-did-metadata.jpg deleted file mode 100644 index 7849f89f..00000000 Binary files a/.gitbook/assets/aquarius-did-metadata.jpg and /dev/null differ diff --git a/.gitbook/assets/architecture (2).png b/.gitbook/assets/architecture (2).png deleted file mode 100644 index 97084f85..00000000 Binary files a/.gitbook/assets/architecture (2).png and /dev/null differ diff --git a/.gitbook/assets/DataNFT&Datatokens.png b/.gitbook/assets/architecture/DataNFT&Datatokens.png similarity index 100% rename from .gitbook/assets/DataNFT&Datatokens.png rename to .gitbook/assets/architecture/DataNFT&Datatokens.png diff --git a/.gitbook/assets/architecture/Ocean101.png b/.gitbook/assets/architecture/Ocean101.png new file mode 100644 index 00000000..5a430327 Binary files /dev/null and b/.gitbook/assets/architecture/Ocean101.png differ diff --git a/.gitbook/assets/architecture/architecture.png b/.gitbook/assets/architecture/architecture.png deleted file mode 100644 index 97084f85..00000000 Binary files a/.gitbook/assets/architecture/architecture.png and /dev/null differ diff --git a/.gitbook/assets/architecture/architecture_overview.png b/.gitbook/assets/architecture/architecture_overview.png index c244bd45..fa75be2b 100644 Binary files a/.gitbook/assets/architecture/architecture_overview.png and b/.gitbook/assets/architecture/architecture_overview.png differ diff --git a/.gitbook/assets/architecture/datanft-and-datatoken.png b/.gitbook/assets/architecture/datanft-and-datatoken.png deleted file mode 100644 index 1457d0d2..00000000 Binary files a/.gitbook/assets/architecture/datanft-and-datatoken.png and /dev/null differ diff --git a/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png b/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png index 9bd7bb12..2b2d85a3 100644 Binary files a/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png and b/.gitbook/assets/architecture/datanfts_and_datatokens_flow.png differ diff --git a/.gitbook/assets/architecture/ddo-flow.png b/.gitbook/assets/architecture/ddo-flow.png deleted file mode 100644 index fab9a027..00000000 Binary files a/.gitbook/assets/architecture/ddo-flow.png and /dev/null differ diff --git a/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png b/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png index c715ee87..d8947336 100644 Binary files a/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png and b/.gitbook/assets/architecture/decentralized_exchanges_marketplaces.png differ diff --git a/.gitbook/assets/architecture/feature-compute@2x.webp b/.gitbook/assets/architecture/feature-compute@2x.webp deleted file mode 100644 index 09d216a8..00000000 Binary files a/.gitbook/assets/architecture/feature-compute@2x.webp and /dev/null differ diff --git a/.gitbook/assets/architecture/feature-datascience@2x.webp b/.gitbook/assets/architecture/feature-datascience@2x.webp deleted file mode 100644 index f8b8923b..00000000 Binary files a/.gitbook/assets/architecture/feature-datascience@2x.webp and /dev/null differ diff --git a/.gitbook/assets/architecture/feature-marketplaces@2x.webp b/.gitbook/assets/architecture/feature-marketplaces@2x.webp deleted file mode 100644 index 9548f4cb..00000000 Binary files a/.gitbook/assets/architecture/feature-marketplaces@2x.webp and /dev/null differ diff --git a/.gitbook/assets/architecture/high-level-flow (1).png b/.gitbook/assets/architecture/high-level-flow (1).png deleted file mode 100644 index a581963f..00000000 Binary files a/.gitbook/assets/architecture/high-level-flow (1).png and /dev/null differ diff --git a/.gitbook/assets/architecture/high-level-flow.png b/.gitbook/assets/architecture/high-level-flow.png deleted file mode 100644 index a581963f..00000000 Binary files a/.gitbook/assets/architecture/high-level-flow.png and /dev/null differ diff --git a/.gitbook/assets/architecture/new-ramp-on-crypto-ramp-off.webp b/.gitbook/assets/architecture/new-ramp-on-crypto-ramp-off.webp deleted file mode 100644 index 256e4431..00000000 Binary files a/.gitbook/assets/architecture/new-ramp-on-crypto-ramp-off.webp and /dev/null differ diff --git a/.gitbook/assets/architecture/publish_and_retrieve_ddos.png b/.gitbook/assets/architecture/publish_and_retrieve_ddos.png index b16ee891..4328e865 100644 Binary files a/.gitbook/assets/architecture/publish_and_retrieve_ddos.png and b/.gitbook/assets/architecture/publish_and_retrieve_ddos.png differ diff --git a/.gitbook/assets/architecture/publish_dataNFT_detailed_flow.png b/.gitbook/assets/architecture/publish_dataNFT_detailed_flow.png deleted file mode 100644 index af55ed4e..00000000 Binary files a/.gitbook/assets/architecture/publish_dataNFT_detailed_flow.png and /dev/null differ diff --git a/.gitbook/assets/big-money.gif b/.gitbook/assets/big-money.gif deleted file mode 100644 index 9276d391..00000000 Binary files a/.gitbook/assets/big-money.gif and /dev/null differ diff --git a/.gitbook/assets/blockchain.gif b/.gitbook/assets/blockchain.gif deleted file mode 100644 index 520c739a..00000000 Binary files a/.gitbook/assets/blockchain.gif and /dev/null differ diff --git a/.gitbook/assets/c2d/StartComputeJob.png b/.gitbook/assets/c2d/StartComputeJob.png deleted file mode 100644 index 6f0ad176..00000000 Binary files a/.gitbook/assets/c2d/StartComputeJob.png and /dev/null differ diff --git a/.gitbook/assets/c2d/bringmoredata.jpeg b/.gitbook/assets/c2d/bringmoredata.jpeg deleted file mode 100644 index 526962e2..00000000 Binary files a/.gitbook/assets/c2d/bringmoredata.jpeg and /dev/null differ diff --git a/.gitbook/assets/c2d/c2d_compute_job.png b/.gitbook/assets/c2d/c2d_compute_job.png index 2578c254..6c3c4e3d 100644 Binary files a/.gitbook/assets/c2d/c2d_compute_job.png and b/.gitbook/assets/c2d/c2d_compute_job.png differ diff --git a/.gitbook/assets/c2d/double-check-work (1).png b/.gitbook/assets/c2d/double-check-work (1).png deleted file mode 100644 index cbd39efc..00000000 Binary files a/.gitbook/assets/c2d/double-check-work (1).png and /dev/null differ diff --git a/.gitbook/assets/c2d/preview-publish (1).png b/.gitbook/assets/c2d/preview-publish (1).png deleted file mode 100644 index c0c36272..00000000 Binary files a/.gitbook/assets/c2d/preview-publish (1).png and /dev/null differ diff --git a/.gitbook/assets/cash-flow.gif b/.gitbook/assets/cash-flow.gif deleted file mode 100644 index 89d495c4..00000000 Binary files a/.gitbook/assets/cash-flow.gif and /dev/null differ diff --git a/.gitbook/assets/clueless-shopping.gif b/.gitbook/assets/clueless-shopping.gif deleted file mode 100644 index 67146b0a..00000000 Binary files a/.gitbook/assets/clueless-shopping.gif and /dev/null differ diff --git a/.gitbook/assets/components/aquarius.png b/.gitbook/assets/components/aquarius.png new file mode 100644 index 00000000..79d41d38 Binary files /dev/null and b/.gitbook/assets/components/aquarius.png differ diff --git a/.gitbook/assets/components/aquarius_deployment.jpg b/.gitbook/assets/components/aquarius_deployment.jpg new file mode 100644 index 00000000..5ba73df5 Binary files /dev/null and b/.gitbook/assets/components/aquarius_deployment.jpg differ diff --git a/.gitbook/assets/barge.png b/.gitbook/assets/components/barge.png similarity index 100% rename from .gitbook/assets/barge.png rename to .gitbook/assets/components/barge.png diff --git a/.gitbook/assets/ocean_py.png b/.gitbook/assets/components/ocean_py.png similarity index 100% rename from .gitbook/assets/ocean_py.png rename to .gitbook/assets/components/ocean_py.png diff --git a/.gitbook/assets/components/provider.png b/.gitbook/assets/components/provider.png new file mode 100644 index 00000000..70bf5777 Binary files /dev/null and b/.gitbook/assets/components/provider.png differ diff --git a/.gitbook/assets/components/subgraph.png b/.gitbook/assets/components/subgraph.png new file mode 100644 index 00000000..c8a3dff2 Binary files /dev/null and b/.gitbook/assets/components/subgraph.png differ diff --git a/.gitbook/assets/contracts/publish_detailed_flow.png b/.gitbook/assets/contracts/publish_detailed_flow.png index af55ed4e..a876df90 100644 Binary files a/.gitbook/assets/contracts/publish_detailed_flow.png and b/.gitbook/assets/contracts/publish_detailed_flow.png differ diff --git a/.gitbook/assets/contracts/roles_datatokens_level.png b/.gitbook/assets/contracts/roles_datatokens_level.png index 4930251b..0c5ef5ae 100644 Binary files a/.gitbook/assets/contracts/roles_datatokens_level.png and b/.gitbook/assets/contracts/roles_datatokens_level.png differ diff --git a/.gitbook/assets/contracts/roles_nft_level.png b/.gitbook/assets/contracts/roles_nft_level.png index cf069012..4593a88d 100644 Binary files a/.gitbook/assets/contracts/roles_nft_level.png and b/.gitbook/assets/contracts/roles_nft_level.png differ diff --git a/.gitbook/assets/smart-contracts.png b/.gitbook/assets/contracts/smart-contracts.png similarity index 100% rename from .gitbook/assets/smart-contracts.png rename to .gitbook/assets/contracts/smart-contracts.png diff --git a/.gitbook/assets/contracts/v4_contracts_overview.png b/.gitbook/assets/contracts/v4_contracts_overview.png index bfcc9b73..a4aa4e02 100644 Binary files a/.gitbook/assets/contracts/v4_contracts_overview.png and b/.gitbook/assets/contracts/v4_contracts_overview.png differ diff --git a/.gitbook/assets/cover/contribute_card (1).png b/.gitbook/assets/cover/contribute_card (1).png deleted file mode 100644 index 4ee76eef..00000000 Binary files a/.gitbook/assets/cover/contribute_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/data_science_card (1).png b/.gitbook/assets/cover/data_science_card (1).png deleted file mode 100644 index 862b97e8..00000000 Binary files a/.gitbook/assets/cover/data_science_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/developer_card (1).png b/.gitbook/assets/cover/developer_card (1).png deleted file mode 100644 index f0f5ebe9..00000000 Binary files a/.gitbook/assets/cover/developer_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/discover_card (1).png b/.gitbook/assets/cover/discover_card (1).png deleted file mode 100644 index b41976df..00000000 Binary files a/.gitbook/assets/cover/discover_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/infrastructure_card (1).png b/.gitbook/assets/cover/infrastructure_card (1).png deleted file mode 100644 index 395fb229..00000000 Binary files a/.gitbook/assets/cover/infrastructure_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/rewards_card (1).png b/.gitbook/assets/cover/rewards_card (1).png deleted file mode 100644 index ce298ae4..00000000 Binary files a/.gitbook/assets/cover/rewards_card (1).png and /dev/null differ diff --git a/.gitbook/assets/cover/user_guides_card (1).png b/.gitbook/assets/cover/user_guides_card (1).png deleted file mode 100644 index ec1321c6..00000000 Binary files a/.gitbook/assets/cover/user_guides_card (1).png and /dev/null differ diff --git a/.gitbook/assets/data_everywhere.gif b/.gitbook/assets/data_everywhere.gif deleted file mode 100644 index e24abcfb..00000000 Binary files a/.gitbook/assets/data_everywhere.gif and /dev/null differ diff --git a/.gitbook/assets/data_union.jpeg b/.gitbook/assets/data_union.jpeg deleted file mode 100644 index 8965ac3c..00000000 Binary files a/.gitbook/assets/data_union.jpeg and /dev/null differ diff --git a/.gitbook/assets/datanft-and-datatoken.png b/.gitbook/assets/datanft-and-datatoken.png deleted file mode 100644 index 1457d0d2..00000000 Binary files a/.gitbook/assets/datanft-and-datatoken.png and /dev/null differ diff --git a/.gitbook/assets/ddo-flow.png b/.gitbook/assets/ddo-flow.png deleted file mode 100644 index fab9a027..00000000 Binary files a/.gitbook/assets/ddo-flow.png and /dev/null differ diff --git a/.gitbook/assets/image (1).png b/.gitbook/assets/deployment/image (1).png similarity index 100% rename from .gitbook/assets/image (1).png rename to .gitbook/assets/deployment/image (1).png diff --git a/.gitbook/assets/image (2).png b/.gitbook/assets/deployment/image (2).png similarity index 100% rename from .gitbook/assets/image (2).png rename to .gitbook/assets/deployment/image (2).png diff --git a/.gitbook/assets/image (3).png b/.gitbook/assets/deployment/image (3).png similarity index 100% rename from .gitbook/assets/image (3).png rename to .gitbook/assets/deployment/image (3).png diff --git a/.gitbook/assets/image (4).png b/.gitbook/assets/deployment/image (4).png similarity index 100% rename from .gitbook/assets/image (4).png rename to .gitbook/assets/deployment/image (4).png diff --git a/.gitbook/assets/image (5).png b/.gitbook/assets/deployment/image (5).png similarity index 100% rename from .gitbook/assets/image (5).png rename to .gitbook/assets/deployment/image (5).png diff --git a/.gitbook/assets/image (6).png b/.gitbook/assets/deployment/image (6).png similarity index 100% rename from .gitbook/assets/image (6).png rename to .gitbook/assets/deployment/image (6).png diff --git a/.gitbook/assets/image.png b/.gitbook/assets/deployment/image.png similarity index 100% rename from .gitbook/assets/image.png rename to .gitbook/assets/deployment/image.png diff --git a/.gitbook/assets/dolphin.gif b/.gitbook/assets/dolphin.gif deleted file mode 100644 index c36789cf..00000000 Binary files a/.gitbook/assets/dolphin.gif and /dev/null differ diff --git a/.gitbook/assets/drew-barrymore-notes.gif b/.gitbook/assets/drew-barrymore-notes.gif deleted file mode 100644 index 3e8e3b40..00000000 Binary files a/.gitbook/assets/drew-barrymore-notes.gif and /dev/null differ diff --git a/.gitbook/assets/feature-compute@2x.webp b/.gitbook/assets/feature-compute@2x.webp deleted file mode 100644 index 09d216a8..00000000 Binary files a/.gitbook/assets/feature-compute@2x.webp and /dev/null differ diff --git a/.gitbook/assets/feature-datascience@2x.webp b/.gitbook/assets/feature-datascience@2x.webp deleted file mode 100644 index f8b8923b..00000000 Binary files a/.gitbook/assets/feature-datascience@2x.webp and /dev/null differ diff --git a/.gitbook/assets/feature-marketplaces@2x.webp b/.gitbook/assets/feature-marketplaces@2x.webp deleted file mode 100644 index 9548f4cb..00000000 Binary files a/.gitbook/assets/feature-marketplaces@2x.webp and /dev/null differ diff --git a/.gitbook/assets/file.excalidraw.svg b/.gitbook/assets/file.excalidraw.svg deleted file mode 100644 index e061aa71..00000000 --- a/.gitbook/assets/file.excalidraw.svg +++ /dev/null @@ -1,16 +0,0 @@ - - - eyJ2ZXJzaW9uIjoiMSIsImVuY29kaW5nIjoiYnN0cmluZyIsImNvbXByZXNzZWQiOnRydWUsImVuY29kZWQiOiJ4nHVTy26jMFx1MDAxNN3zXHUwMDE1iG47Kc80dDd9SO1oNKqURVx1MDAxN1VcdTAwMTdcdTAwMGV2wMLYlm1CM1X+fWxDbcqkXlx1MDAwMPfcc1x1MDAxZtxz/Vx1MDAxMYRhpI5cdTAwMWNFN2GE3itAMFx1MDAxNGCILlxyfkBCYka1K7W2ZL2oLLNRisubqyvA+arGasdYu6pYN4YhgjpEldTEV22H4Yd9zlxuXHUwMDExTJHlWtSXScpkif5h1JZMsjwr13mWb1x1MDAxY1x1MDAwM8t7XUohqN17QCTyXHUwMDFlXHUwMDAzRcPvX2DbJYdcdTAwMWa7uipcdTAwMWXjh/v25bn0ZfeYkK06kvGHQNX0YtaUVIK16Fx1MDAwNUPVmOpcdTAwMGLcxUmmXHUwMDA35qNcdTAwMDTr64ZcIim/xDBcdTAwMGUqrI5cdTAwMDaLY4dcdTAwMDJa21x1MDAxY1x1MDAxZXnXVp5mzrZcdTAwMTHZZlx1MDAxNc9Ptmjljlx1MDAxMSZMK1x1MDAxN6PfN7NcdTAwMDNVW+uOKHRcdTAwMWMlXHUwMDAwlVx1MDAxY1xiLY/nXHLTT1x1MDAxNtm1w1x1MDAxYYTrRn0qP5ZDdtJ5ml8nRZHmzmFq8Cdo9X6bj4LCaVx1MDAxNJ9cdTAwMGLgVyCdkJPv1vBcdTAwMWZmq+NT9Vx1MDAxY4JR5mS9WcfFOs7KtCydX69Tq520J8RjrGrPbIZUQKhbTCGm9TJcdTAwMDRR+I2HXHUwMDAwqe5Y12Gl23hmmKolw+b9KVx1MDAwNFx1MDAxYlx1MDAxYVx1MDAwNOCZzN/6uEnn74o5/iv022Fccvf9dnmWPZPQnNTzg/n7XHUwMDE0TFx1MDAxOVwifYG3Slx1MDAwZteJpNXEcIv/oi99Rlx1MDAwN4yG2//36WJvT1x1MDAxNExKmkuFrOKn4PRcdTAwMGZK/Vx1MDAxN2wifQ== - - - - \ No newline at end of file diff --git a/.gitbook/assets/follow-instructions.gif b/.gitbook/assets/follow-instructions.gif deleted file mode 100644 index 306837a3..00000000 Binary files a/.gitbook/assets/follow-instructions.gif and /dev/null differ diff --git a/.gitbook/assets/dao.jpeg b/.gitbook/assets/general/dao.jpeg similarity index 100% rename from .gitbook/assets/dao.jpeg rename to .gitbook/assets/general/dao.jpeg diff --git a/.gitbook/assets/developers.png b/.gitbook/assets/general/developers.png similarity index 100% rename from .gitbook/assets/developers.png rename to .gitbook/assets/general/developers.png diff --git a/.gitbook/assets/gif/dolphin.gif b/.gitbook/assets/gif/dolphin.gif deleted file mode 100644 index c36789cf..00000000 Binary files a/.gitbook/assets/gif/dolphin.gif and /dev/null differ diff --git a/.gitbook/assets/giphy.gif b/.gitbook/assets/gif/giphy.gif similarity index 100% rename from .gitbook/assets/giphy.gif rename to .gitbook/assets/gif/giphy.gif diff --git a/.gitbook/assets/giphy.webp b/.gitbook/assets/gif/giphy.webp similarity index 100% rename from .gitbook/assets/giphy.webp rename to .gitbook/assets/gif/giphy.webp diff --git a/.gitbook/assets/gif/mafs.gif b/.gitbook/assets/gif/mafs.gif new file mode 100644 index 00000000..3d19c0e1 Binary files /dev/null and b/.gitbook/assets/gif/mafs.gif differ diff --git a/.gitbook/assets/gif/matrix-code (1).gif b/.gitbook/assets/gif/matrix-code (1).gif deleted file mode 100644 index 9f3356ae..00000000 Binary files a/.gitbook/assets/gif/matrix-code (1).gif and /dev/null differ diff --git a/.gitbook/assets/gif/my-data (1).gif b/.gitbook/assets/gif/my-data (1).gif deleted file mode 100644 index f2dd42fb..00000000 Binary files a/.gitbook/assets/gif/my-data (1).gif and /dev/null differ diff --git a/.gitbook/assets/gif/whats-a-wallet (1).gif b/.gitbook/assets/gif/whats-a-wallet (1).gif deleted file mode 100644 index 11d7ce42..00000000 Binary files a/.gitbook/assets/gif/whats-a-wallet (1).gif and /dev/null differ diff --git a/.gitbook/assets/Raw-URL.png b/.gitbook/assets/hosting/Raw-URL.png similarity index 100% rename from .gitbook/assets/Raw-URL.png rename to .gitbook/assets/hosting/Raw-URL.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.52.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.52.29.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 15.52.29.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 15.52.29.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.54.21.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.54.21.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 15.54.21.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 15.54.21.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.55.16.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.55.16.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 15.55.16.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 15.55.16.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.56.34.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.56.34.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 15.56.34.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 15.56.34.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 15.58.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 15.58.29.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 15.58.29.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 15.58.29.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 16.08.42.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.08.42.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 16.08.42.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 16.08.42.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 16.12.10.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.12.10.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 16.12.10.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 16.12.10.png diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 16.26.56.png b/.gitbook/assets/hosting/Screenshot 2023-06-15 at 16.26.56.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 16.26.56.png rename to .gitbook/assets/hosting/Screenshot 2023-06-15 at 16.26.56.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.50.27.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.50.27.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.50.27.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.50.27.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.51.14.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.51.14.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.51.14.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.51.14.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.54.29.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.54.29.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.54.29.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.54.29.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.56.01.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.56.01.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.56.01.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.56.01.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.58.20.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.58.20.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.58.20.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.58.20.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 07.59.38.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 07.59.38.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 07.59.38.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 07.59.38.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 08.02.25.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.02.25.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 08.02.25.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 08.02.25.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 08.05.41.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.05.41.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 08.05.41.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 08.05.41.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 08.08.12.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.08.12.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 08.08.12.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 08.08.12.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 08.15.45.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.15.45.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 08.15.45.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 08.15.45.png diff --git a/.gitbook/assets/Screenshot 2023-06-16 at 08.16.46.png b/.gitbook/assets/hosting/Screenshot 2023-06-16 at 08.16.46.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-16 at 08.16.46.png rename to .gitbook/assets/hosting/Screenshot 2023-06-16 at 08.16.46.png diff --git a/.gitbook/assets/hosting/azure1 (1).png b/.gitbook/assets/hosting/azure1 (1).png deleted file mode 100644 index fa00009a..00000000 Binary files a/.gitbook/assets/hosting/azure1 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure2 (1).png b/.gitbook/assets/hosting/azure2 (1).png deleted file mode 100644 index 511ff199..00000000 Binary files a/.gitbook/assets/hosting/azure2 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure3 (1).png b/.gitbook/assets/hosting/azure3 (1).png deleted file mode 100644 index 4cf63916..00000000 Binary files a/.gitbook/assets/hosting/azure3 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure4 (1).png b/.gitbook/assets/hosting/azure4 (1).png deleted file mode 100644 index 2b074641..00000000 Binary files a/.gitbook/assets/hosting/azure4 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure5 (1).png b/.gitbook/assets/hosting/azure5 (1).png deleted file mode 100644 index c07525e5..00000000 Binary files a/.gitbook/assets/hosting/azure5 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure6 (1).png b/.gitbook/assets/hosting/azure6 (1).png deleted file mode 100644 index 4c576189..00000000 Binary files a/.gitbook/assets/hosting/azure6 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure7 (1).png b/.gitbook/assets/hosting/azure7 (1).png deleted file mode 100644 index 000dc780..00000000 Binary files a/.gitbook/assets/hosting/azure7 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure8 (1).png b/.gitbook/assets/hosting/azure8 (1).png deleted file mode 100644 index d424ccae..00000000 Binary files a/.gitbook/assets/hosting/azure8 (1).png and /dev/null differ diff --git a/.gitbook/assets/hosting/azure9 (1).png b/.gitbook/assets/hosting/azure9 (1).png deleted file mode 100644 index f46b3c23..00000000 Binary files a/.gitbook/assets/hosting/azure9 (1).png and /dev/null differ diff --git a/.gitbook/assets/hustlin.gif b/.gitbook/assets/hustlin.gif deleted file mode 100644 index 6a2e918e..00000000 Binary files a/.gitbook/assets/hustlin.gif and /dev/null differ diff --git a/.gitbook/assets/i-know-kung-fu.gif b/.gitbook/assets/i-know-kung-fu.gif deleted file mode 100644 index 0d2c674f..00000000 Binary files a/.gitbook/assets/i-know-kung-fu.gif and /dev/null differ diff --git a/.gitbook/assets/just-publish.gif b/.gitbook/assets/just-publish.gif deleted file mode 100644 index dfa0b1e3..00000000 Binary files a/.gitbook/assets/just-publish.gif and /dev/null differ diff --git a/.gitbook/assets/kermit-typing.gif b/.gitbook/assets/kermit-typing.gif deleted file mode 100644 index 4e099165..00000000 Binary files a/.gitbook/assets/kermit-typing.gif and /dev/null differ diff --git a/.gitbook/assets/kramer-mind-blown.gif b/.gitbook/assets/kramer-mind-blown.gif deleted file mode 100644 index addb6028..00000000 Binary files a/.gitbook/assets/kramer-mind-blown.gif and /dev/null differ diff --git a/.gitbook/assets/landing/contribute_card.png b/.gitbook/assets/landing/contribute_card.png deleted file mode 100644 index 6154b6cf..00000000 Binary files a/.gitbook/assets/landing/contribute_card.png and /dev/null differ diff --git a/.gitbook/assets/landing/data_science_card.png b/.gitbook/assets/landing/data_science_card.png deleted file mode 100644 index b82a92c7..00000000 Binary files a/.gitbook/assets/landing/data_science_card.png and /dev/null differ diff --git a/.gitbook/assets/landing/developers_card.png b/.gitbook/assets/landing/developers_card.png deleted file mode 100644 index 2b531cd5..00000000 Binary files a/.gitbook/assets/landing/developers_card.png and /dev/null differ diff --git a/.gitbook/assets/landing/discover_card.png b/.gitbook/assets/landing/discover_card.png deleted file mode 100644 index 4dda6733..00000000 Binary files a/.gitbook/assets/landing/discover_card.png and /dev/null differ diff --git a/.gitbook/assets/landing/infrastructure_card.png b/.gitbook/assets/landing/infrastructure_card.png deleted file mode 100644 index 0a15bc0b..00000000 Binary files a/.gitbook/assets/landing/infrastructure_card.png and /dev/null differ diff --git a/.gitbook/assets/landing/rewards_card.png b/.gitbook/assets/landing/rewards_card.png deleted file mode 100644 index d188e329..00000000 Binary files a/.gitbook/assets/landing/rewards_card.png and /dev/null differ diff --git a/.gitbook/assets/like-a-boss.gif b/.gitbook/assets/like-a-boss.gif deleted file mode 100644 index 3cf63a3a..00000000 Binary files a/.gitbook/assets/like-a-boss.gif and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-1.png b/.gitbook/assets/liquidity/remove-liquidity-1.png deleted file mode 100644 index 5288d028..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-1.png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-2 (1).png b/.gitbook/assets/liquidity/remove-liquidity-2 (1).png deleted file mode 100644 index c99a54ac..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-2 (1).png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-2 (2).png b/.gitbook/assets/liquidity/remove-liquidity-2 (2).png deleted file mode 100644 index c99a54ac..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-2 (2).png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-2 (3).png b/.gitbook/assets/liquidity/remove-liquidity-2 (3).png deleted file mode 100644 index c99a54ac..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-2 (3).png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-3.png b/.gitbook/assets/liquidity/remove-liquidity-3.png deleted file mode 100644 index 50bda9bb..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-3.png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-4.png b/.gitbook/assets/liquidity/remove-liquidity-4.png deleted file mode 100644 index fa7e4263..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-4.png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-5.png b/.gitbook/assets/liquidity/remove-liquidity-5.png deleted file mode 100644 index 4780d741..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-5.png and /dev/null differ diff --git a/.gitbook/assets/liquidity/remove-liquidity-6 (1).png b/.gitbook/assets/liquidity/remove-liquidity-6 (1).png deleted file mode 100644 index a52c717e..00000000 Binary files a/.gitbook/assets/liquidity/remove-liquidity-6 (1).png and /dev/null differ diff --git a/.gitbook/assets/Access.png b/.gitbook/assets/market/Access.png similarity index 100% rename from .gitbook/assets/Access.png rename to .gitbook/assets/market/Access.png diff --git a/.gitbook/assets/Check-Debug-Mode.png b/.gitbook/assets/market/Check-Debug-Mode.png similarity index 100% rename from .gitbook/assets/Check-Debug-Mode.png rename to .gitbook/assets/market/Check-Debug-Mode.png diff --git a/.gitbook/assets/Click-Settings.png b/.gitbook/assets/market/Click-Settings.png similarity index 100% rename from .gitbook/assets/Click-Settings.png rename to .gitbook/assets/market/Click-Settings.png diff --git a/.gitbook/assets/Enter-Metadata.png b/.gitbook/assets/market/Enter-Metadata.png similarity index 100% rename from .gitbook/assets/Enter-Metadata.png rename to .gitbook/assets/market/Enter-Metadata.png diff --git a/.gitbook/assets/Preview.png b/.gitbook/assets/market/Preview.png similarity index 100% rename from .gitbook/assets/Preview.png rename to .gitbook/assets/market/Preview.png diff --git a/.gitbook/assets/Price.png b/.gitbook/assets/market/Price.png similarity index 100% rename from .gitbook/assets/Price.png rename to .gitbook/assets/market/Price.png diff --git a/.gitbook/assets/Publish-Link.png b/.gitbook/assets/market/Publish-Link.png similarity index 100% rename from .gitbook/assets/Publish-Link.png rename to .gitbook/assets/market/Publish-Link.png diff --git a/.gitbook/assets/Screenshot 2023-06-13 at 14.39.17.png b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.39.17.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-13 at 14.39.17.png rename to .gitbook/assets/market/Screenshot 2023-06-13 at 14.39.17.png diff --git a/.gitbook/assets/Screenshot 2023-06-13 at 14.43.25.png b/.gitbook/assets/market/Screenshot 2023-06-13 at 14.43.25.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-13 at 14.43.25.png rename to .gitbook/assets/market/Screenshot 2023-06-13 at 14.43.25.png diff --git a/.gitbook/assets/Screenshot 2023-06-14 at 14.30.59.png b/.gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-14 at 14.30.59.png rename to .gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png diff --git a/.gitbook/assets/Scroll-DDO-Info.png b/.gitbook/assets/market/Scroll-DDO-Info.png similarity index 100% rename from .gitbook/assets/Scroll-DDO-Info.png rename to .gitbook/assets/market/Scroll-DDO-Info.png diff --git a/.gitbook/assets/market/fixed-asset-pricing.png b/.gitbook/assets/market/fixed-asset-pricing.png deleted file mode 100644 index cc55112a..00000000 Binary files a/.gitbook/assets/market/fixed-asset-pricing.png and /dev/null differ diff --git a/.gitbook/assets/market/free-asset-pricing.png b/.gitbook/assets/market/free-asset-pricing.png deleted file mode 100644 index 8876bc31..00000000 Binary files a/.gitbook/assets/market/free-asset-pricing.png and /dev/null differ diff --git a/.gitbook/assets/market/market-forking-1.png b/.gitbook/assets/market/market-forking-1.png deleted file mode 100644 index c5b59e35..00000000 Binary files a/.gitbook/assets/market/market-forking-1.png and /dev/null differ diff --git a/.gitbook/assets/market/market-forking-2.png b/.gitbook/assets/market/market-forking-2.png deleted file mode 100644 index 68f024d4..00000000 Binary files a/.gitbook/assets/market/market-forking-2.png and /dev/null differ diff --git a/.gitbook/assets/market/marketplace-connect-wallet.png b/.gitbook/assets/market/marketplace-connect-wallet.png deleted file mode 100644 index cf71156a..00000000 Binary files a/.gitbook/assets/market/marketplace-connect-wallet.png and /dev/null differ diff --git a/.gitbook/assets/market/marketplace-landing-page.png b/.gitbook/assets/market/marketplace-landing-page.png deleted file mode 100644 index e57548bd..00000000 Binary files a/.gitbook/assets/market/marketplace-landing-page.png and /dev/null differ diff --git a/.gitbook/assets/market/marketplace-publish-file-field.png b/.gitbook/assets/market/marketplace-publish-file-field.png deleted file mode 100644 index f16eee95..00000000 Binary files a/.gitbook/assets/market/marketplace-publish-file-field.png and /dev/null differ diff --git a/.gitbook/assets/marketplace_data.jpg b/.gitbook/assets/market/marketplace_data.jpg similarity index 100% rename from .gitbook/assets/marketplace_data.jpg rename to .gitbook/assets/market/marketplace_data.jpg diff --git a/.gitbook/assets/network-and-datatoken-address.png b/.gitbook/assets/market/network-and-datatoken-address.png similarity index 100% rename from .gitbook/assets/network-and-datatoken-address.png rename to .gitbook/assets/market/network-and-datatoken-address.png diff --git a/.gitbook/assets/market/ocean-market-homepage.png b/.gitbook/assets/market/ocean-market-homepage.png deleted file mode 100644 index 4aa49e30..00000000 Binary files a/.gitbook/assets/market/ocean-market-homepage.png and /dev/null differ diff --git a/.gitbook/assets/market/publish-1.png b/.gitbook/assets/market/publish-1.png deleted file mode 100644 index b9d2604e..00000000 Binary files a/.gitbook/assets/market/publish-1.png and /dev/null differ diff --git a/.gitbook/assets/market/publish-2.png b/.gitbook/assets/market/publish-2.png deleted file mode 100644 index e47421ab..00000000 Binary files a/.gitbook/assets/market/publish-2.png and /dev/null differ diff --git a/.gitbook/assets/market/publish-3.png b/.gitbook/assets/market/publish-3.png deleted file mode 100644 index 371f19e3..00000000 Binary files a/.gitbook/assets/market/publish-3.png and /dev/null differ diff --git a/.gitbook/assets/market/publish-4.png b/.gitbook/assets/market/publish-4.png deleted file mode 100644 index d77cece1..00000000 Binary files a/.gitbook/assets/market/publish-4.png and /dev/null differ diff --git a/.gitbook/assets/market/publish-8.png b/.gitbook/assets/market/publish-8.png deleted file mode 100644 index 4bee7cee..00000000 Binary files a/.gitbook/assets/market/publish-8.png and /dev/null differ diff --git a/.gitbook/assets/publish-page-2.png b/.gitbook/assets/market/publish-page-2.png similarity index 100% rename from .gitbook/assets/publish-page-2.png rename to .gitbook/assets/market/publish-page-2.png diff --git a/.gitbook/assets/publish-page-before-edit.png b/.gitbook/assets/market/publish-page-before-edit.png similarity index 100% rename from .gitbook/assets/publish-page-before-edit.png rename to .gitbook/assets/market/publish-page-before-edit.png diff --git a/.gitbook/assets/market/publish.png b/.gitbook/assets/market/publish.png deleted file mode 100644 index cd41d29a..00000000 Binary files a/.gitbook/assets/market/publish.png and /dev/null differ diff --git a/.gitbook/assets/matrix-code (1).gif b/.gitbook/assets/matrix-code (1).gif deleted file mode 100644 index 9f3356ae..00000000 Binary files a/.gitbook/assets/matrix-code (1).gif and /dev/null differ diff --git a/.gitbook/assets/matrix-code.gif b/.gitbook/assets/matrix-code.gif deleted file mode 100644 index 9f3356ae..00000000 Binary files a/.gitbook/assets/matrix-code.gif and /dev/null differ diff --git a/.gitbook/assets/mind-blown.gif b/.gitbook/assets/mind-blown.gif deleted file mode 100644 index 5b403a11..00000000 Binary files a/.gitbook/assets/mind-blown.gif and /dev/null differ diff --git a/.gitbook/assets/moredefi.jpeg b/.gitbook/assets/moredefi.jpeg deleted file mode 100644 index 912451bc..00000000 Binary files a/.gitbook/assets/moredefi.jpeg and /dev/null differ diff --git a/.gitbook/assets/morpheus-challenge.gif b/.gitbook/assets/morpheus-challenge.gif deleted file mode 100644 index 10e17cc2..00000000 Binary files a/.gitbook/assets/morpheus-challenge.gif and /dev/null differ diff --git a/.gitbook/assets/morpheus-taunting.gif b/.gitbook/assets/morpheus-taunting.gif deleted file mode 100644 index 2211fb35..00000000 Binary files a/.gitbook/assets/morpheus-taunting.gif and /dev/null differ diff --git a/.gitbook/assets/morpheus.gif b/.gitbook/assets/morpheus.gif deleted file mode 100644 index 343bb67a..00000000 Binary files a/.gitbook/assets/morpheus.gif and /dev/null differ diff --git a/.gitbook/assets/my-data (1).gif b/.gitbook/assets/my-data (1).gif deleted file mode 100644 index f2dd42fb..00000000 Binary files a/.gitbook/assets/my-data (1).gif and /dev/null differ diff --git a/.gitbook/assets/my-data.gif b/.gitbook/assets/my-data.gif deleted file mode 100644 index f2dd42fb..00000000 Binary files a/.gitbook/assets/my-data.gif and /dev/null differ diff --git a/.gitbook/assets/neo-bb.gif b/.gitbook/assets/neo-bb.gif deleted file mode 100644 index 067f696e..00000000 Binary files a/.gitbook/assets/neo-bb.gif and /dev/null differ diff --git a/.gitbook/assets/neo-blocking.gif b/.gitbook/assets/neo-blocking.gif deleted file mode 100644 index 4260200f..00000000 Binary files a/.gitbook/assets/neo-blocking.gif and /dev/null differ diff --git a/.gitbook/assets/neo-kinda-martial-arts.gif b/.gitbook/assets/neo-kinda-martial-arts.gif deleted file mode 100644 index 31c92293..00000000 Binary files a/.gitbook/assets/neo-kinda-martial-arts.gif and /dev/null differ diff --git a/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp b/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp deleted file mode 100644 index 256e4431..00000000 Binary files a/.gitbook/assets/new-ramp-on-crypto-ramp-off.webp and /dev/null differ diff --git a/.gitbook/assets/ocean-jelly-hyperrealistic.jpeg b/.gitbook/assets/ocean-jelly-hyperrealistic.jpeg deleted file mode 100644 index f02db939..00000000 Binary files a/.gitbook/assets/ocean-jelly-hyperrealistic.jpeg and /dev/null differ diff --git a/.gitbook/assets/passive-income.gif b/.gitbook/assets/passive-income.gif deleted file mode 100644 index b776a466..00000000 Binary files a/.gitbook/assets/passive-income.gif and /dev/null differ diff --git a/.gitbook/assets/penguin-diving.gif b/.gitbook/assets/penguin-diving.gif deleted file mode 100644 index eeacff6a..00000000 Binary files a/.gitbook/assets/penguin-diving.gif and /dev/null differ diff --git a/.gitbook/assets/please-dont-leave.gif b/.gitbook/assets/please-dont-leave.gif deleted file mode 100644 index c26a7564..00000000 Binary files a/.gitbook/assets/please-dont-leave.gif and /dev/null differ diff --git a/.gitbook/assets/publish-page-2 (1).png b/.gitbook/assets/publish-page-2 (1).png deleted file mode 100644 index 0cb13b87..00000000 Binary files a/.gitbook/assets/publish-page-2 (1).png and /dev/null differ diff --git a/.gitbook/assets/rewards/DF-Grid.png b/.gitbook/assets/rewards/DF-Grid.png deleted file mode 100644 index 5820084c..00000000 Binary files a/.gitbook/assets/rewards/DF-Grid.png and /dev/null differ diff --git a/.gitbook/assets/rewards/Rewards-Page.png b/.gitbook/assets/rewards/Rewards-Page.png deleted file mode 100644 index 4ca998a8..00000000 Binary files a/.gitbook/assets/rewards/Rewards-Page.png and /dev/null differ diff --git a/.gitbook/assets/rewards/curate-datasets.png b/.gitbook/assets/rewards/curate-datasets.png new file mode 100644 index 00000000..8798483b Binary files /dev/null and b/.gitbook/assets/rewards/curate-datasets.png differ diff --git a/.gitbook/assets/rewards/veOCEAN-After-Lock.png b/.gitbook/assets/rewards/veOCEAN-After-Lock.png deleted file mode 100644 index 6f03a021..00000000 Binary files a/.gitbook/assets/rewards/veOCEAN-After-Lock.png and /dev/null differ diff --git a/.gitbook/assets/rewards/veOCEAN-Before-Lock.png b/.gitbook/assets/rewards/veOCEAN-Before-Lock.png deleted file mode 100644 index 9949800e..00000000 Binary files a/.gitbook/assets/rewards/veOCEAN-Before-Lock.png and /dev/null differ diff --git a/.gitbook/assets/rewards/veOCEAN-DF-Homepage.png b/.gitbook/assets/rewards/veOCEAN-DF-Homepage.png deleted file mode 100644 index 3e5bcae8..00000000 Binary files a/.gitbook/assets/rewards/veOCEAN-DF-Homepage.png and /dev/null differ diff --git a/.gitbook/assets/rewards/vedf_youtube_thumbnail.png b/.gitbook/assets/rewards/vedf_youtube_thumbnail.png deleted file mode 100644 index 455ddddd..00000000 Binary files a/.gitbook/assets/rewards/vedf_youtube_thumbnail.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup1.png b/.gitbook/assets/server-setup/server-setup1.png deleted file mode 100644 index 0059a3bc..00000000 Binary files a/.gitbook/assets/server-setup/server-setup1.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup2.png b/.gitbook/assets/server-setup/server-setup2.png deleted file mode 100644 index cab24603..00000000 Binary files a/.gitbook/assets/server-setup/server-setup2.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup3 (1).png b/.gitbook/assets/server-setup/server-setup3 (1).png deleted file mode 100644 index a633d8f8..00000000 Binary files a/.gitbook/assets/server-setup/server-setup3 (1).png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup3.png b/.gitbook/assets/server-setup/server-setup3.png deleted file mode 100644 index a633d8f8..00000000 Binary files a/.gitbook/assets/server-setup/server-setup3.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup4.png b/.gitbook/assets/server-setup/server-setup4.png deleted file mode 100644 index 99eb413e..00000000 Binary files a/.gitbook/assets/server-setup/server-setup4.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup5.png b/.gitbook/assets/server-setup/server-setup5.png deleted file mode 100644 index bd62e17e..00000000 Binary files a/.gitbook/assets/server-setup/server-setup5.png and /dev/null differ diff --git a/.gitbook/assets/server-setup/server-setup6.png b/.gitbook/assets/server-setup/server-setup6.png deleted file mode 100644 index 9967ad31..00000000 Binary files a/.gitbook/assets/server-setup/server-setup6.png and /dev/null differ diff --git a/.gitbook/assets/shopping-minions.gif b/.gitbook/assets/shopping-minions.gif deleted file mode 100644 index d49c7876..00000000 Binary files a/.gitbook/assets/shopping-minions.gif and /dev/null differ diff --git a/.gitbook/assets/show-me-the-money (1).gif b/.gitbook/assets/show-me-the-money (1).gif deleted file mode 100644 index c8041925..00000000 Binary files a/.gitbook/assets/show-me-the-money (1).gif and /dev/null differ diff --git a/.gitbook/assets/show-me-the-money (2).gif b/.gitbook/assets/show-me-the-money (2).gif deleted file mode 100644 index c8041925..00000000 Binary files a/.gitbook/assets/show-me-the-money (2).gif and /dev/null differ diff --git a/.gitbook/assets/show-me-the-money-tom-cruise.gif b/.gitbook/assets/show-me-the-money-tom-cruise.gif deleted file mode 100644 index a5a6e330..00000000 Binary files a/.gitbook/assets/show-me-the-money-tom-cruise.gif and /dev/null differ diff --git a/.gitbook/assets/show-me-the-money.gif b/.gitbook/assets/show-me-the-money.gif deleted file mode 100644 index c8041925..00000000 Binary files a/.gitbook/assets/show-me-the-money.gif and /dev/null differ diff --git a/.gitbook/assets/smart-contracts (1).png b/.gitbook/assets/smart-contracts (1).png deleted file mode 100644 index 9af02ac2..00000000 Binary files a/.gitbook/assets/smart-contracts (1).png and /dev/null differ diff --git a/.gitbook/assets/sponge-money.gif b/.gitbook/assets/sponge-money.gif deleted file mode 100644 index c2507e64..00000000 Binary files a/.gitbook/assets/sponge-money.gif and /dev/null differ diff --git a/.gitbook/assets/super-mario-coins.gif b/.gitbook/assets/super-mario-coins.gif deleted file mode 100644 index e53e987a..00000000 Binary files a/.gitbook/assets/super-mario-coins.gif and /dev/null differ diff --git a/.gitbook/assets/talk-data-to-me.gif b/.gitbook/assets/talk-data-to-me.gif deleted file mode 100644 index c147240a..00000000 Binary files a/.gitbook/assets/talk-data-to-me.gif and /dev/null differ diff --git a/.gitbook/assets/tell-me-more.gif b/.gitbook/assets/tell-me-more.gif deleted file mode 100644 index ff542c91..00000000 Binary files a/.gitbook/assets/tell-me-more.gif and /dev/null differ diff --git a/.gitbook/assets/the-algorithm.gif b/.gitbook/assets/the-algorithm.gif deleted file mode 100644 index 4df0b339..00000000 Binary files a/.gitbook/assets/the-algorithm.gif and /dev/null differ diff --git a/.gitbook/assets/the-rock-simple.gif b/.gitbook/assets/the-rock-simple.gif deleted file mode 100644 index 258f2e7c..00000000 Binary files a/.gitbook/assets/the-rock-simple.gif and /dev/null differ diff --git a/.gitbook/assets/to-the-computer.gif b/.gitbook/assets/to-the-computer.gif deleted file mode 100644 index 76a6210b..00000000 Binary files a/.gitbook/assets/to-the-computer.gif and /dev/null differ diff --git a/.gitbook/assets/underwater-treasure.gif b/.gitbook/assets/underwater-treasure.gif deleted file mode 100644 index 6cc01470..00000000 Binary files a/.gitbook/assets/underwater-treasure.gif and /dev/null differ diff --git a/.gitbook/assets/upload-files-button.webp b/.gitbook/assets/upload-files-button.webp deleted file mode 100644 index a3030c41..00000000 Binary files a/.gitbook/assets/upload-files-button.webp and /dev/null differ diff --git a/.gitbook/assets/use-case (1) (1).png b/.gitbook/assets/use-case (1) (1).png deleted file mode 100644 index a581963f..00000000 Binary files a/.gitbook/assets/use-case (1) (1).png and /dev/null differ diff --git a/.gitbook/assets/use-case (1).png b/.gitbook/assets/use-case (1).png deleted file mode 100644 index a581963f..00000000 Binary files a/.gitbook/assets/use-case (1).png and /dev/null differ diff --git a/.gitbook/assets/v4-contracts.png b/.gitbook/assets/v4-contracts.png deleted file mode 100644 index 310e171d..00000000 Binary files a/.gitbook/assets/v4-contracts.png and /dev/null differ diff --git a/.gitbook/assets/vertical-jellies.jpeg b/.gitbook/assets/vertical-jellies.jpeg deleted file mode 100644 index 36707941..00000000 Binary files a/.gitbook/assets/vertical-jellies.jpeg and /dev/null differ diff --git a/.gitbook/assets/Screenshot 2023-06-15 at 14.49.36.png b/.gitbook/assets/wallet/data_nft_open_sea.png similarity index 100% rename from .gitbook/assets/Screenshot 2023-06-15 at 14.49.36.png rename to .gitbook/assets/wallet/data_nft_open_sea.png diff --git a/.gitbook/assets/we-are-a-team.gif b/.gitbook/assets/we-are-a-team.gif deleted file mode 100644 index d18b2ecf..00000000 Binary files a/.gitbook/assets/we-are-a-team.gif and /dev/null differ diff --git a/.gitbook/assets/welcome-to-my-dojo.gif b/.gitbook/assets/welcome-to-my-dojo.gif deleted file mode 100644 index 309ebcb0..00000000 Binary files a/.gitbook/assets/welcome-to-my-dojo.gif and /dev/null differ diff --git a/.gitbook/assets/whats-a-wallet (1).gif b/.gitbook/assets/whats-a-wallet (1).gif deleted file mode 100644 index 11d7ce42..00000000 Binary files a/.gitbook/assets/whats-a-wallet (1).gif and /dev/null differ diff --git a/.gitbook/assets/whats-a-wallet.gif b/.gitbook/assets/whats-a-wallet.gif deleted file mode 100644 index 11d7ce42..00000000 Binary files a/.gitbook/assets/whats-a-wallet.gif and /dev/null differ diff --git a/.gitbook/assets/zelda-head-bop.gif b/.gitbook/assets/zelda-head-bop.gif deleted file mode 100644 index bfcb2b10..00000000 Binary files a/.gitbook/assets/zelda-head-bop.gif and /dev/null differ diff --git a/SUMMARY.md b/SUMMARY.md index 2a10ea1d..421cc943 100644 --- a/SUMMARY.md +++ b/SUMMARY.md @@ -29,12 +29,13 @@ * [Join a Data Challenge](user-guides/join-a-data-challenge.md) * [Sponsor a Data Challenge](user-guides/sponsor-a-data-challenge.md) * [Get Started Data Farming](user-guides/get-started-df.md) + * [Estimate your APYs](user-guides/how-to-df-estimate-apy.md) * [Harvest More Yield Data Farming](user-guides/how-to-data-farm.md) * [Harvest More Yield with Challenge DF](user-guides/how-to-data-farm-challengeDF.md) * [Harvest More Yield with Volume DF](user-guides/how-to-data-volumeDF.md) * [Claim Rewards Data Farming](user-guides/claim-ocean-rewards.md) * [Liquidity Pools \[deprecated\]](user-guides/remove-liquidity-pools.md) -* [👨💻 Developers](developers/README.md) +* [💻 Developers](developers/README.md) * [Architecture Overview](developers/architecture.md) * [Contracts](developers/contracts/README.md) * [Data NFTs](developers/contracts/data-nfts.md) @@ -85,7 +86,7 @@ * [Publish](developers/ocean.js/publish.md) * [Mint Datatokens](developers/ocean.js/mint-datatoken.md) * [Update Metadata](developers/ocean.js/update-metadata.md) - * [Asset Visibility](developers/ocean.js/remove-asset.md) + * [Asset Visibility](developers/ocean.js/asset-visibility.md) * [Consume Asset](developers/ocean.js/consume-asset.md) * [Run C2D Jobs](developers/ocean.js/cod-asset.md) * [Compute to data](developers/compute-to-data/README.md) @@ -111,7 +112,7 @@ * [Deploying Aquarius](infrastructure/deploying-aquarius.md) * [Deploying Provider](infrastructure/deploying-provider.md) * [Deploying Ocean Subgraph](infrastructure/deploying-ocean-subgraph.md) - * [C2D - Minikube Environment](infrastructure/compute-to-data-minikube.md) + * [Deploying C2D](infrastructure/compute-to-data-minikube.md) * [C2D - Private Docker Registry](infrastructure/compute-to-data-docker-registry.md) * [🤑 DeFi](defi/README.md) * [💰 Rewards](rewards/README.md) diff --git a/contribute/README.md b/contribute/README.md index c63f08a0..91906867 100644 --- a/contribute/README.md +++ b/contribute/README.md @@ -19,7 +19,8 @@ First, make sure that you search existing open + closed issues + PRs to see if y Follow our steps below to properly document your bug! Paste the screenshots into your GitHub issue. -{% @arcade/embed flowId="fUNrK6z2eurJ2C1ty2OG" url="https://app.arcade.software/share/fUNrK6z2eurJ2C1ty2OG" %} +{% embed url="https://app.arcade.software/share/fUNrK6z2eurJ2C1ty2OG" fullWidth="false" %} +{% endembed %} ### Report vulnerabilities diff --git a/contribute/assets_info.json b/contribute/assets_info.json deleted file mode 100644 index 9e26dfee..00000000 --- a/contribute/assets_info.json +++ /dev/null @@ -1 +0,0 @@ -{} \ No newline at end of file diff --git a/contribute/code-of-conduct.md b/contribute/code-of-conduct.md index 2c766357..cbf44393 100644 --- a/contribute/code-of-conduct.md +++ b/contribute/code-of-conduct.md @@ -3,48 +3,30 @@ title: Contributor Code of Conduct description: Be excellent to each other. --- -As contributors and maintainers of this project, and in the interest of -fostering an open and welcoming community, we pledge to respect all people who -contribute to the project. +# Contributor Code of Conduct -We are committed to making participation in this project a harassment-free -experience for everyone, regardless of level of experience, gender, gender -identity and expression, sexual orientation, disability, personal appearance, -body size, race, ethnicity, age, religion, nationality, or species. +As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute to the project. + +We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, nationality, or species. Examples of unacceptable behavior by participants include: -- The use of sexualized language or imagery -- Personal attacks -- Trolling or insulting/derogatory comments -- Public or private harassment -- Publishing other's private information, such as physical or electronic - addresses, without explicit permission -- Deliberate intimidation -- Other unethical or unprofessional conduct +* The use of sexualized language or imagery +* Personal attacks +* Trolling or insulting/derogatory comments +* Public or private harassment +* Publishing other's private information, such as physical or electronic addresses, without explicit permission +* Deliberate intimidation +* Other unethical or unprofessional conduct -Project maintainers have the right and responsibility to remove, edit, or -reject comments, commits, code, wiki edits, issues, and other contributions -that are not aligned to this Code of Conduct, or to ban temporarily or -permanently any contributor for other behaviors that they deem inappropriate, -threatening, offensive, or harmful. +Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. -By adopting this Code of Conduct, project maintainers commit themselves to -fairly and consistently applying these principles to every aspect of managing -this project. Project maintainers who do not follow or enforce the Code of -Conduct may be permanently removed from the project team. +By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team. -This Code of Conduct applies both within project spaces and in public spaces -when an individual is representing the project or its community. +This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. -Instances of abusive, harassing, or otherwise unacceptable behavior directed at yourself or another community member may be reported by contacting a project maintainer at [conduct@oceanprotocol.com](mailto:conduct@oceanprotocol.com). All -complaints will be reviewed and investigated and will result in a response that -is appropriate to the circumstances. Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident. +Instances of abusive, harassing, or otherwise unacceptable behavior directed at yourself or another community member may be reported by contacting a project maintainer at [conduct@oceanprotocol.com](mailto:conduct@oceanprotocol.com). All complaints will be reviewed and investigated and will result in a response that is appropriate to the circumstances. Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident. ---- -This Code of Conduct is adapted from the [Contributor Covenant][homepage], -version 1.3.0, available at [contributor-covenant.org/version/1/3/0/][version] -[homepage]: http://contributor-covenant.org -[version]: http://contributor-covenant.org/version/1/3/0/ +This Code of Conduct is adapted from the [Contributor Covenant](http://contributor-covenant.org), version 1.3.0, available at [contributor-covenant.org/version/1/3/0/](http://contributor-covenant.org/version/1/3/0/) diff --git a/contribute/projects-using-ocean.md b/contribute/projects-using-ocean.md index c668caae..45adf7e6 100644 --- a/contribute/projects-using-ocean.md +++ b/contribute/projects-using-ocean.md @@ -7,11 +7,7 @@ description: We are so proud of the companies that use Ocean Protocol tools!
-From startups to full enterprises, we have so many partners and collaborators using Ocean tech. Curious who's working with Ocean tools? Check out our up-to-date list of our partners and collaborators on the [Collaborators page ](https://oceanprotocol.com/collaborators)of our website. - -### Get a "degree" in Ocean Protocol knowledge - -Our [Ocean Academy](https://oceanacademy.io/) program is a great way to learn more about Ocean Protocol beyond our website, [oceanprotocol.com](https://www.oceanprotocol.com), and our docs, [docs.oceanprotocol.com](https://docs.oceanprotocol.com). The Academy helps you to get up to speed on all things that Ocean can do, setting you up for success using Ocean Protocol tech to build the "next big thing" in Web3 data sharing. +From startups to full enterprises, we have so many partners and collaborators using Ocean tech. Curious who's working with Ocean tools? Check out our up-to-date list of our partners and collaborators on the [Ecosystem page](https://oceanprotocol.com/ecosystem) of our website. ### Show your support by trading OCEAN tokens diff --git a/data-science/README.md b/data-science/README.md index f9f22eee..f37c8497 100644 --- a/data-science/README.md +++ b/data-science/README.md @@ -6,7 +6,7 @@ coverY: 0 # 📊 Data Science -

Ocean Protocol - Built to protect your precious.

+

Ocean Protocol - Built to protect your precious.

### Why should data scientists use Ocean Protocol? @@ -22,9 +22,9 @@ Ocean Protocol is built for data scientists to **monetize data effectively and** ### How to design a ML system using Ocean Protocol? -The first step is to tokenize data into data NFTs and datatokens on the blockchain. We offer a no-code way to tokenize data via the [Ocean Market](https://market.oceanprotocol.com). But we also offer code options for data scientists to use the [Ocean.py](../developers/ocean.py/) and [Ocean.js](../developers/ocean.js/) libraries. Data scientists can then build sophisticated ML systems on top of the tokenized data by using composable Ocean Protocol tools. ML models can use a variety of Ocean smart contracts, including Ocean's [Compute-to-Data](../developers/compute-to-data/), to build model outputs all the way to the last-mile delivery for businesses. +The first step is to tokenize data into data NFTs and datatokens on the blockchain. We offer a no-code way to tokenize data via the [Ocean Market](https://market.oceanprotocol.com). But we also offer code options for data scientists to use the [Ocean.py](../developers/ocean.py/README.md) and [Ocean.js](../developers/ocean.js/README.md) libraries. Data scientists can then build sophisticated ML systems on top of the tokenized data by using composable Ocean Protocol tools. ML models can use a variety of Ocean smart contracts, including Ocean's [Compute-to-Data](../developers/compute-to-data/README.md), to build model outputs all the way to the last-mile delivery for businesses. ### **Key Links for Data Scientists:** * Learn the difference between Ocean Protocol [data NFTs and datatokens](../developers/contracts/datanft-and-datatoken.md), the two types of tokenized data assets you need to start building your ML systems. -* Discover Ocean's [Compute-to-Data](../developers/compute-to-data/) engine that can help you to solve the difficult problem of selling algorithmic compute jobs on your datasets without actually revealing the contents of the algorithm & dataset to the consumer. +* Discover Ocean's [Compute-to-Data](../developers/compute-to-data/README.md) engine that can help you to solve the difficult problem of selling algorithmic compute jobs on your datasets without actually revealing the contents of the algorithm & dataset to the consumer. diff --git a/data-science/data-engineers.md b/data-science/data-engineers.md index 3a70f803..15a5ecae 100644 --- a/data-science/data-engineers.md +++ b/data-science/data-engineers.md @@ -12,7 +12,7 @@ A lot of people miss the mark on tokenizing data that actually _sells_. If your To figure out which market segments are paying for data, then it may help you to **go to the Ocean Market and sort by Sales.** -But even then, it's not enough to just publish useful data on Ocean. **You need to market your data** **assets** to close sales. +But even then, it's not enough to just publish useful data on Ocean. **You need to market your data** **assets** to close sales. Have you tried all these things and are still having trouble making money? Never fear! You can enter one of our [data challenges](https://oceanprotocol.com/challenges) to make sweet OCEAN rewards and build your data science skills. diff --git a/data-science/the-data-value-creation-loop.md b/data-science/the-data-value-creation-loop.md index c9091f50..33b9eb88 100644 --- a/data-science/the-data-value-creation-loop.md +++ b/data-science/the-data-value-creation-loop.md @@ -25,12 +25,3 @@ Here's a condensed breakdown of the loop: ### What is an example of a Data Value Creation Loop? Let's explore an example to showcase the process of the data value creation loop. Imagine a healthcare organization seeking to develop a predictive model for early detection of diseases. They collaborate with data engineers to collect and preprocess various medical datasets, including patient demographics, lab results, and medical imaging. These datasets are tokenized and made available on the Ocean Protocol platform for secure computation. Data scientists utilize the tokenized data to train machine learning models that can accurately identify early warning signs of diseases. These models are then published as compute assets on Ocean Market. Application developers work with the healthcare organization to integrate the models into their existing patient management system, allowing doctors to receive automated risk assessments and personalized recommendations for preventive care. As a result, patients benefit from early detection, doctors can make more informed decisions, and the healthcare organization generates insights to improve patient outcomes while fostering data and model asset collaboration. Et voilà! - - - - - - - - - diff --git a/developers/README.md b/developers/README.md index b4af2ef5..868319a0 100644 --- a/developers/README.md +++ b/developers/README.md @@ -11,14 +11,14 @@ coverY: 0 With Ocean, crypto wallets transform into magical data wallets, where your data can roam freely and securely. Crypto exchanges? Well, they've taken on a new role as data marketplaces, where you can showcase and trade your valuable data treasures. And hold on tight because DAOs are here to create epic data co-ops, where collaboration and innovation reign supreme! 🤝 -But hold on tight, because we have even more in store for you! With Ocean Protocol, you gain access to a treasure trove of tools that will unlock your data scientist superpowers and allow you to unleash your creativity. Whether you're a Python aficionado or a JavaScript maestro, we have you covered with [ocean.py](ocean.py/) and [ocean.js](ocean.js) libraries. So, get ready to dive into the depths of data innovation and create the next groundbreaking dAapp (that's a decentralized App, by the way) using [ocean.js's](ocean.js) powerful capabilities or unleash your skills with [ocean.py](ocean.py/). It's time to shake up the data world like never before! 🌐🚀 +But hold on tight, because we have even more in store for you! With Ocean Protocol, you gain access to a treasure trove of tools that will unlock your data scientist superpowers and allow you to unleash your creativity. Whether you're a Python aficionado or a JavaScript maestro, we have you covered with [ocean.py](ocean.py/README.md) and [ocean.js](ocean.js/README.md) libraries. So, get ready to dive into the depths of data innovation and create the next groundbreaking dAapp (that's a decentralized App, by the way) using [ocean.js's](ocean.js/README.md) powerful capabilities or unleash your skills with [ocean.py](ocean.py/README.md). It's time to shake up the data world like never before! 🌐🚀 -

Ocean Protocol Explorer

+

Ocean Protocol Explorer

-At the core of the Ocean Protocol, you'll find a constellation of [smart contracts](contracts/) that bring extraordinary capabilities to every data asset. Here's where the magic happens! Every asset gets its own cool and unique [**ERC721 data NFT**](contracts/data-nfts.md#what-is-a-data-nft), along with one (or more) [**ERC20 datatokens**](contracts/datanft-and-datatoken.md). It's like giving your data its very own superhero cape! 🦸‍♂️ +At the core of the Ocean Protocol, you'll find a constellation of [smart contracts](contracts/README.md) that bring extraordinary capabilities to every data asset. Here's where the magic happens! Every asset gets its own cool and unique [**ERC721 data NFT**](contracts/data-nfts.md#what-is-a-data-nft), along with one (or more) [**ERC20 datatokens**](contracts/datanft-and-datatoken.md). It's like giving your data its very own superhero cape! 🦸‍♂️ -These [smart contracts](contracts/) form the backbone of Ocean Protocol, empowering data assets with unparalleled value and enabling seamless integration with the wider blockchain ecosystem. Through the [contracts](contracts/), data becomes not only valuable but also tradable, allowing you to unleash the true potential of your data treasures. +These [smart contracts](contracts/README.md) form the backbone of Ocean Protocol, empowering data assets with unparalleled value and enabling seamless integration with the wider blockchain ecosystem. Through the [contracts](contracts/README.md), data becomes not only valuable but also tradable, allowing you to unleash the true potential of your data treasures. -

Smart Contracts

+

Smart Contracts

-Now, if you're new to the world of web3 and blockchain technologies, fear not! We've got you covered. Before diving into the depths of Ocean Protocol, we recommend starting with some introductory guides. These [guides](../user-guides/) will gently introduce you to the magical world of [web3](../discover/wallets/) and help you understand the [basics](../discover/wallets-and-ocean-tokens.md) before you embark on your epic data-driven adventure. +Now, if you're new to the world of web3 and blockchain technologies, fear not! We've got you covered. Before diving into the depths of Ocean Protocol, we recommend starting with some introductory guides. These [guides](../user-guides/README.md) will gently introduce you to the magical world of [web3](../discover/wallets/README.md) and help you understand the [basics](../discover/wallets-and-ocean-tokens.md) before you embark on your epic data-driven adventure. diff --git a/developers/aquarius/README.md b/developers/aquarius/README.md index f93cc3a6..9691b788 100644 --- a/developers/aquarius/README.md +++ b/developers/aquarius/README.md @@ -1,12 +1,16 @@ # Aquarius -### What is Aquarius? +### What is Aquarius? Aquarius is a tool that tracks and caches the metadata from each chain where the Ocean Protocol smart contracts are deployed. It operates off-chain, running an Elasticsearch database. This makes it easy to query the metadata generated on-chain. The core job of Aquarius is to continually look out for new metadata being created or updated on the blockchain. Whenever such events occur, Aquarius takes note of them, processes this information, and adds it to its database. This allows it to keep an up-to-date record of the metadata activity on the chains. -Aquarius has its own interface (API) that allows you to easily query this metadata. With Aquarius, you don't need to do the time-consuming task of scanning the data chains yourself. It offers you a convenient shortcut to the information you need. It's ideal for when you need a search feature within your Dapp. +Aquarius has its own interface (API) that allows you to easily query this metadata. With Aquarius, you don't need to do the time-consuming task of scanning the data chains yourself. It offers you a convenient shortcut to the information you need. It's ideal for when you need a search feature within your dApp. + + + +

Aquarius high level overview

### What does Aquarius do? @@ -19,7 +23,7 @@ Aquarius has its own interface (API) that allows you to easily query this metada ### How to run Aquarius? -We recommend checking the README in the [Aquarius GitHub repository](https://github.com/oceanprotocol/aquarius) for the steps to run the Aquarius. If you see any errors in the instructions, please open an issue within the GitHub repository. +We recommend checking the README in the [Aquarius GitHub repository](https://github.com/oceanprotocol/aquarius) for the steps to run the Aquarius. If you see any errors in the instructions, please open an issue within the GitHub repository. ### What technology does Aquarius use? diff --git a/developers/aquarius/asset-requests.md b/developers/aquarius/asset-requests.md index 7042d7a8..14708ed6 100644 --- a/developers/aquarius/asset-requests.md +++ b/developers/aquarius/asset-requests.md @@ -75,7 +75,7 @@ console.log(response.data.description) ``` -### **Asset Names** +### **Asset Names** Used to retrieve the names of a group of assets using a list of unique identifiers known as Decentralized Identifiers (DIDs). @@ -121,7 +121,7 @@ for (let key in response.data) { ### Query Assets -Used to run a custom search query on the assets using Elasticsearch's native query syntax. We recommend reading the [Elasticsearch documentation](https://www.elastic.co/guide/index.html) to understand their syntax. +Used to run a custom search query on the assets using Elasticsearch's native query syntax. We recommend reading the [Elasticsearch documentation](https://www.elastic.co/guide/index.html) to understand their syntax. * **Endpoint**: `POST /api/aquarius/assets/query` * **Purpose**: This endpoint is used to execute a native Elasticsearch (ES) query against the stored assets. This allows for highly customizable searches and can be used to filter and sort assets based on complex criteria. The body of the request should contain a valid JSON object that defines the ES query. diff --git a/developers/aquarius/chain-requests.md b/developers/aquarius/chain-requests.md index e832ff5b..2668536e 100644 --- a/developers/aquarius/chain-requests.md +++ b/developers/aquarius/chain-requests.md @@ -12,7 +12,7 @@ Retrieves a list of chains that are currently supported or recognized by the Aqu Here are some typical responses you might receive from the API: -* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing chain IDs as keys and their active status as values. +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing chain IDs as keys and their active status as values. Example response: diff --git a/developers/architecture.md b/developers/architecture.md index 991bace4..e43d5250 100644 --- a/developers/architecture.md +++ b/developers/architecture.md @@ -10,9 +10,9 @@ Embark on an exploration of the innovative realm of Ocean Protocol, where data f ### Layer 1: The Foundational Blockchain Layer -At the core of Ocean Protocol lies the robust [Blockchain Layer](contracts/). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. +At the core of Ocean Protocol lies the robust [Blockchain Layer](contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. -The [smart contracts](contracts/) are deployed on the Ethereum mainnet and other compatible [networks](../discover/networks/). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](contracts/) section. +The [smart contracts](contracts/README.md) are deployed on the Ethereum mainnet and other compatible [networks](../discover/networks/README.md). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](contracts/README.md) section. ### Layer 2: The Empowering Middle Layer @@ -20,22 +20,22 @@ Above the smart contracts, you'll find essential [libraries](architecture.md#lib #### Libraries -These libraries include [Ocean.js](ocean.js/), a JavaScript library, and [Ocean.py](ocean.py/), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol. +These libraries include [Ocean.js](ocean.js/README.md), a JavaScript library, and [Ocean.py](ocean.py/README.md), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol. -1. [Ocean.js](ocean.js/): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. -2. [Ocean.py](ocean.py/): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../data-science/) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data. +1. [Ocean.js](ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. +2. [Ocean.py](ocean.py/README.md): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../data-science/README.md) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data. #### Middleware components Additionally, in supporting the discovery process, middleware components come into play: -1. [Aquarius](aquarius/): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery. -2. [Provider](provider/): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants. -3. [Subgraph](subgraph/): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information. +1. [Aquarius](aquarius/README.md): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery. +2. [Provider](provider/README.md): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants. +3. [Subgraph](subgraph/README.md): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information. #### Compute-to-Data -[Compute-to-Data](compute-to-data/) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols. +[Compute-to-Data](compute-to-data/README.md) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols. ### Layer 3: The Accessible Application Layer @@ -45,6 +45,6 @@ Prominently featured within this layer is [Ocean Market](../user-guides/using-oc ### Layer 4: The Friendly Wallets -At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../discover/wallets/), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../discover/wallets/). +At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../discover/wallets/README.md), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../discover/wallets/README.md). With the layers of the architecture clearly delineated, the stage is set for a comprehensive exploration of their underlying logic and intricate design. By examining each individually, we can gain a deeper understanding of their unique characteristics and functionalities. diff --git a/developers/barge/README.md b/developers/barge/README.md index 138bf3a3..22381b70 100644 --- a/developers/barge/README.md +++ b/developers/barge/README.md @@ -8,15 +8,15 @@ The Barge component of Ocean Protocol is a powerful tool designed to simplify th By using the Barge component, developers can spin up an environment that includes default versions of [Aquarius](../aquarius/README.md), [Provider](../provider/README.md), [Subgraph](../subgraph/README.md), and [Compute-to-Data](../compute-to-data/README.md). Additionally, it deploys all the [smart contracts](../contracts/README.md) from the ocean-contracts repository, ensuring a complete and functional local setup. Barge component also starts additional services like [Ganache](https://trufflesuite.com/ganache/), which is a local blockchain simulator used for smart contract development, and [Elasticsearch](https://www.elastic.co/elasticsearch/), a powerful search and analytics engine required by Aquarius for efficient indexing and querying of data sets. A full list of components and exposed ports is available in the GitHub [repository](https://github.com/oceanprotocol/barge#component-versions-and-exposed-ports). -

Load Ocean components locally by using Barge

+

Load Ocean components locally by using Barge

To explore all the available options and gain a deeper understanding of how to utilize the Barge component, you can visit the official GitHub [repository](https://github.com/oceanprotocol/barge#all-options) of Ocean Protocol. -By utilizing the Barge component, developers gain the freedom to conduct experiments, customize, and fine-tune their local development environment, and offers the flexibility to override the Docker image tag associated with specific components. By setting the appropriate environment variable before executing the start\_ocean.sh command, developers can customize the versions of various components according to their requirements. For instance, developers can modify the: `AQUARIUS_VERSION`, `PROVIDER_VERSION`, `CONTRACTS_VERSION`, `RBAC_VERSION`, and `ELASTICSEARCH_VERSION` environment variables to specify the desired Docker image tags for each respective component. +By utilizing the Barge component, developers gain the freedom to conduct experiments, customize, and fine-tune their local development environment, and offers the flexibility to override the Docker image tag associated with specific components. By setting the appropriate environment variable before executing the start\_ocean.sh command, developers can customize the versions of various components according to their requirements. For instance, developers can modify the: `AQUARIUS_VERSION`, `PROVIDER_VERSION`, `CONTRACTS_VERSION`, `RBAC_VERSION`, and `ELASTICSEARCH_VERSION` environment variables to specify the desired Docker image tags for each respective component. {% hint style="warning" %} ⚠️ We've got an important heads-up about Barge that we want to share with you. Brace yourself, because **Barge is not for the faint-hearted**! Here's the deal: the barge works great on Linux, but we need to be honest about its limitations on macOS. And, well, it doesn't work at all on Windows. Sorry, Windows users! -To make things easier for everyone, we **strongly** recommend giving a try first on a **testnet**. Everything is configured already so it should be sufficient for your needs as well. Visit our [networks](../../discover/networks/README.md) page to have clarity on the available test networks. ⚠️\ +To make things easier for everyone, we **strongly** recommend giving a try first on a **testnet**. Everything is configured already so it should be sufficient for your needs as well. Visit our [networks](../../discover/networks/README.md) page to have clarity on the available test networks. ⚠️ {% endhint %} diff --git a/developers/barge/local-setup-ganache.md b/developers/barge/local-setup-ganache.md index feb138ad..75a13945 100644 --- a/developers/barge/local-setup-ganache.md +++ b/developers/barge/local-setup-ganache.md @@ -8,7 +8,7 @@ description: 🧑🏽‍💻 Your Local Development Environment for Ocean Protoc Barge offers several functionalities that enable developers to create and test the Ocean Protocol infrastructure efficiently. Here are its key components: -
FunctionalityDescription
AquariusA metadata storage and retrieval service for Ocean Protocol. Allows indexing and querying of metadata.
ProviderA service that facilitates interaction between users and the Ocean Protocol network.
GanacheA local Ethereum blockchain network for testing and development purposes.
TheGraphA decentralized indexing and querying protocol used for building subgraphs in Ocean Protocol.
ocean-contractsSmart contracts repository for Ocean Protocol. Deploys and manages the necessary contracts for local development.
Customization and OptionsBarge provides various options to customize component versions, log levels, and enable/disable specific blocks.
+
FunctionalityDescription
AquariusA metadata storage and retrieval service for Ocean Protocol. Allows indexing and querying of metadata.
ProviderA service that facilitates interaction between users and the Ocean Protocol network.
GanacheA local Ethereum blockchain network for testing and development purposes.
TheGraphA decentralized indexing and querying protocol used for building subgraphs in Ocean Protocol.
ocean-contractsSmart contracts repository for Ocean Protocol. Deploys and manages the necessary contracts for local development.
Customization and OptionsBarge provides various options to customize component versions, log levels, and enable/disable specific blocks.
Barge helps developers to get started with Ocean Protocol by providing a local development environment. With its modular and user-friendly design, developers can focus on building and testing their applications without worrying about the intricacies of the underlying infrastructure. diff --git a/developers/build-a-marketplace/customising-your-market.md b/developers/build-a-marketplace/customising-your-market.md index 275372e9..ee587733 100644 --- a/developers/build-a-marketplace/customising-your-market.md +++ b/developers/build-a-marketplace/customising-your-market.md @@ -107,7 +107,7 @@ As with the color changes, it’s a good idea to save the file with each change Let’s head to the publish page to see what it looks like with our new styling - so far, so good. But there is one major issue, the publish form is still telling people to publish datasets. On our new marketplace, we want people to publish and sell their photos, so we’re going to have to make some changes here. -![Market Customisation](../../.gitbook/assets/publish-page-before-edit.png) +![Market Customisation](../../.gitbook/assets/market/publish-page-before-edit.png) Open up the `index.json` file from `content/publish/index.json` - here we change the text to explain that this form is for publishing photos. @@ -121,7 +121,7 @@ Open up `src/components/Publish/Metadata/index.tsx` and change line 33 so that i Great, now our publish page explains that users should be publishing photos and the photo is provided as an asset type option. We’ll also leave the algorithm as an option in case some data scientists want to do some analysis or image transformation on the photos. -![Market Customisation](../../.gitbook/assets/publish-page-2.png) +![Market Customisation](../../.gitbook/assets/market/publish-page-2.png) There is one more thing that is fun to change before we move away from the publish form. You’ll notice that Ocean Market V4 now has a cool SVG generation feature that creates the images for the Data NFT. It creates a series of pink waves. Let’s change this so that it uses our brand colors in the waves! @@ -139,7 +139,7 @@ And now your customized publish page is ready for your customers: ## Advanced customization -This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. You'll also need to set the environmental variables if you customized services like Aquarius, Provider, or Subgraph. +This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. You'll also need to set the environmental variables if you customized services like Aquarius, Provider, or Subgraph. First, we are going to create a new file called `.env` in the root of your repository. @@ -176,7 +176,7 @@ NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="0.01" At this point, we have made a lot of changes and hopefully, you’re happy with the way that your marketplace is looking. Given that you now have your own awesome photo marketplace, it’s about time we talked about monetizing it. Yup, that’s right - you will earn a [commission](../contracts/fees.md) when people buy and sell photos in your marketplace. In Ocean V4, there are a whole host of new [fees](../contracts/fees.md) and customization options that you can use. In order to receive the fees you’ll need to set the address where you want to receive these fees in. -When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that. +When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that. You need to replace “0x123abc” with your Ethereum address (this is where the fees will be sent). @@ -209,5 +209,3 @@ If you intend to utilize the ocean market with a custom [Aquarius](../aquarius/R Using a custom subgraph with the ocean market requires additional steps due to the differences in deployment. Unlike the multi-network deployment of the provider and Aquarius services, each network supported by the ocean market has a separate subgraph deployment. This means that while the provider and Aquarius services can be handled by a single deployment across all networks, the subgraph requires specific handling for each network. To utilize a custom subgraph, you will need to implement additional logic within the `getOceanConfig` function located in the `src/utils/ocean.ts` file. By modifying this function, you can ensure that the market uses the desired custom subgraph for the selected network. This is particularly relevant if your market aims to support multiple networks and you do not want to enforce the use of the same subgraph across all networks. By incorporating the necessary logic within `getOceanConfig`, you can ensure the market utilizes the appropriate custom subgraph for each network, enabling the desired level of customization. If the mentioned scenario doesn't apply to your situation, there is another approach you can take. Similar to the previously mentioned examples, you can modify the `.env` file by updating the key labeled `NEXT_PUBLIC_SUBGRAPH_URI`. By making changes to this key, you can configure the ocean market to utilize your preferred subgraph deployment. This alternative method allows you to customize the market's behavior and ensure it utilizes the desired subgraph, even if you don't require different subgraph deployments for each network. - -\ diff --git a/developers/build-a-marketplace/deploying-market.md b/developers/build-a-marketplace/deploying-market.md index e8888f69..b002249b 100644 --- a/developers/build-a-marketplace/deploying-market.md +++ b/developers/build-a-marketplace/deploying-market.md @@ -35,8 +35,8 @@ surge If this is your first time using surge, you will be prompted to enter an email address and password to create a free account. It will ask you to confirm the directory that it is about to publish, check that you are in the market/public/ directory and press enter to proceed. Now it gives you the option to choose the domain that you want your project to be available on. -

surge interaction

+

surge interaction

- We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain. +We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain. After a few minutes, your upload will be complete, and you’re ready to share your data marketplace. You can view the version we created in this guide [here](https://crypto-photos.surge.sh/). diff --git a/developers/build-a-marketplace/forking-ocean-market.md b/developers/build-a-marketplace/forking-ocean-market.md index b6dcbb73..6a4864af 100644 --- a/developers/build-a-marketplace/forking-ocean-market.md +++ b/developers/build-a-marketplace/forking-ocean-market.md @@ -44,12 +44,12 @@ npm start The above command will build the development bundle and run it locally. -

Forking Ocean Market

+

Forking Ocean Market

Great news - your marketplace has successfully been built and is now running locally. Let’s check it out! Open your browser and navigate to http://localhost:8000/. You’ll see that you have a full-on clone of Ocean Market running locally. Give it a go and test out publishing and consuming assets - everything works! That’s all that’s required to get a clone of Ocean market working. The whole process is made simple because your clone can happily use all the smart contracts and backend components that are maintained by Ocean Protocol Foundation. -

Forking Ocean Market

+

Forking Ocean Market

So you’ve got a fully functioning marketplace at this point, which is pretty cool. But it doesn’t really look like your marketplace. Right now, it’s still just a clone of Ocean Market - the same branding, name, logo, etc. The next few steps focus on personalizing your marketplace. diff --git a/developers/community-monetization.md b/developers/community-monetization.md index 649f52cb..a3ceea85 100644 --- a/developers/community-monetization.md +++ b/developers/community-monetization.md @@ -41,9 +41,7 @@ The download and compute fees can both be set to any absolute amount and you can Additionally, provider fees are not limited to data consumption — they can also be used to charge for compute resources. So, for example, this means a provider can charge a fixed fee of 15 DAI to reserve compute resources for 1 hour. This has a huge upside for both the user and the provider host. From the user’s perspective, this means that they can now reserve a suitable amount of compute resources according to what they require. For the host of the provider, this presents another great opportunity to create an income. - -**Benefits to the Ocean Community**\ +**Benefits to the Ocean Community** We’re always looking to give back to the Ocean community and collecting fees is an important part of that. As mentioned above, the Ocean Protocol Foundation retains the ability to implement community fees on data consumption. The tokens that we receive will either be burned or invested in the community via projects that they are building. These investments will take place either through [Data Farming](../rewards/df-intro.md), [Ocean Shipyard](https://oceanprotocol.com/shipyard), or Ocean Ventures. -Additionally, we will also be placing an additional 0.1% fee on projects that aren’t using either the Ocean token or H2O. We want to support marketplaces that use other tokens but we also recognize that they don’t bring the same wider benefit to the Ocean community, so we feel this small additional fee is proportionate. - +Projects that utilize the Ocean token or H2O are subject to a 0.1% fee. In the case of projects that opt to use different tokens, an additional 0.1% fee will be applied. We want to support marketplaces that use other tokens but we also recognize that they don’t bring the same wider benefit to the Ocean community, so we feel this small additional fee is proportionate. diff --git a/developers/compute-to-data/README.md b/developers/compute-to-data/README.md index 33195fd2..e2d1e526 100644 --- a/developers/compute-to-data/README.md +++ b/developers/compute-to-data/README.md @@ -18,18 +18,7 @@ Private data has the potential to drive groundbreaking discoveries in science an The Ocean Protocol provides a compute environment that you can access at the following address: [https://stagev4.c2d.oceanprotocol.com/](https://stagev4.c2d.oceanprotocol.com/). Feel free to explore and utilize this platform for your needs. {% endhint %} -We suggest reading these guides to get an understanding on how compute-to-data works: - -### User Guides - -* [How to write compute to data algorithms](../../user-guides/compute-to-data/make-a-boss-c2d-algorithm.md) -* [How to publish a compute to data algorithm](../../user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md) -* [How to publish a dataset for compute to data](../../user-guides/compute-to-data/publish-a-c2d-data-nft.md) - -### Developer Guides - -* [How to use compute to data with ocean.js](../ocean.js/cod-asset.md) -* [How to use compute to data with ocean.py](../ocean.py/compute-flow.md) +We suggest reading these guides to get an understanding of how compute-to-data works: ### Architecture & Overview Guides @@ -38,6 +27,17 @@ We suggest reading these guides to get an understanding on how compute-to-data w * [Writing Algorithms](compute-to-data-algorithms.md) * [Compute options](compute-options.md) +### User Guides + +* [How to write compute to data algorithms](../../user-guides/compute-to-data/make-a-boss-c2d-algorithm.md) +* [How to publish a compute-to-data algorithm](../../user-guides/compute-to-data/publish-a-c2d-algorithm-nft.md) +* [How to publish a dataset for compute to data](../../user-guides/compute-to-data/publish-a-c2d-data-nft.md) + +### Developer Guides + +* [How to use compute to data with ocean.js](../ocean.js/cod-asset.md) +* [How to use compute to data with ocean.py](../ocean.py/compute-flow.md) + ### Infrastructure Deployment Guides * [Minikube Environment](../../infrastructure/compute-to-data-minikube.md) diff --git a/developers/compute-to-data/compute-options.md b/developers/compute-to-data/compute-options.md index 7ac2fe8e..ec097945 100644 --- a/developers/compute-to-data/compute-options.md +++ b/developers/compute-to-data/compute-options.md @@ -8,7 +8,7 @@ description: Specification of compute options for assets in Ocean Protocol. ### Compute Options -An asset categorized as a `compute type` incorporates additional attributes under the `compute object`. +An asset categorized as a `compute type` incorporates additional attributes under the `compute object`. These attributes are specifically relevant to assets that fall within the compute category and are not required for assets classified under the `access type`. However, if an asset is designated as `compute`, it is essential to include these attributes to provide comprehensive information about the compute service associated with the asset. @@ -18,7 +18,7 @@ These attributes are specifically relevant to assets that fall within the comput ### Trusted Algorithms -The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm permissions. It controls which algorithms can be used for computation. If not defined, any published algorithm is allowed. If the array is empty, no algorithms are allowed. However, if the array is not empty, only algorithms published by the defined publishers are permitted. +The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm permissions. It controls which algorithms can be used for computation. If not defined, any published algorithm is allowed. If the array is empty, no algorithms are allowed. However, if the array is not empty, only algorithms published by the defined publishers are permitted. The structure of each object within the `publisherTrustedAlgorithms` array is as follows: diff --git a/developers/compute-to-data/compute-to-data-algorithms.md b/developers/compute-to-data/compute-to-data-algorithms.md index c08b5c86..f9d14c64 100644 --- a/developers/compute-to-data/compute-to-data-algorithms.md +++ b/developers/compute-to-data/compute-to-data-algorithms.md @@ -5,11 +5,6 @@ description: >- feature. --- -# Writing Algorithms - -### Overview - -\ In the Ocean Protocol stack, algorithms are recognized as distinct asset types, alongside datasets. When it comes to Compute-to-Data, an algorithm comprises the following key components: * **Algorithm Code**: The algorithm code refers to the specific instructions and logic that define the computational steps to be executed on a dataset. It encapsulates the algorithms' functionalities, calculations, and transformations. @@ -34,7 +29,7 @@ When creating an algorithm asset in Ocean Protocol, it is essential to include t -
VariableUsage
imageThe Docker image name the algorithm will run with.
tagThe Docker image tag that you are going to use.
entrypointThe Docker entrypoint. $ALGO is a macro that gets replaced inside the compute job, depending where your algorithm code is downloaded.
+
VariableUsage
imageThe Docker image name the algorithm will run with.
tagThe Docker image tag that you are going to use.
entrypointThe Docker entrypoint. $ALGO is a macro that gets replaced inside the compute job, depending where your algorithm code is downloaded.
Define your entry point according to your dependencies. E.g. if you have multiple versions of Python installed, use the appropriate command `python3.6 $ALGO`. @@ -97,7 +92,7 @@ Please note that when using local Providers or Metatata Caches, the ddos might n For every algorithm pod, the Compute to Data environment provides the following environment variables: -
VariableUsage
DIDSAn array of DID strings containing the input datasets.
TRANSFORMATION_DIDThe DID of the algorithm.
+
VariableUsage
DIDSAn array of DID strings containing the input datasets.
TRANSFORMATION_DIDThe DID of the algorithm.
@@ -248,13 +243,13 @@ To run this algorithm, use the following `container` object: An asset of type `algorithm` has additional attributes under `metadata.algorithm`, describing the algorithm and the Docker environment it is supposed to be run under. -
AttributeTypeDescription
languagestringLanguage used to implement the software.
versionstringVersion of the software preferably in SemVer notation. E.g. 1.0.0.
consumerParametersConsumer ParametersAn object that defines required consumer input before running the algorithm
container*containerObject describing the Docker container image. See below
+
AttributeTypeDescription
languagestringLanguage used to implement the software.
versionstringVersion of the software preferably in SemVer notation. E.g. 1.0.0.
consumerParametersConsumer ParametersAn object that defines required consumer input before running the algorithm
container*containerObject describing the Docker container image. See below
\* Required The `container` object has the following attributes defining the Docker image for running the algorithm: -
AttributeTypeDescription
entrypoint*stringThe command to execute, or script to run inside the Docker image.
image*stringName of the Docker image.
tag*stringTag of the Docker image.
checksum*stringDigest of the Docker image. (ie: sha256:xxxxx)
+
AttributeTypeDescription
entrypoint*stringThe command to execute, or script to run inside the Docker image.
image*stringName of the Docker image.
tag*stringTag of the Docker image.
checksum*stringDigest of the Docker image. (ie: sha256:xxxxx)
\* Required diff --git a/developers/compute-to-data/compute-to-data-architecture.md b/developers/compute-to-data/compute-to-data-architecture.md index 6584ed91..e95085d3 100644 --- a/developers/compute-to-data/compute-to-data-architecture.md +++ b/developers/compute-to-data/compute-to-data-architecture.md @@ -5,7 +5,17 @@ description: Architecture overview # Architecture -### Architecture Overview +Compute-to-Data (C2D) is a cutting-edge data processing paradigm that enables secure and privacy-preserving computation on sensitive datasets. + +In the C2D workflow, the following steps are performed: + +1. The consumer initiates a compute-to-data job by selecting the desired data asset and algorithm, and then, the orders are validated via the dApp used. +2. A dedicated and isolated execution pod is created for the C2D job. +3. The execution pod loads the specified algorithm into its environment. +4. The execution pod securely loads the selected dataset for processing. +5. The algorithm is executed on the loaded dataset within the isolated execution pod. +6. The results and logs generated by the algorithm are securely returned to the user. +7. The execution pod deletes the dataset, algorithm, and itself to ensure data privacy and security.

Compute architecture overview

@@ -14,7 +24,7 @@ The interaction between the Consumer and the Provider follows a specific workflo Throughout the computation process, the Consumer has the ability to check the status of the job by making a query to the Provider using the `getJobDetails(XXXX)` function, providing the job identifier (`XXXX`) as a reference. {% hint style="info" %} -You have the option to initiate a compute job using one or more data assets. You can explore this functionality by utilizing the [ocean.py](../ocean.py/) and [ocean.js](../ocean.js/) libraries. +You have the option to initiate a compute job using one or more data assets. You can explore this functionality by utilizing the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries. {% endhint %} Now, let's delve into the inner workings of the Provider. Initially, it verifies whether the Consumer has sent the appropriate datatokens to gain access to the desired data. Once validated, the Provider interacts with the Operator-Service, a microservice responsible for coordinating the job execution. The Provider submits a request to the Operator-Service, which subsequently forwards the request to the Operator-Engine, the actual compute system in operation. @@ -36,9 +46,9 @@ Before the flow can begin, these pre-conditions must be met: ### Access Control using Ocean Provider -Similar to the `access service`, the `compute service` within Ocean Protocol relies on the [Ocean Provider](../provider/), which is a crucial component managed by Publishers. The role of the Ocean Provider is to facilitate interactions with users and handle the fundamental aspects of a Publisher's infrastructure, enabling seamless integration into the Ocean Protocol ecosystem. It serves as the primary interface for direct interaction with the infrastructure where the data is located. +Similar to the `access service`, the `compute service` within Ocean Protocol relies on the [Ocean Provider](../provider/README.md), which is a crucial component managed by the asset Publishers. The role of the Ocean Provider is to facilitate interactions with users and handle the fundamental aspects of a Publisher's infrastructure, enabling seamless integration into the Ocean Protocol ecosystem. It serves as the primary interface for direct interaction with the infrastructure where the data is located. -The [Ocean Provider](../provider/) encompasses the necessary credentials to establish secure and authorized interactions with the underlying infrastructure. Initially, this infrastructure may be hosted in cloud providers, although it also has the flexibility to extend to on-premise environments if required. By encompassing the necessary credentials, the Ocean Provider ensures the smooth and controlled access to the infrastructure, allowing Publishers to effectively leverage the compute service within Ocean Protocol. +The [Ocean Provider](../provider/README.md) encompasses the necessary credentials to establish secure and authorized interactions with the underlying infrastructure. Initially, this infrastructure may be hosted in cloud providers, although it also has the flexibility to extend to on-premise environments if required. By encompassing the necessary credentials, the Ocean Provider ensures the smooth and controlled access to the infrastructure, allowing Publishers to effectively leverage the compute service within Ocean Protocol. ### Operator Service @@ -89,7 +99,7 @@ Upon the successful completion of its tasks, the Pod-Configuration gracefully co ### Pod Publishing -Pod Publishing is a command-line utility that seamlessly integrates with the Operator Service and Operator Engine within a Kubernetes-based compute infrastructure. It serves as a versatile tool for efficiently processing, logging, and uploading workflow outputs. By working in tandem with the Operator Service and Operator Engine, Pod Publishing streamlines the workflow management process, enabling easy and reliable handling of output data generated during computation tasks. Whether it's processing complex datasets or logging crucial information, Pod Publishing simplifies these tasks and enhances the overall efficiency of the compute infrastructure. +Pod Publishing is a command-line utility that seamlessly integrates with the Operator Service and Operator Engine within a Kubernetes-based compute infrastructure. It serves as a versatile tool for efficient processing, logging, and uploading workflow outputs. By working in tandem with the Operator Service and Operator Engine, Pod Publishing streamlines the workflow management process, enabling easy and reliable handling of output data generated during computation tasks. Whether it's processing complex datasets or logging crucial information, Pod Publishing simplifies these tasks and enhances the overall efficiency of the compute infrastructure. The primary functionality of Pod Publishing can be divided into three key areas: @@ -97,7 +107,7 @@ The primary functionality of Pod Publishing can be divided into three key areas: 2. **Role in Publishing Pod**: Within the compute infrastructure orchestrated by the Operator Engine on Kubernetes (K8s), Pod Publishing is integral to the Publishing Pod. The Publishing Pod handles the creation of new assets in the Ocean Protocol network after a workflow execution. 3. **Workflow Outputs Management**: Pod Publishing manages the storage of workflow outputs. Depending on configuration, it interacts with IPFS or AWS S3, and logs the processing steps. -Please note: - +{% hint style="info" %} * Pod Publishing does not provide storage capabilities; all state information is stored directly in the K8s cluster or the respective data storage solution (AWS S3 or IPFS). * The utility works in close coordination with the Operator Service and Operator Engine, but does not have standalone functionality. +{% endhint %} diff --git a/developers/compute-to-data/compute-to-data-datasets-algorithms.md b/developers/compute-to-data/compute-to-data-datasets-algorithms.md index 1df430bb..3d1a2f17 100644 --- a/developers/compute-to-data/compute-to-data-datasets-algorithms.md +++ b/developers/compute-to-data/compute-to-data-datasets-algorithms.md @@ -9,14 +9,14 @@ description: Datasets and Algorithms Compute-to-Data introduces a paradigm where datasets remain securely within the premises of the data holder, ensuring strict data privacy and control. Only authorized algorithms are granted access to operate on these datasets, subject to specific conditions, within a secure and isolated environment. In this context, algorithms are treated as valuable assets, comparable to datasets, and can be priced accordingly. This approach enables data holders to maintain control over their sensitive data while allowing for valuable computations to be performed on them, fostering a balanced and secure data ecosystem. -To define the accessibility of algorithms, their classification as either public or private can be specified by setting the `attributes.main.type` value in the Decentralized Data Object (DDO): +To define the accessibility of algorithms, their classification as either public or private can be specified by setting the `attributes.main.type` value in the Decentralized Data Object (DDO): * `"access"` - public. The algorithm can be downloaded, given appropriate datatoken. * `"compute"` - private. The algorithm is only available to use as part of a compute job without any way to download it. The Algorithm must be published on the same Ocean Provider as the dataset it's targeted to run on. This flexibility allows for fine-grained control over algorithm usage, ensuring data privacy and enabling fair pricing mechanisms within the Compute-to-Data framework. -For each dataset, Publishers have the flexibility to define permission levels for algorithms to execute on their datasets, offering granular control over data access. +For each dataset, Publishers have the flexibility to define permission levels for algorithms to execute on their datasets, offering granular control over data access. There are several options available for publishers to configure these permissions: diff --git a/developers/contracts/README.md b/developers/contracts/README.md index 6b26724b..3c911f83 100644 --- a/developers/contracts/README.md +++ b/developers/contracts/README.md @@ -4,7 +4,7 @@ description: Empowering the Decentralised Data Economy # Contracts -The [V4 release](https://blog.oceanprotocol.com/ocean-v4-overview-1ccd4a7ce150) of Ocean Protocol introduces a comprehensive and enhanced suite of s[mart contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts) that serve as the backbone of the decentralized data economy. These contracts facilitate secure, transparent, and efficient interactions among data providers, consumers, and ecosystem participants. With the introduction of V4 contracts, Ocean Protocol propels itself forward, delivering substantial functionality, scalability, and flexibility advancements. +The [V4 release](https://blog.oceanprotocol.com/ocean-v4-overview-1ccd4a7ce150) of Ocean Protocol introduces a comprehensive and enhanced suite of [smart contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts) that serve as the backbone of the decentralized data economy. These contracts facilitate secure, transparent, and efficient interactions among data providers, consumers, and ecosystem participants. With the introduction of V4 contracts, Ocean Protocol propels itself forward, delivering substantial functionality, scalability, and flexibility advancements. The V4 smart contracts have been deployed across multiple [networks](../../discover/networks/README.md) and are readily accessible through the GitHub [repository](https://github.com/oceanprotocol/contracts/tree/main/contracts). The V4 introduces significant enhancements that encompass the following key **features**: @@ -20,7 +20,8 @@ By utilizing ERC721 tokens, Ocean V4 **grants data creators greater flexibility

Ocean Protocol V4 Smart Contracts

-#### [**Community monetization**](../community-monetization.md), to help the community create sustainable businesses. + +### [**Community monetization**](../community-monetization.md), to help the community create sustainable businesses. Ocean V4 brings forth enhanced opportunities for marketplace operators, creating a conducive environment for the emergence of a thriving market of **third-party Providers**. diff --git a/developers/contracts/architecture.md b/developers/contracts/architecture.md index 12377249..1eb6c012 100644 --- a/developers/contracts/architecture.md +++ b/developers/contracts/architecture.md @@ -7,11 +7,11 @@ description: Ocean Protocol Architecture Adventure! Embark on an exploration of the innovative realm of Ocean Protocol, where data flows seamlessly and AI achieves new heights. Dive into the intricately layered architecture that converges data and services, fostering a harmonious collaboration. Let us delve deep and uncover the profound design of Ocean Protocol.🐬 -

Overview of the Ocean Protocol Architecture

+

Overview of the Ocean Protocol Architecture

### Layer 1: The Foundational Blockchain Layer -At the core of Ocean Protocol lies the robust [Blockchain Layer](../contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. +At the core of Ocean Protocol lies the robust [Blockchain Layer](../contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. The [smart contracts](../contracts/README.md) are deployed on the Ethereum mainnet and other compatible [networks](../../discover/networks/README.md). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](../contracts/README.md) section. @@ -23,7 +23,7 @@ Above the smart contracts, you'll find essential [libraries](architecture.md#lib These libraries include [Ocean.js](../ocean.js/README.md), a JavaScript library, and [Ocean.py](../ocean.py/README.md), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol. -1. [Ocean.js](../ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. +1. [Ocean.js](../ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. 2. [Ocean.py](../ocean.py/README.md): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../../data-science/README.md) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data. #### Middleware components @@ -40,9 +40,9 @@ Additionally, in supporting the discovery process, middleware components come in ### Layer 3: The Accessible Application Layer -Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures. +Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures. -Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized. +Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized. ### Layer 4: The Friendly Wallets diff --git a/developers/contracts/data-nfts.md b/developers/contracts/data-nfts.md index 1557d821..42f2a04b 100644 --- a/developers/contracts/data-nfts.md +++ b/developers/contracts/data-nfts.md @@ -35,9 +35,9 @@ We have implemented data NFTs using the [ERC721 standard](https://erc721.org/). ERC721 tokens are non-fungible, and thus cannot be used for automatic price discovery like ERC20 tokens. ERC721 and ERC20 combined together can be used for sub-licensing. Ocean Protocol's [ERC721Template](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC721Template.sol) solves this problem by using ERC721 for tokenizing the **Base IP** and tokenizing sub-licenses by using ERC20. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**. -Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. The data NFTs can be easily managed from any NFT marketplace like [OpenSea](https://opensea.io/). +Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. The data NFTs can be easily managed from any NFT marketplace like [OpenSea](https://opensea.io/). -

Data NFT on Open Sea

+

Data NFT on Open Sea

Ocean V4’s data NFT factory can deploy different types of data NFTs based on a variety of templates. Some templates could be tuned for data unions, others for DeFi, and others yet for enterprise use cases. diff --git a/developers/contracts/datanft-and-datatoken.md b/developers/contracts/datanft-and-datatoken.md index 9e3f26cf..ed0eb505 100644 --- a/developers/contracts/datanft-and-datatoken.md +++ b/developers/contracts/datanft-and-datatoken.md @@ -7,15 +7,15 @@ description: >- # Data NFTs and Datatokens -

Data NFTs and Datatokens

+

Data NFTs and Datatokens

-In summary: A [**data NFT**](data-nfts.md) serves as a **representation of the copyright** or exclusive license for a data asset on the blockchain, known as the "\[**base IP**]\(../../discover/glossary.md#base ip)". **Datatokens**, on the other hand, function as a crucial mechanism for **decentralized access** to data assets. +In summary: A [**data NFT**](data-nfts.md) serves as a **representation of the copyright** or exclusive license for a data asset on the blockchain, known as the [**base IP**](../../discover/glossary.md). **Datatokens**, on the other hand, function as a crucial mechanism for **decentralized access** to data assets. -For a specific data NFT, multiple ERC20 datatoken contracts can exist. Here's the main concept: Owning 1.0 datatokens grants you the ability to **consume** the corresponding dataset. Essentially, it acts as a **sub-license** from the \[base IP]\(../../discover/glossary.md#base ip), allowing you to utilize the dataset according to the specified license terms (when provided by the publisher). License terms can be established with a "good default" or by the Data NFT owner. +For a specific data NFT, multiple ERC20 datatoken contracts can exist. Here's the main concept: Owning 1.0 datatokens grants you the ability to **consume** the corresponding dataset. Essentially, it acts as a **sub-license** from the [base IP](../../discover/glossary.md), allowing you to utilize the dataset according to the specified license terms (when provided by the publisher). License terms can be established with a "good default" or by the Data NFT owner. The choice to employ the ERC20 fungible token standard for datatokens is logical, as licenses themselves are fungible. This standard ensures compatibility and interoperability of datatokens with ERC20-based wallets, decentralized exchanges (DEXes), decentralized autonomous organizations (DAOs), and other relevant platforms. Datatokens can be transferred, acquired through marketplaces or exchanges, distributed via airdrops, and more. -You can \[publish]\(../../discover/glossary.md#to publish) a data NFT initially with no ERC20 datatoken contracts. This means you simply aren’t ready to grant access to your data asset yet (sub-license it). Then, you can publish one or more ERC20 datatoken contracts against the data NFT. One datatoken contract might grant consume rights for **1 day**, another for **1 week**, etc. Each different datatoken contract is for **different** license terms. +You can [publish](../../discover/glossary.md) a data NFT initially with no ERC20 datatoken contracts. This means you simply aren’t ready to grant access to your data asset yet (sub-license it). Then, you can publish one or more ERC20 datatoken contracts against the data NFT. One datatoken contract might grant consume rights for **1 day**, another for **1 week**, etc. Each different datatoken contract is for **different** license terms. For a more comprehensive exploration of intellectual property and its practical connections with ERC721 and ERC20, you can read the blog post written by [Trent McConaghy](http://www.trent.st/), co-founder of Ocean Protocol. It delves into the subject matter in detail and provides valuable insights. @@ -30,7 +30,27 @@ For a more comprehensive exploration of intellectual property and its practical What happends under the hood? 🤔 -

Data NFT & Datatokens flow

+Publishing with V4 smart contracts in Ocean Protocol involves a well-defined process that streamlines the publishing of data assets. It provides a systematic approach to ensure efficient management and exchange of data within the Ocean Protocol ecosystem. By leveraging smart contracts, publishers can securely create and deploy data NFTs, allowing them to tokenize and represent their data assets. Additionally, the flexibility of V4 smart contracts enables publishers to define pricing schemas for datatokens, facilitating fair and transparent transactions. This publishing framework empowers data publishers by providing them with greater control and access to a global marketplace, while ensuring trust, immutability, and traceability of their published data assets. + +The V4 smart contracts publishing includes the following steps: + +1. The data publisher initiates the creation of a new data NFT. +2. The data NFT factory deploys the template for the new data NFT. +3. The data NFT template creates the data NFT contract. +4. The address of the newly created data NFT is available to the data publisher. +5. The publisher is now able to create datatokens with pricing schema for the data NFT. To accomplish this, the publisher initiates a call to the data NFT contract, specifically requesting the creation of a new datatoken with a fixed rate schema. +6. The data NFT contract deploys a new datatoken and a fixed rate schema by interacting with the datatoken template contract. +7. The datatoken contract is created (Datatoken-1 contract). +8. The datatoken template generates a new fixed rate schema for Datatoken-1. +9. The address of Datatoken-1 is now available to the data publisher. +10. Optionally, the publisher can create a new datatoken (Datatoken-2) with a free price schema. +11. The data NFT contract interacts with the Datatoken Template contract to create a new datatoken and a dispenser schema. +12. The datatoken templated deploys the Datatoken-2 contract. +13. The datatoken templated creates a dispenser for the Datatoken-2 contract. + +Below is a visual representation that illustrates the flow: + +

Data NFT & Datatokens flow

We have some awesome hands-on experience when it comes to publishing a data NFT and minting datatokens. diff --git a/developers/contracts/datatoken-templates.md b/developers/contracts/datatoken-templates.md index 3c9afca6..2403835f 100644 --- a/developers/contracts/datatoken-templates.md +++ b/developers/contracts/datatoken-templates.md @@ -21,7 +21,7 @@ The details regarding currently supported **datatoken templates** are as follows ### **Regular template** -The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a [`MINTER`](roles.md#minter) role, making the supply of datatoken variable. This template is assigned _**`templateId`**_` ``= 1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol). +The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a [`MINTER`](roles.md#minter) role, making the supply of datatoken variable. This template is assigned _**`templateId =`**_`1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol). ### **Enterprise template** @@ -29,11 +29,11 @@ The enterprise template has additional functions apart from methods in the ERC20 #### Set the template -When you're creating an ERC20 datatoken, you can specify the desired template by passing on the template index. +When you're creating an ERC20 datatoken, you can specify the desired template by passing on the template index. {% tabs %} {% tab title="Ocean.js" %} -To specify the datatoken template via ocean.js, you need to customize the [DatatokenCreateParams](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/%40types/Datatoken.ts#L3) with your desired `templateIndex`. +To specify the datatoken template via ocean.js, you need to customize the [DatatokenCreateParams](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/%40types/Datatoken.ts#L3) with your desired `templateIndex`. The default template used is 1. @@ -53,7 +53,7 @@ export interface DatatokenCreateParams { {% endtab %} {% tab title="Ocean.py" %} -To specify the datatoken template via ocean.py, you need to customize the [DatatokenArguments](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/datatoken_base.py#L64) with your desired template\_index. +To specify the datatoken template via ocean.py, you need to customize the [DatatokenArguments](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/datatoken_base.py#L64) with your desired template\_index. The default template used is 1. @@ -78,7 +78,7 @@ class DatatokenArguments: {% endtabs %} {% hint style="info" %} -By default, all assets published through the Ocean Market use the Enterprise Template. +By default, all assets published through the Ocean Market use the Enterprise Template. {% endhint %} #### Retrieve the template @@ -90,11 +90,12 @@ To identify the template used for a specific asset, you can easily retrieve this 3. Once you have located the datatoken address, click on the contract tab to access more details. 4. Within the contract details, we can identify and determine the template used for the asset. - + -We like making things easy :sunglasses: so here is an even easier way to retrieve the info for [this](https://market.oceanprotocol.com/asset/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6) asset published in the Ocean Market: +We like making things easy :sunglasses: so here is an even easier way to retrieve the info for [this](https://market.oceanprotocol.com/asset/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6) asset published in the Ocean Market: -{% @arcade/embed flowId="wxBPSc42eSYUiawSY8rC" url="https://app.arcade.software/share/wxBPSc42eSYUiawSY8rC" %} +{% embed url="https://app.arcade.software/share/wxBPSc42eSYUiawSY8rC" fullWidth="false" %} +{% endembed %} {% hint style="info" %} _It's important to note that Ocean Protocol may introduce new templates to support additional variations of data NFTs and datatokens in the future._ diff --git a/developers/contracts/datatokens.md b/developers/contracts/datatokens.md index d816d442..749bb0b6 100644 --- a/developers/contracts/datatokens.md +++ b/developers/contracts/datatokens.md @@ -6,7 +6,7 @@ description: ERC20 datatokens represent licenses to access the assets. Fungible tokens are a type of digital asset that are identical and interchangeable with each other. Each unit of a fungible token holds the same value and can be exchanged on a one-to-one basis. This means that one unit of a fungible token is indistinguishable from another unit of the same token. Examples of fungible tokens include cryptocurrencies like Bitcoin (BTC) and Ethereum (ETH), where each unit of the token is equivalent to any other unit of the same token. Fungible tokens are widely used for transactions, trading, and as a means of representing value within blockchain-based ecosystems. -## What is a Datatoken? +## What is a Datatoken? Datatokens are fundamental within Ocean Protocol, representing a key mechanism to **access** data assets in a decentralized manner. In simple terms, a datatoken is an **ERC20-compliant token** that serves as **access control** for a data/service represented by a [data NFT](data-nfts.md). @@ -16,7 +16,7 @@ By using datatokens, data owners can retain ownership and control over their dat ### Understanding Datatokens and Licenses -Each datatoken represents a [**sub-license**](../../discover/glossary.md#sub-licensee) from the base intellectual property (IP) owner, enabling users to access and consume the associated dataset. The license terms can be set by the data NFT owner or default to a predefined "good default" license. The fungible nature of ERC20 tokens aligns perfectly with the fungibility of licenses, facilitating seamless exchangeability and interoperability between different datatokens. +Each datatoken represents a [**sub-license**](../../discover/glossary.md) from the base intellectual property (IP) owner, enabling users to access and consume the associated dataset. The license terms can be set by the data NFT owner or default to a predefined "good default" license. The fungible nature of ERC20 tokens aligns perfectly with the fungibility of licenses, facilitating seamless exchangeability and interoperability between different datatokens. By adopting the ERC20 standard for datatokens, Ocean Protocol ensures compatibility and interoperability with a wide array of ERC20-based wallets, [decentralized exchanges (DEXes)](https://blog.oceanprotocol.com/ocean-datatokens-will-be-tradeable-on-decentrs-dex-41715a166a1f), decentralized autonomous organizations (DAOs), and other blockchain-based platforms. This standardized approach enables users to effortlessly transfer, purchase, exchange, or receive datatokens through various means such as marketplaces, exchanges, or airdrops. diff --git a/developers/contracts/fees.md b/developers/contracts/fees.md index c076826e..16a18065 100644 --- a/developers/contracts/fees.md +++ b/developers/contracts/fees.md @@ -25,17 +25,17 @@ However, if you're building a custom marketplace, you have the flexibility to in When a user exchanges a [datatoken](datatokens.md) for the privilege of downloading an asset or initiating a compute job that utilizes the asset, consume fees come into play. These fees are associated with accessing an asset and include: -1. **Publisher Market** Consumption Fee +1. **Publisher Market** Consumption Fee * Defined during the ERC20 [creation](https://github.com/oceanprotocol/contracts/blob/b937a12b50dc4bdb7a6901c33e5c8fa136697df7/contracts/templates/ERC721Template.sol#L334). * Defined as Address, Token, Amount. The amount is an absolute value(not a percentage). - * A marketplace can charge a specified amount per order. + * A marketplace can charge a specified amount per order. * Eg: A market can set a fixed fee of 10 USDT per order, no matter what pricing schemas are used (fixedrate with ETH, BTC, dispenser, etc). -2. **Consume Market** Consumption Fee - * A market can specify what fee it wants on the order function. -3. **Provider Consumption** Fees +2. **Consume Market** Consumption Fee + * A market can specify what fee it wants on the order function. +3. **Provider Consumption** Fees * Defined by the [Provider](../provider/README.md) for any consumption. * Expressed in: Address, Token, Amount (absolute), Timeout. - * You can retrieve them when calling the initialize endpoint. + * You can retrieve them when calling the initialize endpoint. * Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser). 4. **Ocean Community** Fee * Ocean's smart contracts collect **Ocean Community fees** during order operations. These fees are reinvested in community projects and distributed to the veOcean holders through Data Farming. @@ -70,7 +70,7 @@ function updateOPCFee(uint256 _newSwapOceanFee, uint256 _newSwapNonOceanFee,
-Each of these fees plays a role in ensuring fair compensation and supporting the Ocean community. +Each of these fees plays a role in ensuring fair compensation and supporting the Ocean community. | Fee | Value in Ocean Market | Value in Other Markets | | ---------------- | :-------------------: | -------------------------------------------------------- | @@ -81,14 +81,14 @@ Each of these fees plays a role in ensuring fair compensation and supporting the ### Provider fee -[Providers](../provider/README.md) facilitate data consumption, initiate compute jobs, encrypt and decrypt DDOs, and verify user access to specific data assets or services. +[Providers](../provider/README.md) facilitate data consumption, initiate compute jobs, encrypt and decrypt DDOs, and verify user access to specific data assets or services. -Provider fees serve as [compensation](../community-monetization.md#3.-running-your-own-provider) to the individuals or organizations operating their own provider instances when users request assets. +Provider fees serve as [compensation](../community-monetization.md#3.-running-your-own-provider) to the individuals or organizations operating their own provider instances when users request assets. * Defined by the [Provider](../provider/README.md) for any consumption. * Expressed in: Address, Token, Amount (absolute), Timeout. -* You can retrieve them when calling the initialize endpoint. -* These fees can be set as a **fixed amount** rather than a percentage. +* You can retrieve them when calling the initialize endpoint. +* These fees can be set as a **fixed amount** rather than a percentage. * Providers have the flexibility to specify the token in which the fees must be paid, which can differ from the token used in the consuming market. * Provider fees can be utilized to charge for [computing](../compute-to-data/README.md) resources. Consumers can select the desired payment amount based on the compute resources required to execute an algorithm within the [Compute-to-Data](../compute-to-data/README.md) environment, aligning with their specific needs. * Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser). diff --git a/developers/contracts/pricing-schemas.md b/developers/contracts/pricing-schemas.md index c33d5827..5ad3f76d 100644 --- a/developers/contracts/pricing-schemas.md +++ b/developers/contracts/pricing-schemas.md @@ -182,8 +182,8 @@ function createNftWithErc20WithDispenser( -To make the most of these pricing models, you can rely on user-friendly libraries such as [Ocean.js ](../ocean.js/)and [Ocean.py](../ocean.py/), specifically developed for interacting with Ocean Protocol. +To make the most of these pricing models, you can rely on user-friendly libraries such as [Ocean.js ](../ocean.js/README.md)and [Ocean.py](../ocean.py/README.md), specifically developed for interacting with Ocean Protocol. With Ocean.js, you can use the [createFRE() ](../ocean.js/publish.md)function to effortlessly deploy a data NFT (non-fungible token) and datatoken with a fixed-rate exchange pricing model. Similarly, in Ocean.py, the [create\_url\_asset()](../ocean.py/publish-flow.md#create-an-asset--pricing-schema-simultaneously) function allows you to create an asset with fixed pricing. These libraries simplify the process of interacting with Ocean Protocol, managing pricing, and handling asset creation. -By taking advantage of Ocean Protocol's pricing options and leveraging the capabilities of [Ocean.js](../ocean.js/) and [Ocean.py](../ocean.py/) (or by using the [Market](../../user-guides/using-ocean-market.md)), you can effectively monetize your data assets while ensuring transparent and seamless access for data consumers. +By taking advantage of Ocean Protocol's pricing options and leveraging the capabilities of [Ocean.js](../ocean.js/README.md) and [Ocean.py](../ocean.py/README.md) (or by using the [Market](../../user-guides/using-ocean-market.md)), you can effectively monetize your data assets while ensuring transparent and seamless access for data consumers. diff --git a/developers/contracts/revenue.md b/developers/contracts/revenue.md index dd61a138..84c84572 100644 --- a/developers/contracts/revenue.md +++ b/developers/contracts/revenue.md @@ -8,7 +8,7 @@ Having a [data NFT](data-nfts.md) that generates revenue continuously, even when

Make it rain

-By default, the revenue generated from a [data NFT](data-nfts.md) is directed to the [owner](roles.md#nft-owner) of the NFT. This arrangement automatically updates whenever the data NFT is transferred to a new owner. C +By default, the revenue generated from a [data NFT](data-nfts.md) is directed to the [owner](roles.md#nft-owner) of the NFT. This arrangement automatically updates whenever the data NFT is transferred to a new owner. However, there are scenarios where you may prefer the revenue to be sent to a different account instead of the owner. This can be accomplished by designating a new payment collector. This feature becomes particularly beneficial when the data NFT is owned by an organization or enterprise rather than an individual. @@ -20,7 +20,7 @@ In the case of [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-templ On the other hand, with [ERC20Template](datatoken-templates.md#regular-template), for a fixed rate exchange, the revenue is available at the fixed rate exchange level. The owner or the payment collector has the authority to manually retrieve the revenue. {% endhint %} -There are several methods available for establishing a new **payment collector**. You have the option to utilize the ERC20Template/ERC20TemplateEnterprise contract directly. Another approach is to leverage the [ocean.py](../ocean.py/) and [ocean.js](../ocean.js/) libraries. Alternatively, you can employ the network explorer associated with your asset. Lastly, you can directly set it up within the Ocean Market. +There are several methods available for establishing a new **payment collector**. You have the option to utilize the ERC20Template/ERC20TemplateEnterprise contract directly. Another approach is to leverage the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries. Alternatively, you can employ the network explorer associated with your asset. Lastly, you can directly set it up within the Ocean Market. Here are some examples of how to set up a new payment collector using the mentioned methods: @@ -33,7 +33,7 @@ paymentCollectorAddress = 'New payment collector address' await datatoken.setPaymentCollector(datatokenAddress, callerAddress, paymentCollectorAddress) ``` -2. Using [Ocean.py](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean\_lib/models/test/test\_datatoken.py#L39). +2. Using [Ocean.py](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/test/test_datatoken.py#L39). ```python datatokenAddress = 'Your datatoken address' diff --git a/developers/contracts/roles.md b/developers/contracts/roles.md index bb9368bc..2380c57b 100644 --- a/developers/contracts/roles.md +++ b/developers/contracts/roles.md @@ -23,7 +23,7 @@ The NFT owner is the owner of the base-IP and is therefore at the highest level. With the exception of the NFT owner role, all other roles can be assigned to multiple users. {% endhint %} -There are several methods available to assign roles and permissions. One option is to utilize the [ocean.py](../ocean.py/) and [ocean.js](../ocean.js/) libraries that we provide. These libraries offer a streamlined approach for assigning roles and permissions programmatically. +There are several methods available to assign roles and permissions. One option is to utilize the [ocean.py](../ocean.py/README.md) and [ocean.js](../ocean.js/README.md) libraries that we provide. These libraries offer a streamlined approach for assigning roles and permissions programmatically. Alternatively, for a more straightforward solution that doesn't require coding, you can utilize the network explorer of your asset's network. By accessing the network explorer, you can directly interact with the contracts associated with your asset. Below, we provide a few examples to help guide you through the process. @@ -62,7 +62,8 @@ function removeManager(address _managerAddress) external onlyNFTOwner { The **manager** can assign or revoke three main roles (**deployer, metadata updater, and store updater**). The manager is also able to call any other contract (ERC725X implementation). -{% @arcade/embed flowId="qC8QpkLsFIQk3NxPzB8p" url="https://app.arcade.software/share/qC8QpkLsFIQk3NxPzB8p" %} +{% embed url="https://app.arcade.software/share/qC8QpkLsFIQk3NxPzB8p" fullWidth="false" %} +{% endembed %} ### Metadata Updater @@ -292,7 +293,8 @@ function removeMinter(address _minter) external onlyERC20Deployer { -{% @arcade/embed flowId="OHlwsPbf29S1PLh03FM7" url="https://app.arcade.software/share/OHlwsPbf29S1PLh03FM7" %} +{% embed url="https://app.arcade.software/share/OHlwsPbf29S1PLh03FM7" fullWidth="false" %} +{% endembed %} ### Fee Manager diff --git a/developers/ddo-specification.md b/developers/ddo-specification.md index 0d16bf78..ee758514 100644 --- a/developers/ddo-specification.md +++ b/developers/ddo-specification.md @@ -9,9 +9,9 @@ description: >- # DDO Specification -### DDO Schema - High Level +### DDO Schema - High Level -The below diagram shows the high-level DDO schema depicting the content of each data structure and the relations between them. +The below diagram shows the high-level DDO schema depicting the content of each data structure and the relations between them. Please note that some data structures apply only on certain types of services or assets. @@ -342,7 +342,16 @@ _Aquarius_ should always verify the checksum after data is decrypted via a _Prov Each asset has a state, which is held by the NFT contract. The possible states are: -
StateDescriptionDiscoverable in Ocean MarketOrdering allowedListed under profile
0ActiveYesYesYes
1End-of-lifeNoNoNo
2Deprecated (by another asset)NoNoNo
3Revoked by publisherNoNoNo
4Ordering is temporary disabledYesNoYes
5Asset unlisted.NoYesYes
+
StateDescriptionDiscoverable in Ocean MarketOrdering allowedListed under profile
0ActiveYesYesYes
1End-of-lifeYesNoNo
2Deprecated (by another asset)NoNoNo
3Revoked by publisherNoNoNo
4Ordering is temporary disabledYesNoYes
5Asset unlisted.NoYesYes
+ +States details: + +1. **Active**: Assets in the "Active" state are fully functional and available for discovery in Ocean Market, and other components. Users can search for, view, and interact with these assets. Ordering is allowed, which means users can place orders to purchase or access the asset's services. +2. **End-of-life**: Assets in the "End-of-life" state remain discoverable but cannot be ordered. This state indicates that the assets are usually deprecated or outdated, and they are no longer actively promoted or maintained. +3. **Deprecated (by another asset)**: This state indicates that another asset has deprecated the current asset. Deprecated assets are not discoverable, and ordering is not allowed. Similar to the "End-of-life" state, deprecated assets are not listed under the owner's profile. +4. **Revoked by publisher**: When an asset is revoked by its publisher, it means that the publisher has explicitly revoked access or ownership rights to the asset. Revoked assets are not discoverable, and ordering is not allowed. +5. **Ordering is temporarily disabled**: Assets in this state are still discoverable, but ordering functionality is temporarily disabled. Users can view the asset and gather information, but they cannot place orders at that moment. However, these assets are still listed under the owner's profile. +6. **Asset unlisted**: Assets in the "Asset unlisted" state are not discoverable. However, users can still place orders for these assets, making them accessible. Unlisted assets are listed under the owner's profile, allowing users to view and access them. ### Aquarius Enhanced DDO Response @@ -492,7 +501,7 @@ Details for each of these are explained on the [Compute Options page](compute-to ### DDO Schema - Detailed -The below diagram shows the detailed DDO schema depicting the content of each data structure and the relations between them. +The below diagram shows the detailed DDO schema depicting the content of each data structure and the relations between them. Please note that some data structures apply only on certain types of services or assets. diff --git a/developers/fractional-ownership.md b/developers/fractional-ownership.md index e75f57b1..9112c8bc 100644 --- a/developers/fractional-ownership.md +++ b/developers/fractional-ownership.md @@ -13,15 +13,15 @@ Ocean offers two approaches to facilitate fractional ownership: 1. Sharded Holding of ERC20 Datatokens: Under this approach, each holder of ERC20 tokens possesses the typical datatoken rights outlined earlier. For instance, owning 1.0 datatoken allows consumption of a particular asset. Ocean conveniently provides this feature out of the box. 2. Sharding ERC721 Data NFT: This method involves dividing the ownership of an ERC721 data NFT among multiple individuals, granting each co-owner the right to a portion of the earnings generated from the underlying IP. Moreover, these co-owners collectively control the data NFT. For instance, a dedicated DAO may be established to hold the data NFT, featuring its own ERC20 token. DAO members utilize their tokens to vote on updates to data NFT roles or the deployment of ERC20 datatokens associated with the ERC721. -It's worth noting that for the second approach, one might consider utilizing platforms like Niftex for sharding. However, important questions arise in this context: +It's worth noting that for the second approach, one might consider utilizing platforms like Niftex for sharding. However, important questions arise in this context: -* What specific rights do shard-holders possess? +* What specific rights do shard-holders possess? * It's possible that they have limited rights, just as Amazon shareholders don't have the authority to roam the hallways of Amazon's offices simply because they own shares -* Additionally, how do shard-holders exercise control over the data NFT? +* Additionally, how do shard-holders exercise control over the data NFT? These concerns are effectively addressed by employing a tokenized DAO, as previously described. -

DAO

+

DAO

Data DAOs present a fascinating use case whenever a group of individuals desires to collectively manage data or consolidate data for increased bargaining power. Such DAOs can take the form of unions, cooperatives, or trusts. diff --git a/developers/identifiers.md b/developers/identifiers.md index 7df031ce..10224f9c 100644 --- a/developers/identifiers.md +++ b/developers/identifiers.md @@ -36,7 +36,7 @@ console.log(did) ``` -Before creating a DID you should first publish a data NFT, we suggest reading the following sections so you are familiar with the process: +Before creating a DID you should first publish a data NFT, we suggest reading the following sections so you are familiar with the process: * [Creating a data NFT with ocean.js](ocean.js/creating-datanft.md) * [Publish flow with ocean.py](ocean.py/publish-flow.md) diff --git a/developers/obtaining-api-keys-for-blockchain-access.md b/developers/obtaining-api-keys-for-blockchain-access.md index ec5be9db..a2f9be36 100644 --- a/developers/obtaining-api-keys-for-blockchain-access.md +++ b/developers/obtaining-api-keys-for-blockchain-access.md @@ -16,6 +16,6 @@ Choose any API provider of your choice. Some of the commonly used are: * [Alchemy](https://www.alchemy.com/) * [Moralis](https://moralis.io/) -The supported networks are listed [here](../discover/networks/). +The supported networks are listed [here](../discover/networks/README.md). Let's configure the remote setup for the mentioned components in the following sections. diff --git a/developers/ocean.js/README.md b/developers/ocean.js/README.md index dc442d25..1e9770e7 100644 --- a/developers/ocean.js/README.md +++ b/developers/ocean.js/README.md @@ -29,5 +29,3 @@ Our module structure follows this format: * Utils When working with a particular module, you will need to provide different parameters. To instantiate classes from the contracts module, you must pass objects such as Signer, which represents the wallet instance, or the contract address you wish to utilize, depending on the scenario. As for the services modules, you will need to provide the provider URI or metadata cache URI. - - diff --git a/developers/ocean.js/remove-asset.md b/developers/ocean.js/asset-visibility.md similarity index 57% rename from developers/ocean.js/remove-asset.md rename to developers/ocean.js/asset-visibility.md index 73e052ae..9ab0a1bd 100644 --- a/developers/ocean.js/remove-asset.md +++ b/developers/ocean.js/asset-visibility.md @@ -1,17 +1,6 @@ # Asset Visibility -In the Ocean Protocol ecosystem, each asset is associated with a state that is maintained by the NFT (Non-Fungible Token) contract. The [state of an asset](../ddo-specification.md#state) determines its visibility and availability for different actions on platforms like Ocean Market, as well as its appearance in user profiles. The following table outlines the possible states and their characteristics: - -
StateDescriptionDiscoverable in Ocean MarketOrdering AllowedListed Under Profile
0ActiveYesYesYes
1End-of-lifeNoNoNo
2Deprecated (by another asset)NoNoNo
3Revoked by publisherNoNoNo
4Ordering is temporarily disabledYesNoYes
5Asset unlistedNoYesYes
- -Now let's explain each state in more detail: - -1. **Active**: Assets in the "Active" state are fully functional and available for discovery in Ocean Market, and other components. Users can search for, view, and interact with these assets. Ordering is allowed, which means users can place orders to purchase or access the asset's services. -2. **End-of-life**: Assets in the "End-of-life" state are no longer discoverable. They are typically deprecated or outdated and are no longer actively promoted or maintained. Users cannot place orders or interact with these assets, and they are not listed under the owner's profile. -3. **Deprecated (by another asset)**: This state indicates that another asset has deprecated the current asset. Deprecated assets are not discoverable, and ordering is not allowed. Similar to the "End-of-life" state, deprecated assets are not listed under the owner's profile. -4. **Revoked by publisher**: When an asset is revoked by its publisher, it means that the publisher has explicitly revoked access or ownership rights to the asset. Revoked assets are not discoverable, and ordering is not allowed. -5. **Ordering is temporarily disabled**: Assets in this state are still discoverable, but ordering functionality is temporarily disabled. Users can view the asset and gather information, but they cannot place orders at that moment. However, these assets are still listed under the owner's profile. -6. **Asset unlisted**: Assets in the "Asset unlisted" state are not discoverable. However, users can still place orders for these assets, making them accessible. Unlisted assets are listed under the owner's profile, allowing users to view and access them. +In the Ocean Protocol ecosystem, each asset is associated with a state that is maintained by the NFT (Non-Fungible Token) contract. The [state of an asset](../ddo-specification.md#state) determines its visibility and availability for different actions on platforms like Ocean Market, as well as its appearance in user profiles. To explore the various asset's state in detail, please check out the [DDO Specification](../ddo-specification.md#state) page. It provides comprehensive information about the different states that assets can be in. By assigning specific states to assets, Ocean Protocol enables a structured approach to asset management and visibility. These states help regulate asset discoverability, ordering permissions, and the representation of assets in user profiles, ensuring a controlled and reliable asset ecosystem. diff --git a/developers/ocean.js/cod-asset.md b/developers/ocean.js/cod-asset.md index c6c3abb1..a1853c8e 100644 --- a/developers/ocean.js/cod-asset.md +++ b/developers/ocean.js/cod-asset.md @@ -2,7 +2,7 @@ **Overview** -Compute-to-Data is a powerful feature of Ocean Protocol that enables privacy-preserving data analysis and computation. With Compute-to-Data, data owners can maintain control over their data while allowing external parties to perform computations on that data. +Compute-to-Data is a powerful feature of Ocean Protocol that enables privacy-preserving data analysis and computation. With Compute-to-Data, data owners can maintain control over their data while allowing external parties to perform computations on that data. This documentation provides an overview of Compute-to-Data in Ocean Protocol and explains how to use it with Ocean.js. For detailed code examples and implementation details, please refer to the official [Ocean.js](https://github.com/oceanprotocol/ocean.js) GitHub repository. diff --git a/developers/ocean.js/configuration.md b/developers/ocean.js/configuration.md index 95b6f92a..ea98009e 100644 --- a/developers/ocean.js/configuration.md +++ b/developers/ocean.js/configuration.md @@ -1,6 +1,6 @@ # Configuration -For obtaining the API keys for blockchain access and setting the correct environment variables, please consult [this section](http://localhost:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) first and proceed with the next steps. +For obtaining the API keys for blockchain access and setting the correct environment variables, please consult [this section](http://127.0.0.1:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) first and proceed with the next steps. ### Create a directory @@ -74,7 +74,7 @@ PRIVATE_KEY=0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58 {% endtab %} {% endtabs %} -Replace `` with the appropriate values. \*\*You can see all the networks configuration on Oceanjs' [config helper](https://github.com/oceanprotocol/ocean.js/blob/main/src/config/ConfigHelper.ts#L42). +Replace `` with the appropriate values. You can see all the networks configuration on Oceanjs' [config helper](https://github.com/oceanprotocol/ocean.js/blob/main/src/config/ConfigHelper.ts#L42). ### Setup dependencies diff --git a/developers/ocean.js/creating-datanft.md b/developers/ocean.js/creating-datanft.md index 19c94050..f2675963 100644 --- a/developers/ocean.js/creating-datanft.md +++ b/developers/ocean.js/creating-datanft.md @@ -11,7 +11,7 @@ This tutorial guides you through the process of creating your own data NFT using #### Create a script to deploy dataNFT -The provided script demonstrates how to create a data NFT using Oceanjs. +The provided script demonstrates how to create a data NFT using Oceanjs. First, create a new file in the working directory, alongside the `config.js` and `.env` files. Name it `create_dataNFT.js` (or any appropriate name). Then, copy the following code into the new created file: diff --git a/developers/ocean.js/publish.md b/developers/ocean.js/publish.md index 16d313d9..e0efeb6e 100644 --- a/developers/ocean.js/publish.md +++ b/developers/ocean.js/publish.md @@ -17,7 +17,7 @@ Create a new file in the same working directory where configuration file (`confi **Fees**: The code snippets below define fees related parameters. Please refer [fees page ](../contracts/fees.md)for more details {% endhint %} -The code utilizes methods such as `NftFactory` and `Datatoken` from the Ocean libraries to enable you to interact with the Ocean Protocol and perform various operations related to data NFTs and datatokens. +The code utilizes methods such as `NftFactory` and `Datatoken` from the Ocean libraries to enable you to interact with the Ocean Protocol and perform various operations related to data NFTs and datatokens. The `createFRE()` performs the following: diff --git a/developers/ocean.py/README.md b/developers/ocean.py/README.md index 2992b294..46ab875a 100644 --- a/developers/ocean.py/README.md +++ b/developers/ocean.py/README.md @@ -6,7 +6,7 @@ Attention all data enthusiasts! Are you an inquisitive data scientist intrigued Well, brace yourselves for some exhilarating news! Introducing ocean.py, a Python library that possesses a touch of magic. 🎩🐍 It empowers you to discreetly and securely publish, exchange, and effortlessly consume data. 🐙💦 Collaborating with the Ocean Protocol 🌊, it unlocks a plethora of advantages mentioned earlier. So get ready to take the plunge into the vast ocean of data with a resounding splash of excitement! 💦🌊 -

ocean.py library

+

ocean.py library

### Overview diff --git a/developers/ocean.py/consume-flow.md b/developers/ocean.py/consume-flow.md index 024abbd7..007c535c 100644 --- a/developers/ocean.py/consume-flow.md +++ b/developers/ocean.py/consume-flow.md @@ -18,7 +18,7 @@ Below, we show four possible approaches: * C is when Alice wants to share access for free, to anyone * D is when Alice wants to sell access -
+
In the same Python console: @@ -102,7 +102,7 @@ The _beginning_ of the file should contain the following contents: ... ``` -Here’s a video version this post 👇. +Here’s a video version for this post 👇 {% embed url="https://www.youtube.com/watch?v=JQF-5oRvq9w" %} Main Flow Video diff --git a/developers/ocean.py/install.md b/developers/ocean.py/install.md index 6dec3eec..e775203c 100644 --- a/developers/ocean.py/install.md +++ b/developers/ocean.py/install.md @@ -4,8 +4,7 @@ Let’s start interacting with the python library by firstly installing it & its From the adventurous `Python 3.8.5` all the way up to `Python 3.10.4`, ocean.py has got your back! 🚀 -While `ocean.py` can join you on your `Python 3.11` journey, a few manual tweaks may be required. But worry not, brave explorers, we've got all the juicy details for you below! 📚✨\ -\ +While `ocean.py` can join you on your `Python 3.11` journey, a few manual tweaks may be required. But worry not, brave explorers, we've got all the juicy details for you below! 📚✨ ⚠️ Make sure that you have `autoconf`, `pkg-config` and `build-essential` or their equivalents installed on your host. ### Installing ocean.py @@ -54,7 +53,6 @@ Let's dive deeper into the Ocean world! 💙 Did you know that Ocean and Brownie Oh, buoy! 🌊🐙 When it comes to installation, ocean.py has you covered with a special README called ["install.md"](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/install.md). It's like a trusty guide that helps you navigate all the nitty-gritty details. So, let's dive in and ride the waves of installation together! 🏄‍♂️🌊 -\ Or if you prefer a video format, you can check this tutorial on Youtube {% embed url="https://www.youtube.com/watch?v=mbniGPNHE_M" %} diff --git a/developers/ocean.py/ocean-assets-tech-details.md b/developers/ocean.py/ocean-assets-tech-details.md index 6569c5c3..831b14b9 100644 --- a/developers/ocean.py/ocean-assets-tech-details.md +++ b/developers/ocean.py/ocean-assets-tech-details.md @@ -31,7 +31,7 @@ A tuple which contains the data NFT, datatoken and the data asset. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL178C1-L185C82) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL178C1-L185C82)
@@ -78,7 +78,7 @@ A tuple which contains the algorithm NFT, algorithm datatoken and the algorithm **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL146C4-L176C82) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL146C4-L176C82)
@@ -145,7 +145,7 @@ A tuple which contains the data NFT, datatoken and the data asset. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL187C5-L198C82) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL187C5-L198C82)
@@ -194,7 +194,7 @@ A tuple which contains the data NFT, datatoken and the data asset. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL200C5-L212C82) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL200C5-L212C82)
@@ -244,7 +244,7 @@ A tuple which contains the data NFT, datatoken and the data asset. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL214C5-L229C1) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL214C5-L229C1)
@@ -302,7 +302,7 @@ A tuple which contains the data NFT, datatoken and the data asset. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL259C5-L390C43) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL259C5-L390C43)
@@ -539,7 +539,7 @@ The updated DDO, or `None` if updated DDO not found in Aquarius. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL392C5-L454C19) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL392C5-L454C19)
@@ -625,7 +625,7 @@ Returns DDO instance. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL456C5-L458C43) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL456C5-L458C43)
@@ -657,7 +657,7 @@ A list of DDOs which have matches with the text provided as parameter. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL460C4-L475C10) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL460C4-L475C10)
@@ -702,7 +702,7 @@ A list of DDOs which have matches with the query provided as parameter. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL477C4-L490C10) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL477C4-L490C10)
@@ -756,7 +756,7 @@ The full path to the downloaded file as `string`. **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL492C5-L516C20) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL492C5-L516C20)
@@ -820,7 +820,7 @@ Return value is a hex string for transaction hash which denotes the proof of sta **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL518C5-L571C28) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL518C5-L571C28)
@@ -914,7 +914,7 @@ Return value is a tuple composed of list of datasets and algorithm data (if exis **Defined in** -[ocean/ocean\_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/ocean/ocean\_assets.py#LL573C5-L627C30) +[ocean/ocean_assets.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/ocean/ocean_assets.py#LL573C5-L627C30)
diff --git a/developers/ocean.py/publish-flow.md b/developers/ocean.py/publish-flow.md index b1476f00..cd6509af 100644 --- a/developers/ocean.py/publish-flow.md +++ b/developers/ocean.py/publish-flow.md @@ -31,7 +31,7 @@ You've now published an Ocean asset! * [`data_nft`](../contracts/data-nfts.md) is the base (base IP) * [`datatoken`](../contracts/datatokens.md) for access by others (licensing) -* `ddo` holding metadata +* [`ddo`](../ddo-specification.md) holding metadata
@@ -113,7 +113,7 @@ If you call `create()` after this, you can pass in an argument `deployed_datatok Ocean Assets allows you to bundle several common scenarios as a single transaction, thus lowering gas fees. -Any of the `ocean.assets.create__asset()` functions can also take an optional parameter that describes a bundled [pricing schema](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) (Dispenser or Fixed Rate Exchange). +Any of the `ocean.assets.create__asset()` functions can also take an optional parameter that describes a bundled [pricing schema](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) (Dispenser or Fixed Rate Exchange). Here is an example involving an exchange: diff --git a/developers/ocean.py/remote-setup.md b/developers/ocean.py/remote-setup.md index caa2ace4..fd740f4a 100644 --- a/developers/ocean.py/remote-setup.md +++ b/developers/ocean.py/remote-setup.md @@ -83,7 +83,7 @@ For all users: to use EWC, add the following to network-config.yaml: #### 1.5 RPCs and Infura -In order to obtain API keys for blockchain access, follow up [this document](http://localhost:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) for tips & tricks. +In order to obtain API keys for blockchain access, follow up [this document](http://127.0.0.1:5000/o/mTcjMqA4ylf55anucjH8/s/zQlpIJEeu8x5yl0OLuXn/) for tips & tricks. The config file's default RPCs point to Infura, which require you to have an Infura account with corresponding token `WEB3_INFURA_PROJECT_ID`. @@ -100,7 +100,7 @@ You can bypass manually: just edit your brownie network config file. Or you can bypass via the command line. The following command replaces Infura RPCs with public ones in `network-config.yaml`: -* Linux users: in the console: +* Linux users: in the console: {% code overflow="wrap" %} ```bash diff --git a/developers/provider/README.md b/developers/provider/README.md index 211f2a9d..7b228237 100644 --- a/developers/provider/README.md +++ b/developers/provider/README.md @@ -26,6 +26,12 @@ Additionally, the Provider service offers compute services by establishing a con * Provides compute services (connects to C2D environment) * Typically run by the Data owner +

Ocean Provider - publish & consume

+ +In the publishing process, the provider plays a crucial role by encrypting the DDO using its private key. Then, the encrypted DDO is stored on the blockchain. + +During the consumption flow, after a consumer obtains access to the asset by purchasing a datatoken, the provider takes responsibility for decrypting the DDO and fetching data from the source used by the data publisher. + ### What technology is used? * Python: This is the main programming language used in Provider. @@ -34,7 +40,7 @@ Additionally, the Provider service offers compute services by establishing a con ### How to run the provider? -We recommend checking the README in the Provider [GitHub repository](https://github.com/oceanprotocol/provider) for the steps to run the Provider. If you see any errors in the instructions, please open an issue within the GitHub repository. +We recommend checking the README in the Provider [GitHub repository](https://github.com/oceanprotocol/provider) for the steps to run the Provider. If you see any errors in the instructions, please open an issue within the GitHub repository. ### Ocean Provider Endpoints Specification diff --git a/developers/provider/general-endpoints.md b/developers/provider/general-endpoints.md index 4eac9b60..2d1fa7be 100644 --- a/developers/provider/general-endpoints.md +++ b/developers/provider/general-endpoints.md @@ -10,7 +10,7 @@ Retrieves the last-used nonce value for a specific user's Ethereum address. Here are some typical responses you might receive from the API: -* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the nonce value. +* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the nonce value. Example response: @@ -42,7 +42,7 @@ Retrieves Content-Type and Content-Length from the given URL or asset. * `serviceId`: This is a string representing the ID of the service. * **Purpose**: This endpoint is used to retrieve the `Content-Type` and `Content-Length` from a given URL or asset. For published assets, `did` and `serviceId` should be provided. It also accepts file objects (as described in the Ocean Protocol documentation) and can compute a checksum if the file size is less than `MAX_CHECKSUM_LENGTH`. For larger files, the checksum will not be computed. * **Responses**: - * **200**: This is a successful HTTP response code. It returns a JSON object containing the file info. + * **200**: This is a successful HTTP response code. It returns a JSON object containing the file info. Example response: @@ -89,11 +89,11 @@ console.log(response) #### Javascript Example -Before calling the `/download` endpoint, you need to follow these steps: +Before calling the `/download` endpoint, you need to follow these steps: 1. You need to set up and connect a wallet for the consumer. The consumer needs to have purchased the datatoken for the asset that you are trying to download. Libraries such as ocean.js or ocean.py can be used for this. 2. Get the nonce. This can be done by calling the `/getnonce` endpoint above. -3. Sign a message from the account that has purchased the datatoken. +3. Sign a message from the account that has purchased the datatoken. 4. Add the nonce and signature to the payload. ```javascript @@ -139,7 +139,7 @@ downloadAsset(payload); ### Initialize -In order to consume a data service the user is required to send one datatoken to the provider. +In order to consume a data service the user is required to send one datatoken to the provider. The datatoken is transferred on the blockchain by requesting the user to sign an ERC20 approval transaction where the approval is given to the provider's account for the number of tokens required by the service. diff --git a/developers/retrieve-datatoken-address.md b/developers/retrieve-datatoken-address.md index c86cbc34..855ed174 100644 --- a/developers/retrieve-datatoken-address.md +++ b/developers/retrieve-datatoken-address.md @@ -10,7 +10,7 @@ description: >- If you are given an Ocean Market link, then the network and datatoken address for the asset is visible on the Ocean Market webpage. For example, given this asset's Ocean Market link: [https://odc.oceanprotocol.com/asset/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1](https://odc.oceanprotocol.com/asset/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1) the webpage shows that this asset is hosted on the Mumbai network, and one simply clicks the datatoken's hyperlink to reveal the datatoken's address as shown in the screenshot below: -

See the Network and Datatoken Address for an Ocean Market asset by visiting the asset's Ocean Market page.

+

See the Network and Datatoken Address for an Ocean Market asset by visiting the asset's Ocean Market page.

#### More Detailed Info: @@ -18,15 +18,15 @@ You can access all the information for the Ocean Market asset also by **enabling **Step 1** - Click the Settings button in the top right corner of the Ocean Market -

Click the Settings button

+

Click the Settings button

**Step 2** - Check the Activate Debug Mode box in the dropdown menu -

Check 'Active Debug Mode'

+

Check 'Active Debug Mode'

**Step 3** - Go to the page for the asset you would like to examine, and scroll through the DDO information to find the NFT address, datatoken address, chain ID, and other information. -
+
### How to use Aquarius to find the chainID and datatoken address from a DID? @@ -34,7 +34,7 @@ If you know the DID:op but you don't know the source link, then you can use Ocea For example, for the following DID:op: "did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1" the Ocean Aquarius URL can be modified to add the DID:op and resolve its metadata. Simply add "[https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/](https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1)" to the beginning of the DID:op and enter the link in your browser like this: [https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1](https://v4.aquarius.oceanprotocol.com/api/aquarius/assets/ddo/did:op:1b26eda361c6b6d307c8a139c4aaf36aa74411215c31b751cad42e59881f92c1) -

The metadata printout for this DID:op with the network's Chain ID and datatoken address circled in red

+

The metadata printout for this DID:op with the network's Chain ID and datatoken address circled in red

Here are the networks and their corresponding chain IDs: diff --git a/developers/subgraph/README.md b/developers/subgraph/README.md index 848f5c13..f96b3f32 100644 --- a/developers/subgraph/README.md +++ b/developers/subgraph/README.md @@ -8,15 +8,13 @@ description: >- ### What is the Subgraph? -The [Ocean Subgraph](https://github.com/oceanprotocol/ocean-subgraph) is built on top of [The Graph](https://thegraph.com/)(the popular :sunglasses: indexing and querying protocol for blockchain data). It is an essential component of the Ocean Protocol ecosystem. It provides an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. The data sourced from the Ocean subgraph can be accessed through [GraphQL](https://graphql.org/learn/) queries. +The [Ocean Subgraph](https://github.com/oceanprotocol/ocean-subgraph) is built on top of [The Graph](https://thegraph.com/) (the popular :sunglasses: indexing and querying protocol for blockchain data). It is an essential component of the Ocean Protocol ecosystem. It provides an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. The data sourced from the Ocean subgraph can be accessed through [GraphQL](https://graphql.org/learn/) queries. -Imagine this 💭: if you were to always fetch data from the on-chain, you'd start to feel a little...old :older\_woman: Like your queries are stuck in a time warp. But fear not! When you embrace the power of the subgraph, data becomes your elixir of youth. It's snappy, it's swift, and it's refreshingly retrievable. With the subgraph, you can sail through data like a sprightly dolphin 🐬 +Imagine this 💭: if you were to always fetch data from the on-chain, you'd start to feel a little...old :older\_woman: Like your queries are stuck in a time warp. But fear not! When you embrace the power of the subgraph, data becomes your elixir of youth. -
+

Ocean Subgraph

-

High-speed subgraph

- -
+The subgraph reads data from the blockchain, extracting relevant information. Additionally, it indexes events emitted from the Ocean smart contracts. This collected data is then made accessible to any decentralized applications (dApps) that require it, through GraphQL queries. The subgraph organizes and presents the data in a JSON format, facilitating efficient and structured access for dApps. ### How to use the Subgraph? @@ -25,7 +23,7 @@ You can utilize the Subgraph instances provided by Ocean Protocol or deploy your If you're eager to use the Ocean Subgraph, here's some important information for you: We've deployed an Ocean Subgraph for each of the supported networks. Take a look at the table below, where you'll find handy links to both the subgraph instance and GraphiQL for each network. With the user-friendly GraphiQL interface, you can execute GraphQL queries directly, without any additional setup. It's a breeze! :ocean: {% hint style="info" %} -When it comes to fetching valuable information about [Data NFTs](../contracts/data-nfts.md) and [datatokens](../contracts/datatokens.md), the subgraph queries play a crucial role. They retrieve numerous details and information, but, the Subgraph cannot decrypt the DDO. But worry not, we have a dedicated component for that—[Aquarius](../aquarius/README.md)! 🐬 Aquarius communicates with the provider and decrypts the encrypted information, making it readily available for queries. +When it comes to fetching valuable information about [Data NFTs](../contracts/data-nfts.md) and [datatokens](../contracts/datatokens.md), the subgraph queries play a crucial role. They retrieve numerous details and information, but, the Subgraph cannot decrypt the DDO. But worry not, we have a dedicated component for that—[Aquarius](../aquarius/)! 🐬 Aquarius communicates with the provider and decrypts the encrypted information, making it readily available for queries. {% endhint %} ### Ocean Subgraph deployments @@ -44,8 +42,7 @@ When it comes to fetching valuable information about [Data NFTs](../contracts/da When making subgraph queries, please remember that the parameters you send, such as a datatoken address or a data NFT address, should be in **lowercase**. This is an essential requirement to ensure accurate processing of the queries. We kindly request your attention to this detail to facilitate a seamless query experience. {% endhint %} -In the following pages, we've prepared a few examples just for you. From running queries to exploring data, you'll have the chance to dive right into the Ocean Subgraph data. There, you'll find a wide range of additional code snippets and examples that showcase the power and versatility of the Ocean Subgraph. So, grab a virtual snorkel, and let's explore together! 🤿\ - +In the following pages, we've prepared a few examples just for you. From running queries to exploring data, you'll have the chance to dive right into the Ocean Subgraph data. There, you'll find a wide range of additional code snippets and examples that showcase the power and versatility of the Ocean Subgraph. So, grab a virtual snorkel, and let's explore together! 🤿 {% hint style="info" %} For more examples, visit the subgraph GitHub [repository](https://github.com/oceanprotocol/ocean-subgraph), where you'll discover an extensive collection of code snippets and examples that highlight the Subgraph's capabilities and adaptability. diff --git a/developers/subgraph/get-datatoken-buyers.md b/developers/subgraph/get-datatoken-buyers.md index d70ccc2d..6eca5845 100644 --- a/developers/subgraph/get-datatoken-buyers.md +++ b/developers/subgraph/get-datatoken-buyers.md @@ -127,7 +127,7 @@ python datatoken_buyers.py {% endtab %} {% tab title="Query" %} -Copy the query to fetch the list of buyers for a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph). +Copy the query to fetch the list of buyers for a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph). ```graphql diff --git a/developers/subgraph/get-datatoken-information.md b/developers/subgraph/get-datatoken-information.md index 688ddc77..9b3846c9 100644 --- a/developers/subgraph/get-datatoken-information.md +++ b/developers/subgraph/get-datatoken-information.md @@ -150,7 +150,7 @@ print(json.dumps(result, indent=4, sort_keys=True)) {% endtab %} {% tab title="Query" %} -Copy the query to fetch the information of a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). +Copy the query to fetch the information of a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). ``` { diff --git a/developers/subgraph/get-veocean-stats.md b/developers/subgraph/get-veocean-stats.md index 592a822c..cdc6650b 100644 --- a/developers/subgraph/get-veocean-stats.md +++ b/developers/subgraph/get-veocean-stats.md @@ -154,7 +154,7 @@ axios(config) {% endtab %} {% tab title="Python" %} -You can employ the following Python script to execute the query and fetch the list of veOCEAN holders from the subgraph. +You can employ the following Python script to execute the query and fetch the list of veOCEAN holders from the subgraph. {% code title="get_veOcean_holders.py" %} ```python diff --git a/developers/subgraph/list-data-nfts.md b/developers/subgraph/list-data-nfts.md index 2f6dc292..c947f590 100644 --- a/developers/subgraph/list-data-nfts.md +++ b/developers/subgraph/list-data-nfts.md @@ -4,11 +4,11 @@ description: 'Discover the World of NFTs: Retrieving a List of Data NFTs' # Get data NFTs -If you are already familiarized with the concept of NFTs, you're off to a great start. However, if you require a refresher, we recommend visiting the [data NFTs and datatokens page](../contracts/datanft-and-datatoken.md) for a quick overview. +If you are already familiarized with the concept of NFTs, you're off to a great start. However, if you require a refresher, we recommend visiting the [data NFTs and datatokens page](../contracts/datanft-and-datatoken.md) for a quick overview. Now, let us delve into the realm of utilizing the subgraph to extract a list of data NFTs that have been published using the Ocean contracts. By employing GraphQL queries, we can seamlessly retrieve the desired information from the subgraph. You'll see how simple it is :sunglasses: -You'll find below an example of a GraphQL query that retrieves the first 10 data NFTs from the subgraph. The GraphQL query is structured to access the "nfts" route, extracting the first 10 elements. For each item retrieved, it retrieves the "id," "name," "symbol," "owner," "address," "assetState," "tx," "block," and "transferable" parameters. +You'll find below an example of a GraphQL query that retrieves the first 10 data NFTs from the subgraph. The GraphQL query is structured to access the "nfts" route, extracting the first 10 elements. For each item retrieved, it retrieves the `id`, `name`, `symbol`, `owner`, `address`, `assetState`, `tx`, `block` and `transferable` parameters. There are several options available to see this query in action. Below, you will find three: diff --git a/developers/subgraph/list-datatokens.md b/developers/subgraph/list-datatokens.md index adcbf586..a93fefeb 100644 --- a/developers/subgraph/list-datatokens.md +++ b/developers/subgraph/list-datatokens.md @@ -12,7 +12,7 @@ _PS: In this example, the query is executed on the Ocean subgraph deployed on th {% tabs %} {% tab title="Javascript" %} -The javascript below can be used to run the query. If you wish to change the network, replace the variable's value `network` as needed. +The javascript below can be used to run the query. If you wish to change the network, replace the variable's value `network` as needed. ```runkit nodeVersion="18.x.x" var axios = require('axios'); @@ -134,7 +134,7 @@ python list_all_tokens.py {% endtab %} {% tab title="Query" %} -Copy the query to fetch a list of datatokens in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). +Copy the query to fetch a list of datatokens in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). ```graphql { diff --git a/developers/subgraph/list-fixed-rate-exchanges.md b/developers/subgraph/list-fixed-rate-exchanges.md index 452ea5d5..93416c94 100644 --- a/developers/subgraph/list-fixed-rate-exchanges.md +++ b/developers/subgraph/list-fixed-rate-exchanges.md @@ -12,7 +12,7 @@ _PS: In this example, the query is executed on the Ocean subgraph deployed on th {% tabs %} {% tab title="Javascript" %} -The javascript below can be used to run the query and fetch a list of fixed-rate exchanges. If you wish to change the network, replace the variable's value `network` as needed. +The javascript below can be used to run the query and fetch a list of fixed-rate exchanges. If you wish to change the network, replace the variable's value `network` as needed. ```runkit nodeVersion="18.x.x" var axios = require('axios'); @@ -140,7 +140,7 @@ python list_fixed_rate_exchanges.py {% endtab %} {% tab title="Query" %} -Copy the query to fetch a list of fixed-rate exchanges in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). +Copy the query to fetch a list of fixed-rate exchanges in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). ``` { diff --git a/discover/README.md b/discover/README.md index f2d57101..1dfa1d39 100644 --- a/discover/README.md +++ b/discover/README.md @@ -8,7 +8,7 @@ coverY: 7.413145539906106 {% embed url="https://youtu.be/4P72ZelkEpQ" %} -Society is increasingly reliant on data as AI becomes more popular. However, a small handful of organizations possess and controls massive amounts of our personal data, posing a threat to a free and open society ☢️ +Society is increasingly reliant on data as AI becomes more popular. However, a small handful of organizations possess and control massive amounts of our personal data, posing a threat to a free and open society ☢️ The concentration of vast datasets in the hands of a few organizations can lead to significant negative consequences for society. These include: @@ -34,7 +34,7 @@ We believe that data is a valuable resource that should be accessible to **every To find out more about the amazing team behind Ocean, you can visit our [website](https://oceanprotocol.com/about). -Are you curious about our mission and how we're making a difference in the world? Then you won't want to miss this video featuring our co-founder, [Trent McConaghy](http://www.trent.st/). He'll share some fascinating insights into what we're doing and why it matters. +Now that we've made you curious about our mission and how we're making a difference in the world, you won't want to miss this video featuring our co-founder, [Trent McConaghy](http://www.trent.st/). He'll share some fascinating insights into what we're doing and why it matters. {% embed url="https://youtu.be/XN_PHg1K61w" fullWidth="false" %} A new data economy with power to the people - Trent McConaghy @@ -42,5 +42,4 @@ A new data economy with power to the people - Trent McConaghy ### Ocean Protocol Whitepaper -\ -If you'd like to explore the nitty-gritty of our technology, feel free to dive into our [whitepaper](https://oceanprotocol.com/tech-whitepaper.pdf)! It's a comprehensive resource that explains all the technical details and the core concepts that drive Ocean Protocol. It's a great way to get a deeper understanding of what we're all about. +If you'd like to explore the details of our technology, feel free to dive into our [whitepaper](https://oceanprotocol.com/tech-whitepaper.pdf)! It's a comprehensive resource that explains all the technical details and the core concepts that drive Ocean Protocol. It's a great way to get a deeper understanding of what we're all about. diff --git a/discover/explore.md b/discover/explore.md index 25814993..1856cc34 100644 --- a/discover/explore.md +++ b/discover/explore.md @@ -12,12 +12,13 @@ Ocean Protocol is used for a variety of purposes, including: 2. **Data Monetization**: Data owners can monetize their data by offering it for sale or by providing data services through compute-to-data (C2D) capabilities. Data consumers can access and utilize data assets. 3. **Decentralized Data Marketplaces**: Ocean Protocol facilitates the creation of decentralized data marketplaces where data providers can list their data assets and data consumers can discover and access them. These marketplaces operate on a peer-to-peer basis, eliminating the need for intermediaries and providing more efficient and transparent data transactions. 4. **AI Development**: Ocean Protocol supports the development of AI models by providing access to diverse and high-quality datasets. Data scientists and AI developers can leverage these datasets to train and improve their models, leading to more accurate and robust AI systems. +5. **Access control:** Ocean Protocol incorporates token-gating mechanisms that grant or restrict access to specific data assets based on predefined criteria, ensuring controlled and regulated data sharing within the ecosystem. By leveraging **blockchain technology** and **smart contracts**, Ocean Protocol offers **open-source tools** to securely publish [NFTs](../developers/contracts/data-nfts.md) of your data and algorithms to seamlessly collaborate, trade, and innovate with others.

A vast ocean of data awaits you...

-Get a glimpse into some of the things you can do with Ocean Protocol. The opportunities with our protocol to leverage an "Ocean of data" are great and ever-evolving. Together, we'll dive deeper and uncover even more ways to leverage the power of decentralized data. +Get a glimpse into some of the things you can do with Ocean Protocol. The opportunities with our protocol to leverage an "Ocean of data" are great and ever-evolving. Together, we'll dive deeper and uncover even more ways to harness the power of decentralized data.
@@ -73,7 +74,7 @@ The following guides will help you get started with buying and selling data: * [Publish an NFT](../user-guides/publish-data-nfts.md) * [Download an NFT](../user-guides/buy-data-nfts.md) -* [Host Assets](../user-guides/asset-hosting/) +* [Host Assets](../user-guides/asset-hosting/README.md)
@@ -81,7 +82,7 @@ The following guides will help you get started with buying and selling data: Manage datatokens and data NFTs for use in DeFi

Ocean makes it easy to publish data services (deploy ERC721 data NFTs and ERC20 datatokens), and to consume data services (spend datatokens). Crypto wallets, exchanges, and DAOs become data wallets, exchanges, and DAOs.
-Use Ocean [JavaScript](../developers/ocean.js/) or [Python](../developers/ocean.py/) drivers to manage data NFTs and datatokens: +Use Ocean [JavaScript](../developers/ocean.js/README.md) or [Python](../developers/ocean.py/README.md) drivers to manage data NFTs and datatokens: Ocean-based apps make data assets on-ramps and off-ramps easy for end users. Ocean smart contracts and libraries make this easy for developers. The data itself does not need to be on-chain, just the access control. @@ -101,7 +102,7 @@ Since datatokens are ERC20, and live on Ethereum mainnet, there's a whole ecosys Run Your Own Provider

You have the option to generate revenue by running your own provider. It has come to our attention that many of you have not pursued this opportunity, primarily due to the lack of compelling incentives.
-If you're not familiar with it, the Ocean [Provider](../developers/provider/) serves as the proxy service responsible for encrypting/decrypting data and streaming it to the consumer. It also verifies user access privileges for specific data assets or services. It plays a vital role in the Ocean architecture. +If you're not familiar with it, the Ocean [Provider](../developers/provider/README.md) serves as the proxy service responsible for encrypting/decrypting data and streaming it to the consumer. It also verifies user access privileges for specific data assets or services. It plays a vital role in the Ocean architecture. Fees are now paid to the individual or organization running the provider when a user downloads a data asset. The download fees are set based on the cost per MB, and there is also a provider fee for compute jobs, which is priced per minute. @@ -115,7 +116,7 @@ Furthermore, provider fees are not restricted to data consumption; they can also Earn Rewards

The Data Farming initiative is a key feature of Ocean Protocol that empowers participants to earn rewards while contributing to a decentralized data economy. By staking Ocean tokens and actively participating in data markets, users play a vital role in enhancing the availability and accessibility of valuable data assets
-Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOcean](../rewards/veocean.md). By staking your tokens, you not only support the growth and sustainability of the ecosystem but also earn a share of the generated incentives💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/) while making a meaningful impact in the data marketplace. +Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOcean](../rewards/veocean.md). By staking your tokens, you not only support the growth and sustainability of the ecosystem but also earn a share of the generated incentives💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/README.md) while making a meaningful impact in the data marketplace. Participating in the Data Farming initiative demonstrates a commitment to the principles of **fairness**, **transparency**, and **collaboration** that underpin Ocean Protocol. It allows you to actively engage with the ecosystem, promoting innovation, and driving the evolution of the decentralized data economy. @@ -133,7 +134,7 @@ By participating in Ocean Data Challenges, you can tackle real-world problems, l Become an Ambassador

Becoming an Ocean Ambassador presents a unique opportunity to actively contribute to the growth and adoption of Ocean Protocol while being at the forefront of the decentralized data revolution.
-As an Ocean Ambassador, you become an advocate for the protocol, promoting its vision of democratizing data and empowering individuals. By sharing your knowledge and enthusiasm, you can educate others about the benefits and potential of Ocean Protocol, inspiring them to join the ecosystem. As part of a global community of like-minded individuals, you gain access to exclusive resources, networking opportunities, and collaborations that further enhance your expertise in the data economy. As an Ambassador, you play a vital role in shaping the future of data by driving awareness, fostering innovation, and helping to build a more open and equitable data ecosystem. Join the Ocean Ambassador program by completing the [Ocean Academy](https://www.oceanacademy.io/) and become a catalyst for positive change in the world of data. +As an Ocean Ambassador, you become an advocate for the protocol, promoting its vision of democratizing data and empowering individuals. By sharing your knowledge and enthusiasm, you can educate others about the benefits and potential of Ocean Protocol, inspiring them to join the ecosystem. As part of a global community of like-minded individuals, you gain access to exclusive resources, networking opportunities, and collaborations that further enhance your expertise in the data economy. As an Ambassador, you play a vital role in shaping the future of data by driving awareness, fostering innovation, and helping to build a more open and equitable data ecosystem.
@@ -141,8 +142,7 @@ As an Ocean Ambassador, you become an advocate for the protocol, promoting its v Contribute to Ocean Code Development

Make a positive impact in the Web3 data economy by contributing to Ocean's open source code on Github! From feature requests to pull requests, contributions of all kinds are appreciated.
-To begin, [visit our Github page](https://github.com/oceanprotocol) where you can see the repos and contributors. If you're going to contribute code to a repo, then we ask that you fork the code first, make your changes, and then create a pull request for us to review. If you are reporting an issue, then please first search the existing issues to see if it is documented yet. If not, then please open a new issue by describe your problem as best as possible and include screenshots.\ -\ +To begin, [visit our Github page](https://github.com/oceanprotocol) where you can see the repos and contributors. If you're going to contribute code to a repo, then we ask that you fork the code first, make your changes, and then create a pull request for us to review. If you are reporting an issue, then please first search the existing issues to see if it is documented yet. If not, then please open a new issue by describe your problem as best as possible and include screenshots. We also welcome you to join our [Discord developer community](https://discord.gg/TnXjkR5) where you can get rapid, practical advice on using Ocean tech but also get to know Ocean core team more personally!
diff --git a/discover/faq.md b/discover/faq.md index 75783c8a..1e40b3ee 100644 --- a/discover/faq.md +++ b/discover/faq.md @@ -51,7 +51,7 @@ Yes. Ocean Protocol understands that some data is too sensitive to be shared — Where is my data stored? -Ocean does not provide data storage. Users have the choice to [store](../user-guides/asset-hosting/) their data on their own servers, cloud, or decentralized storage. Users need only to provide a URL, an IPFS hash, an Arweave CID, or the on-chain information to the dataset. This is then encrypted as a means to protect access to the dataset. +Ocean does not provide data storage. Users have the choice to [store](../user-guides/asset-hosting/README.md) their data on their own servers, cloud, or decentralized storage. Users need only to provide a URL, an IPFS hash, an Arweave CID, or the on-chain information to the dataset. This is then encrypted as a means to protect access to the dataset.
@@ -72,14 +72,6 @@ PS: [Fine-grained permissions](../developers/fg-permissions.md) are not integrat
-
- -What is the reach of Ocean Market - how many data buyers can I sell to? - -Hundreds of unique datasets are available that are sourced from private individuals, research institutions, commercial enterprises and government. Publishing data on Ocean offers data providers and algorithm owners an exciting new channel to connect with a rapidly growing community of Web3 enthusiasts and data science professionals around the world. - -
- ### Technical Questions
@@ -110,7 +102,7 @@ The blockchain can do more than just store information - it can also run code. A What is a datatoken? -A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. . If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset. +A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset.
@@ -228,14 +220,6 @@ The counting starts at 12.01am on Thursday, and ends at 11.59pm on the following
-I staked for just one day. What rewards might I expect? - -At least 50 snapshots are randomly taken throughout the week. If you’ve staked just one day, and all else being equal, you should expect 1/7 the rewards compared to the full 7 days. - -
- -
- The datatoken price may change throughout the week. What price is taken in the DCV calculation? The price is taken at the same time as each consume. E.g. if a data asset has three consumes, where price was 1 OCEAN when the first consume happened, and the price was 10 OCEAN when the other consumes happened, then the total DCV for the asset is 1 + 10 + 10 = 21. @@ -276,7 +260,7 @@ To learn more about systems driving veOCEAN and Data Farming, please [visit our What about passive stakers — people who just want to stake in one place and be done? -Earnings are passive by default +Earnings are passive by default.
@@ -284,7 +268,7 @@ Earnings are passive by default What about active stakers — people who want to do extra work and get rewarded? -Ot works. Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more \$$ +Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more \$$.
@@ -346,15 +330,7 @@ They are deployed on Ethereum mainnet, alongside other Ocean contract deployment What is the official veOCEAN epoch start_time? -veFeeDistributor has a start\_time of 1663804800 (Thu Sep 22 2022 00:00:00) - -
- -
- -Will the Market still need to be multi-chain? - -Yes, Ocean Market still needs to be multi-chain: all the reasons that we went multi-chain for are as valid as ever. +veFeeDistributor has a start\_time of 1663804800 (Thu Sep 22 2022 00:00:00).
diff --git a/discover/glossary.md b/discover/glossary.md index cbddb803..bd9cfc0b 100644 --- a/discover/glossary.md +++ b/discover/glossary.md @@ -20,7 +20,7 @@ Ocean Protocol is a decentralized data exchange protocol that enables individual $OCEAN -The Ocean Protocol token (OCEAN) is a utility token used in the Ocean Protocol ecosystem. It serves as a medium of exchange and a unit of value for data services in the network. Participants in the Ocean ecosystem can use OCEAN tokens to buy and sell data, stake on data assets, and participate in the governance of the protocol. +The Ocean Protocol token (OCEAN) is a utility token used in the Ocean Protocol ecosystem. It serves as a medium of exchange and a unit of value for data services in the network. Participants in the Ocean ecosystem can use OCEAN tokens to buy and sell data, stake on data assets, and participate in the governance of the protocol.
@@ -36,7 +36,7 @@ The data consume value (DCV) is a key metric that refers to the amount of $ spen Transaction Volume (TV) -The transaction value is a key metric that refers to the value of transactions within the ecosystem. +The transaction value is a key metric that refers to the value of transactions within the ecosystem. Transaction volume(TV) is often used interchangeably with data consume volume (DCV). DCV is a more refined metric that excludes activities like wash trading. DCV measures the actual consumption or processing of data within the protocol, which is a more accurate measure of the value generated by the ecosystem. @@ -262,9 +262,9 @@ In the context of Ocean Protocol, interoperability enables the integration of th Smart contract -Smart contracts are self-executing digital contracts that allow for the automation and verification of transactions without the need for a third party. They are programmed using code and operate on a decentralized blockchain network. Smart contracts are designed to enforce the rules and regulations of a contract, ensuring that all parties involved fulfill their obligations. Once the conditions of the contract are met, the smart contract automatically executes the transaction, ensuring that the terms of the contract are enforced in a transparent and secure manner. +Smart contracts are self-executing digital contracts that allow for the automation and verification of transactions without the need for a third party. They are programmed using code and operate on a decentralized blockchain network. Smart contracts are designed to enforce the rules and regulations of a contract, ensuring that all parties involved fulfill their obligations. Once the conditions of the contract are met, the smart contract automatically executes the transaction, ensuring that the terms of the contract are enforced in a transparent and secure manner. -Ocean ecosystem smart contracts are deployed on multiple blockchains like Polygon, Energy Web Chain, Binance Smart Chain, and others. The code is open source and available on the organization's [GitHub](https://github.com/oceanprotocol/contracts). +Ocean ecosystem smart contracts are deployed on multiple blockchains like Polygon, Energy Web Chain, Binance Smart Chain, and others. The code is open source and available on the organization's [GitHub](https://github.com/oceanprotocol/contracts).
@@ -354,7 +354,7 @@ A term used in the cryptocurrency and blockchain space to encourage developers a ### -### Decentralized Finance (DeFI) fundamentals +### Decentralized Finance (DeFi) fundamentals
@@ -442,7 +442,7 @@ A strategy in which investors provide liquidity to a DeFi protocol in exchange f AI -AI stands for Artificial Intelligence. It refers to the development of computer systems that can perform tasks that would typically require human intelligence to complete. AI technologies enable computers to learn, reason, and adapt in a way that resembles human cognition. +AI stands for Artificial Intelligence. It refers to the development of computer systems that can perform tasks that would typically require human intelligence to complete. AI technologies enable computers to learn, reason, and adapt in a way that resembles human cognition.
@@ -452,9 +452,6 @@ AI stands for Artificial Intelligence. It refers to the development of computer Machine learning is a subfield of artificial intelligence (AI) that involves teaching computers to learn from data, without being explicitly programmed. In other words, it is a way for machines to automatically learn and improve from experience, without being explicitly told what to do in every situation. -\ - -
diff --git a/discover/networks/README.md b/discover/networks/README.md index 7940d466..9af3bfb8 100644 --- a/discover/networks/README.md +++ b/discover/networks/README.md @@ -13,13 +13,13 @@ In each network, whether it's the Ethereum mainnet, a testnet, or the Polygon/Ma The Ethereum mainnet is a production network, which means that it is a live and operational network that handles real transactions and has actual economic value. To connect to the Ethereum mainnet using a wallet such as MetaMask, you can click on the network name dropdown and select Ethereum mainnet from the list of available networks. -
Gas TokenMainnet ETH (Native token)
OCEAN Token0x967da4048cD07aB37855c090aAF366e4ce1b9F48
Explorerhttps://etherscan.io
+
Gas Token
OCEAN Token0x967da4048cD07aB37855c090aAF366e4ce1b9F48
Explorerhttps://etherscan.io
### Polygon Mainnet Ocean Protocol is also deployed to Polygon Mainnet, which is another production network. The native token of Polygon Mainnet is MATIC. If you cannot find Polygon Mainnet as a predefined network in your wallet, you can manually connect to it by following Polygon's [guide](https://wiki.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/#add-the-polygon-network-manually), which provides step-by-step instructions for connecting to Polygon Mainnet. -
Gas TokenMatic(Native token)
OCEAN0x282d8efCe846A88B159800bd4130ad77443Fa1A1
Explorerhttps://polygonscan.com
+
Gas TokenMatic(Native token)
OCEAN0x282d8efCe846A88B159800bd4130ad77443Fa1A1
Explorerhttps://polygonscan.com
**Bridge** @@ -29,7 +29,7 @@ Check our Polygon Bridge [guide](bridges.md) to learn how you can deposit, withd Ocean Protocol is also deployed to Binance Smart Chain (BSC), which is another production network. The native token of the Binance Smart Chain is BNB, which is the token of the Binance exchange. If Binance Smart Chain is not listed as a predefined network in your wallet, you can manually connect to it by following Binance's [guide](https://academy.binance.com/en/articles/connecting-metamask-to-binance-smart-chain), which provides detailed instructions on how to connect to Binance Smart Chain. -
Gas TokenBSC BNB(Native token)
OCEAN0xdce07662ca8ebc241316a15b611c89711414dd1a
Explorerhttps://bscscan.com/
+
Gas TokenBSC BNB(Native token)
OCEAN0xdce07662ca8ebc241316a15b611c89711414dd1a
Explorerhttps://bscscan.com/
**Bridge** @@ -39,7 +39,7 @@ Check our BSC Bridge [guide](bridges.md#binance-smart-chain-bsc-bridge) to learn Ocean Protocol is also deployed to [Energy Web Chain](https://energy-web-foundation.gitbook.io/energy-web/technology/trust-layer-energy-web-chain), which is another production network. The native token of the Energy Web Chain is EWT. If you cannot find Energy Web Chain as a predefined network in your wallet, you can manually connect to it by following this [guide](https://energy-web-foundation.gitbook.io/energy-web/how-tos-and-tutorials/connect-to-energy-web-chain-main-network-with-metamash). -
Gas TokenEnergy Web Chain EWT(Native token)
OCEAN0x593122aae80a6fc3183b2ac0c4ab3336debee528
Explorerhttps://explorer.energyweb.org/
+
Gas TokenEnergy Web Chain EWT(Native token)
OCEAN0x593122aae80a6fc3183b2ac0c4ab3336debee528
Explorerhttps://explorer.energyweb.org/
**Bridge** @@ -49,7 +49,7 @@ To bridge assets between Energy Web Chain and Ethereum mainnet, you can use [thi Ocean Protocol is also deployed to [Moonriver](https://docs.moonbeam.network/builders/get-started/networks/moonriver/), which is another production network. The native token of Moonriver is MOVR. If Moonriver is not listed as a predefined network in your wallet, you can manually connect to it by following this [guide](https://docs.moonbeam.network/builders/get-started/networks/moonriver/#connect-metamask). -
Gas TokenMoonriver MOVR(Native token)
OCEAN0x99C409E5f62E4bd2AC142f17caFb6810B8F0BAAE
Explorerhttps://blockscout.moonriver.moonbeam.network
+
Gas TokenMoonriver MOVR(Native token)
OCEAN0x99C409E5f62E4bd2AC142f17caFb6810B8F0BAAE
Explorerhttps://blockscout.moonriver.moonbeam.network
**Bridge** @@ -59,7 +59,7 @@ To bridge assets between Moonriver and Ethereum mainnet, you can use [this](http Ocean Protocol is deployed on the Görli test network, which is used for testing and experimentation. Tokens on Görli do not hold real economic value, as it is a non-production network. To connect to Görli using a wallet like MetaMask, simply click on the network name dropdown and select _Goerli_ from the list of available networks. -
Gas TokenGörli ETH(Native token)
Görli ETHFaucet. You may find others by searching.
OCEAN0xCfDdA22C9837aE76E0faA845354f33C62E03653a
Görli OCEANFaucet
Explorerhttps://blockscout.moonriver.moonbeam.network
+
Gas TokenGörli ETH(Native token)
Görli ETHFaucet. You may find others by searching.
Görli OCEANFaucet
OCEAN0xCfDdA22C9837aE76E0faA845354f33C62E03653a
Explorerhttps://blockscout.moonriver.moonbeam.network
### Mumbai @@ -67,5 +67,10 @@ Ocean Protocol is deployed on the Mumbai test network Matic / Polygon, which is If Mumbai is not listed as a predefined network in your wallet, you can connect to it manually by following [Matic's guide](https://wiki.polygon.technology/docs/develop/metamask/config-polygon-on-metamask/). -
Gas TokenMumbai MATIC(Native token)
Mumbai MATICFaucet. You may find others by searching.
OCEAN0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8
Mumbai OCEANFaucet
Explorerhttps://mumbai.polygonscan.com
+
Mumbai MATIC(Native token)
Mumbai MATICFaucet. You may find others by searching.
Mumbai OCEANFaucet
OCEAN0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8
Explorerhttps://mumbai.polygonscan.com
+### Sepolia + +Ocean Protocol is deployed on the Sepolia test network, which is designed for testing and experimentation purposes. Tokens in Sepolia do not hold any real economic value, as it is not a production network. To connect to Sepolia using a wallet like MetaMask, you can select "Sepolia" from the network dropdown list(enable "Show test networks"). + +
Gas TokenSepoliaETH (Native token)
SepoliaETHFaucet
Sepolia OCEANFaucet
OCEAN0x1B083D8584dd3e6Ff37d04a6e7e82b5F622f3985
Explorerhttps://sepolia.etherscan.io/
diff --git a/discover/ocean-101.md b/discover/ocean-101.md index 8515b75a..42f62fd5 100644 --- a/discover/ocean-101.md +++ b/discover/ocean-101.md @@ -1,6 +1,6 @@ # Ocean 101 -

Let's see how it works

+

Let's see how it works

## How Does Ocean Work? @@ -9,14 +9,16 @@ Ocean Protocol utilizes a combination of blockchain technology, decentralized ne 1. **Asset Registration**: Data providers register their data assets on the Ocean blockchain, providing metadata that describes the asset, its usage terms, and pricing information. This metadata is stored on-chain and can be accessed by potential data consumers. 2. **Discovery and Access Control**: Data consumers can discover available data assets through decentralized metadata services like Aquarius. Access control mechanisms, such as smart contracts, verify the consumer's permissions and handle the transfer of data access tokens. 3. **Secure Data Exchange**: When a data consumer purchases access to a data asset, the asset's metadata, and access instructions are encrypted by the data provider using the Provider service. The encrypted asset is then securely transferred to the consumer, who can decrypt and utilize it without revealing the asset's URL. -4. [**Compute-to-Data**](../developers/compute-to-data/) **(C2D)**: Ocean Protocol supports C2D capabilities, allowing data consumers to perform computations on data assets without direct access to the underlying data. The compute operations are executed in a secure and controlled environment, ensuring data privacy and compliance. -5. **Incentives and Governance**: Ocean Protocol incorporates tokeconomics and a governance framework to incentivize participants and ensure the sustainability and evolution of the ecosystem. Participants can earn and stake Ocean tokens (OCEAN) for veOCEANs, curate data, contribute to the network, and participate in governance decisions. +4. [**Compute-to-Data**](../developers/compute-to-data/README.md) **(C2D)**: Ocean Protocol supports C2D capabilities, allowing data consumers to perform computations on data assets without direct access to the underlying data. The compute operations are executed in a secure and controlled environment, ensuring data privacy and compliance. +5. **Incentives and Governance**: Ocean Protocol incorporates tokenomics and a governance framework to incentivize participants and ensure the sustainability and evolution of the ecosystem. Participants can earn and stake Ocean tokens (OCEAN) for veOCEANs, curate data, contribute to the network and participate in governance decisions. Ocean Protocol also combines advanced technologies and web components to create a robust and efficient data ecosystem. -Powerful libraries such as [Ocean.js](../developers/ocean.js/) (JavaScript) and [Ocean.py](../developers/ocean.py/) (Python) facilitate seamless integration and interaction with the protocol, offering a wide range of functionalities. +

Ocean architectural overview

-Ocean Protocol incorporates middleware components that enhance efficiency and streamline interactions. Components such as [Aquarius](../developers/aquarius/) act as a metadata cache, improving search efficiency by caching on-chain data into Elasticsearch while [Provider](../developers/provider/) plays a crucial role in various ecosystem operations, assisting in asset downloading, handling encryption of [Decentralized Data Objects](../developers/ddo-specification.md) (DDOs), and facilitating communication with the operator-service for Compute-to-Data jobs. And finally, the [Subgraph](../developers/subgraph/), an off-chain service leveraging GraphQL, offers efficient access to information related to datatokens, users, and balances. +Powerful libraries such as [Ocean.js](../developers/ocean.js/README.md) (JavaScript) and [Ocean.py](../developers/ocean.py/README.md) (Python) facilitate seamless integration and interaction with the protocol, offering a wide range of functionalities. + +Ocean Protocol incorporates middleware components that enhance efficiency and streamline interactions. Components such as [Aquarius](../developers/aquarius/README.md) act as a metadata cache, improving search efficiency by caching on-chain data into Elasticsearch while [Provider](../developers/provider/README.md) plays a crucial role in various ecosystem operations, assisting in asset downloading, handling encryption of [Decentralized Data Objects](../developers/ddo-specification.md) (DDOs), and facilitating communication with the operator-service for Compute-to-Data jobs. And finally, the [Subgraph](../developers/subgraph/README.md), an off-chain service leveraging GraphQL, offers efficient access to information related to datatokens, users, and balances. These libraries and middleware components contribute to efficient data discovery and secure interactions within the Ocean Protocol ecosystem. diff --git a/discover/wallets-and-ocean-tokens.md b/discover/wallets-and-ocean-tokens.md index 7c5ec613..bc349b17 100644 --- a/discover/wallets-and-ocean-tokens.md +++ b/discover/wallets-and-ocean-tokens.md @@ -6,7 +6,7 @@ description: >- # Manage Your OCEAN Tokens -If you don't see any Ocean Tokens in your crypto wallet software 🔎 (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet. +If you don't see any Ocean Tokens in your crypto wallet software 🔎 (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet. ### Token Information @@ -28,7 +28,8 @@ The [OCEAN Token page](https://oceanprotocol.com/token) at oceanprotocol.com has If you prefer visual demonstrations, we have prepared a visual demo that illustrates the steps mentioned above. -{% @arcade/embed flowId="yHiKKN336QGdAkhTlsIh" url="https://app.arcade.software/share/yHiKKN336QGdAkhTlsIh" %} +{% embed url="https://app.arcade.software/share/yHiKKN336QGdAkhTlsIh" fullWidth="false" %} +{% endembed %} MetaMask should now show your Ocean Token (OCEAN) balance, and when you're looking at that, there should be a `Send` button to send Ocean Tokens to others. For help with that, see [the MetaMask docs about how to send tokens](https://metamask.zendesk.com/hc/en-us/articles/360015488931-How-to-Send-Tokens). diff --git a/infrastructure/README.md b/infrastructure/README.md index 553594df..823883a4 100644 --- a/infrastructure/README.md +++ b/infrastructure/README.md @@ -6,9 +6,9 @@ coverY: 0 # 🔨 Infrastructure -There are many ways in which the components can be deployed, from simple configurations used for development and testing to complex configurations, used for production systems. +There are many ways in which the components can be deployed, from simple configurations used for development and testing to complex configurations, used for production systems. -All the Ocean Protocol components ([Provider](../developers/provider/README.md), [Aquarius](../developers/aquarius/README.md), [Subgraph](../developers/subgraph/README.md)) are designed to run in Docker containers, on a Linux operating system. For simple configurations, we rely on Docker Engine and Docker Compose products to deploy and run our components, while for complex configurations we use Kubernetes. The guides included in this section will present both deployment options. +All the Ocean Protocol components ([Provider](../developers/provider/README.md), [Aquarius](../developers/aquarius/README.md), [Subgraph](../developers/subgraph/README.md)) are designed to run in Docker containers, on a Linux operating system. For simple configurations, we rely on Docker Engine and Docker Compose products to deploy and run our components, while for complex configurations we use Kubernetes. The guides included in this section will present both deployment options. Please note that deploying the Ocean components requires a good understanding of: @@ -16,6 +16,6 @@ Please note that deploying the Ocean components requires a good understanding of * Docker Engine * Docker Compose or Kubernetes (depending on the configuration chosen for the component deployment) -Please note that although Ocean Marketplace is not a core component of our stack but rather an example of what can be achieved with our technology, in this section we included a guide on how to deploy it. +Please note that although Ocean Marketplace is not a core component of our stack but rather an example of what can be achieved with our technology, in this section we included a guide on how to deploy it. All components need to be deployed on a server, so we included a guide about how to install and configure a server will all the necessary tools. diff --git a/infrastructure/compute-to-data-minikube.md b/infrastructure/compute-to-data-minikube.md index ba61f48a..e4b7d7cd 100644 --- a/infrastructure/compute-to-data-minikube.md +++ b/infrastructure/compute-to-data-minikube.md @@ -2,15 +2,40 @@ title: Minikube Compute-to-Data Environment --- -# C2D - Minikube Environment +# Deploying C2D + +This chapter will present how to deploy the C2D component of the Ocean stack. As mentioned in the [C2D Architecture chapter](../developers/compute-to-data/#architecture-and-overview-guides), the Compute-to-Data component uses Kubernetes to orchestrate the creation and deletion of the pods in which the C2D jobs are run. + +For the ones that do not have a Kubernetes environment available, we added to this guide instructions on how to install Minikube, which is a lightweight Kubernetes implementation that creates a VM on your local machine and deploys a simple cluster containing only one node. In case you have a Kubernetes environment in place, please skip directly to step 4 of this guide. + + ### Requirements -* functioning internet-accessable provider service -* machine capable of running compute (e.g. we used a machine with 8 CPUs, 16 GB Ram, 100GB SSD and fast internet connection) -* Ubuntu 22.04 LTS +* Communications: a functioning internet-accessible provider service +* Hardware: a server capable of running compute jobs (e.g. we used a machine with 8 CPUs, 16 GB Ram, 100GB SSD, and a fast internet connection). See [this guide](setup-server.md) for how to create a server; +* Operating system: Ubuntu 22.04 LTS -### Install Docker and Git + + +### Steps + +1. [Install Docker and Git](compute-to-data-minikube.md#install-docker-and-git) +2. [Install Minikube](compute-to-data-minikube.md#install-minikube) +3. [Start Minikube](compute-to-data-minikube.md#start-minikube) +4. [Install the Kubernetes command line tool (kubectl)](compute-to-data-minikube.md#install-the-kubernetes-command-line-tool-kubectl) +5. [Run the IPFS host (optional)](compute-to-data-minikube.md#run-the-ipfs-host-optional) +6. [Update the storage class](compute-to-data-minikube.md#update-the-storage-class) +7. [Download and Configure Operator Service](compute-to-data-minikube.md#download-and-configure-operator-service) +8. [Download and Configure Operator Engine](compute-to-data-minikube.md#download-and-configure-operator-engine) +9. [Create namespaces](compute-to-data-minikube.md#create-namespaces) +10. [Deploy Operator Service](compute-to-data-minikube.md#deploy-operator-service) +11. [Deploy Operator Engine](compute-to-data-minikube.md#deploy-operator-engine) +12. [Expose Operator Service](compute-to-data-minikube.md#expose-operator-service) +13. [Initialize the database](compute-to-data-minikube.md#initialize-database) +14. [Update Provider](compute-to-data-minikube.md#update-provider) + +#### Install Docker and Git ```bash sudo apt update @@ -18,16 +43,16 @@ sudo apt install git docker.io sudo usermod -aG docker $USER && newgrp docker ``` -### Install Minikube +#### Install Minikube ```bash wget -q --show-progress https://github.com/kubernetes/minikube/releases/download/v1.22.0/minikube_1.22.0-0_amd64.deb sudo dpkg -i minikube_1.22.0-0_amd64.deb ``` -### Start Minikube +#### Start Minikube -First command is imporant, and solves a [PersistentVolumeClaims problem](https://github.com/kubernetes/minikube/issues/7828). +The first command is important and solves a [PersistentVolumeClaims problem](https://github.com/kubernetes/minikube/issues/7828). ```bash minikube config set kubernetes-version v1.16.0 @@ -38,7 +63,7 @@ Depending on the number of available CPUs, RAM, and the required resources for r For other options to run minikube refer to this [link](https://minikube.sigs.k8s.io/docs/commands/start/) -### Install kubectl +#### Install the Kubernetes command line tool (kubectl) ```bash curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl" @@ -48,13 +73,17 @@ echo "$(> /etc/hosts' ``` -### Storage class (Optional) +#### Update the storage class -For minikube, you can use the default 'standard' class. +The storage class is used by Kubernetes to create the temporary volumes on which the data used by the algorithm will be stored. -For AWS, please make sure that your class allocates volumes in the same region and zone in which you are running your pods. +Please ensure that your class allocates volumes in the same region and zone where you are running your pods. -We created our own 'standard' class in AWS: +You need to consider the storage class available for your environment. + +For Minikube, you can use the default 'standard' class. + +In AWS, we created our own 'standard' class: ```bash kubectl get storageclass standard -o yaml @@ -96,15 +129,15 @@ volumeBindingMode: Immediate For more information, please visit https://kubernetes.io/docs/concepts/storage/storage-classes/ -### Download and Configure Operator Service +#### Download and Configure Operator Service -Open new terminal and run the command below. +Open a new terminal and run the command below. ```bash git clone https://github.com/oceanprotocol/operator-service.git ``` -Edit `operator-service/kubernetes/postgres-configmap.yaml`. Change `POSTGRES_PASSWORD` to nice long random password. +Edit `operator-service/kubernetes/postgres-configmap.yaml`. Change `POSTGRES_PASSWORD` to a nice long random password. Edit `operator-service/kubernetes/deployment.yaml`. Optionally change: @@ -132,24 +165,24 @@ spec: value: "3600" ``` -### Download and Configure Operator Engine +#### Download and Configure Operator Engine ```bash git clone https://github.com/oceanprotocol/operator-engine.git ``` -Check the [README](https://github.com/oceanprotocol/operator-engine#customize-your-operator-engine-deployment) section of operator engine to customize your deployment. +Check the [README](https://github.com/oceanprotocol/operator-engine#customize-your-operator-engine-deployment) section of the operator engine to customize your deployment. -At a minimum you should add your IPFS URLs or AWS settings, and add (or remove) notification URLs. +At a minimum, you should add your IPFS URLs or AWS settings, and add (or remove) notification URLs. -### Create namespaces +#### Create namespaces ```bash kubectl create ns ocean-operator kubectl create ns ocean-compute ``` -### Deploy Operator Service +#### Deploy Operator Service ```bash kubectl config set-context --current --namespace ocean-operator @@ -160,7 +193,7 @@ kubectl create -f operator-service/kubernetes/postgresql-service.yaml kubectl apply -f operator-service/kubernetes/deployment.yaml ``` -### Deploy Operator Engine +#### Deploy Operator Engine ```bash kubectl config set-context --current --namespace ocean-compute @@ -176,7 +209,7 @@ kubectl create -f operator-service/kubernetes/postgres-configmap.yaml kubectl -n ocean-compute apply -f /ocean/operator-engine/kubernetes/egress.yaml ``` -### Expose Operator Service +#### Expose Operator Service ```bash kubectl expose deployment operator-api --namespace=ocean-operator --port=8050 @@ -190,15 +223,15 @@ kubectl -n ocean-operator port-forward svc/operator-api 8050 Alternatively you could use another method to communicate between the C2D Environment and the provider, such as an SSH tunnel. -### Initialize database +#### Initialize database -If your minikube is running on compute.example.com: +If your Minikube is running on compute.example.com: ```bash curl -X POST "https://compute.example.com/api/v1/operator/pgsqlinit" -H "accept: application/json" ``` -### Update Provider +#### Update Provider Update your provider service by updating the `operator_service.url` value in `config.ini` @@ -208,4 +241,3 @@ operator_service.url = https://compute.example.com/ Restart your provider service. -[Watch the explanatory video for more details](https://vimeo.com/580934725) diff --git a/infrastructure/deploying-aquarius.md b/infrastructure/deploying-aquarius.md index 0045983e..f0a0f821 100644 --- a/infrastructure/deploying-aquarius.md +++ b/infrastructure/deploying-aquarius.md @@ -29,7 +29,7 @@ This guide will deploy Aquarius, including Elasticsearch as a single systemd ser From a terminal console, create /etc/docker/compose/aquarius/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. The following example is for deploying Aquarius for Goerli network. -For each other network in which you want to deploy Aquarius, add to the file a section similar to "aquarius-events-goerli" included in this example and update the corresponding parameters (i.e. EVENTS\_RPC, OCEAN\_ADDRESS, SUBGRAPH\_URLS) specific to that network. \\ +For each other network in which you want to deploy Aquarius, add to the file a section similar to "aquarius-events-goerli" included in this example and update the corresponding parameters (i.e. EVENTS\_RPC, OCEAN\_ADDRESS, SUBGRAPH\_URLS) specific to that network. ```yaml version: '3.9' @@ -281,9 +281,9 @@ $ curl localhost:9200 Aquarius supports indexing multiple chains using a single instance to serve API requests and one instance for each chain that must be indexed. -![image](https://github.com/oceanprotocol/docs/assets/54084524/8099e7d7-171d-4d5a-8475-61706c99f4e5) +

Aquarius deployment - multiple chains indexing

-The following deployment templates could be used for guidance. Some parameters are [optional](https://github.com/oceanprotocol/aquarius) and the template could be adjusted based on these considerations. Common cases are the deployments for one/multiple multiple Ethereum networks: +The following deployment templates could be used for guidance. Some parameters are [optional](https://github.com/oceanprotocol/aquarius) and the template could be adjusted based on these considerations. Common cases are the deployments for one/multiple Ethereum networks: * Mainnet * Goerli diff --git a/infrastructure/deploying-ocean-subgraph.md b/infrastructure/deploying-ocean-subgraph.md index bd383de3..48f390a8 100644 --- a/infrastructure/deploying-ocean-subgraph.md +++ b/infrastructure/deploying-ocean-subgraph.md @@ -4,7 +4,7 @@ Ocean subgraph allows querying the datatoken, data NFT, and all event information using GraphQL. Hosting the Ocean subgraph saves the cost and time required in querying the data directly from the blockchain. The steps in this tutorial will explain how to host Ocean subgraph for the EVM-compatible chains supported by Ocean Protocol. -Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotocol/graph-node), therefore, in this document, we will show first how to deploy graph-node - either using Docker Engine or Kubernetes - and then how to install Ocean Subgraph on the graph-node system. +Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotocol/graph-node), therefore, in this document, we will show first how to deploy graph-node - either using Docker Engine or Kubernetes - and then how to install Ocean Subgraph on the graph-node system. ## Deploying Graph-node using Docker Engine and Docker Compose @@ -25,7 +25,7 @@ Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotoc #### 1. Create the /etc/docker/compose/graph-node/docker-compose.yml file -From a terminal console, create the _/etc/docker/compose/graph-node/docker-compose.yml_ file, then copy and paste the following content to it (. Check the comments in the file and replace the fields with the specific values of your implementation. +From a terminal console, create the _/etc/docker/compose/graph-node/docker-compose.yml_ file, then copy and paste the following content to it (. Check the comments in the file and replace the fields with the specific values of your implementation. _/etc/docker/compose/graph-node/docker-compose.yml_ (annotated - example for `mumbai` network) @@ -174,7 +174,7 @@ Then, check the logs of the Ocean Subgraph docker container: docker logs graph-node [--follow] ``` -## Deploying graph-node using Kubernetes +## Deploying graph-node using Kubernetes In this example, we will deploy graph-node as a Kubernetes deployment service. [graph-node](https://github.com/graphprotocol/graph-node) has the following dependencies: PostgreSQL and IPFS. @@ -431,7 +431,7 @@ spec: ## Deploy Ocean Subgraph -After you deployed graph-node, either using Kubernetes or Docker Compose, you can proceed to deploy Ocean Subgraph on top of it. +After you deployed graph-node, either using Kubernetes or Docker Compose, you can proceed to deploy Ocean Subgraph on top of it. ### Prerequisites diff --git a/infrastructure/deploying-provider.md b/infrastructure/deploying-provider.md index cf2c75e3..6bfe18b0 100644 --- a/infrastructure/deploying-provider.md +++ b/infrastructure/deploying-provider.md @@ -2,20 +2,18 @@ ### About Provider -Provider encrypts the URL and metadata during publishing and decrypts the URL when the dataset is downloaded or a compute job is started. It enables access to the data assets by streaming data (and never the URL). It performs checks on-chain for buyer permissions and payments. It also provides compute services (connects to a C2D environment). +Provider encrypts the URL and metadata during publishing and decrypts the URL when the dataset is downloaded or a compute job is started. It enables access to the data assets by streaming data (and never the URL). It performs checks on-chain for buyer permissions and payments. It also provides compute services (connects to a C2D environment). Provider is a multichain component, meaning that it can handle these tasks on multiple chains with the proper configurations. The source code of Provider can be accessed from [here](https://github.com/oceanprotocol/provider). As mentioned in the Setup a Server document, all Ocean components can be deployed in two types of configurations: simple, based on Docker Engine and Docker Compose, and complex, based on Kubernetes with Docker Engine. In this document, we will present how to deploy Provider in each of these configurations. - ## Deploying Provider using Docker Engine and Docker Compose In this guide, we will deploy Provider for two chains: Goerli (Ethereum test network) and Mumbai (Polygon test network). Therefore, please note that in the following configuration files, "5" and "80001" are the chain IDs for Goerli and Mumbai respectively. - ### Prerequisites * A server for hosting Provider. See [this guide](setup-server.md) for how to create a server; @@ -27,8 +25,8 @@ In this guide, we will deploy Provider for two chains: Goerli (Ethereum test net The steps to deploy the Provider using Docker Engine and Docker Compose are: -1. [Create the /etc/docker/compose/provider/docker-compose.yml file](deploying-provider.md#1.-create-the-etc-docker-compose-provider-docker-compose.yml-file) -2. [Create the /etc/systemd/system/docker-compose@provider.service file](deploying-provider.md#2.-create-the-etc-systemd-system-docker-compose-provider.service-file) +1. [Create the /etc/docker/compose/provider/docker-compose.yml file](deploying-provider.md#1-create-the-etcdockercomposeproviderdocker-composeyml-file) +2. [Create the /etc/systemd/system/docker-compose@provider.service file](deploying-provider.md#2-create-the-etcsystemdsystemdocker-composeproviderservice-file) 3. [Reload the systemd manager configuration](deploying-provider.md#3.-reload-the-systemd-manager-configuration) 4. [Start the Provider service](deploying-provider.md#4.-start-the-provider-service) 5. [Check the service's status](deploying-provider.md#5.-check-the-services-status) @@ -36,10 +34,9 @@ The steps to deploy the Provider using Docker Engine and Docker Compose are: 7. [Check Provider service logs](deploying-provider.md#7.-check-provider-service-logs) - #### 1. Create the /etc/docker/compose/provider/docker-compose.yml file -From a terminal console, create /etc/docker/compose/provider/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. +From a terminal console, create /etc/docker/compose/provider/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. ```yaml version: '3' @@ -70,7 +67,6 @@ networks: ``` - #### 2. Create the _/etc/systemd/system/docker-compose@provider.service_ file Create the _/etc/systemd/system/docker-compose@provider.service_ file then copy and paste the following content to it. This example file could be customized if needed. @@ -96,7 +92,6 @@ WantedBy=multi-user.target ``` - #### 3. Reload the systemd manager configuration Run the following command to reload the systemd manager configuration @@ -112,7 +107,6 @@ sudo systemctl enable docker-compose@provider.service ``` - #### 4. Start the Provider service To start the Provider service, run the following command: @@ -122,7 +116,6 @@ sudo systemctl start docker-compose@provider.service ``` - #### 5. Check the service's status Check the status of the service by running the following command. The output of the command should be similar to the one presented here. @@ -147,10 +140,9 @@ Jun 14 09:41:53 testvm systemd[1]: Finished provider service with docker compose ``` - #### 6. Confirm the Provider is accessible -Once started, the Provider service is accessible on `localhost` port 8030/tcp. Run the following command to access the Provider. The output should be similar to the one displayed here. +Once started, the Provider service is accessible on `localhost` port 8030/tcp. Run the following command to access the Provider. The output should be similar to the one displayed here. ```bash $ curl localhost:8030 @@ -158,7 +150,6 @@ $ curl localhost:8030 ``` - #### 7. Check Provider service logs If needed, use docker CLI to check provider service logs. @@ -196,11 +187,9 @@ $ docker logs --follow provider ``` - ## Deploying Provider using Kubernetes with Docker Engine - In this example, we will run Provider as a Kubernetes deployment resource. We will deploy Provider for two chains: Goerli (Ethereum test network) and Mumbai (Polygon test network). Therefore, please note that in the following configuration files, "5" and "80001" are the chain IDs for Goerli and Mumbai respectively. ### Prerequisites @@ -215,17 +204,16 @@ In this example, we will run Provider as a Kubernetes deployment resource. We wi The steps to deploy the Provider in Kubernetes are: -[1. Create a YAML file for Provider configuration.](deploying-provider.md#1.-create-an-yaml-file-for-provider-configuration.) +[1. Create a YAML file for Provider configuration.](deploying-provider.md#1-create-a-yaml-file-for-provider-configuration) [2. Deploy the configuration.](deploying-provider.md#2.-deploy-the-configuration) [3. Create a Kubernetes service.](deploying-provider.md#3.-create-a-kubernetes-service) - #### 1. Create a YAML file for Provider configuration. -From a terminal window, create a YAML file (in our example the file is named provider-deploy.yaml) then copy and paste the following content. Check the comments in the file and replace the fields with the specific values of your implementation (RPC URLs, the private key etc.). +From a terminal window, create a YAML file (in our example the file is named provider-deploy.yaml) then copy and paste the following content. Check the comments in the file and replace the fields with the specific values of your implementation (RPC URLs, the private key etc.). ```yaml apiVersion: apps/v1 @@ -303,7 +291,6 @@ spec: Tip: before deployment, you can [validate](https://github.com/instrumenta/kubeval) the yaml file. - #### 2. Deploy the configuration Deploy the configuration in Kubernetes using the following commands. @@ -319,7 +306,6 @@ provider-865cb8cf9d-r9xm4 1/1 Running 0 67s ``` - #### 3. Create a Kubernetes service The next step is to create a Kubernetes service (eg. ClusterIP, NodePort, Loadbalancer, ExternalName) for this deployment, depending on the environment specifications. Follow [this link](https://kubernetes.io/docs/concepts/services-networking/service/) for details on how to create a Kubernetes service. diff --git a/infrastructure/setup-server.md b/infrastructure/setup-server.md index d15e04b4..c5240239 100644 --- a/infrastructure/setup-server.md +++ b/infrastructure/setup-server.md @@ -16,9 +16,9 @@ For simple configurations: For complex configurations: -* Operating System: Linux distribution supported by Kubernetes and Docker Engine. Please refer to this link for details: [Kubernetes with Docker Engine](https://kubernetes.io/docs/setup/production-environment/container-runtimes/#docker). +* Operating System: Linux distribution supported by Kubernetes and Docker Engine. Please refer to this link for details: [Kubernetes with Docker Engine](https://kubernetes.io/docs/setup/production-environment/container-runtimes/#docker). + - ## Server Size @@ -47,7 +47,7 @@ For complex configurations: As mentioned earlier, you can use either an on-premise server or one hosted in the cloud (AWS, Azure, Digitalocean, etc.). To install the operating system on an on-premise server, please refer to the installation documentation of the operating system. -If you choose to use a server hosted in the cloud, you need to create the server using the user interface provided by the cloud platform. Following is an example of how to create a server in Digitalocean. +If you choose to use a server hosted in the cloud, you need to create the server using the user interface provided by the cloud platform. Following is an example of how to create a server in Digitalocean. #### Example: Create an Ubuntu Linux server in the Digitalocean cloud @@ -59,7 +59,7 @@ Go to [https://www.digitalocean.com/](https://www.digitalocean.com/) and create Click on **`Create`** button and choose **`Droplets`** options from dropdown. -

Select Droplet

+

Select Droplet

@@ -67,7 +67,7 @@ Click on **`Create`** button and choose **`Droplets`** options from dropdown. Select Ubuntu OS, and choose a plan and a configuration. -

Configure the server

+

Configure the server

### @@ -75,21 +75,21 @@ Select Ubuntu OS, and choose a plan and a configuration. Select the region where you want the component to be hosted and a root password. -

Select the region and set the root password

+

Select the region and set the root password

5. Finish the configuration and create the server -Specify a hostname for the server, specify the project to which you assign the server, and then click on `Create Droplet.` +Specify a hostname for the server, specify the project to which you assign the server, and then click on `Create Droplet.` -

Finalize and create the server

+

Finalize and create the server

6. Access the server's console After the server is ready, select the `Access console` option from the dropdown list to open a terminal window. -

Access the server's console

+

Access the server's console

### Install Docker Engine and Docker Compose @@ -113,9 +113,9 @@ sudo apt-get install docker-compose-plugin ### Install Kubernetes with Docker Engine -Kubernetes is an orchestration engine for containerized applications and the initial setup is dependent on the platform on which it is deployed - presenting how this product must be installed and configured is outside the scope of this document. +Kubernetes is an orchestration engine for containerized applications and the initial setup is dependent on the platform on which it is deployed - presenting how this product must be installed and configured is outside the scope of this document. -For cloud deployment, most of the cloud providers have dedicated turnkey solutions for Kubernetes. A comprehensive list of such cloud providers is presented [here](https://kubernetes.io/docs/setup/production-environment/turnkey-solutions/). +For cloud deployment, most of the cloud providers have dedicated turnkey solutions for Kubernetes. A comprehensive list of such cloud providers is presented [here](https://kubernetes.io/docs/setup/production-environment/turnkey-solutions/). For an on-premise deployment of Kubernetes, please refer to this [link](https://kubernetes.io/docs/setup/). diff --git a/rewards/README.md b/rewards/README.md index b2fad061..132e0068 100644 --- a/rewards/README.md +++ b/rewards/README.md @@ -11,10 +11,6 @@ coverY: 0 The purpose of Ocean Protocol's Data Farming dApp reward system is to incentivize the curation and publishing of high-quality data NFTs in the Ocean Ecosystem. Data Farming participants earn OCEAN rewards for these activities. Data Farming rewards are structured in 2 main streams **Passive Rewards** and **Active Rewards**. At a minimum, Data Farmers earn "passive rewards" for locking their OCEAN tokens to get veOCEAN tokens in return. Then, Data Farmers can maximize their yield by earning "active rewards". Active Rewards stream it's broken down into multiple substreams, to offer you a variaty of ways to contribute to Ocean Protocol values and increase your Data Farming rewards: Volume DF, Challenge DF. -### The belt system - -Earn your white, blue, purple, brown, and black belts in Data Farming knowledge by reading our docs on this topic in increasing difficulty! - ## veOCEAN Learning about [veOCEAN](veocean.md) will help you answer the question "What is the purpose of holding veOCEAN?" & give insights on how veOCEAN (vote-escrowed OCEAN) works. It will teach you everything you need to know about why it exists and how it works. diff --git a/rewards/df-basic.md b/rewards/df-basic.md index 8882562c..7b387d0f 100644 --- a/rewards/df-basic.md +++ b/rewards/df-basic.md @@ -1,14 +1,14 @@ --- -description: Learn the basic moves to start kicking a** Data Farming +description: Show you can complete all User-Guides as the last steps to master Data Farming! --- -# DF Basic Actions (Blue Belt) +# Complete User Guides

Like Neo, you have great potential.

### Get Started -Our [User Guides](../user-guides/README.md) get you started Data Farming quickly to do the basic operations. Follow these guides to earn your blue belt in Data Farming understanding. +Our [User Guides](../user-guides/README.md) get you started on Data Farming quickly so you can complete all operations. Follow these guides to get a complete handle on all Data Farming activities. {% content-ref url="../user-guides/get-started-df.md" %} [get-started-df.md](../user-guides/get-started-df.md) diff --git a/rewards/df-emissions-apys.md b/rewards/df-emissions-apys.md index b9502851..e6616a22 100644 --- a/rewards/df-emissions-apys.md +++ b/rewards/df-emissions-apys.md @@ -1,11 +1,10 @@ --- description: >- - Hey there, Bruce Lee! If you can understand the emission curves and estimated - APYs, then you've earned yourself a solid black belt in Data Farming - understanding 🥋 + Hey there champ! If you can explain the emission curves and + teach how to calculate APYs, then you've mastered Data Farming --- -# DF Emissions & APYs (Black Belt) +# Emissions & APYs

Like a true master of The Way of Data Farming.

@@ -51,9 +50,6 @@ The plot below shows estimated APY over time. Green includes both passive and ac APYs are an estimate because APY depends on OCEAN locked. OCEAN locked for future weeks is not known precisely; it must be estimated. The yellow line is the model for OCEAN locked. We modeled OCEAN locked by observing linear growth from week 5 (when OCEAN locking was introduced) to week 28 (now): OCEAN locked grew from 7.89M OCEAN to 34.98M OCEAN respectively, or 1.177M more OCEAN locked per week. -\ - -

Green: estimated APYs (passive + active). Black: estimated APYs (just passive). Yellow: estimated staking

All the plots are calculated from [this Google Sheet](https://docs.google.com/spreadsheets/d/1F4o7PbV45yW1aPWOJ2rwZEKkgJXbIk5Yq7tj8749drc/edit#gid=1051477754). diff --git a/rewards/df-intro.md b/rewards/df-intro.md index 924a0c3f..cc370d48 100644 --- a/rewards/df-intro.md +++ b/rewards/df-intro.md @@ -2,7 +2,7 @@ description: Learn the fundamentals of The Way of Data Farming 🧑‍🏫 --- -# Data Farming 101 (White Belt) +# Data Farming 101

Meet your sensei.

@@ -18,6 +18,14 @@ Every week OCEAN rewards are paid out to Data Farmers through two different rewa These two reward streams produce different variable APYs. +#### Splitting the Pie + +Each Data Farming weekly round has a pool of OCEAN rewards, where 50% of the pool is paid out in the form of passive rewards & 50% in the form of active rewards. + +| Passive Rewards | Active Rewards | +| --------------- | -------------- | +| 50% | 50% | + #### What are Passive Rewards? Passive rewards are the OCEAN rewards paid to Data Farmers just for locking their OCEAN tokens. @@ -42,7 +50,7 @@ Active Rewards are governed and defined by the [Reward Function](df-max-out-yiel [To start getting Active Rewards, go here.](../user-guides/how-to-data-farm.md) -#### Splitting the Pie +#### Estimating APY Each Data Farming weekly round has a pool of OCEAN rewards and [can be viewed here](https://df.oceandao.org/rewards) diff --git a/rewards/df-max-out-yield.md b/rewards/df-max-out-yield.md index 0c42eb76..ed7ad0fe 100644 --- a/rewards/df-max-out-yield.md +++ b/rewards/df-max-out-yield.md @@ -1,7 +1,7 @@ --- description: >- - If you've gotten this far, then you're half way to getting a black belt in - Ocean Protocol's Data Farming dApp! 🥋 + If you've gotten this far, then you're half way to becoming a pro in + Ocean Protocol's Data Farming dApp! --- # Max Out Volume DF (Purple Belt) @@ -47,6 +47,16 @@ The Reward Function (RF) governs how active rewards are allocated to Data Farmer For mathematicians and coders, you can find this code inside [calcrewards.py](https://github.com/oceanprotocol/df-py/blob/main/df_py/volume/calc_rewards.py) in the Ocean Protocol [df-py repo](https://github.com/oceanprotocol/df-py/)! +### What are Publisher Rewards? + +

Publishing makes you *more* OCEAN rewards

+ +Data Farming strongly incentivizes publishing assets in the Ocean ecosystem by giving double the active rewards to Data Farmers that allocate to their own published assets. + +How is it calculated? _All the veOCEAN a Data Farmer has allocated to an asset they’ve published is **doubled for the rewards calculation.**_ + +You can read more about the implementation [in this blog post](https://blog.oceanprotocol.com/data-farming-publisher-rewards-f2639525e508). + ### What are Ranked Rewards? In Data Farming Round 23 Ranked Rewards were introduced to smooth out the reward distribution by using a logarithmic function. @@ -70,6 +80,7 @@ The asset may be of any type — dataset, an algorithm for Compute-to-Data, or a To qualify for DF, an asset must also: * Have been created by Ocean Smart contracts [deployed](https://github.com/oceanprotocol/contracts/blob/v4main/addresses/address.json) by OPF to [production networks](../discover/networks/README.md) +* The asset must be listed on Ocean Market * Can’t be in [purgatory](https://github.com/oceanprotocol/list-purgatory/blob/main/policies/README.md) ### A Brief History of Data Farming @@ -86,7 +97,7 @@ Data Farming has evolved over time and will continue to do so as the Emission Cu Up to 100K OCEAN rewards were budgeted per week. Counting started Thu Oct 27, 2022, and ended on March 15, 2023. It ran for 20 weeks. The aim was to test the effect of larger incentives, and support ecosystem participation, while continually refining the underlying technology. **DF Main - Rounds 29-1000+**\ -We are now in DF Main which immediately followed the release of DF Beta on Thu Mar 16, 2023. Rewards begin at 150k per week and go to 1.1M OCEAN per week. DF Main emits 503.4M OCEAN worth of rewards and lasts for decades. +We are now in DF Main which immediately followed the release of DF Beta on Thu Mar 16, 2023. Rewards begin at 150k per week and goes up to 1.1M OCEAN per week. DF Main emits 503.4M OCEAN worth of rewards and lasts for decades. The amount of OCEAN released is determined by the emission schedule as defined by the [Emission Curve](df-emissions-apys.md#emissions--first-5-years), and perhaps more easily understood in the Reward Schedule below. diff --git a/rewards/veocean.md b/rewards/veocean.md index 3a6b7ab1..56bdeedc 100644 --- a/rewards/veocean.md +++ b/rewards/veocean.md @@ -1,10 +1,9 @@ --- description: >- - Let's discuss the "ve" in veOCEAN for our last jutsu before earning a black - belt in Data Farming knowledge! + Learn the basic moves to start kicking a** at Data Farming --- -# DF "ve" in veOCEAN (Brown Belt) +# Passive Farming veOCEAN

Data Farming is getting effortless.

@@ -14,7 +13,7 @@ description: >- You see, when you acquire veOCEAN via locking your OCEAN tokens in our Data Farming dApp, the intended use is to **vote on your favorite assets** in the Ocean ecosystem! -When you vote on assets that sell, then **you get a portion of the sales**! +When you allocate to assets that sell, then **you get a portion of the sales**! You can do this all from the Data Farming dApp [Farms page](https://df.oceandao.org/farms). @@ -33,19 +32,19 @@ veOCEAN allows you to engage with different Ocean Protocol mechanisms and benefi #### Passive Rewards from Data Farming -veOCEAN holders get weekly Data Farming rewards with a small carveout for any Ocean Protcol Data Challenges that run through Data Farming operations. +veOCEAN holders get weekly Data Farming rewards with a small carveout for any Ocean Protocol Data Challenges that run through Data Farming operations. #### Active Rewards from Community Fees veOCEAN holders can generate yield completely passively if they wish, though they are incentivized with larger real yield if they **actively participate** in farming yield from assets. -Active rewards follow the usual Data Farming formula: $ of sales of the asset \* allocation to that asset.\*\* But also every transaction in the Ocean ecosystem and Ocean Protocol backend infrastructure generates **"community swap" fees that go into Active Rewards**. 50% of the community fees goes to veOCEAN holders, and 50% goes to the Ocean Protocol Foundation's community-oriented traction programs. +Active rewards follow the usual Data Farming formula: $ of sales of the asset \* allocation to that asset. -\*\*There is no liquidity locked inside a datatoken pool, and this allocation is safe: you can’t lose your OCEAN as it is merely locked. +There is no liquidity locked inside a datatoken pool, and this allocation is safe: you can’t lose your OCEAN as it is merely locked. ### veOCEAN Time Locking -Users can lock their OCEAN for different lengths of time to gain more veOCEAN **voting power**. The Data Farming dApp is designed to lock OCEAN for **a minimum of 2 weeks and a maximum of four years** (for max rewards). The longer you lock your OCEAN, the more veOCEAN + OCEAN rewards you get! +Users can lock their OCEAN for different lengths of time to gain more veOCEAN **voting power**. The Data Farming dApp is designed to lock OCEAN for **a minimum of 2 weeks and a maximum of four years** (for max rewards). The longer you lock your OCEAN, the more veOCEAN + OCEAN rewards you get! On the dApp's [veOCEAN page](https://df.oceandao.org/veocean), the "Lock Multiplier" represents the percentage amount of veOCEAN tokens received per OCEAN token locked. @@ -62,6 +61,8 @@ After choosing your lock period and locking up your OCEAN into the vault, you wi veOCEAN is non-transferable. You can’t sell it or send it to other addresses. +_To help you more easily understand this, we have created [a couple of examples](../user-guides/how-to-df-estimate-apy.md) so you can more easily visualize the impact of your decisions on your overall yields._ + ### Linear Decay Your veOCEAN balance will slowly start declining as soon as you receive it. @@ -81,10 +82,6 @@ If you lock 1.0 OCEAN for 4 years, you get 1.0 veOCEAN at the start. At the end of your 4 years, your OCEAN is unlocked. -#### Linear Decay - -**Your balance of veOCEAN may be less than the amount when you first locked your tokens because your veOCEAN balance decreases linearly over time until the Lock End Date when you can withdraw your OCEAN tokens.** This is because rewards are designed to be paid out weekly in a decreasing amount until you unlock your OCEAN tokens entirely. The veOCEAN code is a fork of Curve's battle tested [veCRV](https://curve.readthedocs.io/dao-vecrv.html) token code. - ### Replenishing your veOCEAN You can choose to update your lock and replenish your veOCEAN balance at any time. @@ -93,9 +90,9 @@ To maximize rewards, participants would need to update their 4-year lock every w ### veOCEAN Earnings -All earnings for veOCEAN holders are claimable in Ethereum mainnet. (Data assets for DFing may published in any network where Ocean’s deployed in production: ETH Mainnet, Polygon, etc.) +All earnings for veOCEAN holders are claimable in the Ethereum mainnet. (Data assets for DFing may be published in any network where Ocean’s deployed in production: ETH Mainnet, Polygon, etc.) -Data Farming rounds occur weekly; in line with this, there’s a new ve distribution “epoch” every week. This affects when you can first claim rewards. Specifically, if you lock OCEAN on day x, you’ll be able to claim rewards on the first ve epoch that begins after day x+7. Put another way, from the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. (This behavior is inherited from veCRV. Here’s the code. ) +Data Farming rounds occur weekly; in line with this, there’s a new [`ve`](https://github.com/oceanprotocol/df-py/tree/main/contracts/ve) distribution “epoch” every week. This affects when you can first claim rewards. Specifically, if you lock OCEAN on day x, you’ll be able to claim rewards on the first ve epoch that begins after day x+7. Put another way, from the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. (This behavior is inherited from veCRV. Here’s the [code](https://github.com/oceanprotocol/df-py/tree/main/contracts/ve) ) ### DYOR! @@ -111,8 +108,6 @@ After the Lock End Date, then you can withdraw your principal OCEAN tokens on th The image below illustrates the flow of value. On the left, at time 0, the staker locks their OCEAN into the veOCEAN contract, and receives veOCEAN. In the middle, the staker receives OCEAN rewards every time there’s revenue to the Ocean Protocol Community (top), and also as part of Data Farming rewards (bottom). On the right, when the lock expires (e.g. 4 years) then the staker is able to move their OCEAN around again. - -

Flow of Value

The veOCEAN design is in accordance with the Web3 Sustainability Loop, which Ocean uses as its system-level design. @@ -123,7 +118,7 @@ The veOCEAN code was forked from the veCRV code. veCRV parameters will be the st The "veTokenomics" model of veOCEAN (vote-escrowed token economics) is inspired by Curve Finance's [veCRV](https://curve.readthedocs.io/dao-fees.html) token code. We took this inspiration to enable our users to participate in on-chain governance and earn rewards within the Ocean Protocol ecosystem. -[Here is Ocean Protocol's open-source code](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veFeeDistributor.vy#L240-L256) for veOCEAN, and if you're a developer, then you'll notice the strong similarities to [veCRV's](https://curve.readthedocs.io/dao-fees.html) code. +[Here is Ocean Protocol's open-source code](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veFeeDistributor.vy#L240-L256) for veOCEAN, and if you're a developer, then you'll notice the strong similarities to [veCRV's](https://curve.readthedocs.io/dao-fees.html) code. ### veOCEAN's Smart Contracts Security @@ -132,4 +127,3 @@ The "veTokenomics" model of veOCEAN (vote-escrowed token economics) is inspired We have built [a new contract](https://github.com/oceanprotocol/contracts/blob/main/contracts/ve/veAllocate.sol) for users to point their veOCEAN towards given data assets (“allocate veOCEAN”). These new contracts do not control the veOCEAN core contracts at all. In the event of a breach, the only funds at risk would be the rewards distributed for a single week; and we would be able to redirect future funds to a different contract. We have an [ongoing bug bounty via Immunefi](https://immunefi.com/bounty/oceanprotocol/) for Ocean software, including veOCEAN and DF components. If you identify an issue, please report it there and get rewarded. - diff --git a/user-guides/README.md b/user-guides/README.md index ba16c189..faae13cc 100644 --- a/user-guides/README.md +++ b/user-guides/README.md @@ -34,18 +34,27 @@ Buy, mint, and sell NFTs using the Ocean Market following the guides below. [using-ocean-market.md](using-ocean-market.md) {% endcontent-ref %} -### Make yield from dataset and algorithm NFTs on-chain ⛓️ +### Farm data like a pro 😎🥕 -Farm data like a pro. 😎🥕 +Earn rewards by obtaining veOCEAN, farm yield by curating datasets, and optimize your APY by publishing Data & Algorithm NFTs on-chain. ⛓️ {% content-ref url="get-started-df.md" %} [get-started-df.md](get-started-df.md) {% endcontent-ref %} +{% content-ref url="get-started-df.md" %} +[get-started-df.md](get-started-df.md) +{% endcontent-ref %} + +{% content-ref url="claim-ocean-rewards.md" %} +[claim-ocean-rewards.md](claim-ocean-rewards.md) +{% endcontent-ref %} + {% content-ref url="how-to-data-farm.md" %} [how-to-data-farm.md](how-to-data-farm.md) {% endcontent-ref %} +<<<<<<< HEAD {% content-ref url="how-to-data-farm-challengedf.md" %} [how-to-data-farm-challengedf.md](how-to-data-farm-challengedf.md) {% endcontent-ref %} @@ -60,6 +69,10 @@ Farm data like a pro. 😎🥕 {% content-ref url="claim-ocean-rewards.md" %} [claim-ocean-rewards.md](claim-ocean-rewards.md) +======= +{% content-ref url="how-to-df-estimate-apy.md" %} +[how-to-df-estimate-apy.md](how-to-df-estimate-apy.md) +>>>>>>> main {% endcontent-ref %} ### Antique Stuff 🏺 diff --git a/user-guides/asset-hosting/README.md b/user-guides/asset-hosting/README.md index 67501972..1d907446 100644 --- a/user-guides/asset-hosting/README.md +++ b/user-guides/asset-hosting/README.md @@ -4,7 +4,7 @@ description: How to host your data and algorithm NFT assets like a champ 🏆 # Host Assets -The most important thing to remember is that wherever you host your asset... it needs to be **reachable & downloadable**. It cannot live behind a private firewall such as a private Github repo. You need to **use a proper hosting service!** +The most important thing to remember is that wherever you host your asset... it needs to be **reachable & downloadable**. It cannot live behind a private firewall such as a private Github repo. You need to **use a proper hosting service!** **The URL to your asset is encrypted in the publishing process!** @@ -14,7 +14,7 @@ The most important thing to remember is that wherever you host your asset... it In this section, we'll walk you through three options to store your assets: Arweave (decentralized storage), AWS (centralized storage), and Azure (centralized storage). Let's goooooo! -Read on, anon, if you are interested in the security details! +Read on, if you are interested in the security details! ### Security Considerations diff --git a/user-guides/asset-hosting/azure-cloud.md b/user-guides/asset-hosting/azure-cloud.md index ec8ff407..6f19bc15 100644 --- a/user-guides/asset-hosting/azure-cloud.md +++ b/user-guides/asset-hosting/azure-cloud.md @@ -18,15 +18,15 @@ Create an account on [Azure](https://azure.microsoft.com/en-us/). Users might al Go to the Azure portal: https://portal.azure.com/#home and select `Storage accounts` as shown below. -![Select storage accounts](<../../.gitbook/assets/hosting/azure1 (1).png>) +![Select storage accounts](../../.gitbook/assets/hosting/azure1.png) **Create a new storage account** -![Create a storage account](<../../.gitbook/assets/hosting/azure2 (1).png>) +![Create a storage account](../../.gitbook/assets/hosting/azure2.png) **Fill in the details** -![Add details](<../../.gitbook/assets/hosting/azure3 (1).png>) +![Add details](../../.gitbook/assets/hosting/azure3.png) **Storage account created** @@ -38,7 +38,7 @@ Go to the Azure portal: https://portal.azure.com/#home and select `Storage accou **Step 3 - Upload a file** -![Upload a file](<../../.gitbook/assets/hosting/azure6 (1).png>) +![Upload a file](../../.gitbook/assets/hosting/azure6.png) **Step 4 - Share the file** diff --git a/user-guides/asset-hosting/github.md b/user-guides/asset-hosting/github.md index 7ddaf964..704fb507 100644 --- a/user-guides/asset-hosting/github.md +++ b/user-guides/asset-hosting/github.md @@ -14,11 +14,11 @@ Create an account on [Github](https://github.com/). Users might also be asked to **Step 1 - Create a new repository on GitHub or navigate to an existing repository where you want to host your files.** -

Create new repository

+

Create new repository

Fill in the repository details. **Make sure your Repo is public.** -

Make the repository public

+

Make the repository public

### Host Your File @@ -26,41 +26,41 @@ Fill in the repository details. **Make sure your Repo is public.** Go to your repo in Github and above the list of files, select the Add file dropdown menu and click Upload files. Alternatively, you can use version control to push your file to the repo. -

Upload file on Github

+

Upload file on Github

To select the files you want to upload, drag and drop the file or folder, or click 'choose your files'. -

Drag and drop new files on your GitHub repo

+

Drag and drop new files on your GitHub repo

In the "Commit message" field, type a short, meaningful commit message that describes the change you made. -

Commit changes

+

Commit changes

Below the commit message field, decide whether to add your commit to the current branch or to a new branch. If your current branch is the default branch, then you should choose to create a new branch for your commit and then create a pull request. After you make your commit (and merge your pull request, if applicable), then click on the file. -

Upload successful

+

Upload successful

**Step 3 - Get the RAW version of your file** -To use your file on the Market **you need to use the raw url of the asset**. Also, make sure your Repo is publicly accessible to allow the market to use that file. +To use your file on the Market **you need to use the raw url of the asset**. Also, make sure your Repo is publicly accessible to allow the market to use that file. Open the File and click on the "Raw" button on the right side of the page. -

Click the Raw button

+

Click the Raw button

Copy the link in your browser's URL - it should begin with "https://raw.githubusercontent.com/...." like in the image below. -

Grab the RAW github URL from your browser's URL bar

+

Grab the RAW github URL from your browser's URL bar

-

Copy paste the raw url

+

Copy paste the raw url

**Step 4 - Publish the asset using the Raw link** Now, copy and paste the Raw Github URL into the File field of the Access page in the Ocean Market. -

Upload on the Ocean Market

+

Upload on the Ocean Market

Et voilà! You have now successfully hosted your asset on Github and properly linked it on the Ocean Market. diff --git a/user-guides/asset-hosting/google-storage.md b/user-guides/asset-hosting/google-storage.md index e6fe5d5e..51ba9d58 100644 --- a/user-guides/asset-hosting/google-storage.md +++ b/user-guides/asset-hosting/google-storage.md @@ -18,43 +18,43 @@ Create an account on [Google Cloud](https://console.cloud.google.com/). Users mi In the Google Cloud console, go to the Cloud Storage Buckets page -
+
**Create a new bucket** -
+
**Fill in the details** -
+
**Allow access to your recently created Bucket** -
+
**Step 2 - Upload a file** -
+
**Step 3 - Change your file's access (optional)** **If your bucket's access policy is restricted, on the menu on the right click on Edit access (skip this step if your bucket is publicly accessible)** -
+
-
+
**Step 4 - Share the file** **Open the file and copy the generated link** -
+
-
+
**Step 5 - Publish the asset using the generated link** Now, copy and paste the link into the Publish page in the Ocean Marketplace. -
+
diff --git a/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md b/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md index 1b0068e4..e83d80ca 100644 --- a/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md +++ b/user-guides/compute-to-data/make-a-boss-c2d-algorithm.md @@ -8,7 +8,7 @@ description: >-
-The beginning of any great algorithm for Compute-to-Data starts by referencing the dataset asset correctly on the Docker container. Read on, anon. +The beginning of any algorithm for Compute-to-Data starts by loading the dataset correctly. Read on, anon 👨🏻‍💻 ### Open the local dataset file diff --git a/user-guides/compute-to-data/publish-a-c2d-data-nft.md b/user-guides/compute-to-data/publish-a-c2d-data-nft.md index f8334008..fc0c0843 100644 --- a/user-guides/compute-to-data/publish-a-c2d-data-nft.md +++ b/user-guides/compute-to-data/publish-a-c2d-data-nft.md @@ -78,7 +78,7 @@ description: How to publish a data NFT with C2D configurations #### Congratulations! You have fully finished the C2D flow. Check your work by verifying that your algorithm appears on the data NFT's page, like in the following example: -

Your algorithm should appear now on the data NFT's page!

+

Your algorithm should appear now on the data NFT's page!

If you would like to run the compute job, then simply click the radio button to the left of the algorithm's name and click Buy Compute Job. diff --git a/user-guides/get-started-df.md b/user-guides/get-started-df.md index c2e2a1fb..8b992a60 100644 --- a/user-guides/get-started-df.md +++ b/user-guides/get-started-df.md @@ -40,7 +40,8 @@ Watch and learn, friend - Click on the purple circles in our interactive demo to walk through the steps for locking your OCEAN tokens for veOCEAN tokens. -{% @arcade/embed flowId="FUSkygksSRsJHwle1zFs" url="https://app.arcade.software/share/FUSkygksSRsJHwle1zFs" %} +{% embed url="https://app.arcade.software/share/FUSkygksSRsJHwle1zFs" fullWidth="false" %} +{% endembed %} In this step you will: diff --git a/user-guides/how-to-df-estimate-apy.md b/user-guides/how-to-df-estimate-apy.md new file mode 100644 index 00000000..9c6fddeb --- /dev/null +++ b/user-guides/how-to-df-estimate-apy.md @@ -0,0 +1,42 @@ +--- +description: >- + Learn how to use basic math and a simple spreadsheet to estimate your + Passive Rewards APY and get deeper into Active Rewards APY. +--- + +

K.I.S.S.

+ +# veOCEAN and your APY + +Before we start, let's do a quick recap. + +Rewards are earned by users that hold and use their veOCEAN to help the protocol improve and grow. Here are some good lessons to improve the outcome of your APY. +1. To improve your yield, you will need to make good decisions for how long you'll choose to lock. The best way to do this is to learn how [Time Locking](/rewards/veocean.md#veocean-time-locking) and [Linear Decay](/rewards/veocean.md#linear-decay) function. +2. APYs are always initially calculated by dividing the amount of OCEAN you have received from rewards, by the relative amount of OCEAN you have locked up. +3. As a rule: _Wherever APYs are provided to the user in the app (df.oceandao.org), they are caclulated assuming an initial 4-year lock up period with a weekly schedule of compounding rewards into an updated 4-year lock. This estimate works provided current: number of users, reward emissions, and other reward parameters stay constant while excluding all tx fees._ + +Now that we got that out of the way, let's work through the examples and keep it as simple as possible. + +### Estimating Passive APY + +To make it easier to estimate your APY, [we have created a simple spreadsheet](https://docs.google.com/spreadsheets/d/1zzuW5pBbX6j6hZL_XtJDtSR2-rDHa_LGOEwgoQ4D8lk/edit?usp=sharing) that let's you easily estimate your Passive APY. + +We have provided 2 sheets as an example of locking-up 10,000 OCEAN for: +1. A 4-year investment period +2. A 6-mo investment period + +The above are simplified examples meant for everyone to understand. They are naive investment strategies and are meant to provide you, the reader, with some examples to build upon. You can use these as a reference to create your own plan, so feel free to make a copy of this spreadsheet and customize it to your needs and wants. + +### Estimating Active APY + +Active Rewards are a bit more complicated and depend on many factors that are currently hard to predict accurately. It is unlikely for the user to get a practical result, which is why we don't offer a tool to estimate Active APY right now. + +You can easily expand the spreadsheet above to support basic, naive calculations for Active Rewards such as defining a constant percent yield per week. + +Having said this, we do provide a thorough dashboard that provides historical and ongoing summaries of APY, Data Consume Volume, and veOCEAN allocations per-round. + +

Curate like a Pro.

+ +You can also [learn how rewards are calculated here](../rewards/df-max-out-yield.md#how-rewards-are-calculated) to understand more about what's happening behind the scenes of each Data Farming round. + +Finally, you can [review the implementation inside df-web](https://github.com/oceanprotocol/df-web/blob/main/src/utils/rewards.js) to understand how APYs are calculated at the frontend/UI level. \ No newline at end of file diff --git a/user-guides/join-a-data-challenge.md b/user-guides/join-a-data-challenge.md index 9fd9b1d3..5d62f807 100644 --- a/user-guides/join-a-data-challenge.md +++ b/user-guides/join-a-data-challenge.md @@ -8,7 +8,7 @@ description: >-

Bring on the data challenges.

-Hone your skills, work on real business problems, and earn sweet dosh along the way. +Hone your skills, work on real business problems, and earn sweet dosh along the way. ### What is an Ocean Protocol data challenge? diff --git a/user-guides/publish-data-nfts.md b/user-guides/publish-data-nfts.md index 232db700..d7155e85 100644 --- a/user-guides/publish-data-nfts.md +++ b/user-guides/publish-data-nfts.md @@ -28,13 +28,13 @@ Don't enjoy reading? Watch our video tutorial! 2. Connect your wallet. 3. Select the network where you would like to publish your NFT (ex. Ethereum, Polygon, etc). -

Connect your wallet

+

Connect your wallet

In this tutorial, we will be using the Polygon Mumbai test network. 4\. Click on the Publish link on the top left corner of the page. -

Click the publish link

+

Click the publish link

#### Step 1 - Metadata 🤓 @@ -58,7 +58,7 @@ _Mandatory fields are marked with \*_ Tags help the asset to be searchable. If not provided, the list of tags is empty by default. -

Enter the metadata

+

Enter the metadata

Click Continue. @@ -84,7 +84,7 @@ _Mandatory fields are marked with \*_ This field specifies how long the buyer can access the dataset after the dataset is purchased. This field is editable after the asset publication. -

Enter the access information

+

Enter the access information

#### Step 3 - Pricing 🫰 @@ -101,11 +101,11 @@ With the _free pricing_ schema, the publisher provides an asset that is free to For a deep dive into the fee structure, please refer to this [document](../developers/contracts/fees.md). -

Set the price

+

Set the price

#### Step 4 - Preview 🔍 -

Preview your work

+

Preview your work

If you are happy with the Preview of your NFT, then click Continue! diff --git a/user-guides/sponsor-a-data-challenge.md b/user-guides/sponsor-a-data-challenge.md index b09ea989..677b2aa8 100644 --- a/user-guides/sponsor-a-data-challenge.md +++ b/user-guides/sponsor-a-data-challenge.md @@ -10,7 +10,7 @@ Hosting a data challenge is a fun way to engage data scientists and machine lear ### How to sponsor an Ocean Protocol data challenge? -1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing. +1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing. 2. Curate the dataset(s) that participants will use for the challenge. The key to hosting a good data challenge is to provide an exciting and through dataset that participants can use to build their solutions. Do your research to understand what data is available, whether it be free from an API, available for download, require any transformations, etc. For the first challenge, it is alright if the created dataset is a static file. However, it is best to ensure there is a path to making the data available from a dynamic endpoint so that entires can eventually be applied to current, real-world use cases. 3. Decide how the judging process will occur. This includes how long to make review period, how to score submissions, and how to decide any prizes will be divided among participants 4. Work with Ocean Protocol to gather participants for your data challenge. Creating blog posts and hosting Twitter Spaces is a good way to spread the word about your data challenge. diff --git a/user-guides/using-ocean-market.md b/user-guides/using-ocean-market.md index 4cc227de..57899247 100644 --- a/user-guides/using-ocean-market.md +++ b/user-guides/using-ocean-market.md @@ -26,5 +26,5 @@ The Ocean Market is a place for buyers + sellers of top-notch data and algorithm **If you are new to web3** and blockchain technologies then we suggest you first get familiar with some Web3 basics: * [Wallet Basics](../discover/wallets/README.md) 👛 -* [Set Up MetaMask](../discover/wallets/metamask-setup.md) [Wallet ](../discover/wallets/metamask-setup.md)🦊 +* [Set Up MetaMask](../discover/wallets/metamask-setup.md) 🦊 * [Manage Your OCEAN Tokens](../discover/wallets-and-ocean-tokens.md) 🪙