Final cleanup
Before Width: | Height: | Size: 31 KiB After Width: | Height: | Size: 43 KiB |
Before Width: | Height: | Size: 868 KiB After Width: | Height: | Size: 808 KiB |
Before Width: | Height: | Size: 456 KiB After Width: | Height: | Size: 376 KiB |
Before Width: | Height: | Size: 1.2 MiB After Width: | Height: | Size: 1.1 MiB |
Before Width: | Height: | Size: 327 KiB After Width: | Height: | Size: 285 KiB |
Before Width: | Height: | Size: 506 KiB After Width: | Height: | Size: 457 KiB |
Before Width: | Height: | Size: 38 KiB After Width: | Height: | Size: 49 KiB |
Before Width: | Height: | Size: 20 KiB After Width: | Height: | Size: 22 KiB |
Before Width: | Height: | Size: 42 KiB After Width: | Height: | Size: 54 KiB |
Before Width: | Height: | Size: 21 KiB After Width: | Height: | Size: 33 KiB |
Before Width: | Height: | Size: 816 KiB After Width: | Height: | Size: 784 KiB |
Before Width: | Height: | Size: 162 KiB After Width: | Height: | Size: 140 KiB |
Before Width: | Height: | Size: 178 KiB After Width: | Height: | Size: 157 KiB |
Before Width: | Height: | Size: 964 KiB After Width: | Height: | Size: 929 KiB |
@ -12,7 +12,7 @@ A lot of people miss the mark on tokenizing data that actually _sells_. If your
|
||||
|
||||
To figure out which market segments are paying for data, then it may help you to **go to the Ocean Market and sort by Sales.**
|
||||
|
||||
But even then, it's not enough to just publish useful data on Ocean. **You need to market your data** **assets** to close sales. 
|
||||
But even then, it's not enough to just publish useful data on Ocean. **You need to market your data** **assets** to close sales.
|
||||
|
||||
Have you tried all these things and are still having trouble making money? Never fear! You can enter one of our [data challenges](https://oceanprotocol.com/challenges) to make sweet OCEAN rewards and build your data science skills.
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
# Aquarius
|
||||
|
||||
### What is Aquarius? 
|
||||
### What is Aquarius?
|
||||
|
||||
Aquarius is a tool that tracks and caches the metadata from each chain where the Ocean Protocol smart contracts are deployed. It operates off-chain, running an Elasticsearch database. This makes it easy to query the metadata generated on-chain.
|
||||
|
||||
@ -23,7 +23,7 @@ Aquarius has its own interface (API) that allows you to easily query this metada
|
||||
|
||||
### How to run Aquarius?
|
||||
|
||||
We recommend checking the README in the [Aquarius GitHub repository](https://github.com/oceanprotocol/aquarius) for the steps to run the Aquarius. If you see any errors in the instructions, please open an issue within the GitHub repository. 
|
||||
We recommend checking the README in the [Aquarius GitHub repository](https://github.com/oceanprotocol/aquarius) for the steps to run the Aquarius. If you see any errors in the instructions, please open an issue within the GitHub repository.
|
||||
|
||||
### What technology does Aquarius use?
|
||||
|
||||
|
@ -75,7 +75,7 @@ console.log(response.data.description)
|
||||
|
||||
```
|
||||
|
||||
### **Asset Names** 
|
||||
### **Asset Names**
|
||||
|
||||
Used to retrieve the names of a group of assets using a list of unique identifiers known as Decentralized Identifiers (DIDs).
|
||||
|
||||
@ -121,7 +121,7 @@ for (let key in response.data) {
|
||||
|
||||
### Query Assets
|
||||
|
||||
Used to run a custom search query on the assets using Elasticsearch's native query syntax. We recommend reading the [Elasticsearch documentation](https://www.elastic.co/guide/index.html) to understand their syntax. 
|
||||
Used to run a custom search query on the assets using Elasticsearch's native query syntax. We recommend reading the [Elasticsearch documentation](https://www.elastic.co/guide/index.html) to understand their syntax.
|
||||
|
||||
* **Endpoint**: `POST /api/aquarius/assets/query`
|
||||
* **Purpose**: This endpoint is used to execute a native Elasticsearch (ES) query against the stored assets. This allows for highly customizable searches and can be used to filter and sort assets based on complex criteria. The body of the request should contain a valid JSON object that defines the ES query.
|
||||
|
@ -12,7 +12,7 @@ Retrieves a list of chains that are currently supported or recognized by the Aqu
|
||||
|
||||
Here are some typical responses you might receive from the API:
|
||||
|
||||
* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing chain IDs as keys and their active status as values. 
|
||||
* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing chain IDs as keys and their active status as values.
|
||||
|
||||
Example response:
|
||||
|
||||
|
@ -12,7 +12,7 @@ By using the Barge component, developers can spin up an environment that include
|
||||
|
||||
To explore all the available options and gain a deeper understanding of how to utilize the Barge component, you can visit the official GitHub [repository](https://github.com/oceanprotocol/barge#all-options) of Ocean Protocol.
|
||||
|
||||
By utilizing the Barge component, developers gain the freedom to conduct experiments, customize, and fine-tune their local development environment, and offers the flexibility to override the Docker image tag associated with specific components. By setting the appropriate environment variable before executing the start\_ocean.sh command, developers can customize the versions of various components according to their requirements. For instance, developers can modify the: `AQUARIUS_VERSION`, `PROVIDER_VERSION`, `CONTRACTS_VERSION`, `RBAC_VERSION`, and `ELASTICSEARCH_VERSION` environment variables to specify the desired Docker image tags for each respective component. 
|
||||
By utilizing the Barge component, developers gain the freedom to conduct experiments, customize, and fine-tune their local development environment, and offers the flexibility to override the Docker image tag associated with specific components. By setting the appropriate environment variable before executing the start\_ocean.sh command, developers can customize the versions of various components according to their requirements. For instance, developers can modify the: `AQUARIUS_VERSION`, `PROVIDER_VERSION`, `CONTRACTS_VERSION`, `RBAC_VERSION`, and `ELASTICSEARCH_VERSION` environment variables to specify the desired Docker image tags for each respective component.
|
||||
|
||||
{% hint style="warning" %}
|
||||
⚠️ We've got an important heads-up about Barge that we want to share with you. Brace yourself, because **Barge is not for the faint-hearted**! Here's the deal: the barge works great on Linux, but we need to be honest about its limitations on macOS. And, well, it doesn't work at all on Windows. Sorry, Windows users!
|
||||
|
@ -139,7 +139,7 @@ And now your customized publish page is ready for your customers:
|
||||
|
||||
## Advanced customization
|
||||
|
||||
This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. You'll also need to set the environmental variables if you customized services like Aquarius, Provider, or Subgraph. 
|
||||
This important step is the last thing that we will change in this guide. To set the marketplace fees and address, you’ll need to save them as environmental variables. You'll also need to set the environmental variables if you customized services like Aquarius, Provider, or Subgraph.
|
||||
|
||||
First, we are going to create a new file called `.env` in the root of your repository.
|
||||
|
||||
@ -176,7 +176,7 @@ NEXT_PUBLIC_CONSUME_MARKET_FIXED_SWAP_FEE="0.01"
|
||||
|
||||
At this point, we have made a lot of changes and hopefully, you’re happy with the way that your marketplace is looking. Given that you now have your own awesome photo marketplace, it’s about time we talked about monetizing it. Yup, that’s right - you will earn a [commission](../contracts/fees.md) when people buy and sell photos in your marketplace. In Ocean V4, there are a whole host of new [fees](../contracts/fees.md) and customization options that you can use. In order to receive the fees you’ll need to set the address where you want to receive these fees in.
|
||||
|
||||
When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that. 
|
||||
When someone sets the pricing for their photos in your marketplace, they are informed that a commission will be sent to the owner of the marketplace. You see that at the moment this fee is set to zero, so you’ll want to increase that.
|
||||
|
||||
You need to replace “0x123abc” with your Ethereum address (this is where the fees will be sent).
|
||||
|
||||
@ -209,5 +209,3 @@ If you intend to utilize the ocean market with a custom [Aquarius](../aquarius/R
|
||||
Using a custom subgraph with the ocean market requires additional steps due to the differences in deployment. Unlike the multi-network deployment of the provider and Aquarius services, each network supported by the ocean market has a separate subgraph deployment. This means that while the provider and Aquarius services can be handled by a single deployment across all networks, the subgraph requires specific handling for each network.
|
||||
|
||||
To utilize a custom subgraph, you will need to implement additional logic within the `getOceanConfig` function located in the `src/utils/ocean.ts` file. By modifying this function, you can ensure that the market uses the desired custom subgraph for the selected network. This is particularly relevant if your market aims to support multiple networks and you do not want to enforce the use of the same subgraph across all networks. By incorporating the necessary logic within `getOceanConfig`, you can ensure the market utilizes the appropriate custom subgraph for each network, enabling the desired level of customization. If the mentioned scenario doesn't apply to your situation, there is another approach you can take. Similar to the previously mentioned examples, you can modify the `.env` file by updating the key labeled `NEXT_PUBLIC_SUBGRAPH_URI`. By making changes to this key, you can configure the ocean market to utilize your preferred subgraph deployment. This alternative method allows you to customize the market's behavior and ensure it utilizes the desired subgraph, even if you don't require different subgraph deployments for each network.
|
||||
|
||||
\
|
||||
|
@ -37,6 +37,6 @@ If this is your first time using surge, you will be prompted to enter an email a
|
||||
|
||||
<figure><img src="../../.gitbook/assets/market/Screenshot 2023-06-14 at 14.30.59.png" alt=""><figcaption><p>surge interaction</p></figcaption></figure>
|
||||
|
||||
 We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain.
|
||||
We have chosen https://crypto-photos.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain.
|
||||
|
||||
After a few minutes, your upload will be complete, and you’re ready to share your data marketplace. You can view the version we created in this guide [here](https://crypto-photos.surge.sh/).
|
||||
|
@ -45,5 +45,5 @@ Additionally, provider fees are not limited to data consumption — they can als
|
||||
**Benefits to the Ocean Community**\
|
||||
We’re always looking to give back to the Ocean community and collecting fees is an important part of that. As mentioned above, the Ocean Protocol Foundation retains the ability to implement community fees on data consumption. The tokens that we receive will either be burned or invested in the community via projects that they are building. These investments will take place either through [Data Farming](../rewards/df-intro.md), [Ocean Shipyard](https://oceanprotocol.com/shipyard), or Ocean Ventures.
|
||||
|
||||
Additionally, we will also be placing an additional 0.1% fee on projects that aren’t using either the Ocean token or H2O. We want to support marketplaces that use other tokens but we also recognize that they don’t bring the same wider benefit to the Ocean community, so we feel this small additional fee is proportionate. 
|
||||
Additionally, we will also be placing an additional 0.1% fee on projects that aren’t using either the Ocean token or H2O. We want to support marketplaces that use other tokens but we also recognize that they don’t bring the same wider benefit to the Ocean community, so we feel this small additional fee is proportionate.
|
||||
|
||||
|
@ -8,7 +8,7 @@ description: Specification of compute options for assets in Ocean Protocol.
|
||||
|
||||
### Compute Options
|
||||
|
||||
An asset categorized as a `compute type` incorporates additional attributes under the `compute object`. 
|
||||
An asset categorized as a `compute type` incorporates additional attributes under the `compute object`.
|
||||
|
||||
These attributes are specifically relevant to assets that fall within the compute category and are not required for assets classified under the `access type`. However, if an asset is designated as `compute`, it is essential to include these attributes to provide comprehensive information about the compute service associated with the asset.
|
||||
|
||||
@ -18,7 +18,7 @@ These attributes are specifically relevant to assets that fall within the comput
|
||||
|
||||
### Trusted Algorithms
|
||||
|
||||
The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm permissions. It controls which algorithms can be used for computation. If not defined, any published algorithm is allowed. If the array is empty, no algorithms are allowed. However, if the array is not empty, only algorithms published by the defined publishers are permitted. 
|
||||
The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm permissions. It controls which algorithms can be used for computation. If not defined, any published algorithm is allowed. If the array is empty, no algorithms are allowed. However, if the array is not empty, only algorithms published by the defined publishers are permitted.
|
||||
|
||||
The structure of each object within the `publisherTrustedAlgorithms` array is as follows:
|
||||
|
||||
|
@ -9,14 +9,14 @@ description: Datasets and Algorithms
|
||||
|
||||
Compute-to-Data introduces a paradigm where datasets remain securely within the premises of the data holder, ensuring strict data privacy and control. Only authorized algorithms are granted access to operate on these datasets, subject to specific conditions, within a secure and isolated environment. In this context, algorithms are treated as valuable assets, comparable to datasets, and can be priced accordingly. This approach enables data holders to maintain control over their sensitive data while allowing for valuable computations to be performed on them, fostering a balanced and secure data ecosystem.
|
||||
|
||||
To define the accessibility of algorithms, their classification as either public or private can be specified by setting the `attributes.main.type` value in the Decentralized Data Object (DDO): 
|
||||
To define the accessibility of algorithms, their classification as either public or private can be specified by setting the `attributes.main.type` value in the Decentralized Data Object (DDO):
|
||||
|
||||
* `"access"` - public. The algorithm can be downloaded, given appropriate datatoken.
|
||||
* `"compute"` - private. The algorithm is only available to use as part of a compute job without any way to download it. The Algorithm must be published on the same Ocean Provider as the dataset it's targeted to run on.
|
||||
|
||||
This flexibility allows for fine-grained control over algorithm usage, ensuring data privacy and enabling fair pricing mechanisms within the Compute-to-Data framework.
|
||||
|
||||
For each dataset, Publishers have the flexibility to define permission levels for algorithms to execute on their datasets, offering granular control over data access. 
|
||||
For each dataset, Publishers have the flexibility to define permission levels for algorithms to execute on their datasets, offering granular control over data access.
|
||||
|
||||
There are several options available for publishers to configure these permissions:
|
||||
|
||||
|
@ -4,7 +4,7 @@ description: Empowering the Decentralised Data Economy
|
||||
|
||||
# Contracts
|
||||
|
||||
The [V4 release](https://blog.oceanprotocol.com/ocean-v4-overview-1ccd4a7ce150) of Ocean Protocol introduces a comprehensive and enhanced suite of s[mart contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts) that serve as the backbone of the decentralized data economy. These contracts facilitate secure, transparent, and efficient interactions among data providers, consumers, and ecosystem participants. With the introduction of V4 contracts, Ocean Protocol propels itself forward, delivering substantial functionality, scalability, and flexibility advancements.
|
||||
The [V4 release](https://blog.oceanprotocol.com/ocean-v4-overview-1ccd4a7ce150) of Ocean Protocol introduces a comprehensive and enhanced suite of [smart contracts](https://github.com/oceanprotocol/contracts/tree/main/contracts) that serve as the backbone of the decentralized data economy. These contracts facilitate secure, transparent, and efficient interactions among data providers, consumers, and ecosystem participants. With the introduction of V4 contracts, Ocean Protocol propels itself forward, delivering substantial functionality, scalability, and flexibility advancements.
|
||||
|
||||
The V4 smart contracts have been deployed across multiple [networks](../../discover/networks/README.md) and are readily accessible through the GitHub [repository](https://github.com/oceanprotocol/contracts/tree/main/contracts). The V4 introduces significant enhancements that encompass the following key **features**:
|
||||
|
||||
|
@ -11,7 +11,7 @@ Embark on an exploration of the innovative realm of Ocean Protocol, where data f
|
||||
|
||||
### Layer 1: The Foundational Blockchain Layer
|
||||
|
||||
At the core of Ocean Protocol lies the robust [Blockchain Layer](../contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets. 
|
||||
At the core of Ocean Protocol lies the robust [Blockchain Layer](../contracts/README.md). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets.
|
||||
|
||||
The [smart contracts](../contracts/README.md) are deployed on the Ethereum mainnet and other compatible [networks](../../discover/networks/README.md). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](../contracts/README.md) section.
|
||||
|
||||
@ -23,7 +23,7 @@ Above the smart contracts, you'll find essential [libraries](architecture.md#lib
|
||||
|
||||
These libraries include [Ocean.js](../ocean.js/README.md), a JavaScript library, and [Ocean.py](../ocean.py/README.md), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol.
|
||||
|
||||
1. [Ocean.js](../ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment. 
|
||||
1. [Ocean.js](../ocean.js/README.md): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment.
|
||||
2. [Ocean.py](../ocean.py/README.md): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../../data-science/README.md) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data.
|
||||
|
||||
#### Middleware components
|
||||
@ -40,9 +40,9 @@ Additionally, in supporting the discovery process, middleware components come in
|
||||
|
||||
### Layer 3: The Accessible Application Layer
|
||||
|
||||
Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures. 
|
||||
Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures.
|
||||
|
||||
Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized. 
|
||||
Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized.
|
||||
|
||||
### Layer 4: The Friendly Wallets
|
||||
|
||||
|
@ -35,7 +35,7 @@ We have implemented data NFTs using the [ERC721 standard](https://erc721.org/).
|
||||
|
||||
ERC721 tokens are non-fungible, and thus cannot be used for automatic price discovery like ERC20 tokens. ERC721 and ERC20 combined together can be used for sub-licensing. Ocean Protocol's [ERC721Template](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC721Template.sol) solves this problem by using ERC721 for tokenizing the **Base IP** and tokenizing sub-licenses by using ERC20. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**.
|
||||
|
||||
Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. The data NFTs can be easily managed from any NFT marketplace like [OpenSea](https://opensea.io/). 
|
||||
Our implementation has been built on top of the battle-tested [OpenZeppelin contract library](https://docs.openzeppelin.com/contracts/4.x/erc721). However, there are a bunch of interesting parts of our implementation that go a bit beyond an out-of-the-box NFT. The data NFTs can be easily managed from any NFT marketplace like [OpenSea](https://opensea.io/).
|
||||
|
||||
<figure><img src="../../.gitbook/assets/wallet/data_nft_open_sea.png" alt=""><figcaption><p>Data NFT on Open Sea</p></figcaption></figure>
|
||||
|
||||
|
@ -21,7 +21,7 @@ The details regarding currently supported **datatoken templates** are as follows
|
||||
|
||||
### **Regular template**
|
||||
|
||||
The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a [`MINTER`](roles.md#minter) role, making the supply of datatoken variable. This template is assigned _**`templateId`**_` ``= 1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol).
|
||||
The regular template allows users to buy/sell/hold datatokens. The datatokens can be minted by the address having a [`MINTER`](roles.md#minter) role, making the supply of datatoken variable. This template is assigned _**`templateId =`**_`1` and the source code is available [here](https://github.com/oceanprotocol/contracts/blob/v4main/contracts/templates/ERC20Template.sol).
|
||||
|
||||
### **Enterprise template**
|
||||
|
||||
@ -29,11 +29,11 @@ The enterprise template has additional functions apart from methods in the ERC20
|
||||
|
||||
#### Set the template
|
||||
|
||||
When you're creating an ERC20 datatoken, you can specify the desired template by passing on the template index. 
|
||||
When you're creating an ERC20 datatoken, you can specify the desired template by passing on the template index.
|
||||
|
||||
{% tabs %}
|
||||
{% tab title="Ocean.js" %}
|
||||
To specify the datatoken template via ocean.js, you need to customize the [DatatokenCreateParams](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/%40types/Datatoken.ts#L3) with your desired `templateIndex`. 
|
||||
To specify the datatoken template via ocean.js, you need to customize the [DatatokenCreateParams](https://github.com/oceanprotocol/ocean.js/blob/ae2ff1ccde53ace9841844c316a855de271f9a3f/src/%40types/Datatoken.ts#L3) with your desired `templateIndex`.
|
||||
|
||||
The default template used is 1.
|
||||
|
||||
@ -53,7 +53,7 @@ export interface DatatokenCreateParams {
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Ocean.py" %}
|
||||
To specify the datatoken template via ocean.py, you need to customize the [DatatokenArguments](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/datatoken_base.py#L64) with your desired template\_index. 
|
||||
To specify the datatoken template via ocean.py, you need to customize the [DatatokenArguments](https://github.com/oceanprotocol/ocean.py/blob/bad11fb3a4cb00be8bab8febf3173682e1c091fd/ocean_lib/models/datatoken_base.py#L64) with your desired template\_index.
|
||||
|
||||
The default template used is 1.
|
||||
|
||||
@ -78,7 +78,7 @@ class DatatokenArguments:
|
||||
{% endtabs %}
|
||||
|
||||
{% hint style="info" %}
|
||||
By default, all assets published through the Ocean Market use the Enterprise Template. 
|
||||
By default, all assets published through the Ocean Market use the Enterprise Template.
|
||||
{% endhint %}
|
||||
|
||||
#### Retrieve the template
|
||||
@ -90,9 +90,9 @@ To identify the template used for a specific asset, you can easily retrieve this
|
||||
3. Once you have located the datatoken address, click on the contract tab to access more details.
|
||||
4. Within the contract details, we can identify and determine the template used for the asset.
|
||||
|
||||
 
|
||||
|
||||
We like making things easy :sunglasses: so here is an even easier way to retrieve the info for [this](https://market.oceanprotocol.com/asset/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6) asset published in the Ocean Market: 
|
||||
|
||||
We like making things easy :sunglasses: so here is an even easier way to retrieve the info for [this](https://market.oceanprotocol.com/asset/did:op:cd086344c275bc7c560e91d472be069a24921e73a2c3798fb2b8caadf8d245d6) asset published in the Ocean Market:
|
||||
|
||||
{% embed url="https://app.arcade.software/share/wxBPSc42eSYUiawSY8rC" fullWidth="false" %}
|
||||
{% endembed %}
|
||||
|
@ -6,7 +6,7 @@ description: ERC20 datatokens represent licenses to access the assets.
|
||||
|
||||
Fungible tokens are a type of digital asset that are identical and interchangeable with each other. Each unit of a fungible token holds the same value and can be exchanged on a one-to-one basis. This means that one unit of a fungible token is indistinguishable from another unit of the same token. Examples of fungible tokens include cryptocurrencies like Bitcoin (BTC) and Ethereum (ETH), where each unit of the token is equivalent to any other unit of the same token. Fungible tokens are widely used for transactions, trading, and as a means of representing value within blockchain-based ecosystems.
|
||||
|
||||
## What is a Datatoken? 
|
||||
## What is a Datatoken?
|
||||
|
||||
Datatokens are fundamental within Ocean Protocol, representing a key mechanism to **access** data assets in a decentralized manner. In simple terms, a datatoken is an **ERC20-compliant token** that serves as **access control** for a data/service represented by a [data NFT](data-nfts.md).
|
||||
|
||||
|
@ -25,17 +25,17 @@ However, if you're building a custom marketplace, you have the flexibility to in
|
||||
|
||||
When a user exchanges a [datatoken](datatokens.md) for the privilege of downloading an asset or initiating a compute job that utilizes the asset, consume fees come into play. These fees are associated with accessing an asset and include:
|
||||
|
||||
1. **Publisher Market** Consumption Fee 
|
||||
1. **Publisher Market** Consumption Fee
|
||||
* Defined during the ERC20 [creation](https://github.com/oceanprotocol/contracts/blob/b937a12b50dc4bdb7a6901c33e5c8fa136697df7/contracts/templates/ERC721Template.sol#L334).
|
||||
* Defined as Address, Token, Amount. The amount is an absolute value(not a percentage).
|
||||
* A marketplace can charge a specified amount per order. 
|
||||
* A marketplace can charge a specified amount per order.
|
||||
* Eg: A market can set a fixed fee of 10 USDT per order, no matter what pricing schemas are used (fixedrate with ETH, BTC, dispenser, etc).
|
||||
2. **Consume Market** Consumption Fee 
|
||||
*  A market can specify what fee it wants on the order function.
|
||||
3. **Provider Consumption** Fees 
|
||||
2. **Consume Market** Consumption Fee
|
||||
* A market can specify what fee it wants on the order function.
|
||||
3. **Provider Consumption** Fees
|
||||
* Defined by the [Provider](../provider/README.md) for any consumption.
|
||||
* Expressed in: Address, Token, Amount (absolute), Timeout.
|
||||
* You can retrieve them when calling the initialize endpoint. 
|
||||
* You can retrieve them when calling the initialize endpoint.
|
||||
* Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser).
|
||||
4. **Ocean Community** Fee
|
||||
* Ocean's smart contracts collect **Ocean Community fees** during order operations. These fees are reinvested in community projects and distributed to the veOcean holders through Data Farming.
|
||||
@ -70,7 +70,7 @@ function updateOPCFee(uint256 _newSwapOceanFee, uint256 _newSwapNonOceanFee,
|
||||
|
||||
</details>
|
||||
|
||||
Each of these fees plays a role in ensuring fair compensation and supporting the Ocean community. 
|
||||
Each of these fees plays a role in ensuring fair compensation and supporting the Ocean community.
|
||||
|
||||
| Fee | Value in Ocean Market | Value in Other Markets |
|
||||
| ---------------- | :-------------------: | -------------------------------------------------------- |
|
||||
@ -81,14 +81,14 @@ Each of these fees plays a role in ensuring fair compensation and supporting the
|
||||
|
||||
### Provider fee
|
||||
|
||||
[Providers](../provider/README.md) facilitate data consumption, initiate compute jobs, encrypt and decrypt DDOs, and verify user access to specific data assets or services. 
|
||||
[Providers](../provider/README.md) facilitate data consumption, initiate compute jobs, encrypt and decrypt DDOs, and verify user access to specific data assets or services.
|
||||
|
||||
Provider fees serve as [compensation](../community-monetization.md#3.-running-your-own-provider) to the individuals or organizations operating their own provider instances when users request assets. 
|
||||
Provider fees serve as [compensation](../community-monetization.md#3.-running-your-own-provider) to the individuals or organizations operating their own provider instances when users request assets.
|
||||
|
||||
* Defined by the [Provider](../provider/README.md) for any consumption.
|
||||
* Expressed in: Address, Token, Amount (absolute), Timeout.
|
||||
* You can retrieve them when calling the initialize endpoint. 
|
||||
* These fees can be set as a **fixed amount** rather than a percentage. 
|
||||
* You can retrieve them when calling the initialize endpoint.
|
||||
* These fees can be set as a **fixed amount** rather than a percentage.
|
||||
* Providers have the flexibility to specify the token in which the fees must be paid, which can differ from the token used in the consuming market.
|
||||
* Provider fees can be utilized to charge for [computing](../compute-to-data/README.md) resources. Consumers can select the desired payment amount based on the compute resources required to execute an algorithm within the [Compute-to-Data](../compute-to-data/README.md) environment, aligning with their specific needs.
|
||||
* Eg: A provider can charge a fixed fee of 10 USDT per consume, irrespective of the pricing schema used (e.g., fixed rate with ETH, BTC, dispenser).
|
||||
|
@ -8,7 +8,7 @@ Having a [data NFT](data-nfts.md) that generates revenue continuously, even when
|
||||
|
||||
<figure><img src="../../.gitbook/assets/gif/sponge-money.gif" alt=""><figcaption><p>Make it rain</p></figcaption></figure>
|
||||
|
||||
By default, the revenue generated from a [data NFT](data-nfts.md) is directed to the [owner](roles.md#nft-owner) of the NFT. This arrangement automatically updates whenever the data NFT is transferred to a new owner. C
|
||||
By default, the revenue generated from a [data NFT](data-nfts.md) is directed to the [owner](roles.md#nft-owner) of the NFT. This arrangement automatically updates whenever the data NFT is transferred to a new owner.
|
||||
|
||||
However, there are scenarios where you may prefer the revenue to be sent to a different account instead of the owner. This can be accomplished by designating a new payment collector. This feature becomes particularly beneficial when the data NFT is owned by an organization or enterprise rather than an individual.
|
||||
|
||||
|
@ -9,9 +9,9 @@ description: >-
|
||||
|
||||
# DDO Specification
|
||||
|
||||
### DDO Schema - High Level 
|
||||
### DDO Schema - High Level
|
||||
|
||||
The below diagram shows the high-level DDO schema depicting the content of each data structure and the relations between them. 
|
||||
The below diagram shows the high-level DDO schema depicting the content of each data structure and the relations between them.
|
||||
|
||||
Please note that some data structures apply only on certain types of services or assets.
|
||||
|
||||
@ -492,7 +492,7 @@ Details for each of these are explained on the [Compute Options page](compute-to
|
||||
|
||||
### DDO Schema - Detailed
|
||||
|
||||
The below diagram shows the detailed DDO schema depicting the content of each data structure and the relations between them. 
|
||||
The below diagram shows the detailed DDO schema depicting the content of each data structure and the relations between them.
|
||||
|
||||
Please note that some data structures apply only on certain types of services or assets.
|
||||
|
||||
|
@ -13,11 +13,11 @@ Ocean offers two approaches to facilitate fractional ownership:
|
||||
1. Sharded Holding of ERC20 Datatokens: Under this approach, each holder of ERC20 tokens possesses the typical datatoken rights outlined earlier. For instance, owning 1.0 datatoken allows consumption of a particular asset. Ocean conveniently provides this feature out of the box.
|
||||
2. Sharding ERC721 Data NFT: This method involves dividing the ownership of an ERC721 data NFT among multiple individuals, granting each co-owner the right to a portion of the earnings generated from the underlying IP. Moreover, these co-owners collectively control the data NFT. For instance, a dedicated DAO may be established to hold the data NFT, featuring its own ERC20 token. DAO members utilize their tokens to vote on updates to data NFT roles or the deployment of ERC20 datatokens associated with the ERC721.
|
||||
|
||||
It's worth noting that for the second approach, one might consider utilizing platforms like Niftex for sharding. However, important questions arise in this context: 
|
||||
It's worth noting that for the second approach, one might consider utilizing platforms like Niftex for sharding. However, important questions arise in this context:
|
||||
|
||||
* What specific rights do shard-holders possess? 
|
||||
* What specific rights do shard-holders possess?
|
||||
* It's possible that they have limited rights, just as Amazon shareholders don't have the authority to roam the hallways of Amazon's offices simply because they own shares
|
||||
* Additionally, how do shard-holders exercise control over the data NFT? 
|
||||
* Additionally, how do shard-holders exercise control over the data NFT?
|
||||
|
||||
These concerns are effectively addressed by employing a tokenized DAO, as previously described.
|
||||
|
||||
|
@ -36,7 +36,7 @@ console.log(did)
|
||||
|
||||
```
|
||||
|
||||
Before creating a DID you should first publish a data NFT, we suggest reading the following sections so you are familiar with the process: 
|
||||
Before creating a DID you should first publish a data NFT, we suggest reading the following sections so you are familiar with the process:
|
||||
|
||||
* [Creating a data NFT with ocean.js](ocean.js/creating-datanft.md)
|
||||
* [Publish flow with ocean.py](ocean.py/publish-flow.md)
|
||||
|
@ -29,5 +29,3 @@ Our module structure follows this format:
|
||||
* Utils
|
||||
|
||||
When working with a particular module, you will need to provide different parameters. To instantiate classes from the contracts module, you must pass objects such as Signer, which represents the wallet instance, or the contract address you wish to utilize, depending on the scenario. As for the services modules, you will need to provide the provider URI or metadata cache URI.
|
||||
|
||||
  
|
||||
|
@ -2,7 +2,7 @@
|
||||
|
||||
**Overview**
|
||||
|
||||
Compute-to-Data is a powerful feature of Ocean Protocol that enables privacy-preserving data analysis and computation. With Compute-to-Data, data owners can maintain control over their data while allowing external parties to perform computations on that data. 
|
||||
Compute-to-Data is a powerful feature of Ocean Protocol that enables privacy-preserving data analysis and computation. With Compute-to-Data, data owners can maintain control over their data while allowing external parties to perform computations on that data.
|
||||
|
||||
This documentation provides an overview of Compute-to-Data in Ocean Protocol and explains how to use it with Ocean.js. For detailed code examples and implementation details, please refer to the official [Ocean.js](https://github.com/oceanprotocol/ocean.js) GitHub repository.
|
||||
|
||||
|
@ -74,7 +74,7 @@ PRIVATE_KEY=0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58
|
||||
{% endtab %}
|
||||
{% endtabs %}
|
||||
|
||||
Replace `<replace this>` with the appropriate values. \*\*You can see all the networks configuration on Oceanjs' [config helper](https://github.com/oceanprotocol/ocean.js/blob/main/src/config/ConfigHelper.ts#L42).
|
||||
Replace `<replace this>` with the appropriate values. You can see all the networks configuration on Oceanjs' [config helper](https://github.com/oceanprotocol/ocean.js/blob/main/src/config/ConfigHelper.ts#L42).
|
||||
|
||||
### Setup dependencies
|
||||
|
||||
|
@ -11,7 +11,7 @@ This tutorial guides you through the process of creating your own data NFT using
|
||||
|
||||
#### Create a script to deploy dataNFT
|
||||
|
||||
The provided script demonstrates how to create a data NFT using Oceanjs. 
|
||||
The provided script demonstrates how to create a data NFT using Oceanjs.
|
||||
|
||||
First, create a new file in the working directory, alongside the `config.js` and `.env` files. Name it `create_dataNFT.js` (or any appropriate name). Then, copy the following code into the new created file:
|
||||
|
||||
|
@ -17,7 +17,7 @@ Create a new file in the same working directory where configuration file (`confi
|
||||
**Fees**: The code snippets below define fees related parameters. Please refer [fees page ](../contracts/fees.md)for more details
|
||||
{% endhint %}
|
||||
|
||||
The code utilizes methods such as `NftFactory` and `Datatoken` from the Ocean libraries to enable you to interact with the Ocean Protocol and perform various operations related to data NFTs and datatokens. 
|
||||
The code utilizes methods such as `NftFactory` and `Datatoken` from the Ocean libraries to enable you to interact with the Ocean Protocol and perform various operations related to data NFTs and datatokens.
|
||||
|
||||
The `createFRE()` performs the following:
|
||||
|
||||
|
@ -102,7 +102,7 @@ The _beginning_ of the file should contain the following contents:
|
||||
...
|
||||
```
|
||||
|
||||
Here’s a video version this post 👇.
|
||||
Here’s a video version for this post 👇
|
||||
|
||||
{% embed url="https://www.youtube.com/watch?v=JQF-5oRvq9w" %}
|
||||
Main Flow Video
|
||||
|
@ -4,8 +4,7 @@ Let’s start interacting with the python library by firstly installing it & its
|
||||
|
||||
From the adventurous `Python 3.8.5` all the way up to `Python 3.10.4`, ocean.py has got your back! 🚀
|
||||
|
||||
While `ocean.py` can join you on your `Python 3.11` journey, a few manual tweaks may be required. But worry not, brave explorers, we've got all the juicy details for you below! 📚✨\
|
||||
\
|
||||
While `ocean.py` can join you on your `Python 3.11` journey, a few manual tweaks may be required. But worry not, brave explorers, we've got all the juicy details for you below! 📚✨
|
||||
⚠️ Make sure that you have `autoconf`, `pkg-config` and `build-essential` or their equivalents installed on your host.
|
||||
|
||||
### Installing ocean.py
|
||||
@ -54,7 +53,6 @@ Let's dive deeper into the Ocean world! 💙 Did you know that Ocean and Brownie
|
||||
|
||||
Oh, buoy! 🌊🐙 When it comes to installation, ocean.py has you covered with a special README called ["install.md"](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/install.md). It's like a trusty guide that helps you navigate all the nitty-gritty details. So, let's dive in and ride the waves of installation together! 🏄♂️🌊
|
||||
|
||||
\
|
||||
Or if you prefer a video format, you can check this tutorial on Youtube
|
||||
|
||||
{% embed url="https://www.youtube.com/watch?v=mbniGPNHE_M" %}
|
||||
|
@ -31,7 +31,7 @@ You've now published an Ocean asset!
|
||||
|
||||
* [`data_nft`](../contracts/data-nfts.md) is the base (base IP)
|
||||
* [`datatoken`](../contracts/datatokens.md) for access by others (licensing)
|
||||
* `ddo` holding metadata
|
||||
* [`ddo`](../ddo-specification.md) holding metadata
|
||||
|
||||
<figure><img src="../../.gitbook/assets/gif/200.webp" alt=""><figcaption></figcaption></figure>
|
||||
|
||||
@ -113,7 +113,7 @@ If you call `create()` after this, you can pass in an argument `deployed_datatok
|
||||
|
||||
Ocean Assets allows you to bundle several common scenarios as a single transaction, thus lowering gas fees.
|
||||
|
||||
Any of the `ocean.assets.create_<type>_asset()` functions can also take an optional parameter that describes a bundled [pricing schema](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) (Dispenser or Fixed Rate Exchange). 
|
||||
Any of the `ocean.assets.create_<type>_asset()` functions can also take an optional parameter that describes a bundled [pricing schema](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL199C5-L219C10) (Dispenser or Fixed Rate Exchange).
|
||||
|
||||
Here is an example involving an exchange:
|
||||
|
||||
|
@ -100,7 +100,7 @@ You can bypass manually: just edit your brownie network config file.
|
||||
|
||||
Or you can bypass via the command line. The following command replaces Infura RPCs with public ones in `network-config.yaml`:
|
||||
|
||||
* Linux users: in the console: 
|
||||
* Linux users: in the console:
|
||||
|
||||
{% code overflow="wrap" %}
|
||||
```bash
|
||||
|
@ -10,7 +10,7 @@ Retrieves the last-used nonce value for a specific user's Ethereum address.
|
||||
|
||||
Here are some typical responses you might receive from the API:
|
||||
|
||||
* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the nonce value. 
|
||||
* **200**: This is a successful HTTP response code. It means the server has successfully processed the request and returns a JSON object containing the nonce value.
|
||||
|
||||
Example response:
|
||||
|
||||
@ -42,7 +42,7 @@ Retrieves Content-Type and Content-Length from the given URL or asset.
|
||||
* `serviceId`: This is a string representing the ID of the service.
|
||||
* **Purpose**: This endpoint is used to retrieve the `Content-Type` and `Content-Length` from a given URL or asset. For published assets, `did` and `serviceId` should be provided. It also accepts file objects (as described in the Ocean Protocol documentation) and can compute a checksum if the file size is less than `MAX_CHECKSUM_LENGTH`. For larger files, the checksum will not be computed.
|
||||
* **Responses**:
|
||||
* **200**: This is a successful HTTP response code. It returns a JSON object containing the file info. 
|
||||
* **200**: This is a successful HTTP response code. It returns a JSON object containing the file info.
|
||||
|
||||
Example response:
|
||||
|
||||
@ -89,11 +89,11 @@ console.log(response)
|
||||
|
||||
#### Javascript Example
|
||||
|
||||
Before calling the `/download` endpoint, you need to follow these steps: 
|
||||
Before calling the `/download` endpoint, you need to follow these steps:
|
||||
|
||||
1. You need to set up and connect a wallet for the consumer. The consumer needs to have purchased the datatoken for the asset that you are trying to download. Libraries such as ocean.js or ocean.py can be used for this.
|
||||
2. Get the nonce. This can be done by calling the `/getnonce` endpoint above.
|
||||
3. Sign a message from the account that has purchased the datatoken. 
|
||||
3. Sign a message from the account that has purchased the datatoken.
|
||||
4. Add the nonce and signature to the payload.
|
||||
|
||||
```javascript
|
||||
@ -139,7 +139,7 @@ downloadAsset(payload);
|
||||
|
||||
### Initialize
|
||||
|
||||
In order to consume a data service the user is required to send one datatoken to the provider. 
|
||||
In order to consume a data service the user is required to send one datatoken to the provider.
|
||||
|
||||
The datatoken is transferred on the blockchain by requesting the user to sign an ERC20 approval transaction where the approval is given to the provider's account for the number of tokens required by the service.
|
||||
|
||||
|
@ -10,7 +10,7 @@ description: >-
|
||||
|
||||
The [Ocean Subgraph](https://github.com/oceanprotocol/ocean-subgraph) is built on top of [The Graph](https://thegraph.com/) (the popular :sunglasses: indexing and querying protocol for blockchain data). It is an essential component of the Ocean Protocol ecosystem. It provides an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. The data sourced from the Ocean subgraph can be accessed through [GraphQL](https://graphql.org/learn/) queries.
|
||||
|
||||
Imagine this 💭: if you were to always fetch data from the on-chain, you'd start to feel a little...old :older\_woman: Like your queries are stuck in a time warp. But fear not! When you embrace the power of the subgraph, data becomes your elixir of youth. 
|
||||
Imagine this 💭: if you were to always fetch data from the on-chain, you'd start to feel a little...old :older\_woman: Like your queries are stuck in a time warp. But fear not! When you embrace the power of the subgraph, data becomes your elixir of youth.
|
||||
|
||||
<figure><img src="../../.gitbook/assets/components/subgraph.png" alt=""><figcaption><p>Ocean Subgraph </p></figcaption></figure>
|
||||
|
||||
@ -42,7 +42,7 @@ When it comes to fetching valuable information about [Data NFTs](../contracts/da
|
||||
When making subgraph queries, please remember that the parameters you send, such as a datatoken address or a data NFT address, should be in **lowercase**. This is an essential requirement to ensure accurate processing of the queries. We kindly request your attention to this detail to facilitate a seamless query experience.
|
||||
{% endhint %}
|
||||
|
||||
In the following pages, we've prepared a few examples just for you. From running queries to exploring data, you'll have the chance to dive right into the Ocean Subgraph data. There, you'll find a wide range of additional code snippets and examples that showcase the power and versatility of the Ocean Subgraph. So, grab a virtual snorkel, and let's explore together! 🤿\\
|
||||
In the following pages, we've prepared a few examples just for you. From running queries to exploring data, you'll have the chance to dive right into the Ocean Subgraph data. There, you'll find a wide range of additional code snippets and examples that showcase the power and versatility of the Ocean Subgraph. So, grab a virtual snorkel, and let's explore together! 🤿
|
||||
|
||||
{% hint style="info" %}
|
||||
For more examples, visit the subgraph GitHub [repository](https://github.com/oceanprotocol/ocean-subgraph), where you'll discover an extensive collection of code snippets and examples that highlight the Subgraph's capabilities and adaptability.
|
||||
|
@ -127,7 +127,7 @@ python datatoken_buyers.py
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Query" %}
|
||||
Copy the query to fetch the list of buyers for a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph). 
|
||||
Copy the query to fetch the list of buyers for a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mumbai.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph).
|
||||
|
||||
```graphql
|
||||
|
||||
|
@ -150,7 +150,7 @@ print(json.dumps(result, indent=4, sort_keys=True))
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Query" %}
|
||||
Copy the query to fetch the information of a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). 
|
||||
Copy the query to fetch the information of a datatoken in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql).
|
||||
|
||||
```
|
||||
{
|
||||
|
@ -154,7 +154,7 @@ axios(config)
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Python" %}
|
||||
You can employ the following Python script to execute the query and fetch the list of veOCEAN holders from the subgraph. 
|
||||
You can employ the following Python script to execute the query and fetch the list of veOCEAN holders from the subgraph.
|
||||
|
||||
{% code title="get_veOcean_holders.py" %}
|
||||
```python
|
||||
|
@ -4,11 +4,11 @@ description: 'Discover the World of NFTs: Retrieving a List of Data NFTs'
|
||||
|
||||
# Get data NFTs
|
||||
|
||||
If you are already familiarized with the concept of NFTs, you're off to a great start. However, if you require a refresher, we recommend visiting the [data NFTs and datatokens page](../contracts/datanft-and-datatoken.md) for a quick overview. 
|
||||
If you are already familiarized with the concept of NFTs, you're off to a great start. However, if you require a refresher, we recommend visiting the [data NFTs and datatokens page](../contracts/datanft-and-datatoken.md) for a quick overview.
|
||||
|
||||
Now, let us delve into the realm of utilizing the subgraph to extract a list of data NFTs that have been published using the Ocean contracts. By employing GraphQL queries, we can seamlessly retrieve the desired information from the subgraph. You'll see how simple it is :sunglasses:
|
||||
|
||||
You'll find below an example of a GraphQL query that retrieves the first 10 data NFTs from the subgraph. The GraphQL query is structured to access the "nfts" route, extracting the first 10 elements. For each item retrieved, it retrieves the "id," "name," "symbol," "owner," "address," "assetState," "tx," "block," and "transferable" parameters.
|
||||
You'll find below an example of a GraphQL query that retrieves the first 10 data NFTs from the subgraph. The GraphQL query is structured to access the "nfts" route, extracting the first 10 elements. For each item retrieved, it retrieves the `id`, `name`, `symbol`, `owner`, `address`, `assetState`, `tx`, `block` and `transferable` parameters.
|
||||
|
||||
There are several options available to see this query in action. Below, you will find three:
|
||||
|
||||
|
@ -12,7 +12,7 @@ _PS: In this example, the query is executed on the Ocean subgraph deployed on th
|
||||
|
||||
{% tabs %}
|
||||
{% tab title="Javascript" %}
|
||||
The javascript below can be used to run the query. If you wish to change the network, replace the variable's value `network` as needed. 
|
||||
The javascript below can be used to run the query. If you wish to change the network, replace the variable's value `network` as needed.
|
||||
|
||||
```runkit nodeVersion="18.x.x"
|
||||
var axios = require('axios');
|
||||
@ -134,7 +134,7 @@ python list_all_tokens.py
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Query" %}
|
||||
Copy the query to fetch a list of datatokens in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). 
|
||||
Copy the query to fetch a list of datatokens in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql).
|
||||
|
||||
```graphql
|
||||
{
|
||||
|
@ -12,7 +12,7 @@ _PS: In this example, the query is executed on the Ocean subgraph deployed on th
|
||||
|
||||
{% tabs %}
|
||||
{% tab title="Javascript" %}
|
||||
The javascript below can be used to run the query and fetch a list of fixed-rate exchanges. If you wish to change the network, replace the variable's value `network` as needed. 
|
||||
The javascript below can be used to run the query and fetch a list of fixed-rate exchanges. If you wish to change the network, replace the variable's value `network` as needed.
|
||||
|
||||
```runkit nodeVersion="18.x.x"
|
||||
var axios = require('axios');
|
||||
@ -140,7 +140,7 @@ python list_fixed_rate_exchanges.py
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Query" %}
|
||||
Copy the query to fetch a list of fixed-rate exchanges in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql). 
|
||||
Copy the query to fetch a list of fixed-rate exchanges in the Ocean Subgraph [GraphiQL interface](https://v4.subgraph.mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph/graphql).
|
||||
|
||||
```
|
||||
{
|
||||
|
@ -42,5 +42,4 @@ A new data economy with power to the people - Trent McConaghy
|
||||
|
||||
### Ocean Protocol Whitepaper
|
||||
|
||||
\
|
||||
If you'd like to explore the details of our technology, feel free to dive into our [whitepaper](https://oceanprotocol.com/tech-whitepaper.pdf)! It's a comprehensive resource that explains all the technical details and the core concepts that drive Ocean Protocol. It's a great way to get a deeper understanding of what we're all about.
|
||||
|
@ -9,7 +9,7 @@ Ocean Protocol is a **decentralized data exchange protocol** that aims to unlock
|
||||
Ocean Protocol is used for a variety of purposes, including:
|
||||
|
||||
1. **Data Sharing**: Ocean Protocol allows individuals and organizations to share data securely and privately, enabling data owners to monetize their data assets while maintaining control over their data.
|
||||
2. **Data Monetization**: Data owners can monetize their data by offering it for sale or by providing data services through compute-to-data (C2D) capabilities. Data consumers can access and utilize data assets. 
|
||||
2. **Data Monetization**: Data owners can monetize their data by offering it for sale or by providing data services through compute-to-data (C2D) capabilities. Data consumers can access and utilize data assets.
|
||||
3. **Decentralized Data Marketplaces**: Ocean Protocol facilitates the creation of decentralized data marketplaces where data providers can list their data assets and data consumers can discover and access them. These marketplaces operate on a peer-to-peer basis, eliminating the need for intermediaries and providing more efficient and transparent data transactions.
|
||||
4. **AI Development**: Ocean Protocol supports the development of AI models by providing access to diverse and high-quality datasets. Data scientists and AI developers can leverage these datasets to train and improve their models, leading to more accurate and robust AI systems.
|
||||
5. **Access control:** Ocean Protocol incorporates token-gating mechanisms that grant or restrict access to specific data assets based on predefined criteria, ensuring controlled and regulated data sharing within the ecosystem.
|
||||
@ -142,8 +142,7 @@ As an Ocean Ambassador, you become an advocate for the protocol, promoting its v
|
||||
|
||||
<summary><mark style="color:green;">Contribute to Ocean Code Development</mark><br><br>Make a positive impact in the Web3 data economy by contributing to <a href="https://github.com/oceanprotocol">Ocean's open source code</a> on Github! From feature requests to pull requests, contributions of all kinds are appreciated.</summary>
|
||||
|
||||
To begin, [visit our Github page](https://github.com/oceanprotocol) where you can see the repos and contributors. If you're going to contribute code to a repo, then we ask that you fork the code first, make your changes, and then create a pull request for us to review. If you are reporting an issue, then please first search the existing issues to see if it is documented yet. If not, then please open a new issue by describe your problem as best as possible and include screenshots.\
|
||||
\
|
||||
To begin, [visit our Github page](https://github.com/oceanprotocol) where you can see the repos and contributors. If you're going to contribute code to a repo, then we ask that you fork the code first, make your changes, and then create a pull request for us to review. If you are reporting an issue, then please first search the existing issues to see if it is documented yet. If not, then please open a new issue by describe your problem as best as possible and include screenshots.
|
||||
We also welcome you to join our [Discord developer community](https://discord.gg/TnXjkR5) where you can get rapid, practical advice on using Ocean tech but also get to know Ocean core team more personally!
|
||||
|
||||
</details>
|
||||
|
@ -102,7 +102,7 @@ The blockchain can do more than just store information - it can also run code. A
|
||||
|
||||
<summary>What is a datatoken?</summary>
|
||||
|
||||
A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. . If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset.
|
||||
A datatoken is an access token to datasets and services published in the Ocean ecosystem. Datatokens can be purchased via the Ocean Market or on a decentralized crypto exchange. If a consumer wishes to access a dataset, they must acquire the datatoken and then exchange the datatoken for access to the dataset.
|
||||
|
||||
</details>
|
||||
|
||||
@ -268,7 +268,7 @@ To learn more about systems driving veOCEAN and Data Farming, please [visit our
|
||||
|
||||
<summary>What about passive stakers — people who just want to stake in one place and be done?</summary>
|
||||
|
||||
Earnings are passive by default
|
||||
Earnings are passive by default.
|
||||
|
||||
</details>
|
||||
|
||||
@ -276,7 +276,7 @@ Earnings are passive by default
|
||||
|
||||
<summary>What about active stakers — people who want to do extra work and get rewarded?</summary>
|
||||
|
||||
Ot works. Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more \$$
|
||||
Half the DF revenue goes to veOCEAN stake that users can allocate. Allocate well → more \$$.
|
||||
|
||||
</details>
|
||||
|
||||
@ -338,7 +338,7 @@ They are deployed on Ethereum mainnet, alongside other Ocean contract deployment
|
||||
|
||||
<summary>What is the official veOCEAN epoch start_time?</summary>
|
||||
|
||||
veFeeDistributor has a start\_time of 1663804800 (Thu Sep 22 2022 00:00:00)
|
||||
veFeeDistributor has a start\_time of 1663804800 (Thu Sep 22 2022 00:00:00).
|
||||
|
||||
</details>
|
||||
|
||||
|
@ -20,7 +20,7 @@ Ocean Protocol is a decentralized data exchange protocol that enables individual
|
||||
|
||||
<summary>$OCEAN</summary>
|
||||
|
||||
The Ocean Protocol token (OCEAN) is a utility token used in the Ocean Protocol ecosystem. It serves as a medium of exchange and a unit of value for data services in the network. Participants in the Ocean ecosystem can use OCEAN tokens to buy and sell data, stake on data assets, and participate in the governance of the protocol. 
|
||||
The Ocean Protocol token (OCEAN) is a utility token used in the Ocean Protocol ecosystem. It serves as a medium of exchange and a unit of value for data services in the network. Participants in the Ocean ecosystem can use OCEAN tokens to buy and sell data, stake on data assets, and participate in the governance of the protocol.
|
||||
|
||||
</details>
|
||||
|
||||
@ -36,7 +36,7 @@ The data consume value (DCV) is a key metric that refers to the amount of $ spen
|
||||
|
||||
<summary>Transaction Volume (TV)</summary>
|
||||
|
||||
The transaction value is a key metric that refers to the value of transactions within the ecosystem. 
|
||||
The transaction value is a key metric that refers to the value of transactions within the ecosystem.
|
||||
|
||||
Transaction volume(TV) is often used interchangeably with data consume volume (DCV). DCV is a more refined metric that excludes activities like wash trading. DCV measures the actual consumption or processing of data within the protocol, which is a more accurate measure of the value generated by the ecosystem.
|
||||
|
||||
@ -262,9 +262,9 @@ In the context of Ocean Protocol, interoperability enables the integration of th
|
||||
|
||||
<summary>Smart contract</summary>
|
||||
|
||||
Smart contracts are self-executing digital contracts that allow for the automation and verification of transactions without the need for a third party. They are programmed using code and operate on a decentralized blockchain network. Smart contracts are designed to enforce the rules and regulations of a contract, ensuring that all parties involved fulfill their obligations. Once the conditions of the contract are met, the smart contract automatically executes the transaction, ensuring that the terms of the contract are enforced in a transparent and secure manner. 
|
||||
Smart contracts are self-executing digital contracts that allow for the automation and verification of transactions without the need for a third party. They are programmed using code and operate on a decentralized blockchain network. Smart contracts are designed to enforce the rules and regulations of a contract, ensuring that all parties involved fulfill their obligations. Once the conditions of the contract are met, the smart contract automatically executes the transaction, ensuring that the terms of the contract are enforced in a transparent and secure manner.
|
||||
|
||||
Ocean ecosystem smart contracts are deployed on multiple blockchains like Polygon, Energy Web Chain, Binance Smart Chain, and others. The code is open source and available on the organization's [GitHub](https://github.com/oceanprotocol/contracts). 
|
||||
Ocean ecosystem smart contracts are deployed on multiple blockchains like Polygon, Energy Web Chain, Binance Smart Chain, and others. The code is open source and available on the organization's [GitHub](https://github.com/oceanprotocol/contracts).
|
||||
|
||||
</details>
|
||||
|
||||
@ -354,7 +354,7 @@ A term used in the cryptocurrency and blockchain space to encourage developers a
|
||||
|
||||
###
|
||||
|
||||
### Decentralized Finance (DeFI) fundamentals
|
||||
### Decentralized Finance (DeFi) fundamentals
|
||||
|
||||
<details>
|
||||
|
||||
@ -442,7 +442,7 @@ A strategy in which investors provide liquidity to a DeFi protocol in exchange f
|
||||
|
||||
<summary>AI</summary>
|
||||
|
||||
AI stands for Artificial Intelligence. It refers to the development of computer systems that can perform tasks that would typically require human intelligence to complete. AI technologies enable computers to learn, reason, and adapt in a way that resembles human cognition. 
|
||||
AI stands for Artificial Intelligence. It refers to the development of computer systems that can perform tasks that would typically require human intelligence to complete. AI technologies enable computers to learn, reason, and adapt in a way that resembles human cognition.
|
||||
|
||||
</details>
|
||||
|
||||
@ -452,9 +452,6 @@ AI stands for Artificial Intelligence. It refers to the development of computer
|
||||
|
||||
Machine learning is a subfield of artificial intelligence (AI) that involves teaching computers to learn from data, without being explicitly programmed. In other words, it is a way for machines to automatically learn and improve from experience, without being explicitly told what to do in every situation.
|
||||
|
||||
\
|
||||
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
|
@ -6,7 +6,7 @@ description: >-
|
||||
|
||||
# Manage Your OCEAN Tokens
|
||||
|
||||
If you don't see any Ocean Tokens in your crypto wallet software 🔎 (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet. 
|
||||
If you don't see any Ocean Tokens in your crypto wallet software 🔎 (e.g. MetaMask or MyEtherWallet), don't worry! It might not know how to manage Ocean Tokens yet.
|
||||
|
||||
### Token Information
|
||||
|
||||
|
@ -6,9 +6,9 @@ coverY: 0
|
||||
|
||||
# 🔨 Infrastructure
|
||||
|
||||
There are many ways in which the components can be deployed, from simple configurations used for development and testing to complex configurations, used for production systems. 
|
||||
There are many ways in which the components can be deployed, from simple configurations used for development and testing to complex configurations, used for production systems.
|
||||
|
||||
All the Ocean Protocol components ([Provider](../developers/provider/README.md), [Aquarius](../developers/aquarius/README.md), [Subgraph](../developers/subgraph/README.md)) are designed to run in Docker containers, on a Linux operating system. For simple configurations, we rely on Docker Engine and Docker Compose products to deploy and run our components, while for complex configurations we use Kubernetes. The guides included in this section will present both deployment options. 
|
||||
All the Ocean Protocol components ([Provider](../developers/provider/README.md), [Aquarius](../developers/aquarius/README.md), [Subgraph](../developers/subgraph/README.md)) are designed to run in Docker containers, on a Linux operating system. For simple configurations, we rely on Docker Engine and Docker Compose products to deploy and run our components, while for complex configurations we use Kubernetes. The guides included in this section will present both deployment options.
|
||||
|
||||
Please note that deploying the Ocean components requires a good understanding of:
|
||||
|
||||
@ -16,6 +16,6 @@ Please note that deploying the Ocean components requires a good understanding of
|
||||
* Docker Engine
|
||||
* Docker Compose or Kubernetes (depending on the configuration chosen for the component deployment)
|
||||
|
||||
Please note that although Ocean Marketplace is not a core component of our stack but rather an example of what can be achieved with our technology, in this section we included a guide on how to deploy it. 
|
||||
Please note that although Ocean Marketplace is not a core component of our stack but rather an example of what can be achieved with our technology, in this section we included a guide on how to deploy it.
|
||||
|
||||
All components need to be deployed on a server, so we included a guide about how to install and configure a server will all the necessary tools.
|
||||
|
@ -4,7 +4,7 @@ title: Minikube Compute-to-Data Environment
|
||||
|
||||
# Deploying C2D
|
||||
|
||||
This chapter will present how to deploy the C2D component of the Ocean stack. As mentioned in the [C2D Architecture chapter](../developers/compute-to-data/#architecture-and-overview-guides), the Compute-to-Data component uses Kubernetes to orchestrate the creation and deletion of the pods in which the C2D jobs are run. 
|
||||
This chapter will present how to deploy the C2D component of the Ocean stack. As mentioned in the [C2D Architecture chapter](../developers/compute-to-data/#architecture-and-overview-guides), the Compute-to-Data component uses Kubernetes to orchestrate the creation and deletion of the pods in which the C2D jobs are run.
|
||||
|
||||
For the ones that do not have a Kubernetes environment available, we added to this guide instructions on how to install Minikube, which is a lightweight Kubernetes implementation that creates a VM on your local machine and deploys a simple cluster containing only one node. In case you have a Kubernetes environment in place, please skip directly to step 4 of this guide.
|
||||
|
||||
@ -81,7 +81,7 @@ watch kubectl get pods --all-namespaces
|
||||
|
||||
#### Run the IPFS host (optional)
|
||||
|
||||
To store the results and the logs of the C2D jobs, you can use either an AWS S3 bucket or IPFS. 
|
||||
To store the results and the logs of the C2D jobs, you can use either an AWS S3 bucket or IPFS.
|
||||
|
||||
In case you want to use IPFS you need to run an IPFS host, as presented below.
|
||||
|
||||
@ -97,11 +97,11 @@ sudo /bin/sh -c 'echo "127.0.0.1 youripfsserver" >> /etc/hosts'
|
||||
|
||||
#### Update the storage class
|
||||
|
||||
The storage class is used by Kubernetes to create the temporary volumes on which the data used by the algorithm will be stored.  
|
||||
The storage class is used by Kubernetes to create the temporary volumes on which the data used by the algorithm will be stored.
|
||||
|
||||
Please ensure that your class allocates volumes in the same region and zone where you are running your pods. 
|
||||
Please ensure that your class allocates volumes in the same region and zone where you are running your pods.
|
||||
|
||||
You need to consider the storage class available for your environment. 
|
||||
You need to consider the storage class available for your environment.
|
||||
|
||||
For Minikube, you can use the default 'standard' class.
|
||||
|
||||
|
@ -29,7 +29,7 @@ This guide will deploy Aquarius, including Elasticsearch as a single systemd ser
|
||||
|
||||
From a terminal console, create /etc/docker/compose/aquarius/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. The following example is for deploying Aquarius for Goerli network.
|
||||
|
||||
For each other network in which you want to deploy Aquarius, add to the file a section similar to "aquarius-events-goerli" included in this example and update the corresponding parameters (i.e. EVENTS\_RPC, OCEAN\_ADDRESS, SUBGRAPH\_URLS) specific to that network. \\
|
||||
For each other network in which you want to deploy Aquarius, add to the file a section similar to "aquarius-events-goerli" included in this example and update the corresponding parameters (i.e. EVENTS\_RPC, OCEAN\_ADDRESS, SUBGRAPH\_URLS) specific to that network.
|
||||
|
||||
```yaml
|
||||
version: '3.9'
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
Ocean subgraph allows querying the datatoken, data NFT, and all event information using GraphQL. Hosting the Ocean subgraph saves the cost and time required in querying the data directly from the blockchain. The steps in this tutorial will explain how to host Ocean subgraph for the EVM-compatible chains supported by Ocean Protocol.
|
||||
|
||||
Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotocol/graph-node), therefore, in this document, we will show first how to deploy graph-node - either using Docker Engine or Kubernetes - and then how to install Ocean Subgraph on the graph-node system. 
|
||||
Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotocol/graph-node), therefore, in this document, we will show first how to deploy graph-node - either using Docker Engine or Kubernetes - and then how to install Ocean Subgraph on the graph-node system.
|
||||
|
||||
## Deploying Graph-node using Docker Engine and Docker Compose
|
||||
|
||||
@ -25,7 +25,7 @@ Ocean Subgraph is deployed on top of [graph-node](https://github.com/graphprotoc
|
||||
|
||||
#### 1. Create the /etc/docker/compose/graph-node/docker-compose.yml file
|
||||
|
||||
From a terminal console, create the _/etc/docker/compose/graph-node/docker-compose.yml_ file, then copy and paste the following content to it (. Check the comments in the file and replace the fields with the specific values of your implementation. 
|
||||
From a terminal console, create the _/etc/docker/compose/graph-node/docker-compose.yml_ file, then copy and paste the following content to it (. Check the comments in the file and replace the fields with the specific values of your implementation.
|
||||
|
||||
_/etc/docker/compose/graph-node/docker-compose.yml_ (annotated - example for `mumbai` network)
|
||||
|
||||
@ -174,7 +174,7 @@ Then, check the logs of the Ocean Subgraph docker container:
|
||||
docker logs graph-node [--follow]
|
||||
```
|
||||
|
||||
## Deploying graph-node using Kubernetes 
|
||||
## Deploying graph-node using Kubernetes
|
||||
|
||||
In this example, we will deploy graph-node as a Kubernetes deployment service. [graph-node](https://github.com/graphprotocol/graph-node) has the following dependencies: PostgreSQL and IPFS.
|
||||
|
||||
@ -431,7 +431,7 @@ spec:
|
||||
|
||||
## Deploy Ocean Subgraph
|
||||
|
||||
After you deployed graph-node, either using Kubernetes or Docker Compose, you can proceed to deploy Ocean Subgraph on top of it. 
|
||||
After you deployed graph-node, either using Kubernetes or Docker Compose, you can proceed to deploy Ocean Subgraph on top of it.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
|
@ -2,7 +2,7 @@
|
||||
|
||||
### About Provider
|
||||
|
||||
Provider encrypts the URL and metadata during publishing and decrypts the URL when the dataset is downloaded or a compute job is started. It enables access to the data assets by streaming data (and never the URL). It performs checks on-chain for buyer permissions and payments. It also provides compute services (connects to a C2D environment). 
|
||||
Provider encrypts the URL and metadata during publishing and decrypts the URL when the dataset is downloaded or a compute job is started. It enables access to the data assets by streaming data (and never the URL). It performs checks on-chain for buyer permissions and payments. It also provides compute services (connects to a C2D environment).
|
||||
|
||||
Provider is a multichain component, meaning that it can handle these tasks on multiple chains with the proper configurations. The source code of Provider can be accessed from [here](https://github.com/oceanprotocol/provider).
|
||||
|
||||
@ -36,7 +36,7 @@ The steps to deploy the Provider using Docker Engine and Docker Compose are:
|
||||
|
||||
#### 1. Create the /etc/docker/compose/provider/docker-compose.yml file
|
||||
|
||||
From a terminal console, create /etc/docker/compose/provider/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation. 
|
||||
From a terminal console, create /etc/docker/compose/provider/docker-compose.yml file, then copy and paste the following content to it. Check the comments in the file and replace the fields with the specific values of your implementation.
|
||||
|
||||
```yaml
|
||||
version: '3'
|
||||
@ -142,7 +142,7 @@ Jun 14 09:41:53 testvm systemd[1]: Finished provider service with docker compose
|
||||
|
||||
#### 6. Confirm the Provider is accessible
|
||||
|
||||
Once started, the Provider service is accessible on `localhost` port 8030/tcp. Run the following command to access the Provider. The output should be similar to the one displayed here. 
|
||||
Once started, the Provider service is accessible on `localhost` port 8030/tcp. Run the following command to access the Provider. The output should be similar to the one displayed here.
|
||||
|
||||
```bash
|
||||
$ curl localhost:8030
|
||||
@ -213,7 +213,7 @@ The steps to deploy the Provider in Kubernetes are:
|
||||
|
||||
#### 1. Create a YAML file for Provider configuration.
|
||||
|
||||
From a terminal window, create a YAML file (in our example the file is named provider-deploy.yaml) then copy and paste the following content. Check the comments in the file and replace the fields with the specific values of your implementation (RPC URLs, the private key etc.). 
|
||||
From a terminal window, create a YAML file (in our example the file is named provider-deploy.yaml) then copy and paste the following content. Check the comments in the file and replace the fields with the specific values of your implementation (RPC URLs, the private key etc.).
|
||||
|
||||
```yaml
|
||||
apiVersion: apps/v1
|
||||
|
@ -16,9 +16,9 @@ For simple configurations:
|
||||
|
||||
For complex configurations:
|
||||
|
||||
* Operating System: Linux distribution supported by Kubernetes and Docker Engine. Please refer to this link for details: [Kubernetes with Docker Engine](https://kubernetes.io/docs/setup/production-environment/container-runtimes/#docker). 
|
||||
* Operating System: Linux distribution supported by Kubernetes and Docker Engine. Please refer to this link for details: [Kubernetes with Docker Engine](https://kubernetes.io/docs/setup/production-environment/container-runtimes/#docker).
|
||||
|
||||
|
||||
 
|
||||
|
||||
## Server Size
|
||||
|
||||
@ -47,7 +47,7 @@ For complex configurations:
|
||||
|
||||
As mentioned earlier, you can use either an on-premise server or one hosted in the cloud (AWS, Azure, Digitalocean, etc.). To install the operating system on an on-premise server, please refer to the installation documentation of the operating system.
|
||||
|
||||
If you choose to use a server hosted in the cloud, you need to create the server using the user interface provided by the cloud platform. Following is an example of how to create a server in Digitalocean. 
|
||||
If you choose to use a server hosted in the cloud, you need to create the server using the user interface provided by the cloud platform. Following is an example of how to create a server in Digitalocean.
|
||||
|
||||
#### Example: Create an Ubuntu Linux server in the Digitalocean cloud
|
||||
|
||||
@ -81,7 +81,7 @@ Select the region where you want the component to be hosted and a root password.
|
||||
|
||||
5. Finish the configuration and create the server
|
||||
|
||||
Specify a hostname for the server, specify the project to which you assign the server, and then click on `Create Droplet.` 
|
||||
Specify a hostname for the server, specify the project to which you assign the server, and then click on `Create Droplet.`
|
||||
|
||||
<figure><img src="../.gitbook/assets/deployment/image (5).png" alt=""><figcaption><p>Finalize and create the server</p></figcaption></figure>
|
||||
|
||||
@ -113,9 +113,9 @@ sudo apt-get install docker-compose-plugin
|
||||
|
||||
### Install Kubernetes with Docker Engine
|
||||
|
||||
Kubernetes is an orchestration engine for containerized applications and the initial setup is dependent on the platform on which it is deployed - presenting how this product must be installed and configured is outside the scope of this document. 
|
||||
Kubernetes is an orchestration engine for containerized applications and the initial setup is dependent on the platform on which it is deployed - presenting how this product must be installed and configured is outside the scope of this document.
|
||||
|
||||
For cloud deployment, most of the cloud providers have dedicated turnkey solutions for Kubernetes. A comprehensive list of such cloud providers is presented [here](https://kubernetes.io/docs/setup/production-environment/turnkey-solutions/). 
|
||||
For cloud deployment, most of the cloud providers have dedicated turnkey solutions for Kubernetes. A comprehensive list of such cloud providers is presented [here](https://kubernetes.io/docs/setup/production-environment/turnkey-solutions/).
|
||||
|
||||
For an on-premise deployment of Kubernetes, please refer to this [link](https://kubernetes.io/docs/setup/).
|
||||
|
||||
|
@ -51,9 +51,6 @@ The plot below shows estimated APY over time. Green includes both passive and ac
|
||||
|
||||
APYs are an estimate because APY depends on OCEAN locked. OCEAN locked for future weeks is not known precisely; it must be estimated. The yellow line is the model for OCEAN locked. We modeled OCEAN locked by observing linear growth from week 5 (when OCEAN locking was introduced) to week 28 (now): OCEAN locked grew from 7.89M OCEAN to 34.98M OCEAN respectively, or 1.177M more OCEAN locked per week.
|
||||
|
||||
\
|
||||
|
||||
|
||||
<figure><img src="../.gitbook/assets/rewards/example_apys.png" alt="" width="563"><figcaption><p><em>Green: estimated APYs (passive + active). Black: estimated APYs (just passive). Yellow: estimated staking</em> </p></figcaption></figure>
|
||||
|
||||
All the plots are calculated from [this Google Sheet](https://docs.google.com/spreadsheets/d/1F4o7PbV45yW1aPWOJ2rwZEKkgJXbIk5Yq7tj8749drc/edit#gid=1051477754).
|
||||
|
@ -39,9 +39,9 @@ veOCEAN holders get weekly Data Farming rewards with a small carveout for any Oc
|
||||
|
||||
veOCEAN holders can generate yield completely passively if they wish, though they are incentivized with larger real yield if they **actively participate** in farming yield from assets.
|
||||
|
||||
Active rewards follow the usual Data Farming formula: $ of sales of the asset \* allocation to that asset.\*\* 
|
||||
Active rewards follow the usual Data Farming formula: $ of sales of the asset \* allocation to that asset.
|
||||
|
||||
\*\*There is no liquidity locked inside a datatoken pool, and this allocation is safe: you can’t lose your OCEAN as it is merely locked.
|
||||
There is no liquidity locked inside a datatoken pool, and this allocation is safe: you can’t lose your OCEAN as it is merely locked.
|
||||
|
||||
### veOCEAN Time Locking
|
||||
|
||||
|
@ -4,7 +4,7 @@ description: How to host your data and algorithm NFT assets like a champ 🏆
|
||||
|
||||
# Host Assets
|
||||
|
||||
The most important thing to remember is that wherever you host your asset... it needs to be **reachable & downloadable**. It cannot live behind a private firewall such as a private Github repo. You need to **use a proper hosting service!** 
|
||||
The most important thing to remember is that wherever you host your asset... it needs to be **reachable & downloadable**. It cannot live behind a private firewall such as a private Github repo. You need to **use a proper hosting service!**
|
||||
|
||||
**The URL to your asset is encrypted in the publishing process!**
|
||||
|
||||
@ -14,7 +14,7 @@ The most important thing to remember is that wherever you host your asset... it
|
||||
|
||||
In this section, we'll walk you through three options to store your assets: Arweave (decentralized storage), AWS (centralized storage), and Azure (centralized storage). Let's goooooo!
|
||||
|
||||
Read on, anon, if you are interested in the security details!
|
||||
Read on, if you are interested in the security details!
|
||||
|
||||
### Security Considerations
|
||||
|
||||
|
@ -44,7 +44,7 @@ After you make your commit (and merge your pull request, if applicable), then cl
|
||||
|
||||
**Step 3 - Get the RAW version of your file**
|
||||
|
||||
To use your file on the Market **you need to use the raw url of the asset**. Also, make sure your Repo is publicly accessible to allow the market to use that file. 
|
||||
To use your file on the Market **you need to use the raw url of the asset**. Also, make sure your Repo is publicly accessible to allow the market to use that file.
|
||||
|
||||
Open the File and click on the "Raw" button on the right side of the page.
|
||||
|
||||
|
@ -12,7 +12,7 @@ The bread and butter of the Data Farming dApp is incentivizing OCEAN rewards for
|
||||
|
||||
#### Step 1 - Navigate to the Data Farming dApp
|
||||
|
||||
* Go to https://df.oceandao.org
|
||||
* Go to [https://df.oceandao.org](https://df.oceandao.org)
|
||||
|
||||
#### Step 2 - Connect your wallet
|
||||
|
||||
@ -44,7 +44,7 @@ Do you have multiple wallets? Say you want to send rewards to someone you 💖 W
|
||||
|
||||
|
||||
|
||||
When you delegate, you transfer 100% of your veOCEAN allocation power for a limited period. You can delegate your active rewards \*without\* the need for reallocation and transaction fees! Note that after you delegate, then you cannot manage your allocations until the delegation expires. The delegation expiration date is the same as your veOCEAN Lock End Date at the time of delegation. If necessary, you can extend your Lock End Date before delegating. You can also cancel your delegation at any time 💪 Once delegated, rewards will be sent to the wallet address you delegated to. Then, the delegation receiver is in charge of your active rewards and is responsible for returning those back to you should you choose to do so. 
|
||||
When you delegate, you transfer 100% of your veOCEAN allocation power for a limited period. You can delegate your active rewards \*without\* the need for reallocation and transaction fees! Note that after you delegate, then you cannot manage your allocations until the delegation expires. The delegation expiration date is the same as your veOCEAN Lock End Date at the time of delegation. If necessary, you can extend your Lock End Date before delegating. You can also cancel your delegation at any time 💪 Once delegated, rewards will be sent to the wallet address you delegated to. Then, the delegation receiver is in charge of your active rewards and is responsible for returning those back to you should you choose to do so.
|
||||
|
||||
Follow these steps to delegate your veOCEAN:
|
||||
|
||||
@ -56,6 +56,6 @@ Follow these steps to delegate your veOCEAN:
|
||||
|
||||
#### What if someone delegates active rewards to you?
|
||||
|
||||
If you receive veOCEAN allocation power from other wallets, then you will receive their active rewards. You cannot delegate the veOCEAN you received from delegates, only the veOCEAN you received from your lock. 
|
||||
If you receive veOCEAN allocation power from other wallets, then you will receive their active rewards. You cannot delegate the veOCEAN you received from delegates, only the veOCEAN you received from your lock.
|
||||
|
||||
<figure><img src="https://1520763098-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzQlpIJEeu8x5yl0OLuXn%2Fuploads%2Fgit-blob-423f12f5b84205ab6cff8b79a1211fcd955d637c%2FveOCEAN-Delegation.png?alt=media" alt=""><figcaption></figcaption></figure>
|
||||
|
@ -8,7 +8,7 @@ description: >-
|
||||
|
||||
<figure><img src="../.gitbook/assets/gif/morpheus-taunting.gif" alt=""><figcaption><p>Bring on the data challenges.</p></figcaption></figure>
|
||||
|
||||
Hone your skills, work on real business problems, and earn sweet dosh along the way. 
|
||||
Hone your skills, work on real business problems, and earn sweet dosh along the way.
|
||||
|
||||
### What is an Ocean Protocol data challenge?
|
||||
|
||||
|
@ -28,7 +28,7 @@ Don't enjoy reading? Watch our video tutorial!
|
||||
2. Connect your wallet.
|
||||
3. Select the network where you would like to publish your NFT (ex. Ethereum, Polygon, etc).
|
||||
|
||||
<figure><img src="../.gitbook/assets/market/Connect-Wallet.png" alt=""><figcaption><p>Connect your wallet</p></figcaption></figure>
|
||||
<figure><img src="../.gitbook/assets/market/connect-wallet.png" alt=""><figcaption><p>Connect your wallet</p></figcaption></figure>
|
||||
|
||||
In this tutorial, we will be using the Polygon Mumbai test network.
|
||||
|
||||
|
@ -10,7 +10,7 @@ Hosting a data challenge is a fun way to engage data scientists and machine lear
|
||||
|
||||
### How to sponsor an Ocean Protocol data challenge?
|
||||
|
||||
1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing. 
|
||||
1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing.
|
||||
2. Curate the dataset(s) that participants will use for the challenge. The key to hosting a good data challenge is to provide an exciting and through dataset that participants can use to build their solutions. Do your research to understand what data is available, whether it be free from an API, available for download, require any transformations, etc. For the first challenge, it is alright if the created dataset is a static file. However, it is best to ensure there is a path to making the data available from a dynamic endpoint so that entires can eventually be applied to current, real-world use cases.
|
||||
3. Decide how the judging process will occur. This includes how long to make review period, how to score submissions, and how to decide any prizes will be divided among participants
|
||||
4. Work with Ocean Protocol to gather participants for your data challenge. Creating blog posts and hosting Twitter Spaces is a good way to spread the word about your data challenge.
|
||||
|
@ -26,5 +26,5 @@ The Ocean Market is a place for buyers + sellers of top-notch data and algorithm
|
||||
**If you are new to web3** and blockchain technologies then we suggest you first get familiar with some Web3 basics:
|
||||
|
||||
* [Wallet Basics](../discover/wallets/README.md) 👛
|
||||
* [Set Up MetaMask](../discover/wallets/metamask-setup.md) [Wallet ](../discover/wallets/metamask-setup.md)🦊
|
||||
* [Set Up MetaMask](../discover/wallets/metamask-setup.md) 🦊
|
||||
* [Manage Your OCEAN Tokens](../discover/wallets-and-ocean-tokens.md) 🪙
|
||||
|