mirror of
https://github.com/oceanprotocol/docs.git
synced 2024-11-26 19:49:26 +01:00
GITBOOK-631: Minor fixes
This commit is contained in:
parent
ca321b1e35
commit
68b56e97eb
@ -14,6 +14,10 @@ Private data holds immense value as it can significantly enhance research and bu
|
||||
|
||||
Private data has the potential to drive groundbreaking discoveries in science and technology, with increased data improving the predictive accuracy of modern AI models. Due to its scarcity and the challenges associated with accessing it, private data is often regarded as the most valuable. By utilizing private data through Compute-to-Data, significant rewards can be reaped, leading to transformative advancements and innovative breakthroughs.
|
||||
|
||||
{% hint style="info" %}
|
||||
The Ocean Protocol provides a compute environment that you can access at the following address: [https://stagev4.c2d.oceanprotocol.com/](https://stagev4.c2d.oceanprotocol.com/). Feel free to explore and utilize this platform for your needs.
|
||||
{% endhint %}
|
||||
|
||||
We suggest reading these guides to get an understanding on how compute-to-data works:
|
||||
|
||||
### User Guides
|
||||
|
@ -12,7 +12,7 @@ An asset categorized as a `compute type` incorporates additional attributes unde
|
||||
|
||||
These attributes are specifically relevant to assets that fall within the compute category and are not required for assets classified under the `access type`. However, if an asset is designated as `compute`, it is essential to include these attributes to provide comprehensive information about the compute service associated with the asset.
|
||||
|
||||
<table><thead><tr><th width="404.3333333333333">Attribute</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><strong><code>allowRawAlgorithm</code></strong>*</td><td><code>boolean</code></td><td>If <code>true</code>, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be <code>false</code> by default in all implementations.</td></tr><tr><td><strong><code>allowNetworkAccess</code></strong>*</td><td><code>boolean</code></td><td>If <code>true</code>, the algorithm job will have network access.</td></tr><tr><td><strong><code>publisherTrustedAlgorithmPublishers</code></strong>*</td><td>Array of <code>string</code></td><td>If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed.</td></tr><tr><td><strong><code>publisherTrustedAlgorithms</code></strong>*</td><td>Array of <code>publisherTrustedAlgorithms</code></td><td>If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below).</td></tr></tbody></table>
|
||||
<table><thead><tr><th width="224.33333333333331">Attribute</th><th width="154">Type</th><th>Description</th></tr></thead><tbody><tr><td><strong><code>allowRawAlgorithm</code></strong>*</td><td><code>boolean</code></td><td>If <code>true</code>, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be <code>false</code> by default in all implementations.</td></tr><tr><td><strong><code>allowNetworkAccess</code></strong>*</td><td><code>boolean</code></td><td>If <code>true</code>, the algorithm job will have network access.</td></tr><tr><td><strong><code>publisherTrustedAlgorithmPublishers</code></strong>*</td><td>Array of <code>string</code></td><td>If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed.</td></tr><tr><td><strong><code>publisherTrustedAlgorithms</code></strong>*</td><td>Array of <code>publisherTrustedAlgorithms</code></td><td>If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below).</td></tr></tbody></table>
|
||||
|
||||
\* Required
|
||||
|
||||
@ -22,9 +22,9 @@ The `publisherTrustedAlgorithms` is an array of objects that specifies algorithm
|
||||
|
||||
The structure of each object within the `publisherTrustedAlgorithms` array is as follows:
|
||||
|
||||
<table><thead><tr><th width="289.3333333333333">Attribute</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><strong><code>did</code></strong></td><td><code>string</code></td><td>The DID of the algorithm which is trusted by the publisher.</td></tr><tr><td><strong><code>filesChecksum</code></strong></td><td><code>string</code></td><td>Hash of algorithm's files (as <code>string</code>).</td></tr><tr><td><strong><code>containerSectionChecksum</code></strong></td><td><code>string</code></td><td>Hash of algorithm's image details (as <code>string</code>).</td></tr></tbody></table>
|
||||
<table><thead><tr><th width="289.3333333333333">Attribute</th><th width="114">Type</th><th>Description</th></tr></thead><tbody><tr><td><strong><code>did</code></strong></td><td><code>string</code></td><td>The DID of the algorithm which is trusted by the publisher.</td></tr><tr><td><strong><code>filesChecksum</code></strong></td><td><code>string</code></td><td>Hash of algorithm's files (as <code>string</code>).</td></tr><tr><td><strong><code>containerSectionChecksum</code></strong></td><td><code>string</code></td><td>Hash of algorithm's image details (as <code>string</code>).</td></tr></tbody></table>
|
||||
|
||||
To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc)
|
||||
To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If the algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc)
|
||||
|
||||
To produce `containerSectionChecksum`:
|
||||
|
||||
@ -95,19 +95,11 @@ Sometimes, the asset needs additional input data before downloading or running a
|
||||
|
||||
The `consumerParameters` is an array of objects. Each object defines a field and has the following structure:
|
||||
|
||||
| Attribute | Type | Description |
|
||||
| ------------------- | -------------------------------- | -------------------------------------------------------------------------- |
|
||||
| **`name`**\* | `string` | The parameter name (this is sent as HTTP param or key towards algo) |
|
||||
| **`type`**\* | `string` | The field type (text, number, boolean, select) |
|
||||
| **`label`**\* | `string` | The field label which is displayed |
|
||||
| **`required`**\* | `boolean` | If customer input for this field is mandatory. |
|
||||
| **`description`**\* | `string` | The field description. |
|
||||
| **`default`**\* | `string`, `number`, or `boolean` | The field default value. For select types, `string` key of default option. |
|
||||
| **`options`** | Array of `option` | For select types, a list of options. |
|
||||
<table><thead><tr><th width="176.33333333333331">Attribute</th><th width="201">Type</th><th>Description</th></tr></thead><tbody><tr><td><strong><code>name</code></strong>*</td><td><code>string</code></td><td>The parameter name (this is sent as HTTP param or key towards algo)</td></tr><tr><td><strong><code>type</code></strong>*</td><td><code>string</code></td><td>The field type (text, number, boolean, select)</td></tr><tr><td><strong><code>label</code></strong>*</td><td><code>string</code></td><td>The field label which is displayed</td></tr><tr><td><strong><code>required</code></strong>*</td><td><code>boolean</code></td><td>If customer input for this field is mandatory.</td></tr><tr><td><strong><code>description</code></strong>*</td><td><code>string</code></td><td>The field description.</td></tr><tr><td><strong><code>default</code></strong>*</td><td><code>string</code>, <code>number</code>, or <code>boolean</code></td><td>The field default value. For select types, <code>string</code> key of default option.</td></tr><tr><td><strong><code>options</code></strong></td><td>Array of <code>option</code></td><td>For select types, a list of options.</td></tr></tbody></table>
|
||||
|
||||
\* Required
|
||||
\* **Required**
|
||||
|
||||
Each `option` is an `object` containing a single key:value pair where the key is the option name, and the value is the option value.
|
||||
Each `option` is an `object` containing a single key: value pair where the key is the option name, and the value is the option value.
|
||||
|
||||
<details>
|
||||
|
||||
@ -160,12 +152,18 @@ Each `option` is an `object` containing a single key:value pair where the key is
|
||||
|
||||
</details>
|
||||
|
||||
Algorithms will have access to a JSON file located at `/data/inputs/algoCustomData.json`, which contains the `keys/values` for input data required. Example:
|
||||
Algorithms will have access to a JSON file located at `/data/inputs/algoCustomData.json`, which contains the `keys/values` input data required. Example:
|
||||
|
||||
<details>
|
||||
|
||||
<summary>Key Value Example</summary>
|
||||
|
||||
\`\`\`json { "hometown": "São Paulo", "age": 10, "developer": true, "languagePreference": "nodejs" } \`\`\`
|
||||
<pre class="language-json"><code class="lang-json">{
|
||||
"hometown": "São Paulo",
|
||||
"age": 10,
|
||||
"developer": true,
|
||||
<strong> "languagePreference": "nodejs"
|
||||
</strong>}
|
||||
</code></pre>
|
||||
|
||||
</details>
|
||||
|
@ -7,19 +7,21 @@ description: Architecture overview
|
||||
|
||||
### Architecture Overview
|
||||
|
||||
Here's the sequence diagram for starting a new compute job.
|
||||
|
||||
<figure><img src="../../.gitbook/assets/c2d/c2d_compute_job.png" alt=""><figcaption></figcaption></figure>
|
||||
<figure><img src="../../.gitbook/assets/c2d/c2d_compute_job.png" alt=""><figcaption><p>Compute architecture overview</p></figcaption></figure>
|
||||
|
||||
The interaction between the Consumer and the Provider follows a specific workflow. To initiate the process, the Consumer contacts the Provider by invoking the `start(did, algorithm, additionalDIDs)` function with parameters such as the data identifier (DID), algorithm, and additional DIDs if required. Upon receiving this request, the Provider generates a unique job identifier (`XXXX`) and returns it to the Consumer. The Provider then assumes the responsibility of overseeing the remaining steps.
|
||||
|
||||
Throughout the computation process, the Consumer has the ability to check the status of the job by making a query to the Provider using the `getJobDetails(XXXX)` function, providing the job identifier (`XXXX`) as a reference.
|
||||
|
||||
{% hint style="info" %}
|
||||
You have the option to initiate a compute job using one or more data assets. You can explore this functionality by utilizing the [ocean.py](../ocean.py/) and [ocean.js](../ocean.js/) libraries.
|
||||
{% endhint %}
|
||||
|
||||
Now, let's delve into the inner workings of the Provider. Initially, it verifies whether the Consumer has sent the appropriate datatokens to gain access to the desired data. Once validated, the Provider interacts with the Operator-Service, a microservice responsible for coordinating the job execution. The Provider submits a request to the Operator-Service, which subsequently forwards the request to the Operator-Engine, the actual compute system in operation.
|
||||
|
||||
The Operator-Engine, equipped with functionalities like running Kubernetes compute jobs, carries out the necessary computations as per the requirements. Throughout the computation process, the Operator-Engine informs the Operator-Service of the job's progress. Finally, when the job reaches completion, the Operator-Engine signals the Operator-Service, ensuring that the Provider receives notification of the job's successful conclusion.
|
||||
|
||||
Here's the actors/components:
|
||||
Here are the actors/components:
|
||||
|
||||
* Consumers - The end users who need to use some computing services offered by the same Publisher as the data Publisher.
|
||||
* Operator-Service - Micro-service that is handling the compute requests.
|
||||
|
@ -4,7 +4,7 @@ description: Learn the Web3 concepts backing up Ocean Protocol tech
|
||||
|
||||
# Basic Concepts
|
||||
|
||||
You'll need to know a thing or two about **Web3** to fully understand Ocean Protocol's tech... Let's get started with the basics 🧑🏫
|
||||
You'll need to know a thing or two about **Web3** to understand Ocean Protocol's tech... Let's get started with the basics 🧑🏫
|
||||
|
||||
<figure><img src="../.gitbook/assets/gif/drew-barrymore-notes.gif" alt=""><figcaption><p>Prepare yourself, my friend</p></figcaption></figure>
|
||||
|
||||
@ -13,13 +13,13 @@ You'll need to know a thing or two about **Web3** to fully understand Ocean Prot
|
||||
Blockchain is a revolutionary technology that enables the decentralized nature of Ocean Protocol. At its core, blockchain is a **distributed ledger** that securely **records and verifies transactions across a network of computers**. It operates on the following key concepts that ensure trust and immutability:
|
||||
|
||||
* **Decentralization**: Blockchain eliminates the need for intermediaries by enabling a peer-to-peer network where transactions are validated collectively. This decentralized structure reduces reliance on centralized authorities, enhances transparency, and promotes a more inclusive data economy.
|
||||
* **Immutability**: Once a transaction is recorded on the blockchain, it becomes virtually impossible to alter or tamper with. The data is stored in blocks, which are cryptographically linked together, forming an unchangeable chain of information. Immutability ensures the integrity and reliability of data, providing a foundation of trust in the Ocean Protocol ecosystem. Furthermore, it enables a reliable traceability of historical transactions.
|
||||
* **Immutability**: Once a transaction is recorded on the blockchain, it becomes virtually impossible to alter or tamper with. The data is stored in blocks, which are cryptographically linked together, forming an unchangeable chain of information. Immutability ensures the integrity and reliability of data, providing a foundation of trust in the Ocean Protocol ecosystem. Furthermore, it enables reliable traceability of historical transactions.
|
||||
* **Consensus Mechanisms**: Blockchain networks employ consensus mechanisms to validate and agree upon the state of the ledger. These mechanisms ensure that all participants validate transactions without relying on a central authority, crucially maintaining a reliable view of the blockchain's history. The consensus mechanisms make it difficult for malicious actors to manipulate the blockchain's history or conduct fraudulent transactions. Popular consensus mechanisms include Proof of Work (PoW) and Proof of Stake (PoS).
|
||||
|
||||
Ocean Protocol harnesses the power of blockchain to facilitate secure and auditable data exchange. This ensures that data transactions are transparent, verifiable, and tamper-proof. Here's how blockchain is utilized in the Ocean Protocol ecosystem:
|
||||
|
||||
* **Data Asset Representation**: Data assets in Ocean Protocol are represented as non-fungible tokens (NFTs) on the blockchain. NFTs provide a unique identifier for each data asset, allowing for seamless tracking, ownership verification, and access control. Through NFTs and datatokens, data assets become easily tradable and interoperable within the Ocean ecosystem.
|
||||
* **Smart Contracts**: Ocean Protocol utilizes smart contracts to automate and enforce the terms of data exchange. Smart contracts act as self-executing agreements that facilitate the transfer of data assets between parties based on predefined conditions - they are the exact mechanisms of decentralization. This enables cyber secure data transactions and eliminate the need for intermediaries.
|
||||
* **Smart Contracts**: Ocean Protocol utilizes smart contracts to automate and enforce the terms of data exchange. Smart contracts act as self-executing agreements that facilitate the transfer of data assets between parties based on predefined conditions - they are the exact mechanisms of decentralization. This enables cyber-secure data transactions and eliminates the need for intermediaries.
|
||||
* **Tamper-Proof Audit Trail**: Every data transaction on Ocean Protocol is recorded on the blockchain, creating an immutable and tamper-proof audit trail. This ensures the transparency and traceability of data usage, providing data scientists with a verifiable record of the data transaction history. Data scientists can query addresses of data transfers on-chain to understand data usage.
|
||||
|
||||
By integrating blockchain technology, Ocean Protocol establishes a trusted infrastructure for data exchange. It empowers individuals and organizations to securely share, monetize, and leverage data assets while maintaining control and privacy.
|
||||
@ -30,4 +30,4 @@ Ocean tokens (**OCEAN**) are the native cryptocurrency of the Ocean Protocol eco
|
||||
|
||||
1. **Data Ownership**: Ocean tokens empower data owners by providing them with control over their data assets. Through the use of smart contracts, data owners can define access permissions, usage rights, and pricing terms for their data. By holding and staking Ocean tokens, data owners can exercise even greater control over their data assets.
|
||||
2. **Data Monetization and Consumption**: Ocean tokens facilitate seamless and secure transactions between data providers and consumers, fostering a thriving new data economy. Data owners can set a price in Ocean tokens for consumers to access and utilize their data. This creates opportunities for unlocking value from siloed or otherwise unused data.
|
||||
3. **Stake for veOcean and Curate Datasets**: Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOCEAN](../rewards/veocean.md). By staking your OCEAN and veOCEAN, you not only support the growth and sustainability of the ecosystem but also earn a share of data asset sales 💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/) while making a meaningful impact in the data marketplace.
|
||||
3. **Stake for veOcean and Curate Datasets**: Through the Data Farming initiative, you are incentivized to lock Ocean tokens for [veOCEAN](../rewards/veocean.md). By staking your OCEAN, you not only support the growth and sustainability of the ecosystem but also earn a share of data asset sales 💰. The Data Farming initiative offers participants a unique opportunity to earn [rewards](../rewards/) while making a meaningful impact in the data marketplace.
|
||||
|
@ -35,7 +35,7 @@ Once a user has Metamask installed and an Ethereum address, they can register, c
|
||||
|
||||
<summary>How do I price my data?</summary>
|
||||
|
||||
Ocean gives you two different options for pricing your data - fixed price or free. You need to decide what your dataset is worth and how you want to price it. You can change the price but you can’t change the price format (e.g. from fixed to free).
|
||||
Ocean gives you two different options for pricing your data - [fixed price](../developers/contracts/pricing-schemas.md#fixed-pricing) or [free](../developers/contracts/pricing-schemas.md#free-pricing). You need to decide what your dataset is worth and how you want to price it. You can change the price but you can’t change the price format (e.g. from fixed to free).
|
||||
|
||||
</details>
|
||||
|
||||
@ -43,7 +43,7 @@ Ocean gives you two different options for pricing your data - fixed price or fre
|
||||
|
||||
<summary>Is my data secure?</summary>
|
||||
|
||||
Yes. Ocean Protocol understands that some data is too sensitive to be shared — potentially due to GDPR or other reasons. For these types of datasets, we offer a unique service called compute-to-data. This enables you to monetise the dataset that sits behind a firewall without ever revealing the raw data to the consumer. For example, researchers and data scientists pay to run their algorithms on the data set and the computation is performed behind a firewall; all the researchers or data scientists receive is the results generated by their algorithm.
|
||||
Yes. Ocean Protocol understands that some data is too sensitive to be shared — potentially due to GDPR or other reasons. For these types of datasets, we offer a unique service called [compute-to-data](../developers/compute-to-data/). This enables you to monetize the dataset that sits behind a firewall without ever revealing the raw data to the consumer. For example, researchers and data scientists pay to run their algorithms on the data set, and the computation is performed behind a firewall; all the researchers or data scientists receive is the results generated by their algorithm.
|
||||
|
||||
</details>
|
||||
|
||||
@ -51,7 +51,7 @@ Yes. Ocean Protocol understands that some data is too sensitive to be shared —
|
||||
|
||||
<summary>Where is my data stored?</summary>
|
||||
|
||||
Ocean does not provide data storage. Users have the choice to store their data on their own servers, cloud or decentralized storage. Users need only to provide a URL to the dataset, which is then encrypted as a means to protect the access to the dataset.
|
||||
Ocean does not provide data storage. Users have the choice to [store](../user-guides/asset-hosting/) their data on their own servers, cloud, or decentralized storage. Users need only to provide a URL, an IPFS hash, an Arweave CID, or the on-chain information to the dataset. This is then encrypted as a means to protect access to the dataset.
|
||||
|
||||
</details>
|
||||
|
||||
@ -59,7 +59,7 @@ Ocean does not provide data storage. Users have the choice to store their data o
|
||||
|
||||
<summary>How do I control who accesses my data?</summary>
|
||||
|
||||
Ocean provides tools for access control, fine grained permissions, passlisting and blocklisting addresses. Data and AI services can be shared under the conditions set by the owner of data. There is no central intermediary, which ensures no one can interfere with the transaction and both the publisher and user have transparency.
|
||||
Ocean provides tools for access control, [fine-grained permissions](../developers/fg-permissions.md), passlisting, and blocklisting addresses. Data and AI services can be shared under the conditions set by the owner of the data. There is no central intermediary, which ensures no one can interfere with the transaction and both the publisher and user have transparency.
|
||||
|
||||
</details>
|
||||
|
||||
@ -67,7 +67,8 @@ Ocean provides tools for access control, fine grained permissions, passlisting a
|
||||
|
||||
<summary>Can I restrict who is able to access my dataset?</summary>
|
||||
|
||||
Yes - Ocean has implemented fine grained permissions. This means that you can create allow and deny lists that restrict access from certain individuals or limit access to particular organizations.
|
||||
Yes - Ocean has implemented [fine-grained permissions](../developers/fg-permissions.md). This means that you can create allow and deny lists that restrict access from certain individuals or limit access to particular organizations. \
|
||||
PS: [Fine-grained permissions](../developers/fg-permissions.md) are not integrated into the Ocean Marketplace.
|
||||
|
||||
</details>
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user