1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-11-26 19:49:26 +01:00

V2 docs developers (#1210)

* Creating new page structure in developers section

* Updating navigation

* Splitting up DDO page and putting examples within details sections

* Updated navigation

* Updating table

* GITBOOK-1: change request with no subject merged in GitBook

* GITBOOK-2: change request with no subject merged in GitBook

* Updating tables

* Fixing services table

* Updating tables

* Updating algorithm page

* Updating compute to data page

* Updating API section

* Adding the fine-grained permissions page

* Adding Market-Level Permissions page

* updating navigation

* Updating fine grained permissions

* adding information on DIDs

* Updating navigation

* Updating did and ddo page

* GITBOOK-5: Adding video
This commit is contained in:
Jamie Hewitt 2023-05-18 13:38:57 +03:00 committed by GitHub
parent ed0d856a8e
commit f24d95a5a7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
24 changed files with 1293 additions and 894 deletions

View File

@ -53,20 +53,24 @@
* [Rewards Tutorial](rewards/veOcean-Data-Farming-Tutorial.md)
* [📊 Data Science](data-science.md)
* [👨💻 Developers](developers/README.md)
* [Core concepts](developers/core-concepts/README.md)
* [Architecture Overview](developers/core-concepts/architecture.md)
* [Data NFTs and Datatokens](developers/core-concepts/datanft-and-datatoken.md)
* [Roles](developers/core-concepts/roles.md)
* [Fees](developers/core-concepts/fees.md)
* [Asset Pricing](developers/core-concepts/asset-pricing.md)
* [DID & DDO](developers/core-concepts/did-ddo.md)
* [C2D - Architecture](developers/core-concepts/compute-to-data-architecture.md)
* [C2D - Datasets & Algorithms](developers/core-concepts/compute-to-data-datasets-algorithms.md)
* [C2D - Writing Algorithms](developers/core-concepts/compute-to-data-algorithms.md)
* [C2D - User defined parameters](developers/core-concepts/user-defined-parameters.md)
* [Aquarius REST API](developers/aquarius-rest-api.md)
* [Provider REST API](developers/provider-rest-api.md)
* [Subgraph](developers/subgraph.md)
* [Architecture Overview](developers/architecture.md)
* [Data NFTs and Datatokens](developers/datanft-and-datatoken.md)
* [Roles](developers/roles.md)
* [Fees](developers/fees.md)
* [Asset Pricing](developers/asset-pricing.md)
* [Identifiers & Metadata](developers/Identifiers-Metadata.md)
* [DDO Specification](developers/ddo-specification.md)
* [Storage Specifications](developers/storage-specifications.md)
* [Fine-Grained Permissions](developers/Fine-Grained-Permissions.md)
* [Compute to data](developers/compute-to-data/README.md)
* [Architecture](developers/compute-to-data/compute-to-data-architecture.md)
* [Datasets & Algorithms](developers/compute-to-data/compute-to-data-datasets-algorithms.md)
* [Writing Algorithms](developers/compute-to-data/compute-to-data-algorithms.md)
* [Compute Options](developers/compute-to-data/compute-options.md)
* [APIs](developers/APIs/README.md)
* [Aquarius](developers/apis/aquarius.md)
* [Provider](developers/apis/provider.md)
* [Subgraph](developers/apis/subgraph.md)
* [🔨 Infrastructure](infrastructure/README.md)
* [Setup a Server](infrastructure/setup-server.md)
* [Deploying Marketplace](infrastructure/deploying-marketplace.md)

View File

@ -0,0 +1,2 @@
# APIs

View File

@ -0,0 +1,146 @@
---
title: Fine-Grained Permissions
slug: /developers/fine-grained-permissions/
section: developers
description: >-
Fine-Grained Permissions Using Role-Based Access Control. You can Control who can publish, buy or browse data
---
A large part of Ocean is about access control, which is primarily handled by datatokens. Users can access a resource (e.g. a file) by redeeming datatokens for that resource. We recognize that enterprises and other users often need more precise ways to specify and manage access, and we have introduced fine-grained permissions for these use cases.
Fine-grained permissions mean that access can be controlled precisely at two levels:
- [Marketplace-level permissions](./market-level-permissions) for browsing, downloading or publishing within a marketplace frontend.
- [Asset-level permissions](./asset-level-permissions) on downloading a specific asset.
The fine-grained permissions features are designed to work in forks of Ocean Market. We have not enabled them in Ocean Market itself, to keep Ocean Market open for everyone to use. On the front end, the permissions features are easily enabled by setting environment variables.
## Introduction
Some datasets need to be restricted to appropriately credentialed users. In this situation there is tension:
1. Datatokens on their own arent enough - the datatokens can be exchanged without any restrictions, which means anyone can acquire them and access the data.
2. We want to retain datatokens approach, since they enable Ocean users to leverage existing crypto infrastructure e.g. wallets, exchange etc.
We can resolve this tension by drawing on the following analogy:
> Imagine going to an age 18+ rock concert. You can only get in if you show both (a) your concert ticket and (b) an id showing that youre old enough.
We can port this model into Ocean, where (a) is a datatoken, and (b) is a credential. The datatoken is the baseline access control. Its fungible, and something that youve paid for or had shared to you. Its independent of your identity. The credential is something thats a function of your identity.
The credential based restrictions are implemented in two ways, at the market level and at the asset level. Access to the market is restricted on a role basis, the user's identity is attached to a role via the role based access control (RBAC) server. Access to individual assets is restricted via allow and deny lists which list the ethereum addresses of the users who can and cannot access the asset within the DDO.
## Asset-Level Restrictions
For asset-level restrictions Ocean supports allow and deny lists. Allow and deny lists are advanced features that allow publishers to control access to individual data assets. Publishers can restrict assets so that they can only be accessed by approved users (allow lists) or they can restrict assets so that they can be accessed by anyone except certain users (deny lists).
When an allow-list is in place, a consumer can only access the resource if they have a datatoken and one of the credentials in the "allow" list of the DDO. Ocean also has complementary deny functionality: if a consumer is on the "deny" list, they will not be allowed to access the resource.
Initially, the only credential supported is Ethereum public addresses. To be fair, its more a pointer to an individual not a credential; but it has a low-complexity implementation so makes a good starting point. For extensibility, the Ocean metadata schema enables specification of other types of credentials like W3C Verifiable Credentials and more. When this gets implemented, asset-level permissions will be properly RBAC too.
Since asset-level permissions are in the DDO, and the DDO is controlled by the publisher, asset-level restrictions are controlled by the publisher.
## Market-Level Permissions
For market-level permissions, Ocean implements a role-based access control server (RBAC server). It implements restrictions at the user level, based on the users role (credentials). The RBAC server is run & controlled by the marketplace owner. Therefore permissions at this level are at the discretion of the marketplace owner.
The RBAC server is the primary mechanism for restricting your users ability to publish, buy, or browse assets in the market.
### Roles
The RBAC server defines four different roles:
- Admin
- Publisher
- Consumer
- User
#### Admin/ Publisher
Currently users with either the admin or publisher roles will be able to use the Market without any restrictions. They can publish, buy and browse datasets.
#### Consumer
A user with the consumer is able to browse datasets, purchase them, trade datatokens and also contribute to datapools. However, they are not able to publish datasets.
#### Users
Users are able to browse and search datasets but they are not able to purchase datasets, trade datatokens, or contribute to data pools. They are also not able to publish datasets.
#### Address without a role
If a user attempts to view the data market without a role, or without a wallet connected, they will not be able to view or search any of the datasets.
#### No wallet connected
When the RBAC server is enabled on the market, users are required to have a wallet connected to browse the datasets.
### Mapping roles to addresses
Currently the are two ways that the RBAC server can be configured to map user roles to Ethereum addresses. The RBAC server is also built in such a way that it is easy for you to add your own authorization service. They two existing methods are:
1. Keycloak
If you already have a [Keycloak](https://www.keycloak.org/) identity and access management server running you can configure the RBAC server to use it by adding the URL of your Keycloak server to the `KEYCLOAK_URL` environmental variable in the RBAC `.enb` file.
2. JSON
Alternatively, if you are not already using Keycloak, the easiest way to map user roles to ethereum addresses is in a JSON object that is saved as the `JSON_DATA` environmental variable in the RBAC `.env` file. There is an example of the format required for this JSON object in `.example.env`
It is possible that you can configure both of these methods of mapping user roles to Ethereum Addresses. In this case the requests to your RBAC server should specify which auth service they are using e.g. `"authService": "json"` or `"authService": "keycloak"`
#### Default Auth service
Additionally, you can also set an environmental variable within the RBAC server that specifies the default authorization method that will be used e.g. `DEFAULT_AUTH_SERVICE = "json"`. When this variable is specified, requests sent to your RBAC server don't need to include an `authService` and they will automatically use the default authorization method.
### Running the RBAC server locally
You can start running the RBAC server by following these steps:
1. Clone this repository:
```Bash
git clone https://github.com/oceanprotocol/RBAC-Server.git
cd RBAC-Server
```
2. Install the dependencies:
```Bash
npm install
```
3. Build the service
```Bash
npm run build
```
4. Start the server
```Bash
npm run start
```
### Running the RBAC in Docker
When you are ready to deploy the RBAC server to
1. Replace the KEYCLOAK_URL in the Dockerfile with the correct URL for your hosting of [Keycloak](https://www.keycloak.org/).
2. Run the following command to build the RBAC service in a Docker container:
```Bash
npm run build:docker
```
3. Next, run the following command to start running the RBAC service in the Docker container:
```Bash
npm run start:docker
```
4. Now you are ready to send requests to the RBAC server via postman. Make sure to replace the URL to `http://localhost:49160` in your requests.

View File

@ -0,0 +1,84 @@
---
title: Identifiers & Metadata
slug: /concepts/did-ddo/
section: concepts
description: >-
Specification of decentralized identifiers for assets in Ocean Protocol using
the DID & DDO standards.
---
# Identifiers & Metadata
### Identifiers
In Ocean, we use decentralized identifiers (DIDs) to identify your asset within the network. Decentralized identifiers (DIDs) are a type of identifier that enables verifiable, decentralized digital identity. In contrast to typical, centralized identifiers, DIDs have been designed so that they may be decoupled from centralized registries, identity providers, and certificate authorities. Specifically, while other parties might be used to help enable the discovery of information related to a DID, the design enables the controller of a DID to prove control over it without requiring permission from any other party. DIDs are URIs that associate a DID subject with a DID document allowing trustable interactions associated with that subject.
A DID in Ocean looks like this:
```
did:op:0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea
```
The part after `did:op:` is the ERC721 contract address(in checksum format) and the chainId (expressed as a decimal) the asset has been published to:
```js
const checksum = sha256(ERC721 contract address + chainId)
console.log(checksum)
// 0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea
```
DIDs in ocean follow [the generic DID scheme](https://w3c-ccg.github.io/did-spec/#the-generic-did-scheme).
{% embed url="https://www.youtube.com/watch?t=95s&v=I06AUNt7ee8" %}
What is a DID and DDO?
{% endembed %}
### Metadata
#### Overview
This document describes how Ocean assets follow the DID/DDO specification, such that Ocean assets can inherit DID/DDO benefits and enhance interoperability. DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/).
Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more.
A DID Document (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID.
#### Rules for DID & DDO
An _asset_ in Ocean represents a downloadable file, compute service, or similar. Each asset is a _resource_ under the control of a _publisher_. The Ocean network itself does _not_ store the actual resource (e.g. files).
An _asset_ has a DID and DDO. The DDO should include [metadata](did-ddo.md#metadata) about the asset, and define access in at least one [service](did-ddo.md#services). Only _owners_ or _delegated users_ can modify the DDO.
All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like _Aquarius_ can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted.
#### Publishing & Retrieving DDOs
The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the _Provider_. To resolve it, a metadata cache like _Aquarius_ must query the provider to decrypt the DDO.
Here is the flow:
![DDO flow](../.gitbook/assets/architecture/ddo-flow.png)
<details>
<summary>UML source</summary>
```
title DDO flow
User(Ocean library) -> User(Ocean library): Prepare DDO
User(Ocean library) -> Provider: encrypt DDO
Provider -> User(Ocean library): encryptedDDO
User(Ocean library) -> ERC721 contract: publish encryptedDDO
Aquarius <-> ERC721 contract: monitors ERC721 contract and gets MetdadataCreated Event (contains encryptedDDO)
Aquarius -> ERC721 contract: calls getMetaData()
Aquarius -> Provider: decrypt encryptedDDO, signed request using Aquarius's private key
Provider -> ERC721 contract: checks state using getMetaData()
Provider -> Provider: depending on metadataState (expired,retired) and aquarius address, validates the request
Provider -> Aquarius: DDO
Aquarius -> Aquarius : validate DDO
Aquarius -> Aquarius : cache DDO
Aquarius -> Aquarius : enhance cached DDO in response with additional infos like events & stats
```
</details>

View File

@ -1,2 +1,126 @@
# API References
# Core concepts
## What is Ocean?
Ocean provides the next generation of tools to unlock data at a large scale. Ocean makes it easy to publish and consume data services.
Ocean uses Data NFTs (ERC721) and datatokens (ERC20) as the interface to connect data assets with blockchain and DeFi tools. Crypto wallets become data wallets, crypto exchanges become data marketplaces, DAOs for data co-ops, and more via DeFi composability.
![Creating a New Data Economy](../../.gitbook/assets/architecture/feature-datascience@2x.webp)
The following guides are a greate place to start if you are new to Ocean:
* [Architecture Overview](architecture.md)
* [Data NFTs and Datatokens](datanft-and-datatoken.md)
* [Publish a data asset](../../how-tos/marketplace-publish-data-asset.md)
* [Download a data asset](../../how-tos/marketplace-download-data-asset.md)
## What is our Mission?
**To unlock data, for more equitable outcomes for users of data, using a thoughtful application of both technology and governance.**
Society is becoming increasingly reliant on data, especially with the advent of AI. However, a small handful of organizations with both massive data assets and AI capabilities attained worrying levels of control which is a danger to a free and open society.
Our team and community is committed to kick-starting a New Data Economy that reaches every single person, company and device, giving power back to data owners and enabling people to capture value from data to better our world.
Find out more about the people building Ocean on our [site](https://oceanprotocol.com/about).
## What can you do with Ocean?
### Buy or Sell Data
Use Ocean Market to publish and sell data, or browse and buy data. Data is published as interoperable ERC721 data NFTs & ERC20 datatokens. It's a decentralized exchange (DEX), tuned for data. The acts of publishing data, purchasing data, and consuming data are all recorded on the blockchain to make a tamper-proof audit trail.
As a data scientist or AI practitioner, you can benefit from access to more data (including private data), crypto-secured provenance in data & AI training, and income opportunities for selling data and curating data.
![Decentralized Exchange Marketplaces](../../.gitbook/assets/architecture/feature-marketplaces@2x.webp)
The following guides will help you get started with buying and selling data:
* [Publish a data asset](../../how-tos/marketplace-publish-data-asset.md)
* [Download a data asset](../../how-tos/marketplace-download-data-asset.md)
* [Publishing with hosting services](../../how-tos/asset-hosting/)
### Build Your Own Data Market
Use Ocean Protocol software tools to build your own data marketplace, by either forking [Ocean Market](https://v4.market.oceanprotocol.com/) code or building up with Ocean components.
![Ocean Market Homepage](../../.gitbook/assets/market/ocean-market-homepage.png)
If you're interested in starting your own marketplace checkout the following guides:
* [Forking Ocean Market](../../tutorials/build-a-marketplace/forking-ocean-market.md)
* [Customising your market](../../tutorials/build-a-marketplace/customising-your-market.md)
* [Deploying your market](../../tutorials/build-a-marketplace/deploying-market.md)
### Manage datatokens and data NFTs for use in DeFi
Ocean makes it easy to publish data services (deploy ERC721 data NFTs and ERC20 datatokens), and to consume data services (spend datatokens). Crypto wallets, exchanges, and DAOs become data wallets, exchanges, and DAOs.
Use Ocean [JavaScript](https://github.com/oceanprotocol/ocean.js) or [Python](https://github.com/oceanprotocol/ocean.py) drivers to manage data NFTs and datatokens:
Ocean-based apps make data asset on-ramps and off-ramps easy for end users. Ocean smart contracts and libraries make this easy for developers. The data itself does not need to be on-chain, just the access control.
![New Data on-ramp and off-ramp](../../.gitbook/assets/architecture/new-ramp-on-crypto-ramp-off.webp)
Data NFTs are ERC721 tokens representing the unique asset and datatokens are ERC20 tokens to access data services. Each data service gets its own data NFT and one or more type of datatokens.
To access the dataset, you send 1.0 datatokens to the data provider (running Ocean Provider). To give access to someone else, send them 1.0 datatokens. That's it.
Since datatokens are ERC20, and live on Ethereum mainnet, there's a whole ecosystem to leverage.
* _Publish and access data services:_ downloadable files or compute-to-data. Use Ocean to deploy a new [ERC721](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-721.md) and [ERC20](https://github.com/ethereum/EIPs/blob/7f4f0377730f5fc266824084188cc17cf246932e/EIPS/eip-20.md) datatoken contract for each data service, then mint datatokens.
* _Transfer datatokens_ to another owner (or approve & transferFrom).
* _And more._ Use ERC20 support in [web3.js](https://web3js.readthedocs.io/), [web3.py](https://web3py.readthedocs.io/en/stable/examples.html#working-with-an-erc20-token-contract) and Solidity to connect datatokens with crypto wallets and other DeFi services.
### Compute-to-Data
Ocean's "Compute-to-Data" feature enables private data to be bought & sold. You can sell compute access to privately-held data, which never leaves the data owners premises. Ocean-based marketplaces enable the monetization of private data while preserving privacy.
Compute-to-data resolves the tradeoff between the benefits of using private data, and the risks of exposing it. It lets the data stay on-premise, yet allows 3rd parties to run specific compute jobs on it to get useful compute results like averaging or building an AI model.
The most valuable data is private data — using it can improve research and business outcomes. But concerns over privacy and control make it hard to access. With Compute-to-Data, private data isnt directly shared but rather specific access to it is granted.
![Compute-to-data](../../.gitbook/assets/architecture/feature-compute@2x.webp)
It can be used for data sharing in science or technology contexts, or in marketplaces for selling private data while preserving privacy, as an opportunity for companies to monetize their data assets.
Private data can help research, leading to life-altering innovations in science and technology. For example, more data improves the predictive accuracy of modern Artificial Intelligence (AI) models. Private data is often considered the most valuable data because its so hard to get at, and using it can lead to potentially big payoffs.
Checkout these guides if you are aiming to get a deeper understanding on how compute-to-data works:
* [Architecture](compute-to-data-architecture.md)
* [Datasets & Algorithms](compute-to-data-datasets-algorithms.md)
* [Minikube Environment](../../infrastructure/compute-to-data-minikube.md)
* [Writing Algorithms](compute-to-data-algorithms.md)
* [Private docker registry](../../infrastructure/compute-to-data-docker-registry.md)
## How does it work?
In Ocean Protocol, each asset gets its own ERC721 **data NFT** and one(or more) ERC20 **datatokens**. This enables data wallets, data exchanges, and data co-ops by directly leveraging crypto wallets, exchanges, and more.
Ocean Protocol provides tools for developers to _build data markets_, and to _manage data NFTs and datatokens_ for use in DeFi.
If you are new to web3 and blockchain technologies then we suggest you first read these introductory guides:
* [Wallet Basics](../../discover/wallets.md)
* [Set Up MetaMask Wallet](../../discover/metamask-setup.md)
* [Manage Your OCEAN Tokens](../../discover/wallets-and-ocean-tokens.md)
If ou are looking to get to grips with the inner workings of Ocean, then you'll be interested in the following guides:
* [Architecture Overview](architecture.md)
* [Data NFTs and Datatokens](datanft-and-datatoken.md)
* [Networks](../../discover/networks/)
* [Fees](fees.md)
* [Asset pricing](asset-pricing.md)
* [DID & DDO](did-ddo.md)
* [Roles](roles.md)
* [Set Up a Marketplace](../../tutorials/build-a-marketplace/marketplace.md)
* [Compute-to-Data](compute-to-data/)
* [Deploying components](../../infrastructure/)
* [Contributing](../../contribute/contributing.md)
## Supporters
[GitBook](https://www.gitbook.com/) is a supporter of this open source project by providing hosting for this documentation.

View File

@ -0,0 +1,2 @@
# API References

View File

@ -0,0 +1 @@
# Compute to Data

View File

@ -0,0 +1,179 @@
---
title: Compute Options
section: developers
description: >-
Specification of compute options for assets in Ocean Protocol.
---
## Compute Options
An asset with a service of `type` `compute` has the following additional attributes under the `compute` object. This object is required if the asset is of `type` `compute`, but can be omitted for `type` of `access`.
| Attribute | Type | Description |
| -------- | -------- | -------- |
| **`allowRawAlgorithm`*** | `boolean` | If `true`, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be `false` by default in all implementations. |
| **`allowNetworkAccess`*** | `boolean` | If `true`, the algorithm job will have network access. |
| **`publisherTrustedAlgorithmPublishers`*** | Array of `string` | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed. |
| **`publisherTrustedAlgorithms`*** | Array of `publisherTrustedAlgorithms` | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below). |
\* Required
## Trusted Algorithms
The `publisherTrustedAlgorithms` is an array of objects with the following structure:
| Attribute | Type | Description |
| ------------------------------ | -------- | ----------------------------------------------------------- |
| **`did`** | `string` | The DID of the algorithm which is trusted by the publisher. |
| **`filesChecksum`** | `string` | Hash of algorithm's files (as `string`). |
| **`containerSectionChecksum`** | `string` | Hash of algorithm's image details (as `string`). |
To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc)
To produce `containerSectionChecksum`:
```js
sha256(algorithm_ddo.metadata.algorithm.container.entrypoint + algorithm_ddo.metadata.algorithm.container.checksum);
```
<details>
<summary>Compute Options Example</summary>
Example:
```json
{
"services": [
{
"id": "1",
"type": "access",
"files": "0x044736da6dae39889ff570c34540f24e5e084f...",
"name": "Download service",
"description": "Download service",
"datatokenAddress": "0x123",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0
},
{
"id": "2",
"type": "compute",
"files": "0x6dd05e0edb460623c843a263291ebe757c1eb3...",
"name": "Compute service",
"description": "Compute service",
"datatokenAddress": "0x124",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0,
"compute": {
"allowRawAlgorithm": false,
"allowNetworkAccess": true,
"publisherTrustedAlgorithmPublishers": ["0x234", "0x235"],
"publisherTrustedAlgorithms": [
{
"did": "did:op:123",
"filesChecksum": "100",
"containerSectionChecksum": "200"
},
{
"did": "did:op:124",
"filesChecksum": "110",
"containerSectionChecksum": "210"
}
]
}
}
]
}
```
</details>
## Consumer Parameters
Sometimes, the asset needs additional input data before downloading or running a Compute-to-Data job. Examples:
* The publisher needs to know the sampling interval before the buyer downloads it. Suppose the dataset URL is `https://example.com/mydata`. The publisher defines a field called `sampling` and asks the buyer to enter a value. This parameter is then added to the URL of the published dataset as query parameters: `https://example.com/mydata?sampling=10`.
* An algorithm that needs to know the number of iterations it should perform. In this case, the algorithm publisher defines a field called `iterations`. The buyer needs to enter a value for the `iterations` parameter. Later, this value is stored in a specific location in the Compute-to-Data pod for the algorithm to read and use it.
The `consumerParameters` is an array of objects. Each object defines a field and has the following structure:
| Attribute | Type | Description |
| ----------------- | -------------------------------- | -------------------------------------------------------------------------- |
| **`name`*** | `string` | The parameter name (this is sent as HTTP param or key towards algo) |
| **`type`*** | `string` | The field type (text, number, boolean, select) |
| **`label`*** | `string` | The field label which is displayed |
| **`required`*** | `boolean` | If customer input for this field is mandatory. |
| **`description`*** | `string` | The field description. |
| **`default`*** | `string`, `number`, or `boolean` | The field default value. For select types, `string` key of default option. |
| **`options`** | Array of `option` | For select types, a list of options. |
\* Required
Each `option` is an `object` containing a single key:value pair where the key is the option name, and the value is the option value.
<details>
<summary>Consumer Parameters Example</summary>
```json
[
{
"name": "hometown",
"type": "text",
"label": "Hometown",
"required": true,
"description": "What is your hometown?",
"default": "Nowhere"
},
{
"name": "age",
"type": "number",
"label": "Age",
"required": false,
"description": "Please fill your age",
"default": 0
},
{
"name": "developer",
"type": "boolean",
"label": "Developer",
"required": false,
"description": "Are you a developer?",
"default": false
},
{
"name": "languagePreference",
"type": "select",
"label": "Language",
"required": false,
"description": "Do you like NodeJs or Python",
"default": "nodejs",
"options": [
{
"nodejs": "I love NodeJs"
},
{
"python": "I love Python"
}
]
}
]
```
</details>
Algorithms will have access to a JSON file located at `/data/inputs/algoCustomData.json`, which contains the `keys/values` for input data required. Example:
<details>
<summary>Key Value Example</summary>
```json
{
"hometown": "São Paulo",
"age": 10,
"developer": true,
"languagePreference": "nodejs"
}
```
</details>

View File

@ -15,6 +15,9 @@ An algorithm in the Ocean Protocol stack is another asset type, in addition to d
When creating an algorithm asset in Ocean Protocol, the additional `algorithm` object needs to be included in its metadata service to define the Docker container environment:
<details>
<summary>Environment Object Example</summary>
```json
{
"algorithm": {
@ -26,6 +29,7 @@ When creating an algorithm asset in Ocean Protocol, the additional `algorithm` o
}
}
```
</details>
| Variable | Usage |
| ------------ | --------------------------------------------------------------------------------------------------------------------------------------- |
@ -44,7 +48,10 @@ We also collect some [example images](https://github.com/oceanprotocol/algo_dock
When publishing an algorithm through the [Ocean Market](https://market.oceanprotocol.com), these properties can be set via the publish UI.
### Environment Examples
<details>
<summary>Environment Examples</summary>
Run an algorithm written in JavaScript/Node.js, based on Node.js v14:
@ -73,12 +80,13 @@ Run an algorithm written in Python, based on Python v3.9:
}
}
```
</details>
### Data Storage
As part of a compute job, every algorithm runs in a K8s pod with these volumes mounted:
| Path | Permissions | Usage |
| Path | Permissions | Usage |
| --------------- | ----------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `/data/inputs` | read | Storage for input data sets, accessible only to the algorithm running in the pod. Contents will be the files themselves, inside indexed folders e.g. `/data/inputs/{did}/{service_id}`. |
| `/data/ddos` | read | Storage for all DDOs involved in compute job (input data set + algorithm). Contents will json files containing the DDO structure. |
@ -97,7 +105,10 @@ For every algorithm pod, the Compute to Data environment provides the following
| `DIDS` | An array of DID strings containing the input datasets. |
| `TRANSFORMATION_DID` | The DID of the algorithm. |
## Example: JavaScript/Node.js
<details>
<summary>Example: JavaScript/Node.js</summary>
The following is a simple JavaScript/Node.js algorithm, doing a line count for ALL input datasets. The algorithm is not using any environment variables, but instead it's scanning the `/data/inputs` folder.
@ -152,8 +163,11 @@ To run this, use the following container object:
}
}
```
</details>
## Example: Python
<details>
<summary>Example: Python</summary>
A more advanced line counting in Python, which relies on environment variables and constructs a job object, containing all the input files & DDOs
@ -233,3 +247,57 @@ To run this algorithm, use the following `container` object:
}
}
```
</details>
### Algorithm Metadata
An asset of type `algorithm` has additional attributes under `metadata.algorithm`, describing the algorithm and the Docker environment it is supposed to be run under.
| Attribute | Type | Description |
| ------------------------ | ----------------------- | -------------------------------------- |
| **`language`** | `string` | Language used to implement the software. |
| **`version`** | `string` | Version of the software preferably in [SemVer](https://semver.org) notation. E.g. `1.0.0`. |
| **`consumerParameters`** | [Consumer Parameters](did-ddo.md#consumer-parameters) | An object that defines required consumer input before running the algorithm |
| **`container`*** | `container` | Object describing the Docker container image. See below |
\* Required
The `container` object has the following attributes defining the Docker image for running the algorithm:
| Attribute | Type | Description |
| ---------------- | -------- | ----------------------------------------------------------------- |
| **`entrypoint`*** | `string` | The command to execute, or script to run inside the Docker image. |
| **`image`*** | `string` | Name of the Docker image. |
| **`tag`*** | `string` | Tag of the Docker image. |
| **`checksum`*** | `string` | Digest of the Docker image. (ie: sha256:xxxxx) |
\* Required
<details>
<summary>Algorithm Metadata Example</summary>
```json
{
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample algorithm asset",
"type": "algorithm",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms",
"algorithm": { "language": "Node.js", "version": "1.0.0",
"container": {
"entrypoint": "node $ALGO",
"image": "ubuntu",
"tag": "latest",
"checksum": "sha256:44e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550"
},
"consumerParameters": {}
}
}
}
```
</details>

View File

@ -1,875 +0,0 @@
---
title: DID & DDO
slug: /concepts/did-ddo/
section: concepts
description: >-
Specification of decentralized identifiers for assets in Ocean Protocol using
the DID & DDO standards.
---
# DID & DDO
**v4.1.0**
### Overview
This document describes how Ocean assets follow the DID/DDO specification, such that Ocean assets can inherit DID/DDO benefits and enhance interoperability. DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/).
Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more.
A DID Document (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID.
### Rules for DID & DDO
An _asset_ in Ocean represents a downloadable file, compute service, or similar. Each asset is a _resource_ under the control of a _publisher_. The Ocean network itself does _not_ store the actual resource (e.g. files).
An _asset_ has a DID and DDO. The DDO should include [metadata](did-ddo.md#metadata) about the asset, and define access in at least one [service](did-ddo.md#services). Only _owners_ or _delegated users_ can modify the DDO.
All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like _Aquarius_ can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted.
### Publishing & Retrieving DDOs
The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the _Provider_. To resolve it, a metadata cache like _Aquarius_ must query the provider to decrypt the DDO.
Here is the flow:
![DDO flow](../../.gitbook/assets/architecture/ddo-flow.png)
<details>
<summary>UML source</summary>
```
title DDO flow
User(Ocean library) -> User(Ocean library): Prepare DDO
User(Ocean library) -> Provider: encrypt DDO
Provider -> User(Ocean library): encryptedDDO
User(Ocean library) -> ERC721 contract: publish encryptedDDO
Aquarius <-> ERC721 contract: monitors ERC721 contract and gets MetdadataCreated Event (contains encryptedDDO)
Aquarius -> ERC721 contract: calls getMetaData()
Aquarius -> Provider: decrypt encryptedDDO, signed request using Aquarius's private key
Provider -> ERC721 contract: checks state using getMetaData()
Provider -> Provider: depending on metadataState (expired,retired) and aquarius address, validates the request
Provider -> Aquarius: DDO
Aquarius -> Aquarius : validate DDO
Aquarius -> Aquarius : cache DDO
Aquarius -> Aquarius : enhance cached DDO in response with additional infos like events & stats
```
</details>
### DID
In Ocean, a DID is a string that looks like this:
```
did:op:0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea
```
The part after `did:op:` is the ERC721 contract address(in checksum format) and the chainId (expressed as a decimal) the asset has been published to:
```js
const checksum = sha256(ERC721 contract address + chainId)
console.log(checksum)
// 0ebed8226ada17fde24b6bf2b95d27f8f05fcce09139ff5cec31f6d81a7cd2ea
```
It follows [the generic DID scheme](https://w3c-ccg.github.io/did-spec/#the-generic-did-scheme).
### DDO
A DDO in Ocean has these required attributes:
| Attribute | Type | Description |
| ----------------- | ------------------------------------- | -------------------------------------------------------------------------------------------------------------- |
| **`@context`** | Array of `string` | Contexts used for validation. |
| **`id`** | `string` | Computed as `sha256(address of ERC721 contract + chainId)`. |
| **`version`** | `string` | Version information in [SemVer](https://semver.org) notation referring to this DDO spec version, like `4.1.0`. |
| **`chainId`** | `number` | Stores chainId of the network the DDO was published to. |
| **`nftAddress`** | `string` | NFT contract linked to this asset |
| **`metadata`** | [Metadata](did-ddo.md#metadata) | Stores an object describing the asset. |
| **`services`** | [Services](did-ddo.md#services) | Stores an array of services defining access to the asset. |
| **`credentials`** | [Credentials](did-ddo.md#credentials) | Describes the credentials needed to access a dataset in addition to the `services` definition. |
#### Metadata
This object holds information describing the actual asset.
| Attribute | Type | Required | Description |
| --------------------------- | --------------------------------------------------- | --------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **`created`** | `ISO date/time string` | | Contains the date of the creation of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. |
| **`updated`** | `ISO date/time string` | | Contains the date of last update of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. |
| **`description`** | `string` | **✓** | Details of what the resource is. For a dataset, this attribute explains what the data represents and what it can be used for. |
| **`copyrightHolder`** | `string` | | The party holding the legal copyright. Empty by default. |
| **`name`** | `string` | **✓** | Descriptive name or title of the asset. |
| **`type`** | `string` | **✓** | Asset type. Includes `"dataset"` (e.g. csv file), `"algorithm"` (e.g. Python script). Each type needs a different subset of metadata attributes. |
| **`author`** | `string` | **✓** | Name of the entity generating this data (e.g. Tfl, Disney Corp, etc.). |
| **`license`** | `string` | **✓** | Short name referencing the license of the asset (e.g. Public Domain, CC-0, CC-BY, No License Specified, etc. ). If it's not specified, the following value will be added: "No License Specified". |
| **`links`** | Array of `string` | | Mapping of URL strings for data samples, or links to find out more information. Links may be to either a URL or another asset. |
| **`contentLanguage`** | `string` | | The language of the content. Use one of the language codes from the [IETF BCP 47 standard](https://tools.ietf.org/html/bcp47) |
| **`tags`** | Array of `string` | | Array of keywords or tags used to describe this content. Empty by default. |
| **`categories`** | Array of `string` | | Array of categories associated to the asset. Note: recommended to use `tags` instead of this. |
| **`additionalInformation`** | Object | | Stores additional information, this is customizable by publisher |
| **`algorithm`** | [Algorithm Metadata](did-ddo.md#algorithm-metadata) | **✓** (for algorithm assets only) | Information about asset of `type` `algorithm` |
Example:
```json
{
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample asset",
"type": "dataset",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms"
}
}
```
#### Algorithm Metadata
An asset of type `algorithm` has additional attributes under `metadata.algorithm`, describing the algorithm and the Docker environment it is supposed to be run under.
| Attribute | Type | Required | Description |
| ------------------------ | ----------------------------------------------------- | -------- | ------------------------------------------------------------------------------------------ |
| **`language`** | `string` | | Language used to implement the software. |
| **`version`** | `string` | | Version of the software preferably in [SemVer](https://semver.org) notation. E.g. `1.0.0`. |
| **`consumerParameters`** | [Consumer Parameters](did-ddo.md#consumer-parameters) | | An object that defines required consumer input before running the algorithm |
| **`container`** | `container` | **✓** | Object describing the Docker container image. See below |
The `container` object has the following attributes defining the Docker image for running the algorithm:
| Attribute | Type | Required | Description |
| ---------------- | -------- | -------- | ----------------------------------------------------------------- |
| **`entrypoint`** | `string` | **✓** | The command to execute, or script to run inside the Docker image. |
| **`image`** | `string` | **✓** | Name of the Docker image. |
| **`tag`** | `string` | **✓** | Tag of the Docker image. |
| **`checksum`** | `string` | **✓** | Digest of the Docker image. (ie: sha256:xxxxx) |
```json
{
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample algorithm asset",
"type": "algorithm",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms",
"algorithm": {
"language": "Node.js",
"version": "1.0.0",
"container": {
"entrypoint": "node $ALGO",
"image": "ubuntu",
"tag": "latest",
"checksum": "sha256:44e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550"
},
"consumerParameters": {}
}
}
}
```
#### Services
Services define the access for an asset, and each service is represented by its respective datatoken.
An asset should have at least one service to be actually accessible, and can have as many services which make sense for a specific use case.
| Attribute | Type | Required | Description |
| --------------------------- | ----------------------------------------------------- | ------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- |
| **`id`** | `string` | **✓** | Unique ID |
| **`type`** | `string` | **✓** | Type of service (`access`, `compute`, `wss`, etc. |
| **`name`** | `string` | | Service friendly name |
| **`description`** | `string` | | Service description |
| **`datatokenAddress`** | `string` | **✓** | Datatoken address |
| **`serviceEndpoint`** | `string` | **✓** | Provider URL (schema + host) |
| **`files`** | [Files](did-ddo.md#files) | **✓** | Encrypted file URLs. |
| **`timeout`** | `number` | **✓** | Describing how long the service can be used after consumption is initiated. A timeout of `0` represents no time limit. Expressed in seconds. |
| **`compute`** | [Compute](did-ddo.md#compute-options) | **✓** (for compute assets only) | If service is of `type` `compute`, holds information about the compute-related privacy settings & resources. |
| **`consumerParameters`** | [Consumer Parameters](did-ddo.md#consumer-parameters) | | An object the defines required consumer input before consuming the asset |
| **`additionalInformation`** | Object | | Stores additional information, this is customizable by publisher |
#### Files
The `files` field is returned as a `string` which holds the encrypted file URLs.
Example:
```json
{
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735"
}
```
During the publish process, file URLs must be encrypted with a respective _Provider_ API call before storing the DDO on-chain. For this, you need to send the following object to Provider:
```json
{
"datatokenAddress":"0x1",
"nftAddress": "0x2",
"files": [
...
]
}
```
where "files" contains one or more storage objects.
**Type of objects supported:**
**`URL`**
Static URLs.
Parameters:
- `url` - File url, required
- `method` - The HTTP method, required
- `headers` - Additional HTTP headers, optional
```
{
"type": "url",
"url": "https://url.com/file1.csv",
"method": "GET",
"headers":
{
"Authorization": "Bearer 123",
"APIKEY": "124",
}
}
```
**`IPFS`**
The [Interplanetary File System](https://ipfs.tech/) (IPFS) is a distributed file storage protocol that allows computers all over the globe to store and serve files as part of a giant peer-to-peer network. Any computer, anywhere in the world, can download the IPFS software and start hosting and serving files.
Parameters:
- `hash` - The file hash
```
{
"type": "ipfs",
"hash": "XXX"
}
```
**`GraphQL`**
[GraphQL](https://graphql.org/) is a query language for APIs and a runtime for fulfilling those queries with your existing data.
Parameters:
- `url` - Server endpoint url, required
- `query` - The query to be executed, required
- `headers` - Additional HTTP headers, optional
```
{
"type": "graphql",
"url": "http://172.15.0.15:8000/subgraphs/name/oceanprotocol/ocean-subgraph",
"headers":{
"Authorization": "Bearer 123",
"APIKEY": "124",
},
"query": """query{
nfts(orderBy: createdTimestamp,orderDirection:desc){
id
symbol
createdTimestamp
}
}"""
}
```
**`On-Chain`**
Use a smart contract as data source.
Parameters:
- `chainId` - The chainId used to query the contract, required
- `address` - The smartcontract address, required
- `abi` - The function abi (NOT the entire contract abi), required
```
{
"type": "smartcontract",
"chainId": 1,
"address": "0x8149276f275EEFAc110D74AFE8AFECEaeC7d1593",
"abi": {
"inputs": [],
"name": "swapOceanFee",
"outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}],
"stateMutability": "view",
"type": "function",
}
}
```
**`Arweave`**
[Arweave](https://www.arweave.org/) is a decentralized data storage that allows to permanently store files over a distributed network of computers.
Parameters:
- `transactionId` - The transaction identifier
```
{
{
"type": "arweave",
"transactionId": "a4qJoQZa1poIv5guEzkfgZYSAD0uYm7Vw4zm_tCswVQ",
}
}
```
First class integrations supported in the future :
**`Filecoin`**
**`Storj`**
**`SQL`**
A service can contain multiple files, using multiple storage types.
Example:
```json
{
"datatokenAddress": "0x1",
"nftAddress": "0x2",
"files": [
{
"type": "url",
"url": "https://url.com/file1.csv",
"method": "GET"
},
{
"type": "ipfs",
"hash": "XXXX"
}
]
}
```
To get information about the files after encryption, the `/fileinfo` endpoint of _Provider_ returns based on a passed DID an array of file metadata (based on the file type):
```json
[
{
"type": "url",
"contentLength": 100,
"contentType": "application/json"
},
{
"type": "ipfs",
"contentLength": 130,
"contentType": "application/text"
}
]
```
This only concerns metadata about a file, but never the file URLs. The only way to decrypt them is to exchange at least 1 datatoken based on the respective service pricing scheme.
#### Compute Options
An asset with a service of `type` `compute` has the following additional attributes under the `compute` object. This object is required if the asset is of `type` `compute`, but can be omitted for `type` of `access`.
| Attribute | Type | Required | Description |
| ------------------------------------- | ------------------------------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `allowRawAlgorithm` | `boolean` | **✓** | If `true`, any passed raw text will be allowed to run. Useful for an algorithm drag & drop use case, but increases risk of data escape through malicious user input. Should be `false` by default in all implementations. |
| `allowNetworkAccess` | `boolean` | **✓** | If `true`, the algorithm job will have network access. |
| `publisherTrustedAlgorithmPublishers` | Array of `string` | **✓** | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. If not empty any algo published by the defined publishers is allowed. |
| `publisherTrustedAlgorithms` | Array of `publisherTrustedAlgorithms` | **✓** | If not defined, then any published algorithm is allowed. If empty array, then no algorithm is allowed. Otherwise only the algorithms defined in the array are allowed. (see below). |
The `publisherTrustedAlgorithms` is an array of objects with the following structure:
| Attribute | Type | Required | Description |
| ------------------------------ | -------- | -------- | ----------------------------------------------------------- |
| **`did`** | `string` | **✓** | The DID of the algorithm which is trusted by the publisher. |
| **`filesChecksum`** | `string` | **✓** | Hash of algorithm's files (as `string`). |
| **`containerSectionChecksum`** | `string` | **✓** | Hash of algorithm's image details (as `string`). |
To produce `filesChecksum`, call the Provider FileInfoEndpoint with parameter withChecksum = True. If algorithm has multiple files, `filesChecksum` is a concatenated string of all files checksums (ie: checksumFile1+checksumFile2 , etc)
To produce `containerSectionChecksum`:
```js
sha256(algorithm_ddo.metadata.algorithm.container.entrypoint + algorithm_ddo.metadata.algorithm.container.checksum);
```
Example:
```json
{
"services": [
{
"id": "1",
"type": "access",
"files": "0x044736da6dae39889ff570c34540f24e5e084f...",
"name": "Download service",
"description": "Download service",
"datatokenAddress": "0x123",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0
},
{
"id": "2",
"type": "compute",
"files": "0x6dd05e0edb460623c843a263291ebe757c1eb3...",
"name": "Compute service",
"description": "Compute service",
"datatokenAddress": "0x124",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0,
"compute": {
"allowRawAlgorithm": false,
"allowNetworkAccess": true,
"publisherTrustedAlgorithmPublishers": ["0x234", "0x235"],
"publisherTrustedAlgorithms": [
{
"did": "did:op:123",
"filesChecksum": "100",
"containerSectionChecksum": "200"
},
{
"did": "did:op:124",
"filesChecksum": "110",
"containerSectionChecksum": "210"
}
]
}
}
]
}
```
#### Consumer Parameters
Sometimes, the asset needs additional input data before downloading or running a Compute-to-Data job. Examples:
- The publisher needs to know the sampling interval before the buyer downloads it. Suppose the dataset URL is `https://example.com/mydata`. The publisher defines a field called `sampling` and asks the buyer to enter a value. This parameter is then added to the URL of the published dataset as query parameters: `https://example.com/mydata?sampling=10`.
- An algorithm that needs to know the number of iterations it should perform. In this case, the algorithm publisher defines a field called `iterations`. The buyer needs to enter a value for the `iterations` parameter. Later, this value is stored in a specific location in the Compute-to-Data pod for the algorithm to read and use it.
The `consumerParameters` is an array of objects. Each object defines a field and has the following structure:
| Attribute | Type | Required | Description |
| ----------------- | -------------------------------- | -------- | -------------------------------------------------------------------------- |
| **`name`** | `string` | **✓** | The parameter name (this is sent as HTTP param or key towards algo) |
| **`type`** | `string` | **✓** | The field type (text, number, boolean, select) |
| **`label`** | `string` | **✓** | The field label which is displayed |
| **`required`** | `boolean` | **✓** | If customer input for this field is mandatory. |
| **`description`** | `string` | **✓** | The field description. |
| **`default`** | `string`, `number`, or `boolean` | **✓** | The field default value. For select types, `string` key of default option. |
| **`options`** | Array of `option` | | For select types, a list of options. |
Each `option` is an `object` containing a single key:value pair where the key is the option name, and the value is the option value.
Example:
```json
[
{
"name": "hometown",
"type": "text",
"label": "Hometown",
"required": true,
"description": "What is your hometown?",
"default": "Nowhere"
},
{
"name": "age",
"type": "number",
"label": "Age",
"required": false,
"description": "Please fill your age",
"default": 0
},
{
"name": "developer",
"type": "boolean",
"label": "Developer",
"required": false,
"description": "Are you a developer?",
"default": false
},
{
"name": "languagePreference",
"type": "select",
"label": "Language",
"required": false,
"description": "Do you like NodeJs or Python",
"default": "nodejs",
"options": [
{
"nodejs": "I love NodeJs"
},
{
"python": "I love Python"
}
]
}
]
```
Algorithms will have access to a JSON file located at /data/inputs/algoCustomData.json, which contains the keys/values for input data required. Example:
```json
{
"hometown": "São Paulo",
"age": 10,
"developer": true,
"languagePreference": "nodejs"
}
```
#### Credentials
By default, a consumer can access a resource if they have 1 datatoken. _Credentials_ allow the publisher to optionally specify more fine-grained permissions.
Consider a medical data use case, where only a credentialed EU researcher can legally access a given dataset. Ocean supports this as follows: a consumer can only access the resource if they have 1 datatoken _and_ one of the specified `"allow"` credentials.
This is like going to an R-rated movie, where you can only get in if you show both your movie ticket (datatoken) _and_ some identification showing you're old enough (credential).
Only credentials that can be proven are supported. This includes Ethereum public addresses, and in the future [W3C Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) and more.
Ocean also supports `"deny"` credentials: if a consumer has any of these credentials, they can not access the resource.
Here's an example object with both `"allow"` and `"deny"` entries:
```json
{
"credentials": {
"allow": [
{
"type": "address",
"values": ["0x123", "0x456"]
}
],
"deny": [
{
"type": "address",
"values": ["0x2222", "0x333"]
}
]
}
}
```
#### DDO Checksum
In order to ensure the integrity of the DDO, a checksum is computed for each DDO:
```js
const checksum = sha256(JSON.stringify(ddo));
```
The checksum hash is used when publishing/updating metadata using the `setMetaData` function in the ERC721 contract, and is stored in the event generated by the ERC721 contract:
```solidity
event MetadataCreated(
address indexed createdBy,
uint8 state,
string decryptorUrl,
bytes flags,
bytes data,
bytes metaDataHash,
uint256 timestamp,
uint256 blockNumber
);
event MetadataUpdated(
address indexed updatedBy,
uint8 state,
string decryptorUrl,
bytes flags,
bytes data,
bytes metaDataHash,
uint256 timestamp,
uint256 blockNumber
);
```
_Aquarius_ should always verify the checksum after data is decrypted via a _Provider_ API call.
#### State
Each asset has a state, which is held by the NFT contract. The possible states are:
| State | Description | Discoverable in Ocean Market | Ordering allowed | Listed under profile |
| ------- | ------------------------------ | ---------------------------- | ---------------- | -------------------- |
| **`0`** | Active | Yes | Yes | Yes |
| **`1`** | End-of-life | No | No | No |
| **`2`** | Deprecated (by another asset) | No | No | No |
| **`3`** | Revoked by publisher | No | No | No |
| **`4`** | Ordering is temporary disabled | Yes | No | Yes |
| **`5`** | Asset unlisted. | No | Yes | Yes |
### Aquarius Enhanced DDO Response
The following fields are added by _Aquarius_ in its DDO response for convenience reasons, where an asset returned by _Aquarius_ inherits the DDO fields stored on-chain.
These additional fields are never stored on-chain, and are never taken into consideration when [hashing the DDO](did-ddo.md#ddo-checksum).
#### NFT
The `nft` object contains information about the ERC721 NFT contract which represents the intellectual property of the publisher.
| Attribute | Type | Description |
| -------------- | ---------------------- | ----------------------------------------------------------------------------------- |
| **`address`** | `string` | Contract address of the deployed ERC721 NFT contract. |
| **`name`** | `string` | Name of NFT set in contract. |
| **`symbol`** | `string` | Symbol of NFT set in contract. |
| **`owner`** | `string` | ETH account address of the NFT owner. |
| **`state`** | `number` | State of the asset reflecting the NFT contract value. See [State](did-ddo.md#state) |
| **`created`** | `ISO date/time string` | Contains the date of NFT creation. |
| **`tokenURI`** | `string` | tokenURI |
Example:
```json
{
"nft": {
"address": "0x000000",
"name": "Ocean Protocol Asset v4",
"symbol": "OCEAN-A-v4",
"owner": "0x0000000",
"state": 0,
"created": "2000-10-31T01:30:00Z"
}
}
```
#### Datatokens
The `datatokens` array contains information about the ERC20 datatokens attached to [asset services](did-ddo.md#services).
| Attribute | Type | Description |
| --------------- | -------- | ------------------------------------------------ |
| **`address`** | `string` | Contract address of the deployed ERC20 contract. |
| **`name`** | `string` | Name of NFT set in contract. |
| **`symbol`** | `string` | Symbol of NFT set in contract. |
| **`serviceId`** | `string` | ID of the service the datatoken is attached to. |
Example:
```json
{
"datatokens": [
{
"address": "0x000000",
"name": "Datatoken 1",
"symbol": "DT-1",
"serviceId": "1"
},
{
"address": "0x000001",
"name": "Datatoken 2",
"symbol": "DT-2",
"serviceId": "2"
}
]
}
```
#### Event
The `event` section contains information about the last transaction that created or updated the DDO.
Example:
```json
{
"event": {
"tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c",
"block": 12831214,
"from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83",
"contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf",
"datetime": "2000-10-31T01:30:00"
}
}
```
#### Purgatory
Contains information about an asset's purgatory status defined in [`list-purgatory`](https://github.com/oceanprotocol/list-purgatory). Marketplace interfaces are encouraged to prevent certain user actions like adding liquidity on assets in purgatory.
| Attribute | Type | Description |
| ------------ | --------- | --------------------------------------------------------------------------------------------- |
| **`state`** | `boolean` | If `true`, asset is in purgatory. |
| **`reason`** | `string` | If asset is in purgatory, contains the reason for being there as defined in `list-purgatory`. |
Example:
```json
{
"purgatory": {
"state": true,
"reason": "Copyright violation"
}
}
```
```json
{
"purgatory": {
"state": false
}
}
```
#### Statistics
The `stats` section contains different statistics fields.
| Attribute | Type | Description |
| ------------ | -------- | ------------------------------------------------------------------------------------------------------------ |
| **`orders`** | `number` | How often an asset was ordered, meaning how often it was either downloaded or used as part of a compute job. |
Example:
```json
{
"stats": {
"orders": 4
}
}
```
### Full Enhanced DDO Example
```json
{
"@context": ["https://w3id.org/did/v1"],
"id": "did:op:ACce67694eD2848dd683c651Dab7Af823b7dd123",
"version": "4.1.0",
"chainId": 1,
"nftAddress": "0x123",
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample asset",
"type": "dataset",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms"
},
"services": [
{
"id": "1",
"type": "access",
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735",
"name": "Download service",
"description": "Download service",
"datatokenAddress": "0x123",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0,
"consumerParameters": [
{
"name": "surname",
"type": "text",
"label": "Name",
"required": true,
"default": "NoName",
"description": "Please fill your name"
},
{
"name": "age",
"type": "number",
"label": "Age",
"required": false,
"default": 0,
"description": "Please fill your age"
}
]
},
{
"id": "2",
"type": "compute",
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735",
"name": "Compute service",
"description": "Compute service",
"datatokenAddress": "0x124",
"serviceEndpoint": "https://myprovider.com",
"timeout": 3600,
"compute": {
"allowRawAlgorithm": false,
"allowNetworkAccess": true,
"publisherTrustedAlgorithmPublishers": ["0x234", "0x235"],
"publisherTrustedAlgorithms": [
{
"did": "did:op:123",
"filesChecksum": "100",
"containerSectionChecksum": "200"
},
{
"did": "did:op:124",
"filesChecksum": "110",
"containerSectionChecksum": "210"
}
]
}
}
],
"credentials": {
"allow": [
{
"type": "address",
"values": ["0x123", "0x456"]
}
],
"deny": [
{
"type": "address",
"values": ["0x2222", "0x333"]
}
]
},
"nft": {
"address": "0x123",
"name": "Ocean Protocol Asset v4",
"symbol": "OCEAN-A-v4",
"owner": "0x0000000",
"state": 0,
"created": "2000-10-31T01:30:00",
"tokenURI": "xxx"
},
"datatokens": [
{
"address": "0x000000",
"name": "Datatoken 1",
"symbol": "DT-1",
"serviceId": "1"
},
{
"address": "0x000001",
"name": "Datatoken 2",
"symbol": "DT-2",
"serviceId": "2"
}
],
"event": {
"tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c",
"block": 12831214,
"from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83",
"contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf",
"datetime": "2000-10-31T01:30:00"
},
"purgatory": {
"state": false
},
"stats": {
"orders": 4
}
}
```

View File

@ -0,0 +1,486 @@
---
title: DDO
slug: /developers/ddo/
section: developers
description: >-
Specification of decentralized identifiers for assets in Ocean Protocol using
the DDO standard.
---
# DDO Full Specification
**v4.1.0**
## Required Attributes
A DDO in Ocean has these required attributes:
| Attribute | Type | Description |
| ----------------- | ------------------------------------- | -------------------------------------------------------------------------------------------------------------- |
| **`@context`** | Array of `string` | Contexts used for validation. |
| **`id`** | `string` | Computed as `sha256(address of ERC721 contract + chainId)`. |
| **`version`** | `string` | Version information in [SemVer](https://semver.org) notation referring to this DDO spec version, like `4.1.0`. |
| **`chainId`** | `number` | Stores chainId of the network the DDO was published to. |
| **`nftAddress`** | `string` | NFT contract linked to this asset |
| **`metadata`** | [Metadata](did-ddo.md#metadata) | Stores an object describing the asset. |
| **`services`** | [Services](did-ddo.md#services) | Stores an array of services defining access to the asset. |
| **`credentials`** | [Credentials](did-ddo.md#credentials) | Describes the credentials needed to access a dataset in addition to the `services` definition. |
<details>
<summary>Full Enhanced DDO Example</summary>
```json
{
"@context": ["https://w3id.org/did/v1"],
"id": "did:op:ACce67694eD2848dd683c651Dab7Af823b7dd123",
"version": "4.1.0",
"chainId": 1,
"nftAddress": "0x123",
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample asset",
"type": "dataset",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms"
},
"services": [
{
"id": "1",
"type": "access",
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735",
"name": "Download service",
"description": "Download service",
"datatokenAddress": "0x123",
"serviceEndpoint": "https://myprovider.com",
"timeout": 0,
"consumerParameters": [
{
"name": "surname",
"type": "text",
"label": "Name",
"required": true,
"default": "NoName",
"description": "Please fill your name"
},
{
"name": "age",
"type": "number",
"label": "Age",
"required": false,
"default": 0,
"description": "Please fill your age"
}
]
},
{
"id": "2",
"type": "compute",
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735",
"name": "Compute service",
"description": "Compute service",
"datatokenAddress": "0x124",
"serviceEndpoint": "https://myprovider.com",
"timeout": 3600,
"compute": {
"allowRawAlgorithm": false,
"allowNetworkAccess": true,
"publisherTrustedAlgorithmPublishers": ["0x234", "0x235"],
"publisherTrustedAlgorithms": [
{
"did": "did:op:123",
"filesChecksum": "100",
"containerSectionChecksum": "200"
},
{
"did": "did:op:124",
"filesChecksum": "110",
"containerSectionChecksum": "210"
}
]
}
}
],
"credentials": {
"allow": [
{
"type": "address",
"values": ["0x123", "0x456"]
}
],
"deny": [
{
"type": "address",
"values": ["0x2222", "0x333"]
}
]
},
"nft": {
"address": "0x123",
"name": "Ocean Protocol Asset v4",
"symbol": "OCEAN-A-v4",
"owner": "0x0000000",
"state": 0,
"created": "2000-10-31T01:30:00",
"tokenURI": "xxx"
},
"datatokens": [
{
"address": "0x000000",
"name": "Datatoken 1",
"symbol": "DT-1",
"serviceId": "1"
},
{
"address": "0x000001",
"name": "Datatoken 2",
"symbol": "DT-2",
"serviceId": "2"
}
],
"event": {
"tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c",
"block": 12831214,
"from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83",
"contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf",
"datetime": "2000-10-31T01:30:00"
},
"purgatory": {
"state": false
},
"stats": {
"orders": 4
}
}
```
</details>
## Metadata
This object holds information describing the actual asset.
| Attribute | Type | Description |
| --------------------------- | --------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **`created`** | `ISO date/time string` | Contains the date of the creation of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. |
| **`updated`** | `ISO date/time string` | Contains the date of last update of the dataset content in ISO 8601 format preferably with timezone designators, e.g. `2000-10-31T01:30:00Z`. |
| **`description`**\* | `string` | Details of what the resource is. For a dataset, this attribute explains what the data represents and what it can be used for. |
| **`copyrightHolder`** | `string` | The party holding the legal copyright. Empty by default. |
| **`name`**\* | `string` | Descriptive name or title of the asset. |
| **`type`**\* | `string` | Asset type. Includes `"dataset"` (e.g. csv file), `"algorithm"` (e.g. Python script). Each type needs a different subset of metadata attributes. |
| **`author`**\* | `string` | Name of the entity generating this data (e.g. Tfl, Disney Corp, etc.). |
| **`license`**\* | `string` | Short name referencing the license of the asset (e.g. Public Domain, CC-0, CC-BY, No License Specified, etc. ). If it's not specified, the following value will be added: "No License Specified". |
| **`links`** | Array of `string` | Mapping of URL strings for data samples, or links to find out more information. Links may be to either a URL or another asset. |
| **`contentLanguage`** | `string` | The language of the content. Use one of the language codes from the [IETF BCP 47 standard](https://tools.ietf.org/html/bcp47) |
| **`tags`** | Array of `string` | Array of keywords or tags used to describe this content. Empty by default. |
| **`categories`** | Array of `string` | Array of categories associated to the asset. Note: recommended to use `tags` instead of this. |
| **`additionalInformation`** | Object | Stores additional information, this is customizable by publisher |
| **`algorithm`**\*\* | [Algorithm Metadata](did-ddo.md#algorithm-metadata) | Information about asset of `type` `algorithm` |
\* Required&#x20;
\*\* Required for algorithms only
<details>
<summary>Metadata Example</summary>
```json
{
"metadata": {
"created": "2020-11-15T12:27:48Z",
"updated": "2021-05-17T21:58:02Z",
"description": "Sample description",
"name": "Sample asset",
"type": "dataset",
"author": "OPF",
"license": "https://market.oceanprotocol.com/terms"
}
}
```
</details>
#### Services
Services define the access for an asset, and each service is represented by its respective datatoken.
An asset should have at least one service to be actually accessible, and can have as many services which make sense for a specific use case.
| Attribute | Type | Description |
| --------- | ---- | ----------- |
| **`id`*** | `string` | Unique ID |
| **`type`*** | `string` |Type of service `access`, `compute`, `wss` etc. |
| **`name`** | `string` | Service friendly name |
| **`description`** | `string` | Service description |
| **`datatokenAddress`*** | `string` | Datatoken |
| **`serviceEndpoint`*** | `string` | Provider URL (schema + host) |
| **`files`*** | [Files](did-ddo.md#files) | Encrypted file. |
| **`timeout`*** | `number` | Describing how long the service can be used after consumption is initiated. A timeout of `0` represents no time limit. Expressed in seconds. |
| **`compute`**** | [Compute](developers/compute-to-data/compute-options.md) | If service is of `type` `compute`, holds information about the compute-related privacy settings & resources. |
| **`consumerParameters`** | [Consumer Parameters](developers/compute-to-data/compute-options.md#consumer-parameters) | An object the defines required consumer input before consuming the asset |
| **`additionalInformation`** | Object | Stores additional information, this is customizable by publisher |
\* Required&#x20;
\*\* Required for compute assets only
#### Files
The `files` field is returned as a `string` which holds the encrypted file URLs.
<details>
<summary>Files Example</summary>
```json
{
"files": "0x044736da6dae39889ff570c34540f24e5e084f4e5bd81eff3691b729c2dd1465ae8292fc721e9d4b1f10f56ce12036c9d149a4dab454b0795bd3ef8b7722c6001e0becdad5caeb2005859642284ef6a546c7ed76f8b350480691f0f6c6dfdda6c1e4d50ee90e83ce3cb3ca0a1a5a2544e10daa6637893f4276bb8d7301eb35306ece50f61ca34dcab550b48181ec81673953d4eaa4b5f19a45c0e9db4cd9729696f16dd05e0edb460623c843a263291ebe757c1eb3435bb529cc19023e0f49db66ef781ca692655992ea2ca7351ac2882bf340c9d9cb523b0cbcd483731dc03f6251597856afa9a68a1e0da698cfc8e81824a69d92b108023666ee35de4a229ad7e1cfa9be9946db2d909735"
}
```
</details>
#### Credentials
By default, a consumer can access a resource if they have 1 datatoken. _Credentials_ allow the publisher to optionally specify more fine-grained permissions.
Consider a medical data use case, where only a credentialed EU researcher can legally access a given dataset. Ocean supports this as follows: a consumer can only access the resource if they have 1 datatoken _and_ one of the specified `"allow"` credentials.
This is like going to an R-rated movie, where you can only get in if you show both your movie ticket (datatoken) _and_ some identification showing you're old enough (credential).
Only credentials that can be proven are supported. This includes Ethereum public addresses, and in the future [W3C Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) and more.
Ocean also supports `"deny"` credentials: if a consumer has any of these credentials, they can not access the resource.
Here's an example object with both `"allow"` and `"deny"` entries:
<details>
<summary>Credentials Example</summary>
```json
{
"credentials": {
"allow": [
{
"type": "address",
"values": ["0x123", "0x456"]
}
],
"deny": [
{
"type": "address",
"values": ["0x2222", "0x333"]
}
]
}
}
```
</details>
#### DDO Checksum
In order to ensure the integrity of the DDO, a checksum is computed for each DDO:
```js
const checksum = sha256(JSON.stringify(ddo));
```
The checksum hash is used when publishing/updating metadata using the `setMetaData` function in the ERC721 contract, and is stored in the event generated by the ERC721 contract.
<details>
<summary>MetadataCreated and MetadataUpdated smart contract events</summary>
```solidity
event MetadataCreated(
address indexed createdBy,
uint8 state,
string decryptorUrl,
bytes flags,
bytes data,
bytes metaDataHash,
uint256 timestamp,
uint256 blockNumber
);
event MetadataUpdated(
address indexed updatedBy,
uint8 state,
string decryptorUrl,
bytes flags,
bytes data,
bytes metaDataHash,
uint256 timestamp,
uint256 blockNumber
);
```
</details>
_Aquarius_ should always verify the checksum after data is decrypted via a _Provider_ API call.
#### State
Each asset has a state, which is held by the NFT contract. The possible states are:
| State | Description | Discoverable in Ocean Market | Ordering allowed | Listed under profile |
| ------- | ------------------------------ | ---------------------------- | ---------------- | -------------------- |
| **`0`** | Active | Yes | Yes | Yes |
| **`1`** | End-of-life | No | No | No |
| **`2`** | Deprecated (by another asset) | No | No | No |
| **`3`** | Revoked by publisher | No | No | No |
| **`4`** | Ordering is temporary disabled | Yes | No | Yes |
| **`5`** | Asset unlisted. | No | Yes | Yes |
### Aquarius Enhanced DDO Response
The following fields are added by _Aquarius_ in its DDO response for convenience reasons, where an asset returned by _Aquarius_ inherits the DDO fields stored on-chain.
These additional fields are never stored on-chain, and are never taken into consideration when [hashing the DDO](did-ddo.md#ddo-checksum).
#### NFT
The `nft` object contains information about the ERC721 NFT contract which represents the intellectual property of the publisher.
| Attribute | Type | Description |
| -------------- | ---------------------- | ----------------------------------------------------------------------------------- |
| **`address`** | `string` | Contract address of the deployed ERC721 NFT contract. |
| **`name`** | `string` | Name of NFT set in contract. |
| **`symbol`** | `string` | Symbol of NFT set in contract. |
| **`owner`** | `string` | ETH account address of the NFT owner. |
| **`state`** | `number` | State of the asset reflecting the NFT contract value. See [State](did-ddo.md#state) |
| **`created`** | `ISO date/time string` | Contains the date of NFT creation. |
| **`tokenURI`** | `string` | tokenURI |
<details>
<summary>NFT Object Example</summary>
```json
{
"nft": {
"address": "0x000000",
"name": "Ocean Protocol Asset v4",
"symbol": "OCEAN-A-v4",
"owner": "0x0000000",
"state": 0,
"created": "2000-10-31T01:30:00Z"
}
}
```
</details>
#### Datatokens
The `datatokens` array contains information about the ERC20 datatokens attached to [asset services](did-ddo.md#services).
| Attribute | Type | Description |
| --------------- | -------- | ------------------------------------------------ |
| **`address`** | `string` | Contract address of the deployed ERC20 contract. |
| **`name`** | `string` | Name of NFT set in contract. |
| **`symbol`** | `string` | Symbol of NFT set in contract. |
| **`serviceId`** | `string` | ID of the service the datatoken is attached to. |
<details>
<summary>Datatokens Array Example</summary>
```json
{
"datatokens": [
{
"address": "0x000000",
"name": "Datatoken 1",
"symbol": "DT-1",
"serviceId": "1"
},
{
"address": "0x000001",
"name": "Datatoken 2",
"symbol": "DT-2",
"serviceId": "2"
}
]
}
```
</details>
#### Event
The `event` section contains information about the last transaction that created or updated the DDO.
<details>
<summary>Event Example</summary>
```json
{
"event": {
"tx": "0x8d127de58509be5dfac600792ad24cc9164921571d168bff2f123c7f1cb4b11c",
"block": 12831214,
"from": "0xAcca11dbeD4F863Bb3bC2336D3CE5BAC52aa1f83",
"contract": "0x1a4b70d8c9DcA47cD6D0Fb3c52BB8634CA1C0Fdf",
"datetime": "2000-10-31T01:30:00"
}
}
```
</details>
#### Purgatory
Contains information about an asset's purgatory status defined in [`list-purgatory`](https://github.com/oceanprotocol/list-purgatory). Marketplace interfaces are encouraged to prevent certain user actions like adding liquidity on assets in purgatory.
| Attribute | Type | Description |
| ------------ | --------- | --------------------------------------------------------------------------------------------- |
| **`state`** | `boolean` | If `true`, asset is in purgatory. |
| **`reason`** | `string` | If asset is in purgatory, contains the reason for being there as defined in `list-purgatory`. |
<details>
<summary>Purgatory Example</summary>
\`\`\`json { "purgatory": { "state": true, "reason": "Copyright violation" } } \`\`\`
```json
{
"purgatory": {
"state": false
}
}
```
</details>
#### Statistics
The `stats` section contains different statistics fields.
| Attribute | Type | Description |
| ------------ | -------- | ------------------------------------------------------------------------------------------------------------ |
| **`orders`** | `number` | How often an asset was ordered, meaning how often it was either downloaded or used as part of a compute job. |
<details>
<summary>Statistics Example</summary>
```json
{
"stats": {
"orders": 4
}
}
```
</details>

View File

@ -0,0 +1,178 @@
---
title: Storage Specifications
section: developers
description: >-
Specification of storage options for assets in Ocean Protocol.
---
# Storage Specifications
Ocean does not handle the actual storage of files directly. The files are stored via other services which are then specified within the DDO.
During the publish process, file URLs must be encrypted with a respective _Provider_ API call before storing the DDO on-chain. For this, you need to send the following object to Provider (where "files" contains one or more storage objects):
```json
{
"datatokenAddress":"0x1",
"nftAddress": "0x2",
"files": [
...
]
}
```
The remainder of this document specifies the different types of storage objects that are supported:
## Static URLs.
Parameters:
* `url` - File url, required
* `method` - The HTTP method, required
* `headers` - Additional HTTP headers, optional
```
{
"type": "url",
"url": "https://url.com/file1.csv",
"method": "GET",
"headers":
{
"Authorization": "Bearer 123",
"APIKEY": "124",
}
}
```
## Interplanetary File System
**`IPFS`**
The [Interplanetary File System](https://ipfs.tech/) (IPFS) is a distributed file storage protocol that allows computers all over the globe to store and serve files as part of a giant peer-to-peer network. Any computer, anywhere in the world, can download the IPFS software and start hosting and serving files.
Parameters:
* `hash` - The file hash
```
{
"type": "ipfs",
"hash": "XXX"
}
```
## GraphQL
**`GraphQL`**
[GraphQL](https://graphql.org/) is a query language for APIs and a runtime for fulfilling those queries with your existing data.
Parameters:
* `url` - Server endpoint url, required
* `query` - The query to be executed, required
* `headers` - Additional HTTP headers, optional
```
{
"type": "graphql",
"url": "http://172.15.0.15:8000/subgraphs/name/oceanprotocol/ocean-subgraph",
"headers":{
"Authorization": "Bearer 123",
"APIKEY": "124",
},
"query": """query{
nfts(orderBy: createdTimestamp,orderDirection:desc){
id
symbol
createdTimestamp
}
}"""
}
```
## Smart Contract Data
Use a smart contract as data source.
Parameters:
* `chainId` - The chainId used to query the contract, required
* `address` - The smartcontract address, required
* `abi` - The function abi (NOT the entire contract abi), required
```
{
"type": "smartcontract",
"chainId": 1,
"address": "0x8149276f275EEFAc110D74AFE8AFECEaeC7d1593",
"abi": {
"inputs": [],
"name": "swapOceanFee",
"outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}],
"stateMutability": "view",
"type": "function",
}
}
```
## Arweave
[Arweave](https://www.arweave.org/) is a decentralized data storage that allows to permanently store files over a distributed network of computers.
Parameters:
* `transactionId` - The transaction identifier
```
{
{
"type": "arweave",
"transactionId": "a4qJoQZa1poIv5guEzkfgZYSAD0uYm7Vw4zm_tCswVQ",
}
}
```
First class integrations supported in the future : **`Filecoin`** **`Storj`** **`SQL`**
A service can contain multiple files, using multiple storage types.
Example:
```json
{
"datatokenAddress": "0x1",
"nftAddress": "0x2",
"files": [
{
"type": "url",
"url": "https://url.com/file1.csv",
"method": "GET"
},
{
"type": "ipfs",
"hash": "XXXX"
}
]
}
```
To get information about the files after encryption, the `/fileinfo` endpoint of _Provider_ returns based on a passed DID an array of file metadata (based on the file type):
```json
[
{
"type": "url",
"contentLength": 100,
"contentType": "application/json"
},
{
"type": "ipfs",
"contentLength": 130,
"contentType": "application/text"
}
]
```
This only concerns metadata about a file, but never the file URLs. The only way to decrypt them is to exchange at least 1 datatoken based on the respective service pricing scheme.