1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-11-26 19:49:26 +01:00

GITBOOK-492: Update the Metadata page

This commit is contained in:
Ana Loznianu 2023-06-13 13:41:12 +00:00 committed by gitbook-bot
parent 48b6e5efe3
commit 72e86e78ae
No known key found for this signature in database
GPG Key ID: 07D2180C7B12D0FF
10 changed files with 105 additions and 123 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.5 MiB

View File

@ -30,7 +30,7 @@
* [Harvest More Yield Data Farming](user-guides/how-to-data-farm.md)
* [Claim Rewards Data Farming](user-guides/claim-ocean-rewards.md)
* [Liquidity Pools \[deprecated\]](user-guides/remove-liquidity-using-etherscan.md)
* [👨💻 👨💻 Developers](developers/README.md)
* [👨💻 Developers](developers/README.md)
* [Architecture Overview](developers/contracts/architecture.md)
* [Contracts](developers/contracts/README.md)
* [Data NFTs](developers/contracts/data-nfts.md)

View File

@ -24,7 +24,7 @@ If youre part of an established organization or a growing startup, youll a
We have always been super encouraging of anyone who wishes to build a dApp on top of Ocean or to fork Ocean Market and make their own data marketplace. With the V4 release, we have taken this to the next level and introduced more opportunities and even more fee customization options.
Unlike in V3, where the fee collection was limited to the consume action with a fixed value of 0.1%, V4 empowers dApp operators like yourself to have greater flexibility and control over the fees you can charge. This means you can tailor the fee structure to suit your specific needs and ensure the sustainability of your project. **V4 smart contracts enable you to collect a fee not only in consume, but also in fixed-rate exchange, also you can set the fee value.** For more detailed information regarding the fees, we invite you to visit the [fees](fees.md) page.
Unlike in V3, where the fee collection was limited to the consume action with a fixed value of 0.1%, V4 empowers dApp operators like yourself to have greater flexibility and control over the fees you can charge. This means you can tailor the fee structure to suit your specific needs and ensure the sustainability of your project. **V4 smart contracts enable you to collect a fee not only in consume, but also in fixed-rate exchange, also you can set the fee value.** For more detailed information regarding the fees, we invite you to visit the [fees](contracts/fees.md) page.
Another new opportunity is using your own **ERC20** token in your dApp, where its used as the unit of exchange. This is fully supported and can be a great way to ensure the sustainability of your project.

View File

@ -26,7 +26,7 @@ By utilizing ERC721 tokens, Ocean V4 **grants data creators greater flexibility
Ocean V4 brings forth enhanced opportunities for marketplace operators, creating a conducive environment for the emergence of a thriving market of **third-party Providers**.
With Ocean V4, marketplace operators can unlock additional benefits. Firstly, the V4 smart contracts empower marketplace operators to collect [fees](../fees.md) not only during **data consumption** but also through **fixed-rate exchanges**. This expanded revenue model allows operators to derive more value from the ecosystem. Moreover, in Ocean V4, the marketplace operator has the authority to determine the fee value, providing them with **increased control** over their pricing strategies.
With Ocean V4, marketplace operators can unlock additional benefits. Firstly, the V4 smart contracts empower marketplace operators to collect [fees](fees.md) not only during **data consumption** but also through **fixed-rate exchanges**. This expanded revenue model allows operators to derive more value from the ecosystem. Moreover, in Ocean V4, the marketplace operator has the authority to determine the fee value, providing them with **increased control** over their pricing strategies.
In addition to empowering marketplace operators, Ocean V4 facilitates the participation of third-party [Providers](../provider/) who can offer compute services in exchange for a fee. This paves the way for the development of a diverse marketplace of Providers. This model supports both centralized trusted providers, where data publishers and consumers have established trust relationships, as well as trustless providers that leverage decentralization or other privacy-preserving mechanisms.
@ -39,7 +39,7 @@ Key features of the V4 smart contracts:
* Interoperability with the NFT ecosystem (and DeFi & DAO tools).
* Allows new data [NFT & datatoken templates](datatoken-templates.md), for flexibility and future-proofing.
* Besides base data IP, you can use data NFTs to **implement comments & ratings, verifiable claims, identity credentials, and social media posts**. They can point to parent data NFTs, enabling the nesting of comments on comments, or replies to tweets. All on-chain, GDPR-compliant, easily searched, with js & py drivers 🤯
* Introduce an advanced [Fee](../fees.md) structure both for Marketplace and Provider runners 💰
* Introduce an advanced [Fee](fees.md) structure both for Marketplace and Provider runners 💰
* [Roles](roles.md) Administration: there are now multiple roles for a more flexible administration both at [NFT](data-nfts.md) and [ERC20](datatokens.md) levels 👥
* When the NFT is transferred, it auto-updates all permissions, e.g. who receives payment, or who can mint derivative ERC20 datatokens.
* Key-value store in the NFT contract: NFT contract can be used to store custom key-value pairs (ERC725Y standard) enabling applications like soulbound tokens and Sybil protection approaches 🗃️

View File

@ -7,13 +7,13 @@ description: Ocean Protocol Architecture Adventure!
Embark on an exploration of the innovative realm of Ocean Protocol, where data flows seamlessly and AI achieves new heights. Dive into the intricately layered architecture that converges data and services, fostering a harmonious collaboration. Let us delve deep and uncover the profound design of Ocean Protocol.🐬
<figure><img src="../.gitbook/assets/OP High Level Architecture.jpg" alt=""><figcaption><p>Overview of the Ocean Protocol Architecture</p></figcaption></figure>
<figure><img src="../../.gitbook/assets/OP High Level Architecture.jpg" alt=""><figcaption><p>Overview of the Ocean Protocol Architecture</p></figcaption></figure>
### Layer 1: The Foundational Blockchain Layer
At the core of Ocean Protocol lies the robust [Blockchain Layer](contracts/). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets.&#x20;
At the core of Ocean Protocol lies the robust [Blockchain Layer](./). Powered by blockchain technology, this layer ensures secure and transparent transactions. It forms the bedrock of decentralized trust, where data providers and consumers come together to trade valuable assets.&#x20;
The [smart contracts](contracts/) are deployed on the Ethereum mainnet and other compatible [networks](../discover/networks/). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](contracts/) section.
The [smart contracts](./) are deployed on the Ethereum mainnet and other compatible [networks](../../discover/networks/). The libraries encapsulate the calls to these smart contracts and provide features like publishing new assets, facilitating consumption, managing pricing, and much more. To explore the contracts in more depth, go ahead to the [contracts](./) section.
### Layer 2: The Empowering Middle Layer
@ -21,32 +21,32 @@ Above the smart contracts, you'll find essential [libraries](architecture.md#lib
#### Libraries
These libraries include [Ocean.js](broken-reference), a JavaScript library, and [Ocean.py](ocean.py/), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol.
These libraries include [Ocean.js](broken-reference), a JavaScript library, and [Ocean.py](../ocean.py/), a Python library. They serve as powerful tools for developers, enabling integration and interaction with the protocol.
1. [Ocean.js](broken-reference): Ocean.js is a JavaScript library that serves as a powerful tool for developers looking to integrate their applications with the Ocean Protocol ecosystem. Designed to facilitate interaction with the protocol, Ocean.js provides a comprehensive set of functionalities, including data tokenization, asset management, and smart contract interaction. Ocean.js simplifies the process of implementing data access controls, building dApps, and exploring data sets within a decentralized environment.&#x20;
2. [Ocean.py](ocean.py/): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../data-science/) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data.
2. [Ocean.py](../ocean.py/): Ocean.py is a Python library that empowers developers to integrate their applications with the Ocean Protocol ecosystem. With its rich set of functionalities, Ocean.py provides a comprehensive toolkit for interacting with the protocol. Developers and [data scientists](../../data-science/) can leverage Ocean.py to perform a wide range of tasks, including data tokenization, asset management, and smart contract interactions. This library serves as a bridge between Python and the decentralized world of Ocean Protocol, enabling you to harness the power of decentralized data.
#### Middleware components
Additionally, in supporting the discovery process, middleware components come into play:
1. [Aquarius](aquarius/): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery.
2. [Provider](provider/): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants.
3. [Subgraph](subgraph/): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information.
1. [Aquarius](../aquarius/): Aquarius acts as a metadata cache, enhancing search efficiency by caching on-chain data into Elasticsearch. By accelerating metadata retrieval, Aquarius enables faster and more efficient data discovery.
2. [Provider](../provider/): The Provider component plays a crucial role in facilitating various operations within the ecosystem. It assists in asset downloading, handles [DDO](../ddo-specification.md) (Decentralized Data Object) encryption, and establishes communication with the operator-service for Compute-to-Data jobs. This ensures secure and streamlined interactions between different participants.
3. [Subgraph](../subgraph/): The Subgraph is an off-chain service that utilizes GraphQL to offer efficient access to information related to datatokens, users, and balances. By leveraging the subgraph, data retrieval becomes faster compared to an on-chain query. This enhances the overall performance and responsiveness of applications that rely on accessing this information.
#### Compute-to-Data
[Compute-to-Data](compute-to-data/) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols.
[Compute-to-Data](../compute-to-data/) (C2D) represents a groundbreaking paradigm within the Ocean Protocol ecosystem, revolutionizing the way data is processed and analyzed. With C2D, the traditional approach of moving data to the computation is inverted, ensuring privacy and security. Instead, algorithms are securely transported to the data sources, enabling computation to be performed locally, without the need to expose sensitive data. This innovative framework facilitates collaborative data analysis while preserving data privacy, making it ideal for scenarios where data owners want to retain control over their valuable assets. C2D provides a powerful tool for enabling secure and privacy-preserving data analysis and encourages collaboration among data providers, ensuring the utilization of valuable data resources while maintaining strict privacy protocols.
### Layer 3: The Accessible Application Layer
Here, the ocean comes alive with a vibrant ecosystem of dApps, marketplaces, and more. This layer hosts a variety of user-friendly interfaces, applications, and tools, inviting data scientists and curious explorers alike to access, explore, and contribute to the ocean's treasures.&#x20;
Prominently featured within this layer is [Ocean Market](../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized.&#x20;
Prominently featured within this layer is [Ocean Market](../../user-guides/using-ocean-market.md), a hub where data enthusiasts and industry stakeholders converge to discover, trade, and unlock the inherent value of data assets. Beyond Ocean Market, the Application Layer hosts a diverse ecosystem of specialized applications and marketplaces, each catering to unique use cases and industries. Empowered by the capabilities of Ocean Protocol, these applications facilitate advanced data exploration, analytics, and collaborative ventures, revolutionizing the way data is accessed, shared, and monetized.&#x20;
### Layer 4: The Friendly Wallets
At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../discover/wallets/), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../discover/wallets/).
At the top of the Ocean Protocol ecosystem, we find the esteemed [Web 3 Wallets](../../discover/wallets/), the gateway for users to immerse themselves in the world of decentralized data transactions. These wallets serve as trusted companions, enabling users to seamlessly transact within the ecosystem, purchase and sell data NFTs, and acquire valuable datatokens. For a more detailed exploration of Web 3 Wallets and their capabilities, you can refer to the [wallet intro page](../../discover/wallets/).

View File

@ -6,12 +6,12 @@ description: Choose the revenue model during asset publishing.
Ocean Protocol offers you flexible and customizable pricing options to monetize your valuable data assets. You have two main pricing models to choose from:&#x20;
* [Fixed pricing ](asset-pricing.md#fixed-pricing)
* [Free pricing](asset-pricing.md#free-pricing)
* [Fixed pricing ](pricing-schemas.md#fixed-pricing)
* [Free pricing](pricing-schemas.md#free-pricing)
These models are designed to cater to your specific needs and ensure a smooth experience for data consumers.
The price of an asset is determined by the number of tokens (this can be Ocean Tokens or any ERC20 Token configured when published the asset) a buyer must pay to access the data. When users pay the tokens, they get a _datatoken_ in their wallets, a tokenized representation of the access right stored on the blockchain. To read more about datatoken and data NFT click [here](contracts/datanft-and-datatoken.md).
The price of an asset is determined by the number of tokens (this can be Ocean Tokens or any ERC20 Token configured when published the asset) a buyer must pay to access the data. When users pay the tokens, they get a _datatoken_ in their wallets, a tokenized representation of the access right stored on the blockchain. To read more about datatoken and data NFT click [here](datanft-and-datatoken.md).
To provide you with even greater flexibility in monetizing your data assets, Ocean Protocol allows you to customize the pricing schema by configuring your own ERC20 token when publishing the asset. This means that instead of using Ocean Tokens as the pricing currency, you can utilize your own token, aligning the pricing structure with your specific requirements and preferences.
@ -116,13 +116,13 @@ IERC721Template(erc721Address).removeFromCreateERC20List(address(this));
</details>
{% hint style="info" %}
There are two templates available: [ERC20Template](contracts/datatoken-templates.md#regular-template) and [ERC20TemplateEnterprise](contracts/datatoken-templates.md#enterprise-template).
There are two templates available: [ERC20Template](datatoken-templates.md#regular-template) and [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template).
In the case of [ERC20TemplateEnterprise](contracts/datatoken-templates.md#enterprise-template), when you deploy a fixed rate exchange, the funds generated as revenue are automatically sent to the owner's address. The owner receives the revenue without any manual intervention.
In the case of [ERC20TemplateEnterprise](datatoken-templates.md#enterprise-template), when you deploy a fixed rate exchange, the funds generated as revenue are automatically sent to the owner's address. The owner receives the revenue without any manual intervention.
On the other hand, with [ERC20Template](contracts/datatoken-templates.md#regular-template), for a fixed rate exchange, the revenue is available at the fixed rate exchange level. The owner or the payment collector has the authority to manually retrieve the revenue.
On the other hand, with [ERC20Template](datatoken-templates.md#regular-template), for a fixed rate exchange, the revenue is available at the fixed rate exchange level. The owner or the payment collector has the authority to manually retrieve the revenue.
{% endhint %}
### Free pricing
@ -182,8 +182,8 @@ function createNftWithErc20WithDispenser(
</details>
To make the most of these pricing models, you can rely on user-friendly libraries such as [Ocean.js ](ocean.js/)and [Ocean.py](ocean.py/), specifically developed for interacting with Ocean Protocol.&#x20;
To make the most of these pricing models, you can rely on user-friendly libraries such as [Ocean.js ](../ocean.js/)and [Ocean.py](../ocean.py/), specifically developed for interacting with Ocean Protocol.&#x20;
With Ocean.js, you can use the [createFRE() ](ocean.js/publish.md)function to effortlessly deploy a data NFT (non-fungible token) and datatoken with a fixed-rate exchange pricing model. Similarly, in Ocean.py, the [create\_url\_asset()](ocean.py/publish-flow.md#create-an-asset-and-pricing-schema-simultaneously) function allows you to create an asset with fixed pricing. These libraries simplify the process of interacting with Ocean Protocol, managing pricing, and handling asset creation.
With Ocean.js, you can use the [createFRE() ](../ocean.js/publish.md)function to effortlessly deploy a data NFT (non-fungible token) and datatoken with a fixed-rate exchange pricing model. Similarly, in Ocean.py, the [create\_url\_asset()](../ocean.py/publish-flow.md#create-an-asset-and-pricing-schema-simultaneously) function allows you to create an asset with fixed pricing. These libraries simplify the process of interacting with Ocean Protocol, managing pricing, and handling asset creation.
By taking advantage of Ocean Protocol's pricing options and leveraging the capabilities of [Ocean.js](ocean.js/) and [Ocean.py](ocean.py/) (or by using the [Market](../user-guides/using-ocean-market.md)), you can effectively monetize your data assets while ensuring transparent and seamless access for data consumers.&#x20;
By taking advantage of Ocean Protocol's pricing options and leveraging the capabilities of [Ocean.js](../ocean.js/) and [Ocean.py](../ocean.py/) (or by using the [Market](../../user-guides/using-ocean-market.md)), you can effectively monetize your data assets while ensuring transparent and seamless access for data consumers.&#x20;

View File

@ -1,21 +1,20 @@
---
title: Fine-Grained Permissions
description: Fine-Grained Permissions Using Role-Based Access Control. You can Control who can publish, buy or browse data
description: >-
Fine-Grained Permissions Using Role-Based Access Control. You can Control who
can publish, buy or browse data
---
# Fine-Grained Permissions
A large part of Ocean is about access control, which is primarily handled by datatokens. Users can access a resource (e.g. a file) by redeeming datatokens for that resource. We recognize that enterprises and other users often need more precise ways to specify and manage access, and we have introduced fine-grained permissions for these use cases.
Fine-grained permissions mean that access can be controlled precisely at two levels:
A large part of Ocean is about access control, which is primarily handled by datatokens. Users can access a resource (e.g. a file) by redeeming datatokens for that resource. We recognize that enterprises and other users often need more precise ways to specify and manage access, and we have introduced fine-grained permissions for these use cases. Fine-grained permissions mean that access can be controlled precisely at two levels:
- [Marketplace-level permissions](./market-level-permissions) for browsing, downloading or publishing within a marketplace frontend.
- [Asset-level permissions](./asset-level-permissions) on downloading a specific asset.
* [Marketplace-level permissions](fg-permissions.md#market-level-permissions) for browsing, downloading or publishing within a marketplace frontend.
* [Asset-level permissions](fg-permissions.md#asset-level-restrictions) on downloading a specific asset.
The fine-grained permissions features are designed to work in forks of Ocean Market. We have not enabled them in Ocean Market itself, to keep Ocean Market open for everyone to use. On the front end, the permissions features are easily enabled by setting environment variables.
## Introduction
### Introduction
Some datasets need to be restricted to appropriately credentialed users. In this situation there is tension:
@ -30,54 +29,50 @@ We can port this model into Ocean, where (a) is a datatoken, and (b) is a creden
The credential based restrictions are implemented in two ways, at the market level and at the asset level. Access to the market is restricted on a role basis, the user's identity is attached to a role via the role based access control (RBAC) server. Access to individual assets is restricted via allow and deny lists which list the ethereum addresses of the users who can and cannot access the asset within the DDO.
## Asset-Level Restrictions
### Asset-Level Restrictions
For asset-level restrictions Ocean supports allow and deny lists. Allow and deny lists are advanced features that allow publishers to control access to individual data assets. Publishers can restrict assets so that they can only be accessed by approved users (allow lists) or they can restrict assets so that they can be accessed by anyone except certain users (deny lists).
When an allow-list is in place, a consumer can only access the resource if they have a datatoken and one of the credentials in the "allow" list of the DDO. Ocean also has complementary deny functionality: if a consumer is on the "deny" list, they will not be allowed to access the resource.
Initially, the only credential supported is Ethereum public addresses. To be fair, its more a pointer to an individual not a credential; but it has a low-complexity implementation so makes a good starting point. For extensibility, the Ocean metadata schema enables specification of other types of credentials like W3C Verifiable Credentials and more. When this gets implemented, asset-level permissions will be properly RBAC too.
Since asset-level permissions are in the DDO, and the DDO is controlled by the publisher, asset-level restrictions are controlled by the publisher.
Initially, the only credential supported is Ethereum public addresses. To be fair, its more a pointer to an individual not a credential; but it has a low-complexity implementation so makes a good starting point. For extensibility, the Ocean metadata schema enables specification of other types of credentials like W3C Verifiable Credentials and more. When this gets implemented, asset-level permissions will be properly RBAC too. Since asset-level permissions are in the DDO, and the DDO is controlled by the publisher, asset-level restrictions are controlled by the publisher.
## Market-Level Permissions
### Market-Level Permissions
For market-level permissions, Ocean implements a role-based access control server (RBAC server). It implements restrictions at the user level, based on the users role (credentials). The RBAC server is run & controlled by the marketplace owner. Therefore permissions at this level are at the discretion of the marketplace owner.
The RBAC server is the primary mechanism for restricting your users ability to publish, buy, or browse assets in the market.
### Roles
#### Roles
The RBAC server defines four different roles:
- Admin
- Publisher
- Consumer
- User
* Admin
* Publisher
* Consumer
* User
#### Admin/ Publisher
**Admin/ Publisher**
Currently users with either the admin or publisher roles will be able to use the Market without any restrictions. They can publish, buy and browse datasets.
#### Consumer
**Consumer**
A user with the consumer is able to browse datasets, purchase them, trade datatokens and also contribute to datapools. However, they are not able to publish datasets.
#### Users
**Users**
Users are able to browse and search datasets but they are not able to purchase datasets, trade datatokens, or contribute to data pools. They are also not able to publish datasets.
#### Address without a role
**Address without a role**
If a user attempts to view the data market without a role, or without a wallet connected, they will not be able to view or search any of the datasets.
#### No wallet connected
**No wallet connected**
When the RBAC server is enabled on the market, users are required to have a wallet connected to browse the datasets.
### Mapping roles to addresses
#### Mapping roles to addresses
Currently the are two ways that the RBAC server can be configured to map user roles to Ethereum addresses. The RBAC server is also built in such a way that it is easy for you to add your own authorization service. They two existing methods are:
@ -91,11 +86,11 @@ Alternatively, if you are not already using Keycloak, the easiest way to map use
It is possible that you can configure both of these methods of mapping user roles to Ethereum Addresses. In this case the requests to your RBAC server should specify which auth service they are using e.g. `"authService": "json"` or `"authService": "keycloak"`
#### Default Auth service
**Default Auth service**
Additionally, you can also set an environmental variable within the RBAC server that specifies the default authorization method that will be used e.g. `DEFAULT_AUTH_SERVICE = "json"`. When this variable is specified, requests sent to your RBAC server don't need to include an `authService` and they will automatically use the default authorization method.
### Running the RBAC server locally
#### Running the RBAC server locally
You can start running the RBAC server by following these steps:
@ -124,11 +119,11 @@ npm run build
npm run start
```
### Running the RBAC in Docker
#### Running the RBAC in Docker
When you are ready to deploy the RBAC server to
1. Replace the KEYCLOAK_URL in the Dockerfile with the correct URL for your hosting of [Keycloak](https://www.keycloak.org/).
1. Replace the KEYCLOAK\_URL in the Dockerfile with the correct URL for your hosting of [Keycloak](https://www.keycloak.org/).
2. Run the following command to build the RBAC service in a Docker container:
```Bash
@ -142,4 +137,3 @@ npm run start:docker
```
4. Now you are ready to send requests to the RBAC server via postman. Make sure to replace the URL to `http://localhost:49160` in your requests.

View File

@ -1,8 +1,12 @@
---
description: How can you enhance data discovery?
---
# Metadata
Imagine you're searching for data on Spanish almond production within a dApp operating within the Ocean ecosystem, managed by a European fruit and nut association. This hypothetical dApp may host a vast collection of datasets, making it essential to have a way to identify the relevant ones. One effective approach is to have **metadata** associated with each dataset to serve as valuable information about the data itself.
Imagine you're searching for data on Spanish almond production within a dApp operating within the Ocean ecosystem, managed by a European fruit and nut association. This hypothetical dApp may host a vast collection of datasets, making it essential to have a way to identify the relevant ones. One effective approach is to have metadata associated with each dataset to serve as valuable information about the data itself.
<figure><img src="../.gitbook/assets/data_everywhere.gif" alt=""><figcaption><p>Data discovery</p></figcaption></figure>
Metadata plays a **crucial role** in asset **discovery**, providing essential information such as **asset type, name, creation date, and licensing details**. Each data asset can have a [decentralized identifier (DID)](identifiers.md) that resolves to a DID document ([DDO](ddo-specification.md)) containing associated metadata. The DDO is essentially [JSON](https://www.json.org/) filling in metadata fields. To understand working with OCEAN DIDs, you can refer to the [DID documentation](identifiers.md). For a more comprehensive understanding of metadata structure, the [DDO Specification](ddo-specification.md) documentation provides in-depth information.
@ -13,7 +17,6 @@ In general, any dApp within the Ocean ecosystem is required to store metadata fo
* **datePublished**, e.g. “20221110T12:32:15Z”
* **author**, e.g. “Spanish Almond Board”
* **license**, e.g. “SAB Data License v4”
* **price**, e.g. “0”
* technical information about the **files**, such as the content type.
Other metadata might also be available. For example:
@ -23,70 +26,55 @@ Other metadata might also be available. For example:
* **description**, e.g. “2002 Italian almond production statistics for 14 varieties and 20 regions.”
* **additionalInformation** can be used to store any other facts about the asset.
### **Overview**
DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/).
```solidity
/**
* @dev setMetaData
* Creates or update Metadata for Aqua(emit event)
Also, updates the METADATA_DECRYPTOR key
* @param _metaDataState metadata state
* @param _metaDataDecryptorUrl decryptor URL
* @param _metaDataDecryptorAddress decryptor public key
* @param flags flags used by Aquarius
* @param data data used by Aquarius
* @param _metaDataHash hash of clear data (before the encryption, if any)
* @param _metadataProofs optional signatures of entitys who validated data (before the encryption, if any)
*/
function set metadata(uint8 _metaDataState, string calldata _metaDataDecryptorUrl
, string calldata _metaDataDecryptorAddress, bytes calldata flags,
bytes calldata data,bytes32 _metaDataHash, metaDataProof[] memory _metadataProofs) external {
require(
permissions[msg.sender].updateMetadata,
"ERC721Template: NOT METADATA_ROLE"
);
_setMetaData(_metaDataState, _metaDataDecryptorUrl, _metaDataDecryptorAddress,flags,
data,_metaDataHash, _metadataProofs);
}TODO: Add information ab
```
Overview
This document describes how Ocean assets follow the DID/DDO specification, such that Ocean assets can inherit DID/DDO benefits and enhance interoperability. DIDs and DDOs follow the [specification defined by the World Wide Web Consortium (W3C)](https://w3c-ccg.github.io/did-spec/).
**Decentralized identifiers** (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more. A **DID Document** (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID.
Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more.
A DID Document (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID.
Decentralized identifiers (DIDs) are a type of identifier that enable verifiable, decentralized digital identity. Each DID is associated with a unique entity, and DIDs may represent humans, objects, and more.
A DID Document (DDO) is a JSON blob that holds information about the DID. Given a DID, a _resolver_ will return the DDO of that DID.
#### Rules for DID & DDO
An _asset_ in Ocean represents a downloadable file, compute service, or similar. Each asset is a _resource_ under the control of a _publisher_. The Ocean network itself does _not_ store the actual resource (e.g. files).
An _asset_ has a DID and DDO. The DDO should include [metadata](did-ddo.md#metadata) about the asset, and define access in at least one [service](did-ddo.md#services). Only _owners_ or _delegated users_ can modify the DDO.
All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like _Aquarius_ can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted.
All DDOs are stored on-chain in encrypted form to be fully GDPR-compatible. A metadata cache like [_Aquarius_](aquarius/) can help in reading, decrypting, and searching through encrypted DDO data from the chain. Because the file URLs are encrypted on top of the full DDO encryption, returning unencrypted DDOs e.g. via an API is safe to do as the file URLs will still stay encrypted.
#### Publishing & Retrieving DDOs
The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the _Provider_. To resolve it, a metadata cache like _Aquarius_ must query the provider to decrypt the DDO.
The DDO is stored on-chain as part of the NFT contract and stored in encrypted form using the private key of the [_Provider_](provider/). To resolve it, a metadata cache like [_Aquarius_](aquarius/) must query the provider to decrypt the DDO.
Here is the flow:
<figure><img src="../.gitbook/assets/DDO Flow.jpg" alt=""><figcaption><p>DDO Flow</p></figcaption></figure>
out:
To set up the metadata for an asset, you'll need to call the [**setMetaData**](https://github.com/oceanprotocol/contracts/blob/9e29194d910f28a4f0ef17ce6dc8a70741f63309/contracts/templates/ERC721Template.sol#L247) function at the contract level.&#x20;
* [**\_metaDataState**](ddo-specification.md#state) - Each asset has a state, which is held by the NFT contract. One of the following: Active(0), End-of-life(1), Deprecated(2), Revoked(3)....
* **\_metaDataDecryptorUrl** - You create the DDO and then the Provider encrypts it with its private key. Only that Provider can decrypt it.
* **\_metaDataDecryptorAddress** - The decryptor address.
* **flags** - Additional information to represent the state of the data. One of two values: 0 - plain text, 1 - compressed, 2 - encrypted. Used by Aquarius.
* **data -** The [DDO](ddo-specification.md) of the asset. You create the DDO as a JSON, send it to the [Provider](provider/) that encrypts it, and then you set it up at the contract level.
* **\_metaDataHash** - Hash of the clear data **generated before the encryption.** It is used by Provider to check the validity of the data after decryption.
* **\_metadataProofs** - Array with signatures of entities who validated data(before the encryption). Pass an empty array if you don't have any.
```solidity
function setMetadata(uint8 _metaDataState, string calldata _metaDataDecryptorUrl
, string calldata _metaDataDecryptorAddress, bytes calldata flags,
bytes calldata data,bytes32 _metaDataHash, metaDataProof[] memory _metadataProofs) external {
require(
permissions[msg.sender].updateMetadata,
"ERC721Template: NOT METADATA_ROLE"
);
_setMetaData(_metaDataState, _metaDataDecryptorUrl, _metaDataDecryptorAddress,flags,
data,_metaDataHash, _metadataProofs);
}
```
* add details of the parameters of the setmetadata function
* when you make the ddo, you build it as json then the provider encrypts it
* Flags - info if it is encrypted or not.&#x20;
* We use a certain ddo structure but nobody stops you to do your own ddo that has the structure you want. You need your own aqua
* example on how to call it from ocean.py and ocean.js
{% hint style="info" %}
While we utilize a specific DDO structure, you have the flexibility to customize it according to your unique requirements. However, to enable seamless processing, it is essential to have your own Aquarius instance that can handle your modified DDO.
{% endhint %}
You'll have more information about the DIDs, on the [Identifiers](identifiers.md) page.

View File

@ -1,6 +1,6 @@
# Publish
This tutorial guides you through the process of creating your own data NFT and a datatoken using Ocean libraries. To know more about data NFTs and datatokens please refer [this page](../contracts/datanft-and-datatoken.md). Ocean Protocol supports different pricing schemes which can be set while publishing an asset. Please refer [this page](../asset-pricing.md) for more details on pricing schemes.
This tutorial guides you through the process of creating your own data NFT and a datatoken using Ocean libraries. To know more about data NFTs and datatokens please refer [this page](../contracts/datanft-and-datatoken.md). Ocean Protocol supports different pricing schemes which can be set while publishing an asset. Please refer [this page](../contracts/pricing-schemas.md) for more details on pricing schemes.
#### Prerequisites

View File

@ -4,7 +4,7 @@ description: 'Discover the World of NFTs: Retrieving a List of Fixed-rate exchan
# Get fixed-rate exchanges
Having gained knowledge about fetching lists of data NFTs and datatokens and extracting specific information about each, let's now explore the process of retrieving the information of fixed-rate exchanges. A fixed-rate exchange refers to a mechanism where data assets can be traded at a predetermined rate or price. These exchanges offer stability and predictability in data transactions, enabling users to securely and reliably exchange data assets based on fixed rates. If you need a refresher on fixed-rate exchanges, visit the [asset pricing](../asset-pricing.md#fixed-pricing) page.
Having gained knowledge about fetching lists of data NFTs and datatokens and extracting specific information about each, let's now explore the process of retrieving the information of fixed-rate exchanges. A fixed-rate exchange refers to a mechanism where data assets can be traded at a predetermined rate or price. These exchanges offer stability and predictability in data transactions, enabling users to securely and reliably exchange data assets based on fixed rates. If you need a refresher on fixed-rate exchanges, visit the [asset pricing](../contracts/pricing-schemas.md#fixed-pricing) page.