1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-11-26 19:49:26 +01:00

Merge pull request #1001 from oceanprotocol/issue-960-change-wordings

Issue 960 change wordings
This commit is contained in:
Akshay 2022-05-26 16:54:16 +02:00 committed by GitHub
commit e275e948b7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
7 changed files with 19 additions and 19 deletions

View File

@ -11,7 +11,7 @@ Here is the Ocean architecture.
Heres an overview of the figure.
- The top layer is **applications** like Ocean Market. With these apps, users can onboard services like data, algorithms, compute-to-data into crypto (publish and mint data NFTs and datatokens), hold datatokens as assets (data wallets), discover assets, and buy/sell datatokens for a fixed or auto-determined price (data marketplaces), and consume data services (consume datatokens).
- The top layer is **applications** like Ocean Market. With these apps, users can onboard services like data, algorithms, compute-to-data into crypto (publish and mint data NFTs and datatokens), hold datatokens as assets (data wallets), discover assets, and buy/sell datatokens for a fixed or auto-determined price (data marketplaces), and use data services (spend datatokens).
- Below are **libraries** used by the applications: Ocean.js (JavaScript library) and Ocean.py (Python library). This also includes middleware to assist discovery:
- **Aquarius**: Provides metadata cache for faster search by caching on-chain data into elasticsearch
- **Provider**: Facilitates downloading assets, DDO encryption, and communicating with `operator-service` for Compute-to-Data jobs.
@ -23,19 +23,19 @@ Heres an overview of the figure.
Data NFTs are based on [ERC721](https://eips.ethereum.org/EIPS/eip-721) standard. The publisher can use Marketplace or client libraries to deploy a new data NFT contract. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**. Publisher can then assign manager role to other Ethereum addresses who can deploy new datatoken contracts and even mint them. Each datatoken contract is associated with one data NFT contract.
Click [here](/concepts/datanft-and-datatoken/) to further read about data NFTs and datatokens.
ERC721 data NFTs represent holding copyright/base IP of a data asset, and ERC20 datatokens represent licenses to consume the data asset.
ERC721 data NFTs represent holding copyright/base IP of a data asset, and ERC20 datatokens represent licenses to access the asset by downloading the content or running Compute-to-Data jobs.
Datatoken represents the asset that the publisher wants to monetize. The asset can be a dataset or an algorithm. The publisher actor holds the asset in Google Drive, Dropbox, AWS S3, on their phone, on their home server, etc. The publisher can optionally use IPFS for a content-addressable URL. Or instead of a file, the publisher may run a compute-to-data service.
In the **publish** step, the publisher invokes **Ocean Datatoken Factory** to deploy a new datatoken to the chain. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC20 datatoken template**. The publisher then mints datatokens.
The publisher runs their own **Ocean Provider** or can use one deployed by Ocean Protocol. In the **consume** step, Provider software needs to retrieve the data service URL given a datatoken address. One approach would be for the publisher to run a database. However, this adds another dependency. To avoid this, the Provider encrypts the URL on-chain.
The publisher runs their own **Ocean Provider** or can use one deployed by Ocean Protocol. In the **download** step or while running C2D job, Provider software needs to retrieve the data service URL given a datatoken address. One approach would be for the publisher to run a database. However, this adds another dependency. To avoid this, the Provider encrypts the URL, which then gets published on-chain.
To initiate the **consume** step, the data consumer sends 1.0 datatokens to the Provider wallet. Then they make a service request to the Provider. The Provider loads the encrypted URL, decrypts it, and provisions the requested service (send static data, or enable a compute-to-data job).
To initiate the **download** step, the data buyer sends 1.0 datatokens to the Provider wallet. Then they make a service request to the Provider. The Provider loads the encrypted URL, decrypts it, and provisions the requested service (send static data, or enable a compute-to-data job).
Instead of running a Provider themselves, the publisher can have a 3rd party like Ocean Market to run it. While more convenient, it means that the 3rd party has custody of the private encryption/decryption key (more centralized). Ocean will support more service types and URL custody options in the future.
**Ocean JavaScript and Python libraries** act as drivers for the lower-level contracts. Each library integrates with Ocean Provider to provision & consume data services, and Ocean Aquarius for metadata.
**Ocean JavaScript and Python libraries** act as drivers for the lower-level contracts. Each library integrates with Ocean Provider to provision & access data services, and Ocean Aquarius for metadata.
<repo name="provider"></repo>
<repo name="ocean.js"></repo>
@ -77,4 +77,4 @@ The ERC20 nature of datatokens eases composability with other Ethereum tools and
## Actor Identities
Actors like data providers and consumers have Ethereum addresses, aka web3 accounts. These are managed by crypto wallets, as one would expect. For most use cases, this is all thats needed. There are cases where the Ocean community could layer on protocols like [Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) or tools like [3Box](https://3box.io/).
Actors like data providers and buyers have Ethereum addresses, aka web3 accounts. These are managed by crypto wallets, as one would expect. For most use cases, this is all thats needed. There are cases where the Ocean community could layer on protocols like [Verifiable Credentials](https://www.w3.org/TR/vc-data-model/) or tools like [3Box](https://3box.io/).

View File

@ -1,11 +1,11 @@
---
title: Data NFTs and Datatokens
description: In Ocean Protocol, ERC721 data NFTs represent holding copyright/base IP of a data asset, and ERC20 datatokens represent licenses to consume the assets.
description: In Ocean Protocol, ERC721 data NFTs represent holding copyright/base IP of a data asset, and ERC20 datatokens represent licenses to access the assets.
---
A non-fungible token stored on the blockchain represents a unique asset. NFTs can represent images, videos, digital art, or any piece of information. NFTs can be traded, and allow transfer of copyright/base IP. [EIP-721](https://eips.ethereum.org/EIPS/eip-721) defines an interface for handling NFTs on EVM-compatible blockchains. The creator of the NFT can deploy a new contract on Ethereum or any Blockchain supporting NFT related interface and also, transfer the ownership of copyright/base IP through transfer transactions.
Fungible tokens represent fungible assets. If you have 5 ETH and Alice has 5 ETH, you and Alice could swap your ETH and your final holdings remain the same. They're apples-to-apples. Licenses (contracts) to consume a copyrighted asset are naturally fungible - they can be swapped with each other.
Fungible tokens represent fungible assets. If you have 5 ETH and Alice has 5 ETH, you and Alice could swap your ETH and your final holdings remain the same. They're apples-to-apples. Licenses (contracts) to access a copyrighted asset are naturally fungible - they can be swapped with each other.
![Data NFT and datatoken](images/datanft-and-datatoken.png)
@ -14,7 +14,7 @@ Fungible tokens represent fungible assets. If you have 5 ETH and Alice has 5 ETH
The image above describes how ERC721 data NFTs, ERC20 datatokens, and AMMs relate.
- Bottom: The publisher deploys an ERC721 data NFT contract representing the base IP for the data asset. They are now the manager of the data NFT.
- Middle: The manager then deploys an ERC20 datatoken contract against the data NFT. The ERC20 represents a license with specific terms like "can consume for the next 3 days". They could even publish further ERC20 datatoken contracts, to represent different license terms or for compute-to-data.
- Middle: The manager then deploys an ERC20 datatoken contract against the data NFT. The ERC20 represents a license with specific terms like "can download for the next 3 days". They could even publish further ERC20 datatoken contracts, to represent different license terms or for compute-to-data.
- Top: The manager then deploys a pool of the datatoken and OCEAN (or H2O), adds initial liquidity, and receives ERC20 pool tokens in return. Others may also add liquidity to receive pool tokens, i.e. become liquidity providers (LPs).
## Terminology
@ -38,7 +38,7 @@ ERC721 tokens are non-fungible, thus cannot be used for automatic price discover
Here's an example.
- In step 1, Alice **publishes** her dataset with Ocean: this means deploying an ERC721 data NFT contract (claiming copyright/base IP), then an ERC20 datatoken contract (license against base IP).
- In step 2, she **mints** some ERC20 datatokens and **transfers** 1.0 of them to Bob's wallet; now he has a license to be able to consume that dataset.
- In step 2, she **mints** some ERC20 datatokens and **transfers** 1.0 of them to Bob's wallet; now he has a license to be able to download that dataset.
## Other References

View File

@ -36,7 +36,7 @@ Based on the use case of the marketplace, the marketplace owner can decide if th
Consume fees (aka. Order fees) are charged when a user holding a datatoken exchanges it for the right to download an asset or to start a compute job that uses the asset.
These are the fees that are applied whenever a user consumes an asset:
These are the fees that are applied whenever a user pays to access an asset:
- Consume Market Consumption Fee
- Publisher Market Consumption Fee

View File

@ -13,7 +13,7 @@ Ocean Protocol provides tools for developers to _build data markets_, and to _ma
**Manage datatokens and data NFTs for use in DeFi.** Use Ocean [JavaScript](https://github.com/oceanprotocol/ocean.js) or [Python](https://github.com/oceanprotocol/ocean.py) drivers to manage data NFTs and datatokens:
- _Publish and consume data services:_ downloadable files or compute-to-data. Use Ocean to deploy a new [ERC721](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-721.md) and [ERC20](https://github.com/ethereum/EIPs/blob/7f4f0377730f5fc266824084188cc17cf246932e/EIPS/eip-20.md) datatoken contract for each data service, then mint datatokens.
- _Publish and access data services:_ downloadable files or compute-to-data. Use Ocean to deploy a new [ERC721](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-721.md) and [ERC20](https://github.com/ethereum/EIPs/blob/7f4f0377730f5fc266824084188cc17cf246932e/EIPS/eip-20.md) datatoken contract for each data service, then mint datatokens.
- _Transfer datatokens_ to another owner (or approve & transferFrom).

View File

@ -17,7 +17,7 @@ Since asset-level permissions are in the DDO, and the DDO is controlled by the p
All and deny lists are not enabled by default in Ocean Market. You need to edit the environmental variables to enable this feature in your fork of Ocean Market:
- To enable allow and deny lists you need to add the following environmental variable to your .env file in your fork of Ocean Market: `GATSBY_ALLOW_ADVANCED_SETTINGS="true"`
- Publishers in your market will now have the ability to restrict who can consume their datasets.
- Publishers in your market will now have the ability to restrict who can buy their datasets.
## Usage

View File

@ -1,13 +1,13 @@
---
title: Market-Level Permissions
description: Control who can publish, consume or browse data
description: Control who can publish, buy or browse data
---
## Introduction
For market-level permissions, Ocean implements a role-based access control server (RBAC server). It implements restrictions at the user level, based on the users role (credentials). The RBAC server is run & controlled by the marketplace owner. Therefore permissions at this level are at the discretion of the marketplace owner.
The RBAC server is the primary mechanism for restricting your users ability to publish, consume, or browse assets in the market.
The RBAC server is the primary mechanism for restricting your users ability to publish, buy, or browse assets in the market.
## Roles
@ -20,7 +20,7 @@ The RBAC server defines four different roles:
### Admin/ Publisher
Currently users with either the admin or publisher roles will be able to use the Market without any restrictions. They can publish, consume and browse datasets.
Currently users with either the admin or publisher roles will be able to use the Market without any restrictions. They can publish, buy and browse datasets.
### Consumer

View File

@ -1,15 +1,15 @@
---
title: Fine-Grained Permissions
description: Control who can publish, consume or browse data
description: Control who can publish, buy or browse data
---
A large part of Ocean is about access control, which is primarily handled by datatokens. Users can access a resource (e.g. a file) by redeeming datatokens for that resource. We recognize that enterprises and other users often need more precise ways to specify and manage access, and we have introduced fine-grained permissions for these use cases.
Fine-grained permissions mean that access can be controlled precisely at two levels:
- [Marketplace-level permissions](./market-level-permissions) for browsing, consuming or publishing within a marketplace frontend.
- [Marketplace-level permissions](./market-level-permissions) for browsing, downloading or publishing within a marketplace frontend.
- [Asset-level permissions](./asset-level-permissions) on consuming a specific asset.
- [Asset-level permissions](./asset-level-permissions) on downloading a specific asset.
The fine-grained permissions features are designed to work in forks of Ocean Market. We have not enabled them in Ocean Market itself, to keep Ocean Market open for everyone to use. On the front end, the permissions features are easily enabled by setting environment variables.