1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-11-26 19:49:26 +01:00

Issue-#701: V4 architecture diagram

This commit is contained in:
Akshay 2021-11-28 17:05:54 +01:00
parent e3d3bcb283
commit d171ef7f28
3 changed files with 16 additions and 6 deletions

View File

@ -1,24 +1,34 @@
--- ---
title: Architecture Overview title: Architecture Overview
description: Simplicity and Interoperability via a Datatokens Core description: Simplicity and Interoperability via a DataNFT Core
--- ---
## Overview ## Overview
Here is the Ocean architecture. Here is the Ocean architecture.
![Ocean Protocol tools architecture](images/architecture.PNG) ![Ocean Protocol tools architecture](images/architecture.png)
Heres an overview of the figure. Heres an overview of the figure.
- The top layer is **applications** like Ocean Market. With these apps, users can onboard data services into crypto (publish and mint datatokens), hold datatokens as assets (data wallets), discover data assets and buy / sell datatokens for fixed or auto-determined price (data marketplaces), and consume data services (consume datatokens). - The top layer is **applications** like Ocean Market. With these apps, users can onboard services like data, alogrithm, compute-to-data into crypto (publish and mint DataNFTs and Datatokens), hold datatokens as assets (data wallets), discover assets and buy / sell datatokens for fixed or auto-determined price (data marketplaces), and consume data services (consume datatokens).
- Below that are **libraries** used by the applications: Ocean React hooks, JavaScript library, and Python library. This also includes middleware to assist discovery: Aquarius and (3rd party tool) TheGraph. - Below that are **libraries** used by the applications: Ocean.js (JavaScript library) and Ocean.py (Python library). This also includes middleware to assist discovery:
- The lowest level has the **smart contracts** used by the libraries. Theyre deployed on Ethereum mainnet to start, and other networks later. - **Aquarius**: Provides metadata cache for faster serach by caching on-chain data into elasticsearch
- **Provider**: Facilitates downloading assets, DDO encryption and communicating with `operater-service` for Compute-to-Data jobs.
- **TheGraph**: 3rd party tool
Developers can utilize the libraries to built thier custom applications and marketplaces.
- The lowest level has the **smart contracts** used by the libraries. Theyre deployed on Ethereum mainnet, and other compatible networks. To see the list of supported networks click [here](/concepts/networks/).
Left to right are groupings of functionality: tools for datatokens, tools for markets (including pools), tools to consume data services and for metadata, and external ERC20 tools. Left to right are groupings of functionality: tools for datatokens, tools for markets (including pools), tools to consume data services and for metadata, and external ERC20, ERC721 tools.
The rest of this page elaborates. The rest of this page elaborates.
## DataNFT
DataNFTs are based on [ERC721](https://eips.ethereum.org/EIPS/eip-721) standard. The publisher can use Marketplace or client libraries to deploy a new DataNFT contract. To save gas fees, it uses [ERC1167](https://eips.ethereum.org/EIPS/eip-1167) proxy approach on the **ERC721 template**. Each DataNFT has a unique identifier. Publisher can then assign manager role to other ethereum addresses who can deploy new Datatoken contracts and even mint them. Each Datatoken contract is associated with one DataNFT contract.
Click [here](/concepts/nft/) to further read about DataNFTs and Datatokens.
## Datatokens & Access Control Tools ## Datatokens & Access Control Tools
The publisher actor holds the dataset in Google Drive, Dropbox, AWS S3, on their phone, on their home server, etc. The dataset has a URL. The publisher can optionally use IPFS for a content-addressable URL. Or instead of a file, the publisher may run a compute-to-data service. The publisher actor holds the dataset in Google Drive, Dropbox, AWS S3, on their phone, on their home server, etc. The dataset has a URL. The publisher can optionally use IPFS for a content-addressable URL. Or instead of a file, the publisher may run a compute-to-data service.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 62 KiB