1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-11-26 19:49:26 +01:00

GITBOOK-493: change request with no subject merged in GitBook

This commit is contained in:
mariacarmina.cretu 2023-06-13 12:13:08 +00:00 committed by gitbook-bot
parent 96b2811307
commit d82fc95a6a
No known key found for this signature in database
GPG Key ID: 07D2180C7B12D0FF
9 changed files with 513 additions and 413 deletions

View File

@ -30,7 +30,7 @@
* [Harvest More Yield Data Farming](user-guides/how-to-data-farm.md)
* [Claim Rewards Data Farming](user-guides/claim-ocean-rewards.md)
* [Liquidity Pools \[deprecated\]](user-guides/remove-liquidity-using-etherscan.md)
* [👨💻 Developers](developers/README.md)
* [👨💻 👨💻 Developers](developers/README.md)
* [Architecture Overview](developers/contracts/architecture.md)
* [Contracts](developers/contracts/README.md)
* [Data NFTs](developers/contracts/data-nfts.md)

View File

@ -10,28 +10,26 @@ Well, brace yourselves for some exhilarating news! Introducing ocean.py, a Pytho
### Overview
ocean.py serves as a connection layer bridging the V4 smart contracts and various components such as Provider, Aquarius, and Compute to Data engine within Ocean Protocol. This pythonic library brings all these elements together, facilitating seamless integration and interaction. By acting as an intermediary, ocean.py enables developers to easily leverage the functionalities offered by Ocean Protocol, making it a valuable tool in building applications and solutions that utilize decentralized data marketplaces. Its purpose is to simplify the process of connecting with smart contracts and accessing services provided by Provider, Aquarius, and Compute to Data engine, providing a convenient and efficient development experience for users.
ocean.py serves as a connection layer bridging the V4 smart contracts and various components such as [Provider](https://github.com/oceanprotocol/provider), [Aquarius](https://github.com/oceanprotocol/aquarius), and [Compute to Data engine](https://github.com/oceanprotocol/operator-service) within Ocean Protocol. This pythonic library brings all these elements together, facilitating seamless integration and interaction. By acting as an intermediary, ocean.py enables developers to easily leverage the functionalities offered by Ocean Protocol, making it a valuable tool in building applications and solutions that utilize decentralized data marketplaces. Its purpose is to simplify the process of connecting with smart contracts and accessing services provided by Provider, Aquarius, and Compute to Data engine, providing a convenient and efficient development experience for users.
#### Architectural point of view
Oh, the wondrous world of ocean.py! Imagine a playful octopus with eight arms, each one specialized in a unique task. 🐙
ocean.py is like the conductor of an underwater orchestra, guiding different marine creatures (modules) to work together harmoniously. It's an open-source library that makes swimming in the vast sea of data a breeze! 🌊
At its heart, ocean.py is like the conductor of an underwater orchestra, guiding different marine creatures (modules) to work together harmoniously. It's an open-source library that makes swimming in the vast sea of data a breeze! 🌊
The head of our library is the "[Ocean](technical-details.md)" class. It oversees everything and keeps track of the data flow.
The head of our octopus is the "Ocean" class. It oversees everything and keeps track of the data flow. It's like the brain of our underwater friend! 🧠
Now, let's take a closer look at those amazing branches:
Now, let's take a closer look at those amazing arms:
1. **Data Discovery Branch**: This branch acts as an intrepid explorer, delving into the vast seas of data to discover valuable datasets stored in the Ocean Protocol ecosystem. It navigates through metadata and identifies the hidden treasures.
2. **Data Access Branch**: Just like a skilled locksmith, this branch unlocks the doors to the datasets, facilitating seamless access and retrieval. It interacts with the Ocean Protocol's smart contracts to securely fetch the desired data.
3. **Data Cleaning Branch**: Here comes the meticulous cleaner! This branch ensures that the fetched data is pristine and free from any impurities. It scrubs away any inconsistencies or inaccuracies, leaving behind sparkling clean datasets.
4. **Data Transformation Branch**: Transforming data is like wielding magic, and this arm is the magician! It performs enchanting operations on the data, such as reformatting, reorganizing, or even enriching it, making it ready for the next steps.
5. **Model Training Branch**: This branch employs machine learning techniques to train models using the transformed data. It collaborates with the Ocean smart contracts to optimize the training process.
6. **Model Evaluation Branch**: It's time for a performance assessment! This branch thoroughly examines the trained models, assessing their accuracy, robustness, and compliance with predefined metrics. It ensures that our models are as reliable as a trustworthy companion.
7. **Model Deployment Branch**: Now, it's time to set our trained models free into the ocean of opportunities! This branch interacts with the Ocean smart contracts to deploy the models, making them accessible for utilization within the Ocean ecosystem.
8. **Model Monitoring Branch**: This branch monitors their behavior, tracks their performance, and detects any anomalies that may arise. It collaborates with the Ocean smart contracts to ensure the models swim smoothly.
1. **Data Discovery Arm**: This arm acts as an intrepid explorer, delving into the vast seas of data to discover valuable datasets stored in the Ocean Protocol ecosystem. It navigates through metadata and identifies the hidden treasures.
2. **Data Access Arm**: Just like a skilled locksmith, this arm unlocks the doors to the datasets, facilitating seamless access and retrieval. It interacts with the Ocean Protocol's smart contracts to securely fetch the desired data.
3. **Data Cleaning Arm**: Here comes the meticulous cleaner! This arm ensures that the fetched data is pristine and free from any impurities. It scrubs away any inconsistencies or inaccuracies, leaving behind sparkling clean datasets.
4. **Data Transformation Arm**: Transforming data is like wielding magic, and this arm is the magician! It performs enchanting operations on the data, such as reformatting, reorganizing, or even enriching it, making it ready for the next steps.
5. **Model Training Arm**: Our octopus is a quick learner! This arm employs machine learning techniques to train models using the transformed data. It collaborates with the Ocean smart contracts to optimize the training process.
6. **Model Evaluation Arm**: It's time for a performance assessment! This arm thoroughly examines the trained models, assessing their accuracy, robustness, and compliance with predefined metrics. It ensures that our models are as reliable as a trustworthy companion.
7. **Model Deployment Arm**: Now, it's time to set our trained models free into the ocean of opportunities! This arm interacts with the Ocean smart contracts to deploy the models, making them accessible for utilization within the Ocean ecosystem.
8. **Model Monitoring Arm**: Our octopus keeps a watchful eye on the deployed models. This arm monitors their behavior, tracks their performance, and detects any anomalies that may arise. It collaborates with the Ocean smart contracts to ensure the models swim smoothly.
So, in the realm of ocean.py's integration with Ocean Protocol's smart contracts, our octopus and its eight versatile arms embark on an exciting journey, discovering, accessing, cleaning, transforming, training, evaluating, deploying, and monitoring data and models. Together, they form a powerful team, navigating the depths of the Ocean ecosystem. 🌊🐙
So, in the realm of ocean.py's integration with Ocean Protocol's smart contracts, the eight versatile branches embark on an exciting journey, discovering, accessing, cleaning, transforming, training, evaluating, deploying, and monitoring data and models. Together, they form a powerful team, navigating the depths of the Ocean ecosystem. 🌊🐙
### ocean.py Strengths 💪
@ -51,8 +49,9 @@ If you prefer video format, please check this video below, otherwise let's move
To kickstart your adventure with ocean.py, we set out the following steps to get you zooming ahead in no time!
1. Install Ocean 📥
2. Setup 🛠️ — Remote (Win, MacOS, Linux) — or Local (Linux only)
3. Walk through main flow 🚶‍♂️: publish asset, post for free / for sale, dispense it / buy it, and consume it
1. [Install Ocean](install.md) 📥
2. Setup 🛠️ — [Remote ](remote-setup.md)(Win, MacOS, Linux) — or [Local ](local-setup.md)(Linux only)
3. [Publish asset](publish-flow.md), post for free / for sale, dispense it / buy it, and [consume ](consume-flow.md)it
4. Run algorithms through [Compute-to-Data flow](compute-flow.md) using Ocean environment.
After these quickstart steps, the main [README](https://github.com/oceanprotocol/ocean.py/blob/main/README.md) points to several other use cases, such as [Predict-ETH](https://github.com/oceanprotocol/predict-eth), [Data Farming](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/df.md), on-chain key-value stores ([public](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/key-value-public.md) or [private](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/key-value-private.md)), and other types of data assets ([REST API](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-restapi.md), [GraphQL](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-graphql.md), [on-chain](https://github.com/oceanprotocol/ocean.py/blob/main/READMEs/publish-flow-onchain.md)).

View File

@ -1,2 +1,84 @@
# Datatoken Interface
---
description: Technical details about Datatoken functions
---
# Datatoken Interface Tech Details
`Datatoken contract interface` is like the superhero that kicks off the action-packed adventure of contract calls! It's here to save the day by empowering us to unleash the mighty powers of dispensers, fixed rate exchanges, and initializing orders. For this page, we present the utilitary functions that embark you on the Ocean journey.
### Create Dispenser
* **create\_dispenser**(`self`, `tx_dict: dict`, `max_tokens: Optional[Union[int, str]] = None`, `max_balance: Optional[Union[int, str]] = None`, `with_mint: Optional[bool] = True`)
Through datatoken, you can deploy a new dispenser schema which is used for creating free assets, because its behaviour is similar with a faucet. ⛲
It is implemented in DatatokenBase, inherited by Datatoken2, so it can be called within both instances.
**Parameters**
* `tx_dict` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/).
* `max_tokens` - maximum amount of tokens to dispense in wei. The default is a large number.
* `max_balance` - maximum balance of requester in wei. The default is a large number.
* `with_mint` - boolean, `true` if we want to allow the dispenser to be a minter as default value
**Returns**
`str`
Return value is a hex string which denotes the transaction hash of dispenser deployment.
**Defined in**
[models/datatoken.py](https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean\_lib/models/datatoken.py#LL336C5-L377C18)
<details>
<summary>Source code</summary>
```python
@enforce_types
def create_dispenser(
self,
tx_dict: dict,
max_tokens: Optional[Union[int, str]] = None,
max_balance: Optional[Union[int, str]] = None,
with_mint: Optional[bool] = True,
):
"""
For this datataken, create a dispenser faucet for free tokens.
This wraps the smart contract method Datatoken.createDispenser()
with a simpler interface.
:param: max_tokens - max # tokens to dispense, in wei
:param: max_balance - max balance of requester
:tx_dict: e.g. {"from": alice_wallet}
:return: tx
"""
# already created, so nothing to do
if self.dispenser_status().active:
return
# set max_tokens, max_balance if needed
max_tokens = max_tokens or MAX_UINT256
max_balance = max_balance or MAX_UINT256
# args for contract tx
dispenser_addr = get_address_of_type(self.config_dict, "Dispenser")
with_mint = with_mint # True -> can always mint more
allowed_swapper = ZERO_ADDRESS # 0 -> so anyone can call dispense
# do contract tx
tx = self.createDispenser(
dispenser_addr,
max_tokens,
max_balance,
with_mint,
allowed_swapper,
tx_dict,
)
return tx
```
</details>

View File

@ -6,9 +6,9 @@ Lets start interacting with the python library by firstly installing it & its
Ahoy there, matey! 🌊⚓️ When it comes to setting up ocean.py locally, we're diving into the world of Docker containers. These clever containers hold our trusty local blockchain nodes (Ganache) and the mighty Ocean middleware (Aquarius metadata cache and Provider to aid in consuming data assets). But fear not, for a smooth sailing experience, you'll need to ensure the following Docker components are shipshape and ready to go:
1. Docker 🐳
2. Docker Compose 🛠️
3. Oh, and don't forget to allow those non-root users to join in on the fun! 🙅‍♂️
1. [Docker](https://docs.docker.com/engine/install/) 🐳
2. [Docker Compose](https://docs.docker.com/compose/install/) 🛠️
3. Oh, and don't forget to [allow those non-root users](https://www.thegeekdiary.com/run-docker-as-a-non-root-user/) to join in on the fun! 🙅‍♂️
So hoist the anchor, prepare your Docker crew, and let's embark on an exciting voyage with ocean.py! 🚢⛵️\
\

View File

@ -4,7 +4,9 @@ description: Local setup for running & testing ocean.py
# Local Setup
⚠️ This setup can be accomplished only on Linux operator systems
On this page, we continue our journey from [installation part](install.md), to do setup for local testing. Local setup means that we will use Ganache as local blockchain where we can effectuate transactions and all the services point to this network.
⚠️ Ocean local setup uses Docker, which is fine for Linux/Ubuntu but plays badly with MacOS and Windows. If you are on these, youll want [remote setup](remote-setup.md)_._
Here are the following steps for configuring ocean.py on Ganache network using barge

View File

@ -4,7 +4,7 @@ description: Technical details about OceanAssets functions
# Ocean Assets Tech Details
Through this class we can publish assets & consume them to make 💲💲💲
Through this class we can publish different types of assets & consume them to make 💲💲💲
### Creates URL Asset

View File

@ -1,2 +1,381 @@
# Ocean Compute
---
description: Technical details about OceanCompute functions
---
# Ocean Compute Tech Details
Using this class, we are able to manipulate a compute job, run it on Ocean environment and retrieve the results after the execution is finished.
### Start Compute Job
* **start**(`self`, `consumer_wallet`, `dataset: ComputeInput`, `compute_environment: str`, `algorithm: Optional[ComputeInput] = None`, `algorithm_meta: Optional[AlgorithmMetadata] = None`, `algorithm_algocustomdata: Optional[dict] = None`, `additional_datasets: List[ComputeInput] = []`) -> `str`
Starts a compute job.
It can be called within Ocean Compute class.
**Parameters**
* `consumer_wallet` - the `Brownie account` of consumer who pays & starts for compute job.
* `dataset` - `ComputeInput` object, each of them includes mandatory the DDO and service.
* `compute_environment` - `string` that represents the ID from the chosen C2D environment.
* `additional_datasets` - list of `ComputeInput` objects for additional datasets in case of starting a compute job for multiple datasets.
**Optional parameters**
* `algorithm` - `ComputeInput` object, each of them includes mandatory the DDO and service for algorithm.
* `algorithm_meta` - either provide just the algorithm metadata as `AlgorithmMetadata.`
* `algorithm_algocustomedata` - additional user data for the algorithm as dictionary.
**Returns**
`str`
Returns a string type job ID.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL32C4-L70C33)
<details>
<summary>Source code</summary>
```python
@enforce_types
def start(
self,
consumer_wallet,
dataset: ComputeInput,
compute_environment: str,
algorithm: Optional[ComputeInput] = None,
algorithm_meta: Optional[AlgorithmMetadata] = None,
algorithm_algocustomdata: Optional[dict] = None,
additional_datasets: List[ComputeInput] = [],
) -> str:
metadata_cache_uri = self._config_dict.get("METADATA_CACHE_URI")
ddo = Aquarius.get_instance(metadata_cache_uri).get_ddo(dataset.did)
service = ddo.get_service_by_id(dataset.service_id)
assert (
ServiceTypes.CLOUD_COMPUTE == service.type
), "service at serviceId is not of type compute service."
consumable_result = is_consumable(
ddo,
service,
{"type": "address", "value": consumer_wallet.address},
with_connectivity_check=True,
)
if consumable_result != ConsumableCodes.OK:
raise AssetNotConsumable(consumable_result)
# Start compute job
job_info = self._data_provider.start_compute_job(
dataset_compute_service=service,
consumer=consumer_wallet,
dataset=dataset,
compute_environment=compute_environment,
algorithm=algorithm,
algorithm_meta=algorithm_meta,
algorithm_custom_data=algorithm_algocustomdata,
input_datasets=additional_datasets,
)
return job_info["jobId"]
```
</details>
### Compute Job Status
* **status**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`) -> `Dict[str, Any]`
Gets status of the compute job.
It can be called within Ocean Compute class.
**Parameters**
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
**Returns**
`Dict[str, Any]`
A dictionary which contains the status for an existing compute job, keys are `(ok, status, statusText)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL72C5-L88C24)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def status(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]:
"""
Gets job status.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the status for an existing compute job, keys are (ok, status, statusText)
"""
job_info = self._data_provider.compute_job_status(
ddo.did, job_id, service, wallet
)
job_info.update({"ok": job_info.get("status") not in (31, 32, None)})
return job_info
```
{% endcode %}
</details>
### Compute Job Result
* **result**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `index: int`, `wallet` ) -> `Dict[str, Any]`
Gets compute job result.
It can be called within Ocean Compute class.
**Parameters**
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `index` - compute result index
* `wallet` - Brownie account which initiated the compute job
**Returns**
`Dict[str, Any]`
A dictionary wich contains the results/logs urls for an existing compute job, keys are `(did, urls, logs)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL90C5-L106C22)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def result(
self, ddo: DDO, service: Service, job_id: str, index: int, wallet
) -> Dict[str, Any]:
"""
Gets job result.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param index: compute result index
:param wallet: Wallet instance
:return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs)
"""
result = self._data_provider.compute_job_result(job_id, index, service, wallet)
return result
```
{% endcode %}
</details>
### Compute Job Result Logs
* **compute\_job\_result\_logs**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`, `log_type="output"`) -> `Dict[str, Any]`
Gets job output if exists.
It can be called within Ocean Compute class.
**Parameters**
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
* `log_type` - string which selects what kind of logs to display. Default "output"
**Returns**
`Dict[str, Any]`
A dictionary which includes the results/logs urls for an existing compute job, keys are `(did, urls, logs)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL108C5-L130C22)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def compute_job_result_logs(
self,
ddo: DDO,
service: Service,
job_id: str,
wallet,
log_type="output",
) -> Dict[str, Any]:
"""
Gets job output if exists.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs)
"""
result = self._data_provider.compute_job_result_logs(
ddo, job_id, service, wallet, log_type
)
return result
```
{% endcode %}
</details>
### Stop Compute Job
* **stop**(`self`, `ddo: DDO`, `service: Service`, `job_id: str`, `wallet`) -> `Dict[str, Any]`
Attempts to stop the running compute job.
It can be called within Ocean Compute class.
**Parameters**
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
**Returns**
`Dict[str, Any]`
A dictionary which contains the status for the stopped compute job, keys are `(ok, status, statusText)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL132C5-L146C24)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def stop(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]:
"""
Attempt to stop the running compute job.
:param ddo: DDO offering the compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the status for the stopped compute job, keys are (ok, status, statusText)
"""
job_info = self._data_provider.stop_compute_job(
ddo.did, job_id, service, wallet
)
job_info.update({"ok": job_info.get("status") not in (31, 32, None)})
return job_info
```
{% endcode %}
</details>
### Get Priced C2D Environments
* **get\_c2d\_environments**(`self`, `service_endpoint: str`, `chain_id: int`)
Get list of compute environments.
It can be called within Ocean Compute class.
**Parameters**
* `service_endpoint` - string Provider URL that is stored in compute service.
* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type.
**Returns**
`list`
A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL148C4-L150C84)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def get_c2d_environments(self, service_endpoint: str, chain_id: int):
return DataServiceProvider.get_c2d_environments(service_endpoint, chain_id)
```
{% endcode %}
</details>
### Get Free C2D Environments
* **get\_free\_c2d\_environment**(`self`, `service_endpoint: str`, `chain_id`)
Get list of free compute environments.
Important thing is that not all Providers contain free environments (`priceMin = 0`).
It can be called within Ocean Compute class.
**Parameters**
* `service_endpoint` - string Provider URL that is stored in compute service.
* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type.
**Returns**
`list`
A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`.
**Defined in**
[ocean/ocean\_compute.py](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean\_compute.py#LL152C5-L155C87)
<details>
<summary>Source code</summary>
{% code overflow="wrap" %}
```python
@enforce_types
def get_free_c2d_environment(self, service_endpoint: str, chain_id):
environments = self.get_c2d_environments(service_endpoint, chain_id)
return next(env for env in environments if float(env["priceMin"]) == float(0))
```
{% endcode %}
</details>

View File

@ -4,14 +4,16 @@ description: Remote setup for running & testing ocean.py
# Remote Setup
This setup does not use barge and uses a remote chain to do the transactions. When the network URL is specified & configured, ocean.py will use components (such as Provider, Aquarius, C2D) according to the expected blockchain.
Here, we do setup for Mumbai, the testnet for Polygon. It's similar for other remote chains.
Here, we will:
1. Configure Brownie networks
2. Create two accounts - `REMOTE_TEST_PRIVATE_KEY1` and `2`
3. Get fake MATIC on Mumbai
4. Get fake OCEAN on Mumbai
3. Get test MATIC on Mumbai
4. Get test OCEAN on Mumbai
5. Set envvars
6. Set up Alice and Bob wallets in Python
@ -96,7 +98,14 @@ You can bypass manually: just edit your brownie network config file.
Or you can bypass via the command line. The following command replaces Infura RPCs with public ones in `network-config.yaml`:
* Linux users: in the console: `sed -i 's#https://polygon-mainnet.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://polygon-rpc.com/#g; s#https://polygon-mumbai.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://rpc-mumbai.maticvigil.com#g' ~/.brownie/network-config.yaml`
* Linux users: in the console:&#x20;
{% code overflow="wrap" %}
```bash
sed -i 's#https://polygon-mainnet.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://polygon-rpc.com/#g; s#https://polygon-mumbai.infura.io/v3/$WEB3_INFURA_PROJECT_ID#https://rpc-mumbai.maticvigil.com#g' ~/.brownie/network-config.yaml
```
{% endcode %}
* MacOS users: you can achieve the same thing with `gnu-sed` and the `gsed` command. (Or just manually edit the file.)
* For Windows: you might need something similar to [powershell](https://www.marek.tokyo/2020/01/remove-string-from-file-in-windows-10.html). (Or just manually edit the file.)
@ -133,11 +142,11 @@ Now, you have two EVM accounts (address & private key). Save them somewhere safe
These accounts will work on any EVM-based chain: production chains like Eth mainnet and Polygon, and testnets like Goerli and Mumbai. Here, we'll use them for Mumbai.
### 3. Get (fake) MATIC on Mumbai
### 3. Get (test) MATIC on Mumbai
We need the a network's native token to pay for transactions on the network. [ETH](https://ethereum.org/en/get-eth/) is the native token for Ethereum mainnet; [MATIC](https://polygon.technology/matic-token/) is the native token for Polygon, and [(fake) MATIC](https://faucet.polygon.technology/) is the native token for Mumbai.
We need the a network's native token to pay for transactions on the network. [ETH](https://ethereum.org/en/get-eth/) is the native token for Ethereum mainnet; [MATIC](https://polygon.technology/matic-token/) is the native token for Polygon, and [(test) MATIC](https://faucet.polygon.technology/) is the native token for Mumbai.
To get free (fake) MATIC on Mumbai:
To get free (test) MATIC on Mumbai:
1. Go to the faucet [https://faucet.polygon.technology/](https://faucet.polygon.technology/). Ensure you've selected "Mumbai" network and "MATIC" token.
2. Request funds for ADDRESS1
@ -145,15 +154,15 @@ To get free (fake) MATIC on Mumbai:
You can confirm receiving funds by going to the following url, and seeing your reported MATIC balance: `https://mumbai.polygonscan.com/address/<ADDRESS1 or ADDRESS2>`
### 4. Get (fake) OCEAN on Mumbai
### 4. Get (test) OCEAN on Mumbai
[OCEAN](https://oceanprotocol.com/token) can be used as a data payment token, and locked into veOCEAN for Data Farming / curation. The READMEs show how to use OCEAN in both cases.
* OCEAN is an ERC20 token with a finite supply, rooted in Ethereum mainnet at address [`0x967da4048cD07aB37855c090aAF366e4ce1b9F48`](https://etherscan.io/token/0x967da4048cD07aB37855c090aAF366e4ce1b9F48).
* OCEAN on other production chains derives from the Ethereum mainnet OCEAN. OCEAN on Polygon (mOCEAN) is at [`0x282d8efce846a88b159800bd4130ad77443fa1a1`](https://polygonscan.com/token/0x282d8efce846a88b159800bd4130ad77443fa1a1).
* (Fake) OCEAN is on each testnet. Fake OCEAN on Mumbai is at [`0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8`](https://mumbai.polygonscan.com/token/0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8).
* (Test) OCEAN is on each testnet. Test OCEAN on Mumbai is at [`0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8`](https://mumbai.polygonscan.com/token/0xd8992Ed72C445c35Cb4A2be468568Ed1079357c8).
To get free (fake) OCEAN on Mumbai:
To get free (test) OCEAN on Mumbai:
1. Go to the faucet [https://faucet.mumbai.oceanprotocol.com/](https://faucet.mumbai.oceanprotocol.com/)
2. Request funds for ADDRESS1

View File

@ -4,7 +4,12 @@ description: Technical details about most used ocean.py functions
# Ocean Instance Tech Details
At the beginning of most flows, we create an `ocean` object, which is an instance of class [`Ocean`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean.py). It exposes useful information, including the following.
At the beginning of most flows, we create an `ocean` object, which is an instance of class [`Ocean`](https://github.com/oceanprotocol/ocean.py/blob/main/ocean\_lib/ocean/ocean.py). It exposes useful information, including the following:
* properties for config & OCEAN token
* contract objects retrieval
* users' orders
* provider fees
### Constructor
@ -561,388 +566,12 @@ A tuple which contains the data NFT, datatoken and the data asset.
</details>
### Ocean Compute
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL32C4-L70C33"><code>ocean.compute.start( self, consumer_wallet, dataset: ComputeInput, compute_environment: str, algorithm: Optional[ComputeInput] = None, algorithm_meta: Optional[AlgorithmMetadata] = None, algorithm_algocustomdata: Optional[dict] = None, additional_datasets: List[ComputeInput] = []) -> str</code></a></summary>
Starts a compute job.
It can be called within Ocean Compute class.
Params:
* `consumer_wallet` - the `Brownie account` of consumer who pays & starts for compute job.
* `dataset` - `ComputeInput` object, each of them includes mandatory the DDO and service.
* `compute_environment` - `string` that represents the ID from the chosen C2D environment.
* `additional_datasets` - list of `ComputeInput` objects for additional datasets in case of starting a compute job for multiple datasets.
Optional params:
* `algorithm` - `ComputeInput` object, each of them includes mandatory the DDO and service for algorithm.
* `algorithm_meta` - either provide just the algorithm metadata as `AlgorithmMetadata.`
* `algorithm_algocustomedata` - additional user data for the algorithm as dictionary.
Return:
Returns a string type job ID.
```python
@enforce_types
def start(
self,
consumer_wallet,
dataset: ComputeInput,
compute_environment: str,
algorithm: Optional[ComputeInput] = None,
algorithm_meta: Optional[AlgorithmMetadata] = None,
algorithm_algocustomdata: Optional[dict] = None,
additional_datasets: List[ComputeInput] = [],
) -> str:
metadata_cache_uri = self._config_dict.get("METADATA_CACHE_URI")
ddo = Aquarius.get_instance(metadata_cache_uri).get_ddo(dataset.did)
service = ddo.get_service_by_id(dataset.service_id)
assert (
ServiceTypes.CLOUD_COMPUTE == service.type
), "service at serviceId is not of type compute service."
consumable_result = is_consumable(
ddo,
service,
{"type": "address", "value": consumer_wallet.address},
with_connectivity_check=True,
)
if consumable_result != ConsumableCodes.OK:
raise AssetNotConsumable(consumable_result)
# Start compute job
job_info = self._data_provider.start_compute_job(
dataset_compute_service=service,
consumer=consumer_wallet,
dataset=dataset,
compute_environment=compute_environment,
algorithm=algorithm,
algorithm_meta=algorithm_meta,
algorithm_custom_data=algorithm_algocustomdata,
input_datasets=additional_datasets,
)
return job_info["jobId"]
```
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL72C5-L88C24"><code>ocean.compute.status(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]</code></a></summary>
Gets status of the compute job.
It can be called within Ocean Compute class.
Params:
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
Return:
A dictionary which contains the status for an existing compute job, keys are `(ok, status, statusText)`.
{% code overflow="wrap" %}
```python
@enforce_types
def status(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]:
"""
Gets job status.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the status for an existing compute job, keys are (ok, status, statusText)
"""
job_info = self._data_provider.compute_job_status(
ddo.did, job_id, service, wallet
)
job_info.update({"ok": job_info.get("status") not in (31, 32, None)})
return job_info
```
{% endcode %}
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL90C5-L106C22"><code>ocean.compute.result( self, ddo: DDO, service: Service, job_id: str, index: int, wallet ) -> Dict[str, Any]</code></a></summary>
Gets compute job result.
It can be called within Ocean Compute class.
Params:
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `index` - compute result index
* `wallet` - Brownie account which initiated the compute job
Return:
A dictionary wich contains the results/logs urls for an existing compute job, keys are `(did, urls, logs)`.
{% code overflow="wrap" %}
```python
@enforce_types
def result(
self, ddo: DDO, service: Service, job_id: str, index: int, wallet
) -> Dict[str, Any]:
"""
Gets job result.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param index: compute result index
:param wallet: Wallet instance
:return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs)
"""
result = self._data_provider.compute_job_result(job_id, index, service, wallet)
return result
```
{% endcode %}
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL108C5-L130C22"><code>ocean.compute.compute_job_result_logs( self, ddo: DDO, service: Service, job_id: str, wallet, log_type="output", ) -> Dict[str, Any]</code></a></summary>
Gets job output if exists.
It can be called within Ocean Compute class.
Params:
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
* `log_type` - string which selects what kind of logs to display. Default "output"
Return:
A dictionary which includes the results/logs urls for an existing compute job, keys are `(did, urls, logs)`.
{% code overflow="wrap" %}
```python
@enforce_types
def compute_job_result_logs(
self,
ddo: DDO,
service: Service,
job_id: str,
wallet,
log_type="output",
) -> Dict[str, Any]:
"""
Gets job output if exists.
:param ddo: DDO offering the compute service of this job
:param service: compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the results/logs urls for an existing compute job, keys are (did, urls, logs)
"""
result = self._data_provider.compute_job_result_logs(
ddo, job_id, service, wallet, log_type
)
return result
```
{% endcode %}
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL132C5-L146C24"><code>ocean.compute.stop(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]</code></a></summary>
Attempts to stop the running compute job.
It can be called within Ocean Compute class.
Params:
* `ddo` - DDO offering the compute service of this job
* `service` - Service object of compute
* `job_id` - ID of the compute job
* `wallet` - Brownie account which initiated the compute job
Return:
A dictionary which contains the status for the stopped compute job, keys are `(ok, status, statusText)`.
{% code overflow="wrap" %}
```python
@enforce_types
def stop(self, ddo: DDO, service: Service, job_id: str, wallet) -> Dict[str, Any]:
"""
Attempt to stop the running compute job.
:param ddo: DDO offering the compute service of this job
:param job_id: str id of the compute job
:param wallet: Wallet instance
:return: dict the status for the stopped compute job, keys are (ok, status, statusText)
"""
job_info = self._data_provider.stop_compute_job(
ddo.did, job_id, service, wallet
)
job_info.update({"ok": job_info.get("status") not in (31, 32, None)})
return job_info
```
{% endcode %}
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL148C4-L150C84"><code>ocean.compute.get_c2d_environments(self, service_endpoint: str, chain_id: int)</code></a></summary>
Get list of compute environments.
It can be called within Ocean Compute class.
Params:
* `service_endpoint` - string Provider URL that is stored in compute service.
* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type.
Return:
A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`.
{% code overflow="wrap" %}
```python
@enforce_types
def get_c2d_environments(self, service_endpoint: str, chain_id: int):
return DataServiceProvider.get_c2d_environments(service_endpoint, chain_id)
```
{% endcode %}
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/main/ocean_lib/ocean/ocean_compute.py#LL152C5-L155C87"><code>ocean.compute.get_free_c2d_environment(self, service_endpoint: str, chain_id)</code></a></summary>
Get list of free compute environments.
Important thing is that not all Providers contain free environments (`priceMin = 0`).
It can be called within Ocean Compute class.
Params:
* `service_endpoint` - string Provider URL that is stored in compute service.
* `chain_id` - using Provider multichain, `chain_id` is required to specify the network for your environment. It has `int` type.
Return:
A list of objects containing information about each compute environment. For each compute environment, these are the following keys: `(id, feeToken, priceMin, consumerAddress, lastSeen, namespace, status)`.
{% code overflow="wrap" %}
```python
@enforce_types
def get_free_c2d_environment(self, service_endpoint: str, chain_id):
environments = self.get_c2d_environments(service_endpoint, chain_id)
return next(env for env in environments if float(env["priceMin"]) == float(0))
```
{% endcode %}
</details>
#### Datatoken Interface
### Datatoken Interface
Dispenser utils:
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL336C5-L377C18"><code>datatoken.create_dispenser(self, tx_dict: dict, max_tokens: Optional[Union[int, str]] = None, max_balance: Optional[Union[int, str]] = None, with_mint: Optional[bool] = True)</code></a></summary>
Through datatoken, you can deploy a new dispenser schema which is used for creating free assets, because its behaviour is similar with a faucet. ⛲
It is implemented in DatatokenBase, inherited by Datatoken2, so it can be called within both instances.
Each parameter has the following meaning:
1. `tx_dict` - is the configuration `dictionary` for that specific transaction. Usually for `development` we include just the `from` wallet, but for remote networks, you can provide gas fees, required confirmations for that block etc. For more info, check [Brownie docs](https://eth-brownie.readthedocs.io/en/stable/).
2. `max_tokens` - maximum amount of tokens to dispense in wei. The default is a large number.
3. `max_balance` - maximum balance of requester in wei. The default is a large number.
4. `with_mint` - boolean, `true` if we want to allow the dispenser to be a minter as default value
Return value is a hex string which denotes the transaction hash of dispenser deployment.
```python
@enforce_types
def create_dispenser(
self,
tx_dict: dict,
max_tokens: Optional[Union[int, str]] = None,
max_balance: Optional[Union[int, str]] = None,
with_mint: Optional[bool] = True,
):
"""
For this datataken, create a dispenser faucet for free tokens.
This wraps the smart contract method Datatoken.createDispenser()
with a simpler interface.
:param: max_tokens - max # tokens to dispense, in wei
:param: max_balance - max balance of requester
:tx_dict: e.g. {"from": alice_wallet}
:return: tx
"""
# already created, so nothing to do
if self.dispenser_status().active:
return
# set max_tokens, max_balance if needed
max_tokens = max_tokens or MAX_UINT256
max_balance = max_balance or MAX_UINT256
# args for contract tx
dispenser_addr = get_address_of_type(self.config_dict, "Dispenser")
with_mint = with_mint # True -> can always mint more
allowed_swapper = ZERO_ADDRESS # 0 -> so anyone can call dispense
# do contract tx
tx = self.createDispenser(
dispenser_addr,
max_tokens,
max_balance,
with_mint,
allowed_swapper,
tx_dict,
)
return tx
```
</details>
<details>
<summary><a href="https://github.com/oceanprotocol/ocean.py/blob/4aa12afd8a933d64bc2ed68d1e5359d0b9ae62f9/ocean_lib/models/datatoken.py#LL379C5-L400C18"><code>datatoken.dispense(self, amount: Union[int, str], tx_dict: dict)</code></a></summary>
This function is used to retrieve funds or datatokens for an user who wants to start an order.