1
0
mirror of https://github.com/oceanprotocol/docs.git synced 2024-06-17 10:03:15 +02:00
docs/data-science
2023-10-12 10:38:08 +03:00
..
data-engineers.md Integrated latest review changes 2023-10-12 10:38:08 +03:00
README.md Integrated latest review changes 2023-10-12 10:38:08 +03:00
the-data-value-creation-loop.md Cleanup image assets file structure and the broken links 2023-07-15 14:52:48 +03:00

description cover coverY
Ocean Protocol is built by data scientists, for data scientists. ../.gitbook/assets/cover/data_science_banner.png 0

📊 Data Science

Ocean Protocol - Built to protect your precious.

Why should data scientists use Ocean Protocol?

Ocean Protocol is built for data scientists to monetize data effectively and solve the "Data Value Creation Loop". These open-source tools tackle some of the biggest problems for data scientists: how to sell data anonymously, how to sell compute jobs on datasets, how to control access to data, and many more. By leveraging blockchain architecture, Ocean achieves several tactical advantages over Web2 to solve these data sharing problems.

What are some use cases for Ocean Protocol?

  • Enable trustless transactions (i.e. buy, sell, and transfer data)
  • Trace data provenance and consumption
  • Token gate a website or dApp using datatokens
  • Deploy a decentralized data marketplace
  • Sell algorithmic compute jobs on private datasets

How to design a ML system using Ocean Protocol?

The first step is to tokenize data into data NFTs and datatokens on the blockchain. We offer a no-code way to tokenize data via the Ocean Market. But we also offer code options for data scientists to use the Ocean.py and Ocean.js libraries. Data scientists can then build sophisticated ML systems on top of the tokenized data by using composable Ocean Protocol tools. ML models can use a variety of Ocean smart contracts, including Ocean's Compute-to-Data, to build model outputs all the way to the last-mile delivery for businesses.

  • Learn the difference between Ocean Protocol data NFTs and datatokens, the two types of tokenized data assets you need to start building your ML systems.
  • Discover Ocean's Compute-to-Data engine that can help you to solve the difficult problem of selling algorithmic compute jobs on your datasets without actually revealing the contents of the algorithm & dataset to the consumer.