Free Site Registration

The Future of AI is on Blockchain

Traders Magazine Online News, August 30, 2018

Sebastian Wurst

At the beginning of my professional career, I used to work as a Data Scientist, and one of my early projects was the analysis of raw human genome data from patients with Alzheimer’s Disease. Many things were painful back then; we had to recruit participants 1-by-1 to enroll in our study, sequencing the genomes to get the data cost us >1m from a research grant, we had to set up a costly compute cluster ourselves, and even simple regression analyses took days to finish (per iteration). I especially remember working for weeks on engineering our data structures, optimizing database settings, and manually re-writing the analysis algorithms (because we exceeded our RAM limits), first just for the analysis to compute at all, and then to finish in days instead of months. A lot has changed since then.

The three currently most prominent enterprise technologies are without doubt AI, blockchain, and IoT, and the driving factor behind them is data; people even go so far to proclaim that “data is the new oil”. New technologies enable collection, sharing, analysis of data, and automation of decisionsbased on them in ways that haven’t been possible before in what is essentially a data value chain.

Out of the three, blockchain technology is what assembles the pieces and there is a whole ecosystem of data-driven blockchain projects emerging. This decentralized ecosystem is set to help incentivize people to contribute data, technical resources, and effort:
  • 1st generation projects have been focussing on creating the data infrastructure to connect and integrate data, e.g. IOTAIoT Chain, or IoTex for data from connected IoT devices, or Streamr for streaming data.
  • 2nd generation projects have been working on creating data marketplaces, e.g. Ocean ProtocolSingularityNet, or Fysical and crowd data annotation platforms, e.g. Gems or Dbrain.
  • With solutions covering the first steps on this data value chain maturing, my friends @sherm8n and Rahul started to work on Raven Protocol, a first 3rd generation project that will close an important gap at the analyze stage: Compute resources for AI training.

A recent OpenAI report showed that “the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month-doubling time”, this is a 300.000x increase since 2012.

OpenAI Report: AI and Compute

The immediate consequences from this are:

  • Higher costs, as used compute is increasing faster than supply
  • Longer lead times for new solution, as model training takes longer
  • Increased market entry barriers based on access to funding & resources

For more information on related topics, visit the following channels:

Comments (0)

Add Your Comments:

You must be registered to post a comment.

Not Registered? Click here to register.

Already registered? Log in here.

Please note you must now log in with your email address and password.