B9lab Logo
Tezos Developer Portal
Developer PortalDeveloper Portal

Oracles

Learn to use Harbinger and Chainlink with Tezos


The need to introduce external data

The term oracle, in computability theory, describes an instance which can solve problems that a Turing machine is not capable of. In blockchain, we have a similar situation where the term oracle appears.

One of the main issue areas, when it comes to developing tools and applications on blockchain protocols, is how to introduce external/off-chain data that cannot be deterministically verified - You cannot verify if introduced data is correct and/or potentially malicious. Thus, introducing off-chain data can be unreliable or even have severe security implications. How to approach the issue of off-chain data reliability?

As you probably already guessed, the answer is simple: oracles.

By an oracle, in the blockchain context, we understand a third-party service capable of verifying external data, i.e. a provider enabling the query of the external data source. Because external data cannot be verified internally an oracle is necessary to uphold reliability. Oracles help introduce reliable external data about external events on the blockchain. Thus, they allow for external API services and the triggering of actions.

As blockchain ecosystems evolve and the number of Decentralised Finance (DeFi) use cases, protocols, and applications rises, the need to incorporate off-chain data has increased. A thriving ecosystem needs oracles.

info icon

Remember, the term DeFi covers numerous applications built on top of protocols aimed at decentralising financial services and reducing the number of intermediaries involved in financial processes.

So, there are use cases for which access to some off-chain data is necessary, for example recent market data. For such cases, we can use an oracle like Chainlink or Harbinger. Let's take a closer look!

Chainlink provides a decentralized network formed by oracle nodes. They are maintained by independent operators. So, we don't depend on just one node but a network of nodes - Data becomes more trustworthy and there is no central point of failure.

Basically, Chainlink connects smart contracts and off-chain data through their decentralised oracle network. Therefore, the oracle network provided allows to access off-chain data in a secure and reliable way.

info icon

If you want to take a closer look at Chainlink's source code, we can recommend the Chainlink GitHub repository.

Combining institutional-grade contracts with secure and reliable off-chain data

A cooperation of SmartPy.io and Cryptonomic has led to Chainlink being natively available to Tezos developers. Whereas Tezos and its ecosystem can profit from leveraging Chainlink's oracle network to enable applications relying on trustworthy external data.

Chainlink's oracle network bears the benefit of integrating developed, reliable, secure oracle solutions. The network builds on a growing number of independent, security reviewed, and Sybil resistant node operators. Additionally, being able to work with a shared node set makes integration less prone to attacks and more seamless. Furthermore, Chainlink has serviced smart contracts worth millions of dollars across multiple projects and chains, i.e. its performance has been tested, and it's used by a number of DeFi applications and blockchain protocols.

Another major argument for using Chainlink is that it provides market data for many price feeds. These use at least seven nodes for each feed. The price feed nodes query data from different data aggregators, i.e. the data level is decentralised too. Afterwards, the nodes' data is aggregated and pushed on-chain as a price update.

Chainlink also provides long-term viability and flexibility. It's modularity relies on different components that can be upgraded, which provides flexibility and makes it a long-term option.

Leveraging Chainlink will allow for the development of fully-integrated contracts for new applications built on Tezos. These can built their execution of functions on the inclusion of external data on-chain. In the end, this can open the innovative potential of blockchain applications, especially in regard to financial applications.

tip icon

Please have a look at this sample project.The smart contracts to integrate Chainlink are written in SmartPy. The smart contracts are deployed and invoked with ConseilJS.

First, let's have a look at the SmartPy Chainlink template.

There you will find:

  • an oracle contract,
  • an escrow contract,
  • a client contract, and
  • a token contract.

You can see that there is an FA2 implementation of the Chainlink token, LINK. It is used to pay node operators for retrieving data for smart contracts:


# Oracle Requests are paid with tokens handled in an FA2 contract.
# The FA2 contract template can be extended by a proxy entry point to
# ensure transfers to Oracle and payments are synchronized.

FA2 = sp.import_template("FA2.py")

...

class Link_token(FA2.FA2):
    def __init__(self, admin, config, metadata, token_is_proxy = True):
        self.token_is_proxy = token_is_proxy
        if token_is_proxy:
            self.proxy = sp.entry_point(proxy)
        FA2.FA2_core.__init__(self, config, metadata, paused = False, administrator = admin)

Let's have a look at the client contract:


# Client class is both requester and receiver
# As a consequence, overall architecture is simplified

class Client(Client_requester, Client_receiver):
    def __init__(self, escrow,
                       oracle,
                       token_contract,
                       job_id,
                       admin,
                       verify_answer_validity = False,
                       token_is_escrow = False,
                       token_address = None,
                       token_is_proxy = True):

It inherits the classes Client_requester and Client_receiver. This simplifies things because one contract less needs to be deployed. Client_requester can send and pay requests to the oracle via the escrow contract. The Client_receiver expects to be called by the oracle contract.

In the template comments, among others, you can find:

    A Request contains:
    tag (sp.TString): `REQUEST_TAG`
    oracle (sp.TAddress): address of the Oracle
    target (sp.TAddress): address (and entry point) of the target
    job_id (sp.TBytes): job_id as required by Oracle
    parameters (TParameters): Optional parameters
    cancel_timeout (sp.TTimestamp): Time after which the Client can cancel the request
    fulfill_timeout (sp.TTimestamp): Time after which the Oracle cannot fulfil anymore
    client_request_id (sp.TNat): Id of the request. Client shall never emits two requests with same id for its own safety.
    
    A Result contains:
    tag (sp.TString): `RESULT_TAG`
    client (sp.TAddress): address of the client requester
    client_request_id (sp.TNat): Id of the request
    result (TValue): result value computed by the Oracle

Let's have a look at the oracle contract:

# Oracle is the on-chain incarnation of an Oracle provider.
# It maintains a queue of requests of type `TRequest` containing
# a sender, a target entry point and parameters.
class Oracle(sp.Contract):
    def __init__(self,
                 admin,
                 escrow_contract,
                 escrow_address      = None,
                 min_cancel_timeout  = 5,
                 min_fulfill_timeout = 5,
                 min_amount          = 0):
        self.escrow_contract = escrow_contract
        if escrow_address is None:
            escrow_address = escrow_contract.address
        self.init(
            setup = sp.record(
                admin               = admin,
                active              = True,
                min_cancel_timeout  = min_cancel_timeout,
                min_fulfill_timeout = min_fulfill_timeout,
                min_amount          = min_amount,
                escrow              = escrow_address
                ),
            next_id = 0,
            reverse_requests = sp.big_map(tkey = sp.TRecord(client = sp.TAddress, client_request_id = sp.TNat), tvalue = sp.TNat),
            requests = sp.big_map(tkey = sp.TNat, tvalue = TRequest),
        )

There you see the request's list, requests = sp.big_map(tkey = sp.TNat, tvalue = TRequest).

At the time of writing, each data provider needs to deploy an oracle contract, so we don't have any proxy. This contract has two important methods:

    @sp.entry_point
    def create_request(self, params):
      ...
      data.requests[data.next_id] = request.copy

and

    @sp.entry_point
    def fulfill_request(self, request_id, result, force):
      ...

The create_request entry point will be called by the escrow contract and fulfill_request by an external source.

Finally, let's have a look the escrow contract comments:

# Escrow receive the request and payment and transmits the request to the oracle
# The requester can cancel the request and recover his payment after a timeout
# If the Oracle answers before a cancel:
#    the Escrow sends the reward to the Oracle and the answer to Target
class Escrow(sp.Contract):
  ...

So, the client will call the escrow, which will forward a request to the oracle. Because the escrow contract handles the payments and validation, it increases trust in the system.

Let's look into this in a test from the template:

        ##########
        # Test 1 #
        ##########
        scenario.h2("requester1 sends a request that gets fulfilled")
        requester1_balance = compute_balance(scenario, link_token, requester1.address)
        escrow_balance     = compute_balance(scenario, link_token, escrow.address)
        scenario.h3("A request")

        scenario += requester1.request_value(request_value_params).run(sender = requester1_admin)

        request_id += 1
        # Founds should be locked
        requester1_new_balance = compute_balance(scenario, link_token, requester1.address)
        escrow_new_balance     = compute_balance(scenario, link_token, escrow.address)
        scenario.verify(sp.as_nat(requester1_balance - amount) == requester1_new_balance)
        scenario.verify(escrow_balance + amount == escrow_new_balance)

        scenario.h3("Ledger")
        scenario.show(link_token.data.ledger)

        oracle_balance = compute_balance(scenario, link_token, oracle.address)
        escrow_balance = compute_balance(scenario, link_token, escrow.address)
        # Request must be registered in oracle
        request_key = sp.record(client = requester1.address, client_request_id = request_id)
        scenario.verify(oracle.data.reverse_requests.contains(request_key))

        scenario.h3("Oracle consumes the request")

        scenario += oracle.fulfill_request(request_id = request_id - 1, result = value_int(2_500_000), force = False).run(sender = oracle1)

        # Founds must be unlocked
        oracle_new_balance = compute_balance(scenario, link_token, oracle.address)
        escrow_new_balance = compute_balance(scenario, link_token, escrow.address)
        scenario.verify(oracle_balance + amount == oracle_new_balance)
        scenario.verify(sp.as_nat(escrow_balance - amount) == escrow_new_balance)
        # Request must be removed from oracle
        scenario.verify(~oracle.data.reverse_requests.contains(request_key))
        # Receiver must have registered the result
        scenario.verify_equal(receiver1.data.value.open_some(), 2_500_000)

So there are two calls (and a lot of validations because this is a test):

        scenario += requester1.request_value(request_value_params).run(sender = requester1_admin)

and

        scenario += oracle.fulfill_request(request_id = request_id - 1, result = value_int(2_500_000), force = False).run(sender = oracle1)

These calls are the calls you will need to make with your clients, if all contracts are deployed.

chainlink contracts

The data provider will:

  • check the storage of the oracle contract, and
  • process a request and call fulfill_request on the oracle contract.

The data seeker will:

  • call request_value with the amount of token and timeout parameters, and
  • check the storage for the requested value.
tip icon

You will also need to deploy a faucet to get some tokens to test this. Have a look at the Writing Smart Contracts section to see how you can deploy the contracts with SmartPy and also at the Developing Clients section to see how you can fetch the storage and call the entry points of deployed contracts.

reading icon

Harbringer: Bringing a price oracle to the Tezos ecosystem

For many DeFi applications to run seamlessly a trustworthy and reliable price feed is vital. Among others, a price feed allows for derivatives, futures smart contracts, insurance, etc.

It comes down to a question of trust. As DeFi applications involve financial means and their value stems strongly from the accuracy, reliability, and safety of price data, it is important to integrate a price oracle on protocols like Tezos. This avoids (maliciously) manipulated data creating unreliable and untrusty price feeds. Harbringer is such a price oracle for digital assets. It provides a set of tools and reference contracts. With them anyone can deploy a price oracle on Tezos.

It allows you to fetch signed price feeds from market data. Exchanges can let their oracles submit a signed price feed, which is then included on-chain by a so-called poster. The exchange signs the price with their private key.

info icon

Exchanges can provide Harbinger-compatible price feeds at low costs through a reference signer.

The poster "posts" the signed price on-chain. Once a smart contract is deployed that relies on the signed price, the signing key is checked, and if correct, the contract deploys using it. This ensures that only trustworthy, i.e. "good", data is used.

How signed price data gets on-chain

Let's take a step-by-step look at the how Harbinger works in a processual sense and the different conceptual elements.

Harbinger price data process
Harbinger price data process
tip icon

Do you need a Harbinger quickstart? Take a look at this.

Harbinger consists of three main components:

  • the contracts to keep track of the price data,
  • the signers, who retrieves and signs price data, and
  • the posters that retrieves price data from the signer and pushes it to the oracle contract.

In Harbinger, a reference signer is used to support market data APIs from exchanges like Coinbase, Binance, and OKEx. Whereas the architecture allows exchanges to provide signed price data for the same asset. Thus, multiple data sources are possible for the same price. This increases trust and reliability of price data as a medianised price can be calculated.

By using the reference signer, security is increased while it deploys independently from the exchange's network and servers, doesn't have to be maintained, and has a secure caching API gateway. Posters cannot request a price from the APIs more than once per minute.

To provide reliable and updated price data, posters get signed prices from either a signer or directly from an exchange. The poster can post the prices on-chain at any time. Updating prices is thus decentralised - everyone can post price data from multiple sources and posting doesn't have to rely on just one poster.

As mentioned before, the price is only accepted by the storage contract if the data is signed with a private key.

Posters can function as Serverless Framework applications or run a command line as part of their process to get price data in continuous intervals.

Harbinger's price storage contracts and normaliser contract

Developers are given the option to leverage Harbinger while configuring contract implementations in regard to gas costs, storage use, and data points. This is done through the implementation of two reference contracts:

  • the price storage contract: stores newest price data for each market at a certain point of time, and
  • the normaliser contract: normalises data points - the number of data points can be customised - for one market for a specific range of updates - the number of updates can be also customised.

The price storage contract is written in SmartPy and compiled to Michelson. The contract is initialised with the signer's public key. The newest price data is retrieved from a signer and stored with the contract.

Once data is posted using the price storage contract, it can then be pushed to the normaliser contract either by the same poster or another entity.

The normaliser contract is also a reference contract. It helps with the normalisation of price data from multiple updates. It calculates the volume-weighted average for the price. The contract can be configured in regard to what type of weighted price it calculates and the number of data points that are included when normalising price data.

Posting the newest price data and updating a normaliser contract can both be performed by invoking a callback to the storage contract. This is done in a single atomic transaction and not two.

How to use Harbinger

Basically you can deploy the oracle and normalizer contracts with the Harbinger CLI.

$ npm i -g @tacoinfra/harbinger-cli

If you deploy the contract, you will need to pass a public key for the signer.

tip icon

There are already contracts deployed for:

You can see the public keys of the signer in the storage.

Notice that only a signer can update a price feed.

Then you can use the Harbinger Serverless Price Feed Signer to sign price feeds. The CLI can also update the contract with a signed price. In addition, you can use the Harbinger Serverless Poster, which can update the price automatically.

tip icon

If you don't want to use those high level tools, have a look at the harbinger-lib repository.

In the price storage contract each item will have:

  • oracle data for asset,
  • period start,
  • period end,
  • open,
  • high,
  • low, and
  • close,

like you can see here for XTZ:

ha
tip icon

Have a look at the Developing Clients section to see how you can fetch the storage. Notice that it might be better to fetch the rates from the normaliser contract because of the volume weighted average price it contains.

Discuss on Slack