Apibara Blog

News and updates about Apibara

Follow us on Twitter|Subscribe to our newsletter

26 Sept 2023

Apibara Changelog #6

Testing should help developers move faster and not break things. When building indexers, that often feels different. That's why we decided to add a built-in tool to Apibara to quickly test your indexers.

Apibara testing tool

If you want to start testing your indexers, update the Apibara CLI to the latest version with:

curl -sL https://install.apibara.com | bash

The new apibara test command implements snapshot testing for Apibara indexers. The first time you run a test, it will fetch actual data from a DNA stream and run your indexer on this data, storing all relevant data (configuration, input stream, and output data) to a snapshot file. You can inspect this file with any text editor and check that the output matches your expectations (pro tip: you can manually edit the result to your liking). Now, you can rerun the test command, and it will replay the stream data from the snapshot file and compare the output of the indexer with what is stored in the snapshot. If the output matches, the test is considered a success. If it fails, the CLI will print an error message showing a difference between the expected and actual results.

The test command provides some options to customize the input stream by specifying a specific block range for replaying data, among other customizations. You can also decide to overwrite an existing snapshot file. We will publish a more detailed testing tutorial in the upcoming days. You can read more by running apibara test --help.

19 Sept 2023

Apibara Changelog #5

This week, we released an update to all Apibara integrations. This update improves the integrations by exposing a new gRPC service to query the indexing status and progress.

Upgrading is easy. Check what integrations you have installed with

$ apibara plugins list
# NAME KIND VERSION
#
# mongo sink 0.3.0
# postgres sink 0.3.0
# console sink 0.3.0
# parquet sink 0.3.0

Then, upgrade the plugins with the following:

$ apibara plugin install sink-webhook

Every time you run an indexer, the status server will automatically start in the background. The server binds to a random port to allow you to run multiple indexers simultaneously, so look for the following message to find out how to reach your server.

INFO apibara_sink_common::status: status server listening on 0.0.0.0:8118

Alternatively, specify an address and port with the --status-server-address flag, for example --status-server-address=0.0.0.0:8118.

While the indexer is running, query its state using a gRPC client. In this example, we use grpcurl to query it from the command line. The gRPC service definition is available on GitHub (don't forget to star and subscribe while you're there) so you can generate a client in your favourite language!

# The status server supports reflection!
$ grpcurl -plaintext localhost:8118 list
apibara.sink.v1.Status
grpc.reflection.v1alpha.ServerReflection
# The only method exposed is `GetStatus`
$ grpcurl -plaintext localhost:8118 list apibara.sink.v1.Status
apibara.sink.v1.Status.GetStatus
# Call this method to get the current status
$ grpcurl -plaintext localhost:8118 apibara.sink.v1.Status.GetStatus
{
"status": "SINK_STATUS_RUNNING",
"startingBlock": "3129",
"currentBlock": "4248",
"headBlock": "241673"
}

This API is the last piece needed before working on the new runner abstraction. The runner API enables developers to start, stop and query indexers through a single API. It's like docker-compose but for indexers.

12 Sept 2023

Apibara Changelog #4

Connect the PostgreSQL integration to any cloud provider

This week, we released version 0.3.0 of the PostgreSQL integration. This version supports secure TLS connections between the indexer and the database. You can synchronize onchain data to hosted PostgreSQL, such as Amazon RDS, Google CloudSQL, Supabase, and Neon.

Head over to the documentation to learn more about the new TLS options and how to connect Apibara to your database.

Indexer Typescript package updates

We updated the @apibara/indexer package with minor fixes to transaction types. Users should update to the latest release for better type-checking and autocomplete support.

5 Sept 2023

Apibara Changelog #3

New endpoint to monitor DNA servers' status

This week, we released version 1.1.2 of the Starknet DNA service. This version brings a new Status gRPC method to query the ingestion state of the DNA server. This endpoint can be used to know the most recent block in the chain and the latest ingested block by the node.

Users running their Starknet DNA server are encouraged to upgrade their container images to quay.io/apibara/starnet:1.1.2.

New @apibara/indexer Typescript package release

We released an update to the indexer Typescript library. This version fixes some typing issues, especially for Starknet's Filter and Block types. Users can upgrade by changing their imports to point to the new release.

import { Filter } from "https://esm.sh/@apibara/indexer/starknet@0.2.0
28 Aug 2023

Apibara Changelog #2

Entity mode for the MongoDB integration

This week, we are launching a new entity mode for the MongoDB integration. This new mode gives you even more power by enabling you to insert and update existing entities that you previously inserted in Mongo.

Before you can try out entity mode, you need to update the MongoDB integration:

apibara plugins install sink-mongo

To enable entity mode, set the entityMode option to true. You must then change your transform function to return a list of entity operations. These operations are JSON objects that contain an entity property that specifies which entities need update and an update property with a Mongo update operation or Mongo pipeline on that entity.

For example, the following update operation updates the owner of an NFT token and, at the same time, increases the transaction count on the same token.

export default function transform({ header, events }: Block) {
const tokenIdCount = new Map<string, number>();
const tokenIdToOwner = new Map<string, string>();
for (const { event } of events) {
const dest = event.data[1];
const tokenId = uint256.uint256ToBN({
low: event.data[2],
high: event.data[3],
});
tokenIdToOwner.set(tokenId.toString(), dest);
tokenIdCount.set(
tokenId.toString(),
(tokenIdCount.get(tokenId.toString()) ?? 0) + 1,
);
}
return [...tokenIdToOwner.entries()].map(([tokenId, owner]) => ({
entity: {
tokenId,
},
update: {
"$set": {
tokenId,
owner,
updateAt: header?.timestamp,
updateAtBlock: header?.blockNumber,
},
"$inc": {
transferCount: tokenIdCount.get(tokenId) ?? 0,
},
},
}));
}

You can read more about entity mode, including details about its implementation, in our documentation. Looking forward to seeing what you're going to build with it!

25 Aug 2023

Indexing NFT Metadata with Apibara and Inngest

In this tutorial, we are going to show how to easily index NFT Metadata by leveraging a serverless job queue like Inngest.

Implementing a robust and scalable NFT metadata indexer is hard, your indexer needs to take the following into consideration:

  • you need to "discover" NFT tokens by listening to onchain activity.
  • fetching the NFT token URL is slow because it involves a JSON-RPC call.
  • the NFT metadata server may be temporarily or permanently unavailable.
  • you need to take into account rate limits for both the JSON-RPC server and the metadata servers.
  • you want to concurrently fetch the metadata for as many tokens as possible to speed up indexing.

Luckily, all of the issues above are solved by using modern developers tools like Apibara and Inngest.

Apibara is an open-source platform to build indexers. Our philosophy is to focus on streaming and transforming data and then sending the result to third-party integrations. In this case, we use Apibara to trigger jobs in a task queue.

Inngest is a serverless task queue: you start by implementing durable tasks using Javascript or Typescript. Durable tasks are functions composed by one or more steps (for example, fetch the token URL, or fetch metadata). Inngest will run each step in order, automatically retrying a step if it fails. With Inngest, you can implement complex workflows without having to worry about scheduling or retries.

In the next sections, you will learn how to:

Before we begin, you should visit the getting started guide to learn how to install and configure Apibara.

The image below contains the reference architecture of what we are going to build in this tutorial:

  • an indexer streams data from a DNA server.
  • the indexer uses onchain data to determine which NFT needs indexing and invokes a new Inngest task.
  • Inngest schedules workers to index the NFT metadata.

Combining Apibara with Inngest

As always, the source code for this tutorial is available on GitHub.

Setting up Deno & Inngest

For this tutorial, we are going to use Deno as the Javascript runtime. Refer to this guide to setup Deno on your machine. Note that you can follow along this tutorial using Node.js if you prefer that.

We start by creating a src/inngest folder to contain all Inngest-related code.

We create a file src/inngest/client.ts that contains the definition for our Inngest client. It contains the schema for the events that will trigger our tasks and the Inngest client. Notice that since we are running Inngest locally, we use the "local" eventKey.

import { EventSchemas, Inngest } from "https://esm.sh/inngest";
type Events = {
"nft/mint": {
data: {
address: string;
tokenId: string;
};
};
}
export const inngest = new Inngest({
name: "NFT Metadata Tutorial",
eventKey: "local",
schemas: new EventSchemas().fromRecord<Events>(),
});

The next step is to create a file containing the definition of the task we want to run. We do that in src/inngest/fetch_metadata.ts. You can learn more about writing Inngest functions in the official documentation.

import { inngest } from "./client.ts";
export const fetchMetadata = inngest.createFunction(
{ name: "fetchMetadata" },
{ event: "nft/mint" },
async ({ event, step }) => {
// ⚡ Use `step.run` to asynchronously run a that may fail. Inngest will
// automatically retry it if it fails.
const metadataUrl = await step.run("Fetch token URL", () => {
// Here we could fetch the metadata URL from the node using an RPC call.
return `https://cloud.argent-api.com/v1/moments/metadata/1/${event.data.tokenId}`
});
const metadata = await step.run("Fetch metadata", async () => {
const response = await fetch(metadataUrl);
return await response.json();
});
return {
event,
body: metadata,
}
},
);

The last step is to create the HTTP server that we will use later to start new tasks. In this case we use express, but you can integrate with other frameworks such as Next.js. We implement the server in src/server.ts:

import express from "https://esm.sh/express";
import { serve } from "https://esm.sh/inngest/express";
import { inngest } from "./inngest/client.ts";
import { fetchMetadata } from "./inngest/fetch_metadata.ts";
const app = express();
// @ts-ignore - express types are wrong
app.use(express.json());
app.use("/api/inngest", serve(inngest, [fetchMetadata]));
app.get("/health", (_req, res) => {
res.send("OK");
});
app.listen(8000, () => {
console.log("Started server on port 8000");
});

Starting Inngest

We are now ready to start the Inngest server. From the root of your project, run deno run --allow-all src/server.ts to start the express server. In another terminal, start the Inngest UI with npx inngest-cli@latest dev -u http://localhost:8000/api/inngest and then visit http://127.0.0.1:8288. If you navigate to the "Apps" section, you should see the application we defined in src/inngest/client.ts.

Inngest UI

We are now ready to invoke Inngest functions using Apibara.

Trigger functions with Apibara

We are going to write an Apibara indexer to invoke Inngest functions. Inngest provides an HTTP endpoint where we can send events (like the nft/mint we defined) to start the function to fetch metadata we defined previously. We are going to use the Webhook integration to invoke this endpoint for each NFT minted.

For this tutorial, we are going to use the "Argent: Xplorer" collection as an example, but you can use the same strategy on any NFT collection.

We are going to create a src/indexer.ts file. This file contains the indexer configuration and a transform function (more on this later). We configure the indexer to receive Transfer events from the 0x01b2...3066 smart contract, starting at block 54 900 (when the contract was deployed). Finally, we configure the sink. In this case we want to use the webhook sink to send the data returned by the transform function to the HTTP endpoint specified in the configuration. We turn on the raw option to send data to the endpoint exactly as it's returned by the transform function.

import { hash, uint256 } from "https://esm.sh/starknet";
import type { Config } from "https://esm.sh/@apibara/indexer";
import type { Starknet, Block, BlockHeader, EventWithTransaction } from "https://esm.sh/@apibara/indexer/starknet";
import type { Webhook } from "https://esm.sh/@apibara/indexer/sink/webhook";
export const config: Config<Starknet, Webhook> = {
streamUrl: "https://mainnet.starknet.a5a.ch",
startingBlock: 54_900,
network: "starknet",
filter: {
header: {
weak: true,
},
events: [
{
fromAddress: "0x01b22f7a9d18754c994ae0ee9adb4628d414232e3ebd748c386ac286f86c3066",
keys: [hash.getSelectorFromName("Transfer")]
},
],
},
sinkType: "webhook",
sinkOptions: {
targetUrl: "http://localhost:8288/e/env_key",
raw: true,
},
};

As we mentioned early, Apibara uses the transform function exported by the script to transform each Starknet block into a piece of data that is specific to your application. In this case, we want to perform the following:

Note that we can schedule multiple tasks by sending a list of event payloads.

Add the following code at the end of src/indexer.ts. Since an Apibara indexer is just regular Typescript, you can continue using any library you already use and share code with your frontend.

export default function transform({ header, events }: Block) {
return events.flatMap((event) => transferToTask(header!, event));
}
function transferToTask(_header: BlockHeader, { event }: EventWithTransaction) {
const from = BigInt(event.data[0]);
if (from !== 0n) {
return [];
}
const tokenId = uint256.uint256ToBN({
low: event.data[2],
high: event.data[3]
}).toString();
return [{
name: "nft/mint",
data: {
address: event.fromAddress,
tokenId,
},
}];
}

Now you can run the indexer with apibara run src/indexer.ts -A <dna-token>, where <dna-token> is your Apibara DNA authentication token (you can create one in the Apibara dashboard). You will see your indexer going through Starknet events block by block and pushing new tasks to Inngest.

You can see all function invocations in the Inngest UI. Select one event to see the function steps in real-time, together with their return values.

Inngest UI with events

What's next

This tutorial showed how to get started integrating Inngest with Apibara. If you want to take this tutorial further and use it for your project, you can explore the following possibilities:

21 Aug 2023

Apibara Changelog #1

New indexer Typescript SDK

You can now write type-safe indexers using the new Typescript SDK! Use it by importing the @apibara/indexer package from your favourite CDN (like esm.sh or Skypack) and then adding types to your variables and functions.

import type { Config } from "https://esm.sh/@apibara/indexer@0.1.2";
import type {
Block,
Starknet,
} from "https://esm.sh/@apibara/indexer@0.1.2/starknet";
import type { Console } from "https://esm.sh/@apibara/indexer@0.1.2/sink/console";
export const config: Config<Starknet, Console> = {
streamUrl: "https://goerli.starknet.a5a.ch",
network: "starknet",
filter: {
header: {},
},
startingBlock: 800_000,
sinkType: "console",
sinkOptions: {},
};
export default function transform(block: Block) {
return block;
}

Integrations updates

We updated the integrations based on your feedback. You can update using the apibara CLI.

Start by listing all the integrations you installed:

$ apibara plugins list
# NAME KIND VERSION
# mongo sink 0.1.0
# postgres sink 0.1.0
# webhook sink 0.1.0
# console sink 0.1.0
# parquet sink 0.1.0

Then update them one by one:

$ apibara plugin install sink-console

Check that the upgrade was successful.

$ apibara plugins list
# NAME KIND VERSION
# mongo sink 0.2.0
# postgres sink 0.2.0
# webhook sink 0.2.0
# console sink 0.2.0
# parquet sink 0.2.0

Changes to the indexer transform function

We changed the indexer’s transform function to accept a single block at the time. Previously this function was invoked with an entire batch of data. Talking with early adopters, we realised this behaviour is confusing, so now the function accepts a block at a time.

Upgrading your indexers is easy; change your transform function as follows:

diff --git a/script.ts b/script.ts
index 999ba82..ea9667a 100644
--- a/old.ts
+++ b/new.ts
@@ -1,8 +1,5 @@
-export default function transform(batch: Block[]) {
- return batch.flatMap(transformBlock);
-}
-
-function transformBlock(block: Block) {
+export default function transform(block: Block) {
// transform a single block
return block;
}

Disk persistence for development

Before this release, developers had only two options for persisting the indexer state between restarts:

  • No persistence: the indexer would restart from the beginning every time it was launched. This behaviour is acceptable for the early stages of development but becomes cumbersome later in the development lifecycle.
  • Etcd persistence: store the state in a distributed key-value store and ensure that only one copy of the same indexer runs simultaneously. This is excellent for production usage, but overkill for development.

This release adds a third option:

  • Disk persistence: store the indexer’s state in a file inside a user-specified folder. Developers can restart an indexer simply by deleting its state file. Notice that this model doesn’t ensure that only one copy of the indexer is running at the same time, and so it’s not recommended for production usage. You can enable this option by running your indexer with the --persist-to-fs=<dir> option.

Apibara Installer

We fixed a bug installing the CLI tool on MacOS.

Starknet DNA Service

We updated the Starknet DNA Service to work better with nodes implementing the Starknet JSON-RPC Spec v0.4. Here’s how to upgrade if you’re running a Starknet DNA service:

  • Ensure you’re running Pathfinder v0.72 or newer
  • Ensure you point the Starknet DNA Service to the RPC v0.4 endpoint (e.g. http://<pathfinder0ip>/rpc/v0.4)
  • Upgrade the apibara/starknet Docker image to v1.1.1
15 Aug 2023

Apibara indexers preview

We are excited to release the first iteration of the Apibara command line tool. This tool is the first step in overhauling the Apibara developer experience to reduce the time needed to build production-grade indexers. This tool enables developers to easily synchronize onchain data with any offchain service they use: from databases like PostgreSQL and MongoDB to any service that accepts webhooks.

Over the past year, we worked with dozens of teams to understand how they consume onchain data and build applications. We learned that all projects are different, so we wanted a tool that enables them to keep using the tools they already know and love.

The new indexers are built on top of the DNA streams and provide a higher-level developer experience for building indexers.

The new CLI is the main entry point to Apibara: use it to run indexers and manage integrations. Installation is as simple as:

curl -sL https://install.apibara.com | bash

Indexers are implemented in Javascript or Typescript. Apibara embeds a Deno runtime to execute the code users provide on each batch of data it receives in the stream. Thanks to Deno, the indexer scripts are self-contained, and you can run them in a single command. Apibara doesn’t require you to manage half a dozen configuration files.

For example, the following code is enough to index all ERC-20 transfers to a PostgreSQL database. Apibara takes care of all the low-level details such as chain reorganizations.

import { hash, uint256 } from "https://esm.run/starknet@5.14";
import { formatUnits } from "https://esm.run/viem@1.4";
const DECIMALS = 18;
const filter = {
// Only request header if any event matches.
header: {
weak: true,
},
events: [
{
fromAddress:
"0x049D36570D4e46f48e99674bd3fcc84644DdD6b96F7C741B1562B82f9e004dC7",
keys: [
hash.getSelectorFromName("Transfer"),
],
},
],
};
export const config = {
streamUrl: "https://goerli.starknet.a5a.ch",
startingBlock: 800_000,
network: "starknet",
filter,
sinkType: "postgres",
sinkOptions: {
tableName: "transfers",
},
};
// Transform each batch of data using the function defined in starknet.js.
export default function transform(batch) {
return batch.flatMap(decodeTransfersInBlock);
}
function decodeTransfersInBlock({ header, events }) {
const { blockNumber, blockHash, timestamp } = header;
return events.map(({ event, receipt }) => {
const { transactionHash } = receipt;
const transferId = `${transactionHash}_${event.index}`;
const [fromAddress, toAddress, amountLow, amountHigh] = event.data;
const amountRaw = uint256.uint256ToBN({ low: amountLow, high: amountHigh });
const amount = formatUnits(amountRaw, DECIMALS);
// Convert to snake_case because it works better with postgres.
return {
network: "starknet-goerli",
symbol: "ETH",
block_hash: blockHash,
block_number: +blockNumber,
block_timestamp: timestamp,
transaction_hash: transactionHash,
transfer_id: transferId,
from_address: fromAddress,
to_address: toAddress,
amount: +amount,
amount_raw: amountRaw.toString(),
};
});
}

After data is streamed and transformed, it’s sent to the downstream integration. As of today, Apibara ships with 4 integrations:

  • PostgreSQL: mirror onchain data to a database table.
  • MongoDB: mirror onchain data to a collection.
  • Webhook: invoke a webhook every time a new block is produced.
  • Parquet: generate datasets from onchain data.

This is just the first step in a new journey for Apibara. Over the following weeks, we will launch the following products:

  • A hosted service where to deploy your indexers. Develop your indexers locally with the tools presented today. When it comes time to deploy to production, we will take care of it.
  • A new testing framework for indexers. Record live data streams and replay them offline to test your transformation step.
  • A Typescript library to help develop type-safe indexers.
  • More integrations. From high-tech databases like ClickHouse to low-tech solutions like Google Sheets, Apibara can integrate with any app.

Head over to the getting started page to learn how to setup and run your first indexer in less than 10 minutes.

Apibara is the fastest platform to build production-grade indexers that connect onchain data to web2 services.

Resources

Blog

© 2023 GNC Labs Limited. All rights reserved.