Apibara Blog

News and updates about Apibara

Follow us on Twitter

16 Jan 2023

New protocol version released

Today we release the next version of the Apibara streaming protocol. This version enables developers to stream exactly the data they need, resulting in indexers that sync up to 100x times faster than the previous version.

Traditionally, developers build indexers by requesting data from the node through a protocol like JSON-RPC. This protocol is not designed for moving significant amounts of data, so these indexers take days to fully sync, blocking teams from iterating quickly over their product.

Apibara solves this by providing an efficient binary protocol to filter and stream any data available in the node directly into your application. You can choose what data you need and stream it using one of our open-source SDKs.

StarkNet is the first network we support. You can access any combination of the following data:

  • Block headers.
  • Transactions and their receipt.
  • Events, together with the transaction that emitted them.
  • Messages from L2 to L1.
  • State updates, including changes in storage, contracts declared and deployed, and nonce updates.

The protocol's previous version will keep running until the end of February 2023. Existing Apibara users are encouraged to migrate to the new protocol since it provides many benefits around speed and data consumption. If you need help, don't hesitate to get in touch over Discord.

To get started using Apibara, head over to our documentation or the Python or Typescript SDKs.

31 Dec 2022

Apibara protocol v1alpha2 preview

Today we release a preview of the next version of the Apibara protocol, also known as v1alpha2. We are excited for this release to bring many new features that improve developer experience and performance. For the first time, developers have a powerful API to stream any on-chain data directly into their application.

const address = FieldElement.fromBigInt(
'0x049d36570d4e46f48e99674bd3fcc84644ddd6b96f7c741b1562b82f9e004dc7'
)
const transferKey = [FieldElement.fromBigInt(hash.getSelectorFromName('Transfer'))]
const filter = Filter.create()
.addEvent((ev) =>
ev.withFromAddress(address)
.withKeys(transferKey))
.withStateUpdate((su) =>
su.addStorageDiff((st) => st.withContractAddress(address)))
.encode()
const client = new StreamClient({
url: 'mainnet.starknet.a5a.ch',
}).connect()
client.configure({
filter,
batchSize: 20,
finality: v1alpha2.DataFinality.DATA_STATUS_FINALIZED
})
for await (const message of client) {
if (message.data && message.data?.data) {
handleBatch(client, message.data.endCursor, message.data.data)
}
}

The most significant change of this release is that developers can now decide what type of data to receive, including:

  • transactions, filtered by transaction type and parameters like sender or class hash,
  • events, filtered by contract address, keys and event data,
  • state updates, including changes in storage variables.

The protocol optimizes data for developer happiness and efficiency. For example, events include their transaction and transaction receipt since this data is often accessed together.

Developers can now choose three levels of data finality:

  • Finalized: only receive finalized data that will always be valid. On StarkNet, this means data accepted on L1.
  • Accepted: receive data from the canonical chain and notifications about chain reorganizations.
  • Pending: like accepted, but also receive pending data before they are produced.

You can start playing with the new protocol today, connecting to the streams directly using gRPC o using the Typescript SDK. The alpha 2 version deployed today supports finalized data only. In the upcoming weeks, we will release an update that achieves feature parity with the current protocol.

The new streams are available at the following addresses:

  • mainnet.starknet.a5a.ch: StarkNet mainnet
  • goerli.starknet.a5a.ch: StarkNet Goerli testnet
26 Nov 2022

Optimistic updates with pending blocks

This week we released an update to the StarkNet streams to support sending pending blocks to clients that wish to receive them. Clients receive pending blocks in between regular blocks, notice that clients can receive multiple pending blocks one after the other.

Sequence showing multiple pending blocks

This new feature is used in the Python SDK to implement optimistic updates. You can update the state of your dapp before its state changes on chain!

This is incredibly useful for applications that require low-latency updates to improve their user experience:

  • NFT drops, showing the number of NFTs left to mint in real time.
  • On-chain games, updating the state of the game as soon as a player moves.
  • Social, showing posts and comments in real-time.
  • DeFi, showing more up-to-date prices.

If you want to implement optimistic updates in your indexer, head over to the Python SDK documentation to learn how to get started.

14 Nov 2022

Apibara Typescript SDK

This week we released a new package to consume Apibara streams from Node.js. Developers can now write Apibara indexers using Typescript or Javascript.

The @apibara/protocol package contains a client to connect to any Apibara stream. Once connected, the client receives historical blocks until it reaches the chain's tip. When streaming live data, the client is informed of chain reorganizations to invalidate the application's data and keep the off-chain state consistent with the on-chain state.

We provide Typescript definitions for StarkNet data in the @apibara/starknet package. Use these type definitions to parse Apibara messages into Typescript objects.

import { NodeClient, credentials } from '@apibara/protocol'
import { StreamMessagesResponse__Output } from '@apibara/protocol/dist/proto/apibara/node/v1alpha1/StreamMessagesResponse'
import { Block } from '@apibara/starknet'
async function main() {
const node = new NodeClient('goerli.starknet.stream.apibara.com:443', credentials.createSsl())
const messages = node.streamMessages({})
return new Promise((resolve, reject) => {
messages.on('end', resolve)
messages.on('error', reject)
messages.on('data', (data: StreamMessagesResponse__Output) => {
const value = data.data?.data?.value
if (value) {
const block = Block.decode(value)
console.log(`${block.blockNumber} ${block.transactions.length}`)
}
})
})
}
main()
.then(() => process.exit(0))
.catch(console.error)

You can find more information and examples in our documentation or on GitHub.

Prisma ORM integration

The next step is integrating the Typescript SDK with Prisma ORM. Prisma is the best Typescript ORM: it generates a type-safe database client from your application model's definitions. Prisma supports multiple relational databases like PostgreSQL, MySQL, and SQLite, and NoSQL databases like MongoDB. The integration enables chain-aware storage for all Prisma models, enabling developers to easily roll back their application state after a chain reorganization.

23 Oct 2022

Composable streams are here

The latest release of Apibara introduces composable streams. Developers can create new Rust applications whose job is to transform streams of data into new streams.

Composable streams

Composable streams are one of Apibara defining features, they are the building stone for many other features like:

  • Stage caching. By dividing complex workflows into multiple streams, each stage is cached. Development cycles become faster so that you can ship sooner.
  • Building common-good streams. Common data, like tokens transfers, are shared between multiple applications: no need to duplicate work by having applications compute the same data over and over again. Our long term vision is to decentralize streams so that developers can freely build on top of them.
  • Easy to deploy services. We can now provide a way for developers to create streams without writing any code.
  • Real-time UIs. Create interfaces that always have the most recent data to increase user experience and trust in your application.

Developers can start building composable streams today using the Rust SDK we relased. We will released a gRPC application interface for all team that want to leverage their existing skills and infrastructure.

Rust application

We're going to ETH Lisbon

Our team is going to attend ETH Lisbon next week! If you would like to meet and chat about builders' needs, DM us on Twitter.

3 Oct 2022

Dynamic event filters

This week's release of the Python SDK brings dynamic event filters! Dynamic event filters enable developers to add event filters while indexing, for example in response to another event.

Use cases for dynamic event filters include indexing:

  • DEX pools that are created by pool factories,
  • game rounds that are created by the main instance of the on-chain game.

To try out this feature, upgrade the apibara package to version 0.5.4.

After that you can subscribe to new events directly in your event handler. For example, the following snippet shows how to decode a PairCreated event from a Uniswap V2-like DEX and then subscribe to the Swap and Sync events on the new trade pair.

async def handle_events(info: Info, block_events: NewEvents):
for event in block_events.events:
if event.name == "PairCreated":
pair_created = decode_event(
pair_created_decoder, event.data)
info.add_event_filters(filters=[
EventFilter.from_event_name(
"Swap", address=pair_created.pair),
EventFilter.from_event_name(
"Sync", address=pair_created.pair),
])

You can find more information about dynamic filters in the documentation.

15 Sept 2022

Apibara streams are here

Today we are releasing the new version of Apibara and the Python SDK.

This release introduces Apibara streams: real-time and composable streams of web3 and web2 data. These changes improve Apibara developer experience significantly:

  • Faster indexing: the protocol can already stream tens of thousands of blocks per minute.
  • More flexible SDK: it's now possible to add features like whole-block indexing and dynamic event filters.
  • Simpler development: developers can get started building in only a few minutes, without depending on Docker.

All users are encouraged to upgrade today, we published a guide on how to migrate from version 0.4 of the Python SDK to the new version 0.5.

What's next

The Apibara streaming protocol is independent from the blockchain being indexed, this enables us to add support to multiple chains in the near future.

The new version also enables us to soon start offering free hosting for all Apibara indexers, subscribe to this block to be notified as soon as it becomes available!

18 Jul 2022

EVM Compatibility

We believe that multi-chain applications are going to be the future, and so we are making Apibara the best tool to build this type of applications. We started our journey by indexing StarkNet events, now we extend our reach to any EVM-compatible chain. Existing tools silo data in separate chain-specific databases, and from our experience developers need to spend a significant effort to integrate these sources together. We reduce the time needed to build multi-chain services by allowing multiple indexers to run side by side.

Chain-aware Document Storage [Preview]

Apibara now provides an API to store documents to a NoSQL database. Chain-aware means that Apibara stores documents together with information about the blockchain block where the data is derived from. This enables the service to automatically invalidate data on chain-reorganizations and roll back to the previous database state. This means that, from the developer's point of view, an indexer is like any other event-driven microservice since they don't need to handle low-level web3 details.

We are going to EthCC!

We are going to be in Paris for EthCC and StarkNetCC! If you'd like to meet us, just DM us on Twitter. We are excited to spend a week surrounded by developers, see what everyone is working on, and share ideas with the amazing Ethereum community.

Open-source on-chain data streaming platform secured by Zero Knowledge proofs.

Resources

Blog

© 2023 GNC Labs Limited. All rights reserved.