Attention: The code in this repository is intended for experimental use only and is not fully tested, documented, or supported by SingleStore. Visit the SingleStore Forums to ask questions about this repository.
SingleStore is great at loading data in Avro format. This sample shows both how to load specific files, and how to create a pipeline that'll continuously load files in AVRO format.
Load a file once:
Continuously load lots of files with a pipeline:
-
Sign up for a free-tier SingleStore license at https://www.singlestore.com/free-software/
-
Start a SingleStore cluster, a Kafka cluster, and an Avro schema registry. You may choose for a Managed Service cluster or you could use Docker to start the cluster:
docker-compose up
. Note: this terminal will continue running, showing you console output from Kafka, Zookeeper, the Avro registry, and SingleStore. -
Adjust the schema registry URL in
avro-consumer.js
andavro-producer.js
. -
Install npm packages from a terminal:
npm install
-
Create and publish the schema in the AVRO registry from a terminal:
node registry-create-schema.js
-
Create some AVRO files from a terminal:
node avro-producer.js
-
Optional: you can run
node avro-consumer.js
to read these binary files and log them to the terminal. -
Copy
init.sql
into your favorite code editor. -
Run each part of
init.sql
to ingest the AVRO files into the table -
When you're, shut down the docker containers from a terminal:
docker-compose down