SingleStore AVRO Sample

Attention: The code in this repository is intended for experimental use only and is not fully tested, documented, or supported by SingleStore. Visit the SingleStore Forums to ask questions about this repository.

SingleStore is great at loading data in Avro format. This sample shows both how to load specific files, and how to create a pipeline that'll continuously load files in AVRO format.

Watch

Load a file once:

Load Avro file into SingleStore

Continuously load lots of files with a pipeline:

SingleStore Pipelines load Avro files

To Use

  1. Sign up for a free-tier SingleStore license at https://www.singlestore.com/free-software/

  2. Start a SingleStore cluster, a Kafka cluster, and an Avro schema registry. You may choose for a Managed Service cluster or you could use Docker to start the cluster: docker-compose up. Note: this terminal will continue running, showing you console output from Kafka, Zookeeper, the Avro registry, and SingleStore.

  3. Adjust the schema registry URL in avro-consumer.js and avro-producer.js.

  4. Install npm packages from a terminal:

    npm install
  5. Create and publish the schema in the AVRO registry from a terminal:

    node registry-create-schema.js
  6. Create some AVRO files from a terminal:

    node avro-producer.js
  7. Optional: you can run node avro-consumer.js to read these binary files and log them to the terminal.

  8. Copy init.sql into your favorite code editor.

  9. Run each part of init.sql to ingest the AVRO files into the table

  10. When you're, shut down the docker containers from a terminal:

    docker-compose down