This is an example of using Akka Streams and Alpakka to stream:
- MongoDB collection to AWS S3.
- File containing Extended JSONs stored on AWS S3 to MongoDB collection.
This example contains a full runnable code presented in a two-part article:
- Crafting production-ready Backup as a Service solution using Akka Streams
- Crafting production-ready Backup as a Service solution using Akka Streams: part 2
- Akka Streams
- Alpakka S3 connector
- MongoDB Reactive Streams driver
The scenario located in Main
is the following:
- Perform backup to S3.
- Drop the collection.
- Perform restore.
In order to run it:
-
Go to
application.conf
and fill the missing properties. -
Run MongoDB locally and prepare the database and collection with some documents. You can quickly run MongoDB as well as MongoShell using Docker.
Run MongoDB:
docker run -p 27017:27017 --name backup-mongo -d mongo
Run Mongo shell:
docker run -it --net host --rm mongo sh -c 'exec mongo "localhost:27017"'
Insert a document:
> use CookieDB switched to db CookieDB > db.cookies.insert({"name" : "cookie1", "delicious" : true}) WriteResult({ "nInserted" : 1 })
-
sbt run
The following snippet presents a basic set of MongoDB commands useful when playing with backup/restore streams.
> show dbs
admin 0.000GB
local 0.000GB
> use CookieDB
switched to db CookieDB
> db.cookies.insert({"name" : "cookie1", "delicious" : true})
WriteResult({ "nInserted" : 1 })
> show collections
cookies
> db.cookies.find()
{ "_id" : ObjectId("599b0d9a266a67c9516e0245"), "name" : "cookie1", "delicious" : true }
> db.dropDatabase()
{ "dropped" : "CookieDB", "ok" : 1 }