MongoDB JSON Data
A dedicated repository that collects collections to practice/use in MongoDB.
List of small datasets
Name | Size | Data type | How to import |
---|---|---|---|
610 Ko |
zip → dump folder |
mongorestore |
|
3.1 Mo |
JSON |
mongoimport |
|
731 Ko |
zip → JSON files |
mongoimport |
|
92 Ko |
JSON |
mongoimport |
|
35 Ko |
JSON |
mongoimport |
|
454 Ko |
JSON |
mongoimport |
|
2.8 Ko |
JSON |
mongoimport |
|
329 Ko |
JSON |
mongoimport |
|
2.3 Mo |
JSON |
mongoimport |
|
666 Ko |
JSON |
mongoimport |
|
470 Ko |
JSON |
mongoimport |
|
525 Ko |
JSON |
mongoimport |
List of bigger datasets
Name | Size | Data type | How to import |
---|---|---|---|
21 Mo |
zip → dump gzip |
mongorestore --gzip |
|
24 Mo |
JSON |
mongoimport |
|
75 Mo |
JSON |
mongoimport |
|
85 Mo |
zip → dump folder |
mongorestore |
|
232 Mo |
JSON |
mongoimport |
|
55 Mo |
RAR (named .zip for confusion) → dump folder |
mongorestore |
List of other dataset
Name | Size | Data type |
---|---|---|
423 Mo |
Email server tarball (slow DL server) |
Import in MongoDB
Use the import.sh
script provided to insert the "small" and the "bigger" datasets. You can see the help and the options with import.sh --help
.
Current features:
-
Docker support: starts a MongoDB automatically in Docker for you.
-
Only insert the smallest dataset for a quick data import with
--small
(cool for live demos).
Requirements:
-
Docker if you use the docker option.
-
MongoDB (mongoimport, mongorestore)
-
unzip
-
unrar (for the Enron dataset)
Contributing
Feel free to make a pull request to add your collection files into the list.