A dedicated repository that collects collections to practice/use in MongoDB.
Name | Size | Data type | How to import |
---|---|---|---|
610 Ko |
zip → dump folder |
mongorestore |
|
3.1 Mo |
JSON |
mongoimport |
|
731 Ko |
zip → JSON files |
mongoimport |
|
92 Ko |
JSON |
mongoimport |
|
35 Ko |
JSON |
mongoimport |
|
454 Ko |
JSON |
mongoimport |
|
2.8 Ko |
JSON |
mongoimport |
|
329 Ko |
JSON |
mongoimport |
|
2.3 Mo |
JSON |
mongoimport |
|
666 Ko |
JSON |
mongoimport |
|
470 Ko |
JSON |
mongoimport |
|
525 Ko |
JSON |
mongoimport |
Name | Size | Data type | How to import |
---|---|---|---|
21 Mo |
zip → dump gzip |
mongorestore --gzip |
|
24 Mo |
JSON |
mongoimport |
|
75 Mo |
JSON |
mongoimport |
|
85 Mo |
zip → dump folder |
mongorestore |
|
232 Mo |
JSON |
mongoimport |
|
55 Mo |
RAR (named .zip for confusion) → dump folder |
mongorestore |
Name | Size | Data type |
---|---|---|
423 Mo |
Email server tarball (slow DL server) |
Use the import.sh
script provided to insert the "small" and the "bigger" datasets. You can see the help and the options with import.sh --help
.
-
Docker support: starts a MongoDB automatically in Docker for you.
-
Only insert the smallest dataset for a quick data import with
--small
(cool for live demos).
-
Docker if you use the docker option.
-
MongoDB (mongoimport, mongorestore)
-
unzip
-
unrar (for the Enron dataset)