Atlas Checks
The Atlas Checks framework and standalone application are tools to enable quality assurance of Atlas data files. For more information on the Atlas mapping file format please see the Atlas project in Github.
Starting with Atlas Checks
Please see the contributing guidelines!
Requirements
To run Atlas Checks the following is required:
Run Atlas Checks
To start working with Checks follow the steps below:
- Clone Atlas Checks project using following command
git clone https://github.com/osmlab/atlas-checks.git
- Switch to newly created directory:
cd atlas-checks
- Execute
gradle run
This command will build and run Atlas Checks with all the default options against the country Anguilla. GeoJSON output will be be produced that contains all the results found from the run. For more information on running Atlas Checks as a standalone application click here.
Working with Configuration
See configuration docs for more information about the configuration files that can be used to define specific details around the Atlas Checks application.
Running Atlas Checks in Spark Cluster
Atlas Checks have been developed to take advantage of distributed computing by running the checks in Spark. For more information on Spark see spark.apache.org. Running Atlas Checks locally is already executed within a local Spark environment on your machine, so running Spark in a cluster is simply a matter of updating the configuration. For more information see Running Atlas Checks in a Spark Cluster
Developing your own Atlas Checks
See Development docs for more information about developing and best practices for new Atlas Checks.
Docker Sandbox
We have also built a docker sandbox that you can use to execute the current checks. For more information click here