/dd-trace-py

Datadog Tracing Python Client

Primary LanguagePythonBSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

dd-trace-py

CircleCI Pyversions PypiVersions OpenTracing Badge

ddtrace is Datadog's tracing library for Python. It is used to trace requests as they flow across web servers, databases and microservices so that developers have great visiblity into bottlenecks and troublesome requests.

Getting Started

For a basic product overview, installation and quick start, check out our setup documentation.

For more advanced usage and configuration, check out our API documentation.

For descriptions of terminology used in APM, take a look at the official documentation.

Development

Testing

Environment

The test suite requires many backing services such as PostgreSQL, MySQL, Redis and more. We use docker and docker-compose to run the services in our CI and for development. To run the test matrix, please install docker and docker-compose using the instructions provided by your platform. Then launch them through:

$ docker-compose up -d

Running Tests in docker

Once your docker-compose environment is running, you can run the test runner image:

$ docker-compose run --rm testrunner

Now you are in a bash shell. You can now run tests as you would do in your local environment:

$ tox -e '{py35,py36}-redis{210}'

We also provide a shell script to execute commands in the provided container.

For example to run the tests for redis-py 2.10 on Python 3.5 and 3.6:

$ ./scripts/ddtest tox -e '{py35,py36}-redis{210}'

If you want to run a list of tox environment (as CircleCI does) based on a pattern, you can use the following command:

$ scripts/ddtest scripts/run-tox-scenario '^futures_contrib-'

Continuous Integration

We use CircleCI 2.0 for our continuous integration.

Configuration

The CI tests are configured through config.yml.

Running Locally

The CI tests can be run locally using the circleci CLI. More information about the CLI can be found at https://circleci.com/docs/2.0/local-cli/.

After installing the circleci CLI, you can run jobs by name. For example:

$ circleci build --job django

Benchmarking

When two or more approaches must be compared, please write a benchmark in the benchmark.py module so that we can measure the efficiency of the algorithm. To run your benchmark, just:

$ python -m tests.benchmark