This repository contains Python samples that show how to integrate with Azure DevOps and Team Foundation Server (TFS) using the Azure DevOps Python API.
As of January 2021, we're no longer actively maintaining this repo. Feel free to continue using it for inspiration or examples. We won't be updating or adding any samples, though.
Samples are organized by "area" (service) and "resource" within the samples
package.
Each sample module shows various ways for interacting with Azure DevOps and TFS.
Resources may have multiple samples, since there are often multiple ways to query for a given resource.
-
Clone this repository and
cd
into it -
Create a virtual environment (
python3 -m venv env && . env/bin/activate && pip install -r requirements.txt
)
Now you can run runner.py
with no arguments to see available options.
VERY IMPORTANT: some samples are destructive! It is recommended that you run these samples against a test organization.
-
Get a personal access token.
-
Store the PAT and organization URL you'll be running samples against (note: some samples are destructive, so use a test organization):
python runner.py config url --set-to https://dev.azure.com/fabrikam
python runner.py config pat --set-to ABC123
- If you don't want your PAT persisted to a file, you can put it in an environment variable called
AZURE_DEVOPS_PAT
instead
-
Run
python runner.py run {area} {resource}
with the 2 required arguments:{area}
: API area (currentlycore
,git
, andwork_item_tracking
) to run the client samples for. Useall
to include all areas.{resource}
: API resource to run the client samples for. Useall
to include all resources.- You can optionally pass
--url {url}
to override your configured URL
python runner.py run all all
python runner.py run work_item_tracking all
python runner.py run git pullrequests
Run all Git samples against a different URL than the one configured; in this case, a TFS on-premises collection
python runner.py run git all --url https://mytfs:8080/tfs/testcollection
To persist the HTTP request/response as JSON for each client sample method that is run, set the --output-path {value}
argument. For example:
python runner.py run all all --output-path ~/temp/http-output
This creates a folder for each area, a folder for each resource under the area folder, and a file for each client sample method that was run. The name of the JSON file is determined by the name of the client sample method. For example:
|-- temp
|-- http-output
|-- git
|-- refs
|-- get_refs.json
|-- ...
|-- repositories
|-- get_repositories.json
|-- ...
Note: certain HTTP headers like Authorization
are removed for security/privacy purposes.
You can run runner.py list
to see what sample areas and resources are available.
We also provide a Jupyter notebook for running the samples. You'll get a web browser where you can enter URL, authentication token, and choose which samples you wish to run.
-
Clone this repository and
cd
into it -
Create a virtual environment (
python3 -m venv env && . env/bin/activate && pip install -r requirements.jupyter.txt
) -
Get a personal access token.
-
Run
jupyter notebook
. In the resulting web browser, click API Samples.ipynb. -
Click Run in the top cell. Scroll down and you'll see a form where you can enter your organization or TFS collection URL, PAT, and choose which samples to run.
IMPORTANT: some samples are destructive. It is recommended that you first run the samples against a test account.
This repo is no longer maintained, and therefore is not accepting new contributions.
This project welcomes contributions and suggestions.
Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution.
For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment).
Simply follow the instructions provided by the bot.
You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
See detailed instructions on how to contribute a sample.