Dependabot Information scraper for Github
The two scripts scrape and parse, respectively, information regarding dependabot alerts for Github repositories belonging to an organization.
Primary data points parsed are open, fixed, dismissed vulnerabilities, and ecosystem (programming language) type of vulnerability.
- Bash or ZSH Shell
- Github CLI
- To properly read all repos a Github token with security_events scope to read private repositories is required.
- JQ
- Python 3 - This was developed and tested with Python 3.10. Likely to work with Python 3.6 and above. (f-strings used in print statements)
Login to Github via gh cli
-
gh auth login
-
./get_all_dependabot.sh <name of organization>
Eg:./get_all_dependabot.sh procurify
-
Use dependa2.py instead, better implementation; less use of loops.python3 dependa.py
python3 dependa2.py
-
Output (CSV) files are written to the current folder.
- JSON files for each repo is saved to ./output folder, in the event manual review is needed. This data can also be viewed via Github, assuming appropriate permissions are granted.
-
Jq is unceessary for either the bash or the python script. Jq is used to provide convenient human readable review of the json files, if needed. (Otherwise all the json returns (files) are in a single line.)
-
Optimization considerations:
- Remove dependency on gh cli command and almalgamate both scripts to a single Python script. (potentially have this run on as an AWS Lambda and executed via scheduled EventBridge event and forward to a platform such as Slack)
- Provide method to name input / output file and folder names via command line paramaeters.
- Optimize code (reduce some repetitive code).
- Generate graphics with Plotly or alternative graphing module with Python.(?)
- Add Docstrings and type hints to the Repo Class, methods, and functions.
Github CLI login
List organization repos
List dependabot alerts
Working with Dependabot
Github Dependabot Blog
Released under the GPLv3
Concerns/Questions, open an issue. Improvements, please submit a pull request.