How the .gov domain space is doing at best practices and federal requirements.
Other forks of the project in use include:
Pulse is a Flask app written in Python 3. We recommend pyenv for easy Python version management.
- Install dependencies:
pip install -r requirements.txt
gem install sass bourbon neat bitters
- If editing styles during development, keep the Sass auto-compiling with:
make watch
- And to run the app in development, use:
make debug
This will run the app with DEBUG
mode on, showing full error messages in-browser when they occur.
To initialize the dataset with the last production scan data and database, there's a convenience function:
make data_init
This will download (using curl
) the current live production database and scan data to the local data/
directory.
The site can be easily deployed (by someone with credentials to the right server) through Fabric, which requires Python 2.
The Fabric script will expect a defined ssh
configuration called pulse
, which you should already have defined in your SSH configuration with the right hostname and key.
To deploy to staging, switch to a Python 2 virtualenv with fabric
installed, and run:
make staging
This will cd
into deploy/
and run fab deploy
.
To deploy to production, activate Python 2 and fabric
and run:
make production
This will run the fabric command to deploy to production.
The command to update the data in Pulse and publish it to production is simple:
python -m data.update
But you will need to do some setup first.
Download and set up domain-scan
from GitHub.
domain-scan
in turn requires pshtt
and ssllabs-scan
. These currently both need to be cloned from GitHub and set up individually.
Pulse requires you to set one environment variable:
DOMAIN_SCAN_PATH
: A path todomain-scan
'sscan
binary.
However, domain-scan
may need you to set a couple others if the binaries it uses aren't on your path:
PSHTT_PATH
: Path to thepshtt_cli
binary.SSLLABS_PATH
: Path to thessllabs-scan
binary.
To publish the resulting data to the production S3 bucket, install the official AWS CLI:
pip install awscli
And link it to AWS credentials that allow authorized write access to the pulse.cio.gov
S3 bucket.
From the Pulse root directory:
python -m data.update
This will kick off the domain-scan
scanning process for HTTP/HTTPS and DAP participation, using the .gov
domain list as specified in meta.yml
for the base set of domains to scan.
Then it will run the scan data through post-processing to produce some JSON and CSV files the Pulse front-end uses to render data.
Finally, this data will be uploaded to the production S3 bucket.
This project is an initial pass - there is much more information that can be represented in dashboards to great effect. Below are some of the further ideas for both for future work on this project. Feel free to add your ideas here, too.
- For the DAP Dashboard
- Number of pages from a domain reporting into DAP
- Number or list of subdomains from a domain reporting into DAP
- Test the deeper config options that the DAP snippet should be employing, such as IP anonymization, Event tracking, Demographics turned off, and ?????. (Possibly using headless browser)
- Does the site require “www”? Does it require not using “www”?
- Load time (server-side)
- More of the scans in observatory.mozilla.org
- Scan for SPF records
- Mobile friendliness (poss. using Google's Mobile Friendly Test)
- Mixed content detection (linking to insecure resources)
- Use of third party services
- STARTTLS email server encryption
- 508 compliance (poss. with http://pa11y.org/)
- Any other items listed in the OMB letter to OGP passing along .gov domain issuance
- Lighter or fun things - like how many domains start with each letter of the alphabet, what the last 10 that came out were, etc.
- 2FA or Connect.gov ? - Not sure how it would work but note Section 3's requirement in this EO
- Anything from/with itdashboard.gov
- Site hosting details
- open source
- Look at what Ben tracked
- IPv6
- DNSSEC
- https://monitor.dnsops.gov/
- What else can we get from Verisign?
This project is in the worldwide public domain. As stated in CONTRIBUTING:
This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.
All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.