Scrapes the UNC Charlotte Parking Availability page using Beautiful Soup. The Python script is run every 30 minutes using Windows Task Scheduler, where it scrapes the website for a specific HTML tag, then parses the data and removes unnecessary components. Using the MySQL Connector/Python, which is a self-contained Python driver for communicating with MySQL servers, it sends the current unavalaible capacity to a MySQL database for long term storage and to use to look at historical trends.
The next step is to add additional code that will query the data for use in a github.io page, where historical trends can be plotted.
I wrote this when I was first learning Python, had never used a database before, and had never used Beautiful Soup. The code needs to be refactored to clean up many redundant lines, but it works well enough that I do not have the need.