Mini-Projects
This repository contains all the snips of code that I create while learning different Python packages
1. Interactive Dictionary:
An interactive dictionary that takes any word from a user, extracts data from a JSON dataset, and returns the definition of the word. It also accounts for misspelled words.
What I learned :
- Working with JSON
- difflib package
2. Website Blocker:
A python script that operates in the background to block user access to certain websites at certain intervals of time to boost productivity and avoid distraction. It can be scheduled to auto-start at startup with the help of taskscheduler and the script can be edited to work with different websites and different intervals of time. Side note : It needs to be run with high privileges.
What I learned:
- System file manipulation
- datetime package
3. Webmap Population:
A python based webmap which uses Folium to implement the webmap functionalities and various folium features to add polygonal layers based on the World Population. It is a layered map and uses OpenStreetMap and GeoJson for presentation of the datasets.
What I learned:
- folium library
- layer control
- GeoJson
4. Live Website:
A website that's been deployed on Heroku with the help of Python and Flask. The webpage in itself is just a shell and the main emphasis of this script was on flask and deployment. You can find the website at https://devan.herokuapp.com/
What I learned:
- Flask
- Virtual Environment for Python
- Deploying a website on Heroku
5. Database Application:
A Desktop Database application that uses Python and Tkinter to convert the backend code into a GUI application that is bundled up together in a single executable file. It uses SQLite3 so the database can be worked on any machine. The application has been designed to gather book details and modify them.
What I learned:
- Tkinter
- sqlite3
- PostgreSQL
- pyinstaller : packaging python files
5. Web Scraper:
A simple Web scraping application that scrapes property listings from Magicbricks.com and converts them into a dataset with the help of pandas. It is missing crawlers and I will turn this into a standalone application which you can find on my profile. Feel free to edit the magicbricks URL in request section to generate CSV file of listings of any area on magicbricks.
What I learned:
- Scraping source code for data
- beautifulsoup4
- requests