============
Yes, Yes. Yet another new year resolution if you want to call it that. It's just about time for me to get acclimated to Python. I find Python being quite a nice language but I never wrote any real application, so this is my notes while I go trough some studies and real things I'm building with Python.
The list of days below is to log the days I'm coding in python mainly not in this repository. The days not logged here are in the commits.
Started a simple python app hosted on Heroku to sell my motorcycle. commit
Started using Flask/Jinja2 for cache busting, and setup requirements. commit commit
Migrating solshal.com scrapping stack from JS (Scraperjs) to a service where I'm using Python (Scrapy).
After some experimentation and research I have decided that Scrapy is much more robust than what I need, now I'm experimenting with BeautifulSoup4 and seems like that's what I'm going with for now.
Continued using BeautifulSoup4 and did some more testing to make sure will perform as todays Nodejs scraper or beter.
Finished up writing the solshal-scraper service in Python using BeautifulSoup4, next I will move it to Docker.
I haven't been feeling well since last night, painful headache that kept annoying me all day. For that reason I'm not spending too much time on computer tonight and decided to only setup the virtualenv for solshal-scraper service which was something I haven't done yet.
Today I dockerized the solshal-scraper written in python and started the integration solshal-app and solshal-scraper.
Kept working on the integration of the services Solshal and solshal-scraper, trying some security options and going through some cases python the scraper can fail and how the main will handle the failures.
Finished testing solshal and solshal-scraper, added solshal-scraper to docker and released the image on DockerHub. Started working on solshal main docker compose file to accommodate solshal-scraper service.
Update the docker-compose file to accommodate the changes and decided the improve and update the entire docker infra and images dependencies. Rewriting the scraper in python made the docker infra much simpler because the node scraper had some dependencies on native modules.
Went through some problems updating to latest MongoDB and connecting/linking solshal service to solshal-scraper service. Pushed a minor change to the scraper service weblancaster/solshal-scraper#1.
Went back to try fixing the linking between solshal and solshal-scraper when running with docker-compose, still haven't figure it out.
Started looking at writing solsha-digest service in python, more especifically cron jobs in python.
Unit tests in python using pytest, at first I just want to run a simple test.
Started adding unit test to solshal-scraper using pytest and I will probably need something to mock methods, request lib and flask lib.
Stayed up until late (started the day/night) fixing/improving/updating the docker-compose so the services solshal-web and solshal-scraper can communicate to each other. I also pushed a good amount of improvements/fixes to solshal-scraper weblancaster/solshal-scraper#2.
Started building another service in Python for solshal.com, now bookmark importer service where users will be able to import from browsers (Chrome) to Solshal.com. This is the setup and start https://github.com/weblancaster/solshal-bookmark-importer.
Started writing the code to import get the data from chrome and format to Solshal collection format
Implemented the initial algorithm for solshal-bookmark-importer (weblancaster/solshal-bookmark-importer#1) and made some accomodations/changes in main solshal service to utilize solshal-bookmark-importer
Minor improvements to solshal-bookmark-importer service to accommodate implementation in Solshal main service