Resources for Joey Hejna's ML Workshop.
Hack:Now Version: https://docs.google.com/presentation/d/196mQECjAryaqI1vNII3c4LrNM1H_Nl2sEt5w7Mhzp0s/edit?usp=sharing
Slightly Reduced: https://docs.google.com/presentation/d/1aDlGjjBSpZU91YW6_it-tSgQYDBZbD_R8_XGWcmSuoE/edit?usp=sharing
Original Full: https://docs.google.com/presentation/d/1cpxkLCDK6ikYC1ja6Txg4YWEo-P6sS7fNUQT6mA6y44/edit?usp=sharing
I have made colab versions of the original workshop! Now you don't need to install anything locally, and can just use the "open in colab" links at the top of the workshop files marked "colab". AFterwards, you can make a copy to your drive.
To Run the workshop notebook, click on workshop.ipynb
, then open in Colab, then make a copy.
If you want to run everything locally, use these directions. To start, clone this repository. We'll be working out of it for the workshop.
To follow good practice, we'll be working out of virtual environment for the following reasons:
- Leaving your global python installation the same
- Ensuring that we are all working with the same exact packages
- Prevent versioning conflicts. Specifically, we're using tensorflow 2.
- Make sure you have python 3.6 or python 3.7 on your computer.
- Make sure you have
pip
forpython3
installed on your computer. As I have bothpython
andpython3
on my computer, for me this ispip3
- Then, install the virtualenv package with
pip3 install virtualenv
. Ideally, this is the only package you have on your system level python installation. You can figure out what pip packages you currently have installed by runningpip freeze
. Ideally, the output list should only containvirtualenv
, though that's probably not the case.
- Create a virtual environment called
venv
in the github repository by runningpython3 -m virtualenv venv
. - Activate the virtual environment by running
source venv/bin/activate
. You can always deactivate the virtual environment withdeactivate
. - Install all the packages for this workshop by running
pip install -r requirements.txt
. This will install all packages needed for the workshop!
Finally, we're going to be using some datasets during the workshop and its best if you have them downloaded before hand.
To make sure the data sets are on your system, with your virtual environment active run python load_datasets.py
.
You should see some datasets being downloaded through the Keras package. That's all you need to do!
We will work out of jupyter notebooks. Run jupyter notebook
in the virtual environment from the mlworkshop
directory.