/ML-Course-Notes

Primary LanguageJupyter Notebook

ML-Course-Notes

Notes for ML course at UCSF Library

Visuals

Decision Tree

hi

Random Forest

hi

Logistic Regression

hi

Support Vector Machine

hi

Neural Net

hi

Naive Bayes

hi

Exercises

Swap out the Random Forest for a different ML algorithm. At their defaults, how do they perform?

See if you can get better performance from the Random Forest by changing some of the parameters. What does n_estimators do?

Take a look at the parameters for a vectorizer. What does ngram_range do? Can you view the output of the ngram range?

Try programming a rules based (as opposed to machine learning) approach. Can you beat these algorithms?