/reading-notes

Some notes on the things that I read

Primary LanguageTeX

Reading Notes

Books

Summarizing books in Markdown doesn't always work well. So sometimes, I'll link to Jupyter notebooks instead.

OpenIntro Statistics (book | notes)

OpenIntro Statistics is a freely available textbook that's meant to introduce readers to statistics. The book does not make many assumptions about the knowledge of readers and is generally on a fairly basic level. It's a good first introduction to significance tests and statistical modelling.

Reinforcement Learning: An Introduction (book | interactive notes)

Reinforcement Learning is a subarea of Machine Learning where there's no supervisor that tells us the optimal answer / behaviour. Instead, the feedback is delayed and we only get to know a numerical rating of our actions. Reinforcement Learning: An Introduction is the canonical book on Reinforcement Learning and gives a good overview over the field. The notes consist of Jupyter notebooks that explain and show implementations for most algorithms from the first two parts of the book.

A Tour of C++ (book | notes)

Bjarne Stroustrup, the creator of C++, wrote a ~1300-page book on C++. The book supposedly covers most of what there is to know about C++ and starts with an overview of the fundamentals. A Tour of C++ is a much shorter book based on this overview. It is meant to be read by people that are already very familiar with programming in general and might already have used some C++. The book is great for quickly getting up to speed with C++. There are updated editions for the newest C++ versions and the book contains some interesting insights from Stroustrup.

Game Theory: A Nontechnical Introduction (book | notes)

Game theory is the science of decision making. It can be used to mathematically formalize how strategies should be chosen, how voting power can be measured, or how players might cooperate. I was looking for a short introduction to the subject. In particular I wanted a book which would be faster to read than a textbook but still had mathematical rigor. After looking for such a book for quite a long time, I settled on Game Theory: A Nontechnical Introduction. It did not contain that much math but still seemed like the closest thing to what I was looking for.

A Random Walk Down Wall Street (book | notes)

This investment book is split into four parts. The first part describes how people historically lost money in bubbles. The second and third give an overview of how academics and Wall Street, respectively, go about investing. In the last part, the author gives his personal recommendations for how to invest. The first edition of this book, published in 1973, was influental in recommending that something like index funds ought to be created, and popularized the approach of broadly investing into the market.

Secrets of Sand Hill Road: Venture Capital and How to Get It (book | notes)

Named after the road that most Silicon-Valley-based Venture Capital (VC) firms are located on, this book gives a thorough overview of how VC works. It explains what kind of companies VC funds, what the life cycle of a VC-funded company is like, and how VC funds are managed. I especially liked that the book talks a lot about incentives various players have and about how how decisions affect share dilution.

Never Split the Difference: Negotiating as If Your Life Depended on It (book | notes)

Written by the FBI's former head of hostage negotiation, this book describes the art of negotiation. It outlines fundamentals to pay attention to during negotiations and a lot of stories to emphasize these points.

Software Engineering at Google: Lessons Learned from Programming Over Time (book | notes)

This book gives a broad overview of software engineering techniques used by Google. There's lots of incredible advice and I genuinely think that reading this taught me more than my software engineering class in university did.

The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution (book | notes)

Jim Simons was a math professor who created the world's most successful hedge fund and was the first to apply early forms of machine learning to investing. This is his story.

Flour Water Salt Yeast: The Fundamentals of Artisan Bread and Pizza (book | notes)

This is a guide on how to bake bread and pizza. It starts with the basics and is an ideal book for learning to bake bread.

Drive: The Surprising Truth About What Motivates Us (book | notes)

There's different kinds of motivation: biological, extrinsic, intrinsic. While intrinsic is the strongest one, it can only grow in certain conditions.

Game Programming Patterns (book | notes)

This book discusses 19 design patterns that are particularly useful or game development. It is much more fun to read than design pattern books I have seen previously, and the patterns are applicable to much more than just games.

Who Am I?: And If So, How Many? (book | notes | review)

Who Am I? And If So, How Many? is an introduction to philosophy’s biggest topics. Each chapter of the book discusses a different question, ranging from the meaning of life to ethical questions. The author does not only discuss the philosophical questions very well but also narrates the historical aspects of how they came to be.

Papers

  1. The Unreasonable Effectiveness of Data
  2. Exploring the structure of a real-time, arbitrary neural artistic stylization network (neural style transfer)
  3. Why Google Stores Billions of Lines of Code in a Single Repository (Google's monorepo)
  4. Are you living in a computer simulation? (simulation argument)
  5. Software Engineering at Google
  6. TensorFlow: A system for large-scale machine learning (2016 whitepaper)
  7. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems (2015 whitepaper)
  8. Generative Adversarial Nets
  9. MapReduce: Simplified Data Processing on Large Clusters
  10. FlumeJava: Easy, Efficient Data-Parallel Pipelines (Google's MapReduce successor)
  11. Neural Machine Translation by Jointly Learning to Align and Translate (attention)
  12. Reflections on Trusting Trust (Thompson’s Turing Award lecture)
  13. A Decomposable Attention Model for Natural Language Inference
  14. Attention Is All You Need (Transformer architecture)
  15. An Improved Data Stream Summary: The Count-Min Sketch and its Applications
  16. What is Data Sketching, and Why Should I Care?

Conventions in this repository

  • All citations are in the MLA citation format

Papers

  • After the title, the paper is linked as a [pdf]
  • Next, the paper is cited in the MLA format in italics
  • This is followed by a short abstract of around three sentences
  • The remaining headings roughly follow the structure of the paper