/silverpond-test

The Silverpond Internship Interview Test

Primary LanguageJupyter NotebookMIT LicenseMIT

Silverpond AI Internship Test

Hello! Hola! Kon’nichiwa! Ni-hao! Hallå ! Aloha! !مرحبا

We'd like you to prepare some code in a Jupyter Notebook that explains a concept in machine learning/AI/deep learning that is of interest to you. You may use TensorFlow, PyTorch, or the framework of your choice; as long as it is in notebook form. We're assuming you know some Python and how to make your way around in a notebook; if not, it's worth learning!

As a team we're particularly interested in, and will be looking for:

  • Good explanations,
  • Modular and Idiomatic code,
  • Clear goals,
  • Tying-back final output to original goals,
  • Different viewpoints,
  • Fundamental understanding,
  • Visual explanations,
  • Awareness of knowledge boundaries ("What do I know about how much I know!").

As a reference, your notebook need not be any longer than the one that we use for the prerequisite for the deep learning workshop:

We would like this notebook to be hosted on your own GitHub (or other public git source control system; we want to know you can use Git!). Don't worry if it's the only thing on your account!

Please try not to spend more than 6 hours (we'd aim for 2-4 hours); so either try and explain well a concept you're already familiar with, or perform a brief investigation into the open areas of an interesting technique! If you get stuck, documenting your current understanding and future directions of investigation would be a nice way to conclude. Also, feel free to get in contact at any time to chat through thoughts; it's encouraged!

Potential example projects:

Here are some ideas that should indicate the general direction we're interested in. Please feel free to come up with your own!

  1. Playing with convolutions: A notebook that looks at convolutions, purely independently of deep learning, and tests a few different varieties on images.

  2. Algebra on auto-encoders: A notebook that loads up an existing auto-encoder model and performs some algebra, on the latent vectors, within it, to demonstrate what it's capabilities are.

  3. t-SNE: A notebook that takes a model that produces some latent vector z, and performs t-SNE on it, then explains what the t-SNE is doing.

  4. Bayesian probability: A notebook that demonstrates how Bayesian probability works, and some examples of prior-updates, and computations of the posteriors in a few different scenarios.

  5. Bias in ML: A notebook that explains how bias can be found in ML models, and highlights a few ways that people are attempting to approach this problem.

  6. Precision scores: An overview of the different measures that people use to assess the capability of different ML models.

  7. An Exploration of the TensorFlow Tensor: A review of what Tensors are, in the context of TensorFlow, and the various operations that can be performed on them, and what their affect is.

  8. Learning about Fully Connected Networks: A tour through your own learning of how to build a few fully-connected networks in PyTorch or TensorFlow; perhaps with some investigation into how many parameters they have as the input grows, or similar visual explanation of what is going on under the hood.

  9. TensorBoard Visualisation: An example of using TensorBoard for visualising different graphs, images, and perhaps even some potential embeddings! Doesn't neccessarily use a real neural network, just demonstrates using a few features of TensorBoard itself.

Notes

If you wish, you could configure your repo so that it can be executed in the MyBinder.org environment; but this isn't necessary!

Fun Datasets

I personally quite like "Fashion-MNIST", if you wanted to play around with that:

Magenta is a another interesting place to look, but can be quite involved to get up and going with:

The TensorFlow Playground also contains some fun inspiration:

CIFAR-10 is a classic image dataset that could be fun to explore:

And if that's not enough, consider the Computer Vision Online list of datasets:

If you find other interesting ones; let us know!