Annotation 1:These collections is nowing still collecting tutorials of all kinds in deep learning. These partial collections can help you get to Deep Learning and improve yourself.
Annotation 2:Some video tutorials is collected by JiQiZhiXin. Most of the videos may be visited with Advanced Network in some country and area。
TUTORIAL | DESCRIPTION |
---|---|
Deep Learning SIMPLIFIED | Are you overwhelmed by overly-technical explanations of Deep Learning? If so, this series will bring you up to speed on this fast-growing field – without any of the math or code. Deep Learning is an important subfield of Artificial Intelligence (AI) that connects various topics like Machine Learning, Neural Networks, and Classification. The field has advanced significantly over the years due to the works of giants like Andrew Ng, Geoff Hinton, Yann LeCun, Adam Gibson, and Andrej Karpathy. Many companies have also invested heavily in Deep Learning and AI research - Google with DeepMind and its Driverless car, nVidia with CUDA and GPU computing, and recently Toyota with its new plan to allocate one billion dollars to AI research. You've probably looked up videos on YouTube and found that most of them contain too much math for a beginner. The few videos that promise to just present concepts are usually still too high level for someone getting started. Any videos that show complicated code just make these problems worse for the viewers. There’s nothing wrong with technical explanations, and to go far in this field you must understand them at some point. However, Deep Learning is a complex topic with a lot of information, so it can be difficult to know where to begin and what path to follow. The goal of this series is to give you a road map with enough detail that you’ll understand the important concepts, but not so much detail that you’ll feel overwhelmed. The hope is to further explain the concepts that you already know and bring to light the concepts that you need to know. In the end, you’ll be able to decide whether or not to invest additional time on this topic. So while the math and the code are important, you will see neither in this series. The focus is on the intuition behind Deep Learning – what it is, how to use it, who’s behind it, and why it’s important. You'll first get an overview of Deep Learning and a brief introduction of how to choose between different models. Then we'll see some use cases. After that, we’ll discuss various Deep Learning tools including important software libraries and platforms where you can build your own Deep Nets. |
Bay Area Deep Learning School Day 1 at CEMEX auditorium, Stanford | Day 1 of Bay Area Deep Learning School featuring speakers Hugo Larochelle, Andrej Karpathy, Richard Socher, Sherry Moore, Ruslan Salakhutdinov and Andrew Ng. |
Bay Area Deep Learning School Day 2 at CEMEX auditorium, Stanford | Day 2 of Bay Area Deep Learning School featuring speakers John Schulman, Pascal Lamblin, Adam Coates, Alex Wiltschko, Quoc Le and Yoshua Bengio. |
Tutorial: Deep Learning | Deep Learning allows computational models composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection, and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large datasets by using the back-propagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about dramatic improvements in processing images, video, speech and audio, while recurrent nets have shone on sequential data such as text and speech. Representation learning is a set of methods that allows a machine to be fed with raw data and to automatically discover the representations needed for detection or classification. Deep learning methods are representation learning methods with multiple levels of representation, obtained by composing simple but non-linear modules that each transform the representation at one level (starting with the raw input) into a representation at a higher, slightly more abstract level. This tutorial will introduce the fundamentals of deep learning, discuss applications, and close with challenges ahead. |
Deep Learning with Neural Networks and TensorFlow Introduction | Welcome to a new section in our Machine Learning Tutorial series: Deep Learning with Neural Networks and TensorFlow. The artificial neural network is a biologically-inspired methodology to conduct machine learning, intended to mimic your brain (a biological neural network). The Artificial Neural Network, which I will now just refer to as a neural network, is not a new concept. The idea has been around since the 1940's, and has had a few ups and downs, most notably when compared against the Support Vector Machine (SVM). For example, the Neural Network was popularized up until the mid 90s when it was shown that the SVM, using a new-to-the-public (the technique itself was thought up long before it was actually put to use) technique, the "Kernel Trick," was capable of working with non-linearly separable datasets. With this, the Support Vector Machine catapulted to the front again, leaving neural nets behind and mostly nothing interesting until about 2011, where Deep Neural Networks began to take hold and outperform the Support Vector Machine, using new techniques, huge dataset availability, and much more powerful computers. |
Neural Networks for Machine Learning | Learn about artificial neural networks and how they're being used for machine learning, as applied to speech and object recognition, image segmentation, modeling language and human motion, etc. We'll emphasize both the basic algorithms and the practical tricks needed to get them to work well. This course contains the same content presented on Coursera beginning in 2013. It is not a continuation or update of the original course. It has been adapted for the new platform. Please be advised that the course is suited for an intermediate level learner - comfortable with calculus and with experience programming (Python). |
Build a TensorFlow Image Classifier in 5 Min | In this episode we're going to train our own image classifier to detect Darth Vader images. The challenge for this episode is to create your own Image Classifier that would be a useful tool for scientists. Just post a clone of this repo that includes your retrained Inception Model (label it output_graph.pb). If it's too big for GitHub, just upload it to DropBox and post the link in your GitHub README. I'm going to judge all of them and the winner gets a shoutout from me in a future video, as well as a signed copy of my book 'Decentralized Applications'. |
Build a Neural Net in 4 Minutes | Curious just how inspired neural networks are from brain architecture? Take some time to learn about the human brain! This is my favorite intro to neuroscience course. |
Neural Network that Changes Everything - Computerphile | Years of work down the drain, the convolutional neural network is a step change in image classification accuracy. Image Analyst Dr Mike Pound explains what it does. |
Wide & Deep Learning with TensorFlow - Machine Learning | Wide & Deep Learning combines the power of memorization and generalization by jointly training wide linear models and deep neural networks. We've open-sourced the implementation with an easy-to-use API in TensorFlow. It's effective for generic large-scale regression and classification problems with sparse inputs, such as recommender systems, search, ranking problems and more. We hope you find it useful in your machine learning projects. |
Deep Learning - Computerphile | Google, Facebook & Amazon all use deep learning methods, but how does it work? Research Fellow & Deep Learning Expert Brais Martinez explains. |
Deep Learning Demystified | An explanation for deep neural networks with no fancy math, no computer jargon. For slides, related posts and other videos, check out the blog post. |
TUTORIAL | DESCRIPTION |
---|---|
Recurrent Neural Networks Yoshua Bengio | 蒙特利尔深度学习暑期班出现了很多来自不同年龄段的专家与从业人员。该教程是要教人们对深度学习与神经网络有基础的理解。里面有 Yoshua Bengio 教授循环神经网络,Surya Ganguli 教授理论神经科学与深度学习理论,Sumit Chopra 教授 reasoning summit 和 attention,Jeff Dean 讲解 TensorFlow 大规模机器学习,Ruslan Salakhutdinov 讲解学习深度生成式模型,Ryan Olson 讲解深度学习的 GPU 编程,还有其他很多的讲演。 |
Geoffrey French - Deep learning tutorial - advanced techniques | Some of the more advanced deep learning to help you get the best out of it in a practical setting. The main focus is on computer vision and image processing. In the last few years, deep neural networks have been used to generate state-of-the-art results in image classification, segmentation and object detection. They have also successfully been used for speech recognition. In this tutorial we build on the basics, demonstrating some useful techniques that are useful in a practical setting. Along with tips and tricks found to be useful, we will discuss the following: active learning; train a neural network using less training data using pre-trained networks; using the body of a pre-trained network (e.g. an ImageNet network) and re-using it for localisation or locating objects not within the original training set having fun with neural networks; some of the fun techniques that have been demonstrated in the last couple of years |
Andrew Ng - Deep Learning in Practice: Speech Recognition and Beyond | 吴恩达的地位无需再多做介绍了,大家都知道他对深度学习的贡献。他是世界上首先认识到深度学习潜力的几个人之一。在这个与吴恩达的一对一对话中,他分享了在深度学习上研究的经验、深度学习所到来的科技进展。他提到大数据的进展正在颠覆如今的产业。观看此视频可以了解更多关于深度学习与数据科学的未来。 |
Google's Deep Mind Explained! - Self Learning A.I. | AlphaGo 击败围棋前世界冠军李世乭是一个历史时刻。每当机器超越人类的时候,就会引发一轮新的社会进步。谷歌 DeepMind 宣称自己将下一代人工智能和目标带到研发这样的系统活动中:聪明到可以自主采取行动。这个视频解释了 DeepMind 的起源,以及它能为人工智能领域带来的什么样的变革。 |
NVIDIA at CES 2016 - Self Driving Cars and Deep Learning GPUs | 英伟达 CEO 黄仁勋分享了深度学习与研究如何改变自动驾驶汽车的面貌,如何让其成真的故事。他开局引介了世界上第一个由英伟达设计的、用于自动驾驶汽车的人工智能超级计算机。还解释了深度神经网络和大数据如何被用于解决 GPU 的问题。深度学习和人工智能如何变不可能为可能?这个视频会让你脑洞大开。 |
9 Cool Deep Learning Applications | Two Minute Papers | Machine learning provides us an incredible set of tools. If you have a difficult problem at hand, you don't need to hand craft an algorithm for it. It finds out by itself what is important about the problem and tries to solve it on its own. In this video, you'll see a number of incredible applications of different machine learning techniques (neural networks, deep learning, convolutional neural networks and more). |
Deep Learning Program Learns to Paint | Two Minute Papers | Artificial neural networks were inspired by the human brain and simulate how neurons behave when they are shown a sensory input (e.g., images, sounds, etc). They are known to be excellent tools for image recognition, any many other problems beyond that - they also excel at weather predictions, breast cancer cell mitosis detection, brain image segmentation and toxicity prediction among many others. Deep learning means that we use an artificial neural network with multiple layers, making it even more powerful for more difficult tasks. This time they have been shown to be apt at reproducing the artistic style of many famous painters, such as Vincent Van Gogh and Pablo Picasso among many others. All the user needs to do is provide an input photograph and a target image from which the artistic style will be learned. |
Tutorial: Introduction to Reinforcement Learning with Function Approximation | Reinforcement learning is a body of theory and techniques for optimal sequential decision making developed in the last thirty years primarily within the machine learning and operations research communities, and which has separately become important in psychology and neuroscience. This tutorial will develop an intuitive understanding of the underlying formal problem (Markov decision processes) and its core solution methods, including dynamic programming, Monte Carlo methods, and temporal-difference learning. It will focus on how these methods have been combined with parametric function approximation, including deep learning, to find good approximate solutions to problems that are otherwise too large to be addressed at all. Finally, it will briefly survey some recent developments in function approximation, eligibility traces, and off-policy learning. |
Deep Reinforcement Terrain Learning | Two Minute Papers | In this piece of work, a combination of deep learning and reinforcement learning is presented which has proven to be useful in solving many extremely difficult tasks. Google DeepMind built a system that can play Atari games at a superhuman level using this technique that is also referred to as Deep Q-Learning. This time, it was used to teach digital creatures to walk and overcome challenging terrain arrangements. |
(E)BOOK | DESCRIPTION |
---|---|
Neural Networks and Deep Learning | Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you many of the core concepts behind neural networks and deep learning. |
Neural Networks and Deep Learning - Translation in Chinese | A translation version in Chinese of Neural Networks and Deep Learning |
(E)BOOK | DESCRIPTION |
---|---|
Neural Networks and Deep Learning | This tutorial will teach you the main ideas of Unsupervised Feature Learning and Deep Learning. By working through it, you will also get to implement several feature learning/deep learning algorithms, get to see them work for yourself, and learn how to apply/adapt these ideas to new problems. This tutorial assumes a basic knowledge of machine learning (specifically, familiarity with the ideas of supervised learning, logistic regression, gradient descent). If you are not familiar with these ideas, we suggest you go to this Machine Learning course and complete sections II, III, IV (up to Logistic Regression) first. |
Deep Learning | It's said to be the best and the most authoritative tutorial in Deep Learning. |