社区
下载资源悬赏专区
帖子详情
deep learning notes下载
weixin_39820780
2019-10-06 05:30:27
这是吴恩达机器学习的课程笔记,关于机器学习部分的主要几个算法
相关下载链接:
//download.csdn.net/download/weixin_39303862/10697372?utm_source=bbsseo
...全文
54
回复
打赏
收藏
deep learning notes下载
这是吴恩达机器学习的课程笔记,关于机器学习部分的主要几个算法 相关下载链接://download.csdn.net/download/weixin_39303862/10697372?utm_source=bbsseo
复制链接
扫一扫
分享
转发到动态
举报
写回复
配置赞助广告
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复
打赏红包
Deep
L
ear
ning
Toolbox Release
Notes
Deep
L
ear
ning
Toolbox Release
Notes
,
Deep
L
ear
ning
Toolbox Release
Notes
,
Deep
L
ear
ning
Toolbox Release
Notes
Deep
L
ear
ning
Tutorial Release 0.1
Deep
L
ear
ning
is a new area of Machine L
ear
ning
res
ear
ch, which has been introduced with the objective of moving Machine L
ear
ning
closer to one of its original goals: Artificial Intelligence. See these course
notes
for a brief introduction to Machine L
ear
ning
for AI and an introduction to
Deep
L
ear
ning
algorithms.
Deep
L
ear
ning
深度学习入门论文
1. 概述类 首先是概述类论文,先后有2013年的“Representation L
ear
ning
: A Review and New Perspectives”和2015年的”
Deep
L
ear
ning
in Neural Networks: An Overview”两篇。 上传了较新的一篇。 3. 分布式计算 分布式计算方面论文涉及到具体解决计算能力的问题。有2012年的两篇论文Building High-level Features Using Large Scale Unsupervised L
ear
ning
和Large Scale Distributed
Deep
Networks,其中后篇较好,其中第一次提到GPU对深度学习计算进行提速,其描述的情形大致是如何对多个GPGPU并行计算的深度学习框架进行编程。故上传了此篇 4. 具体算法 而后便是具体的算法方面的典型论文,包括K-means、单层非监督网络、卷积网络CNN、多级架构、Maxout和增强学习,论文列举如下: 2006年
Notes
on Convolutional Neural Networks 2009年What is the Best Multi-Stage Architecture for Object Recognition 2011年An Analysis of Single-Layer Networks in Unsupervised Feature L
ear
ning
2012年L
ear
ning
Feature Representations with K-means 2012年Sparse Filtering (其中有RBM,auto-encoder等) 2014年Improving
deep
neural network acoustic models using generalized maxout networks 2014年Adolescent-specific patterns of behavior and neural activity during social reinforcement l
ear
ning
2015年Reinforcement l
ear
ning
models and their neural correlates: An activation likelihood estimation meta-analysis和Human-level control through
deep
reinforcement l
ear
ning
Udemy -
Deep
L
ear
ning
Convolutional Neural Networks in Python
https://www.udemy.com/
deep
-l
ear
ning
-convolutional-neural-networks-theano-tensorflow/
Deep
L
ear
ning
: Convolutional Neural Networks in Python Computer Vision and Data Science and Machine L
ear
ning
combined! In Theano and TensorFlow Created by Lazy Programmer Inc. Last updated 5/2017 English What Will I L
ear
n? Understand convolution Understand how convolution can be applied to audio effects Understand how convolution can be applied to image effects Implement Gaussian blur and edge detection in code Implement a simple echo effect in code Understand how convolution helps image classification Understand and explain the architecture of a convolutional neural network (CNN) Implement a convolutional neural network in Theano Implement a convolutional neural network in TensorFlow Requirements Install Python, Numpy, Scipy, Matplotlib, Scikit L
ear
n, Theano, and TensorFlow L
ear
n about backpropagation from
Deep
L
ear
ning
in Python part 1 L
ear
n about Theano and TensorFlow implementations of Neural Networks from
Deep
L
ear
ning
part 2 Description This is the 3rd part in my Data Science and Machine L
ear
ning
series on
Deep
L
ear
ning
in Python. At this point, you already know a lot about neural networks and
deep
l
ear
ning
, including not just the basics like backpropagation, but how to improve it using modern techniques like momentum and adaptive l
ear
ning
rates. You’ve already written
deep
neural networks in Theano and TensorFlow, and you know how to run code using the GPU. This course is all about how to use
deep
l
ear
ning
for computer vision using convolutional neural networks. These are the state of the art when it comes to image classification and they beat vanilla
deep
networks at tasks like MNIST. In this course we are going to up the ante and look at the StreetView House Number (SVHN) dataset – which uses larger color images at various angles – so things are going to get tougher both computationally and in terms of the difficulty of the classification task. But we will show that convolutional neural networks, or CNNs, are capable of handling the challenge! Because convolution is such a central part of this type of neural network, we are going to go in-depth on this topic. It has more applications than you might imagine, such as modeling artificial organs like the pancreas and the h
ear
t. I’m going to show you how to build convolutional filters that can be applied to audio, like the echo effect, and I’m going to show you how to build filters for image effects, like the Gaussian blur and edge detection. We will also do some biology and talk about how convolutional neural networks have been inspired by the animal visual cortex. After describing the architecture of a convolutional neural network, we will jump straight into code, and I will show you how to extend the
deep
neural networks we built last time (in part 2) with just a few new functions to turn them into CNNs. We will then test their performance and show how convolutional neural networks written in both Theano and TensorFlow can outperform the accuracy of a plain neural network on the StreetView House Number dataset. All the materials for this course are FREE. You can download and install Python, Numpy, Scipy, Theano, and TensorFlow with simple commands shown in previous courses. This course focuses on “how to build and understand“, not just “how to use”. Anyone can l
ear
n to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happe
ning
in the model internally. If you want more than just a superficial look at machine l
ear
ning
models, this course is for you.
NOTES
: All the code for this course can be downloaded from my github: /lazyprogrammer/machine_l
ear
ning
_examples In the directory: cnn_class Make sure you always “git pull” so you have the latest version! HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE: calculus lin
ear
algebra probability Python coding: if/else, loops, lists, dicts, sets Numpy coding: matrix and vector operations, loading a CSV file Can write a feedforward neural network in Theano and TensorFlow TIPS (for getting through the course): Watch it at 2x. Take handwritten
notes
. This will drastically increase your ability to retain the information. Write down the equations. If you don’t, I guarantee it will just look like gibberish. Ask lots of questions on the discussion board. The more the better! Realize that most exercises will take you days or weeks to complete. Write code yourself, don’t just sit there and look at my code. USEFUL COURSE ORDERING: (The Numpy Stack in Python) Lin
ear
Regression in Python Logistic Regression in Python (Supervised Machine L
ear
ning
in Python) (Bayesian Machine L
ear
ning
in Python: A/B Testing)
Deep
L
ear
ning
in Python Practical
Deep
L
ear
ning
in Theano and TensorFlow (Supervised Machine L
ear
ning
in Python 2: Ensemble Methods) Convolutional Neural Networks in Python (Easy NLP) (Cluster Analysis and Unsupervised Machine L
ear
ning
) Unsupervised
Deep
L
ear
ning
(Hidden Markov Models) Recurrent Neural Networks in Python Artificial Intelligence: Reinforcement L
ear
ning
in Python Natural Language Processing with
Deep
L
ear
ning
in Python Who is the target audience? Students and professional computer scientists Software engineers Data scientists who work on computer vision tasks Those who want to apply
deep
l
ear
ning
to images Those who want to expand their knowledge of
deep
l
ear
ning
past vanilla
deep
networks People who don’t know what backpropagation is or how it works should not take this course, but instead, take parts 1 and 2. People who are not comfortable with Theano and TensorFlow basics should take part 2 before taking this course.
Udemy -
Deep
L
ear
ning
Recurrent Neural Networks in Python
https://www.udemy.com/
deep
-l
ear
ning
-recurrent-neural-networks-in-python/
Deep
L
ear
ning
: Recurrent Neural Networks in Python GRU, LSTM, + more modern
deep
l
ear
ning
, machine l
ear
ning
, and data science for sequences Created by Lazy Programmer Inc. Last updated 5/2017 English What Will I L
ear
n? Understand the simple recurrent unit (Elman unit) Understand the GRU (gated recurrent unit) Understand the LSTM (long short-term memory unit) Write various recurrent networks in Theano Understand backpropagation through time Understand how to mitigate the vanishing gradient problem Solve the XOR and parity problems using a recurrent neural network Use recurrent neural networks for language modeling Use RNNs for generating text, like poetry Visualize word embeddings and look for patterns in word vector representations Requirements Calculus Lin
ear
algebra Python, Numpy, Matplotlib Write a neural network in Theano Understand backpropagation Probability (conditional and joint distributions) Write a neural network in Tensorflow Description Like the course I just released on Hidden Markov Models, Recurrent Neural Networks are all about l
ear
ning
sequences – but whereas Markov Models are limited by the Markov assumption, Recurrent Neural Networks are not – and as a result, they are more expressive, and more powerful than anything we’ve seen on tasks that we haven’t made progress on in decades. So what’s going to be in this course and how will it build on the previous neural network courses and Hidden Markov Models? In the first section of the course we are going to add the concept of time to our neural networks. I’ll introduce you to the Simple Recurrent Unit, also known as the Elman unit. We are going to revisit the XOR problem, but we’re going to extend it so that it becomes the parity problem – you’ll see that regular feedforward neural networks will have trouble solving this problem but recurrent networks will work because the key is to treat the input as a sequence. In the next section of the course, we are going to revisit one of the most popular applications of recurrent neural networks – language modeling. You saw when we studied Markov Models that we could do things like generate poetry and it didn’t look too bad. We could even discriminate between 2 different poets just from the sequence of parts-of-speech tags they used. In this course, we are going to extend our language model so that it no longer makes the Markov assumption. Another popular application of neural networks for language is word vectors or word embeddings. The most common technique for this is called Word2Vec, but I’ll show you how recurrent neural networks can also be used for creating word vectors. In the section after, we’ll look at the very popular LSTM, or long short-term memory unit, and the more modern and efficient GRU, or gated recurrent unit, which has been proven to yield comparable performance. We’ll apply these to some more practical problems, such as l
ear
ning
a language model from Wikipedia data and visualizing the word embeddings we get as a result. All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey. This course focuses on “how to build and understand“, not just “how to use”. Anyone can l
ear
n to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happe
ning
in the model internally. If you want more than just a superficial look at machine l
ear
ning
models, this course is for you. See you in class!
NOTES
: All the code for this course can be downloaded from my github: /lazyprogrammer/machine_l
ear
ning
_examples In the directory: rnn_class Make sure you always “git pull” so you have the latest version! HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE: calculus lin
ear
algebra probability (conditional and joint distributions) Python coding: if/else, loops, lists, dicts, sets Numpy coding: matrix and vector operations, loading a CSV file
Deep
l
ear
ning
: backpropagation, XOR problem Can write a neural network in Theano and Tensorflow TIPS (for getting through the course): Watch it at 2x. Take handwritten
notes
. This will drastically increase your ability to retain the information. Write down the equations. If you don’t, I guarantee it will just look like gibberish. Ask lots of questions on the discussion board. The more the better! Realize that most exercises will take you days or weeks to complete. Write code yourself, don’t just sit there and look at my code. USEFUL COURSE ORDERING: (The Numpy Stack in Python) Lin
ear
Regression in Python Logistic Regression in Python (Supervised Machine L
ear
ning
in Python) (Bayesian Machine L
ear
ning
in Python: A/B Testing)
Deep
L
ear
ning
in Python Practical
Deep
L
ear
ning
in Theano and TensorFlow (Supervised Machine L
ear
ning
in Python 2: Ensemble Methods) Convolutional Neural Networks in Python (Easy NLP) (Cluster Analysis and Unsupervised Machine L
ear
ning
) Unsupervised
Deep
L
ear
ning
(Hidden Markov Models) Recurrent Neural Networks in Python Artificial Intelligence: Reinforcement L
ear
ning
in Python Natural Language Processing with
Deep
L
ear
ning
in Python Who is the target audience? If you want to level up with
deep
l
ear
ning
, take this course. If you are a student or professional who wants to apply
deep
l
ear
ning
to time series or sequence data, take this course. If you want to l
ear
n about word embeddings and language modeling, take this course. If you want to improve the performance you got with Hidden Markov Models, take this course. If you’re interested the techniques that led to new developments in machine translation, take this course. If you have no idea about
deep
l
ear
ning
, don’t take this course, take the prerequisites.
下载资源悬赏专区
13,655
社区成员
12,579,685
社区内容
发帖
与我相关
我的任务
下载资源悬赏专区
CSDN 下载资源悬赏专区
复制链接
扫一扫
分享
社区描述
CSDN 下载资源悬赏专区
其他
技术论坛(原bbs)
社区管理员
加入社区
获取链接或二维码
近7日
近30日
至今
加载中
查看更多榜单
社区公告
暂无公告
试试用AI创作助手写篇文章吧
+ 用AI写文章