Multilayered artificial neural networks are becoming a pervasive tool in a host of application domains.
At the heart of this deep learning revolution are familiar concepts from applied and computational mathematics; notably in calculus, partial differential equations, linear algebra, and approximation/optimization theory.
It assumes little math knowledge beyond what you learned in freshman calculus, and provide links to help you refresh the necessary math where needed.
Note that you do not need to understand this material before you start learning to train and use deep learning in practice; rather, this material is for those who are already familiar with the basics of neural networks, and wish to deepen their understanding of the underlying math.
While watching a recent webinar sponsored by The ACM, “Break Into AI: A Q&A with Andrew Ng on Building a Career in Machine Learning,” I found out that Dr.
Ng routinely carries around a folder of research papers that he can draw from when there’s a lull in his active schedule like when he’s riding in an Uber.Sentiment analysis is especially valuable when acting on social media data sources.Deep learning is another technology that’s growing in popularity as a powerful machine learning technique that learns multiple layers of representations or features of the data and yields prediction results.In this paper, Bangalore-based PES University researchers describe an alternative to backpropagation without the use of Gradient Descent.Instead, they devise a new algorithm to find the error in the weights and biases of an artificial neuron using .Against a background of considerable progress in areas such as speech recognition, image recognition, and game playing, AI contrarian Gary Marcus of New York University presents ten concerns for deep learning, and suggests that deep learning must be supplemented by other techniques if we are to reach the long-term goal of .The Matrix Calculus You Need For Deep Learning This paper is a wonderful resource that explains all the linear algebra you need in order to understand the operation of deep neural networks (and to read most of the other papers on this list).Group Normalization Batch Normalization (BN) is a milestone technique in the development of deep learning, enabling various networks to train.However, normalizing along the batch dimension introduces problems — BN’s error increases rapidly when the batch size becomes smaller, caused by inaccurate batch statistics estimation.As an academic researcher in a previous life, I like to maintain ties to the research community while working in the data science field.I feel that a firm understanding of the origins for the technologies I use in my consulting work: AI, machine learning, and deep learning, helps me establish a foundational perspective for how things work behind the scenes.