Amazon cover image
Image from Amazon.com

Mathematics of deep learning: an introduction

By: Contributor(s): Material type: TextTextSeries: De Gruyter GraduatePublication details: Walter de Gruyter GmbH, Berlin 2023Description: vi, 126 pISBN:
  • 9783111024318
Subject(s): DDC classification:
  • 006.31 BER
Summary: The goal of this book is to provide a mathematical perspective on some key elements of the so-called deep neural networks (DNNs). Much of the interest in deep learning has focused on the implementation of DNN-based algorithms. Our hope is that this compact textbook will offer a complementary point of view that emphasizes the underlying mathematical ideas. We believe that a more foundational perspective will help to answer important questions that have only received empirical answers so far. The material is based on a one-semester course Introduction to Mathematics of Deep Learning" for senior undergraduate mathematics majors and first year graduate students in mathematics. Our goal is to introduce basic concepts from deep learning in a rigorous mathematical fashion, e.g introduce mathematical definitions of deep neural networks (DNNs), loss functions, the backpropagation algorithm, etc. We attempt to identify for each concept the simplest setting that minimizes technicalities but still contains the key mathematics. Accessible for students with no prior knowledge of deep learning. Focuses on the foundational mathematics of deep learning. Provides quick access to key deep learning techniques. Includes relevant examples that readers can relate to easily. Information zu Autoren / Herausgebern Leonid Berland joined the Pennsylvania State University in 1991 where he is currently a Professor of Mathematics and a member of the Materials Research Institute. He is a founding co-director of the Penn State Centers for Interdisciplinary Mathematics and for Mathematics of Living and Mimetic Matter. He is known for his works at the interface between mathematics and other disciplines such as physics, materials sciences, life sciences, and most recently computer science. He has co-authored, Getting Acquainted with Homogenization and Multiscale,Birkhäuser 2018 and Introduction to the Network Approximation Method for Materials Modeling, Cambridge University Press, 2012. His interdisciplinary works received research awards from leading research agencies in the USA, such as NSF, the US Department of Energy, and the National Institute of Health as well as internationally (Bi-National Science Foundation and NATO). Most recently his work was recognized with the Humboldt Research Award of 2021. His teaching excellence was recognized by C.I. Noll Award for Excellence in Teaching by Eberly College of Science at Penn State. Pierre-Emmanuel Jabin is currently Professor of Mathematics at the Pennsylvania State University since August 2020 previously he was a Professor at the University of Maryland from 2011 to 2020, where he was also director of the Center for Scientific Computation and Mathematical Modeling from 2016 to 2020. Jabin‘s work in applied mathematics is internationally recognized and he has made seminal contributions to the theory and applications of many-particle/multi-agent systems together with advection and transport phenomena. Jabin was an invited speaker at the International Congress of Mathematicians in Rio de Janeiro in 2018. (https://www.degruyter.com/document/doi/10.1515/9783111025551/html?lang=de&srsltid=AfmBOoonOixucJh25soTfm0dzqj9PZE6_Ewkh5-XnNuqpL7AWA2A-Xzp#overview)
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Call number Copy number Status Date due Barcode
Book Book Indian Institute of Management LRC General Stacks IT & Decisions Sciences 006.31 BER (Browse shelf(Opens below)) 1 Available 006826

Table of content:
1 About this book

Nicht lizenziert 1
2 Introduction to machine learning: what and why?

Nicht lizenziert 2
3 Classification problem

Nicht lizenziert 4
4 The fundamentals of artificial neural networks

Nicht lizenziert 6
5 Supervised, unsupervised, and semisupervised learning

Nicht lizenziert 19
6 The regression problem

Nicht lizenziert 24
7 Support vector machine

Nicht lizenziert 40
8 Gradient descent method in the training of DNNs

Nicht lizenziert 52
9 Backpropagation

Nicht lizenziert 67
10 Convolutional neural networks

Nicht lizenziert 93
A Review of the chain rule

Nicht lizenziert 119
Bibliography

Nicht lizenziert 121
Index

Nicht lizenziert

[https://www.degruyter.com/document/doi/10.1515/9783111025551/html?lang=de&srsltid=AfmBOoonOixucJh25soTfm0dzqj9PZE6_Ewkh5-XnNuqpL7AWA2A-Xzp#contents]

The goal of this book is to provide a mathematical perspective on some key elements of the so-called deep neural networks (DNNs). Much of the interest in deep learning has focused on the implementation of DNN-based algorithms. Our hope is that this compact textbook will offer a complementary point of view that emphasizes the underlying mathematical ideas. We believe that a more foundational perspective will help to answer important questions that have only received empirical answers so far.

The material is based on a one-semester course Introduction to Mathematics of Deep Learning" for senior undergraduate mathematics majors and first year graduate students in mathematics. Our goal is to introduce basic concepts from deep learning in a rigorous mathematical fashion, e.g introduce mathematical definitions of deep neural networks (DNNs), loss functions, the backpropagation algorithm, etc. We attempt to identify for each concept the simplest setting that minimizes technicalities but still contains the key mathematics.

Accessible for students with no prior knowledge of deep learning.
Focuses on the foundational mathematics of deep learning.
Provides quick access to key deep learning techniques.
Includes relevant examples that readers can relate to easily.
Information zu Autoren / Herausgebern
Leonid Berland joined the Pennsylvania State University in 1991 where he is currently a Professor of Mathematics and a member of the Materials Research Institute. He is a founding co-director of the Penn State Centers for Interdisciplinary Mathematics and for Mathematics of Living and Mimetic Matter. He is known for his works at the interface between mathematics and other disciplines such as physics, materials sciences, life sciences, and most recently computer science. He has co-authored, Getting Acquainted with Homogenization and Multiscale,Birkhäuser 2018 and Introduction to the Network Approximation Method for Materials Modeling, Cambridge University Press, 2012. His interdisciplinary works received research awards from leading research agencies in the USA, such as NSF, the US Department of Energy, and the National Institute of Health as well as internationally (Bi-National Science Foundation and NATO). Most recently his work was recognized with the Humboldt Research Award of 2021. His teaching excellence was recognized by C.I. Noll Award for Excellence in Teaching by Eberly College of Science at Penn State.

Pierre-Emmanuel Jabin is currently Professor of Mathematics at the Pennsylvania State University since August 2020 previously he was a Professor at the University of Maryland from 2011 to 2020, where he was also director of the Center for Scientific Computation and Mathematical Modeling from 2016 to 2020. Jabin‘s work in applied mathematics is internationally recognized and he has made seminal contributions to the theory and applications of many-particle/multi-agent systems together with advection and transport phenomena. Jabin was an invited speaker at the International Congress of Mathematicians in Rio de Janeiro in 2018.

(https://www.degruyter.com/document/doi/10.1515/9783111025551/html?lang=de&srsltid=AfmBOoonOixucJh25soTfm0dzqj9PZE6_Ewkh5-XnNuqpL7AWA2A-Xzp#overview)

There are no comments on this title.

to post a comment.

©2019-2020 Learning Resource Centre, Indian Institute of Management Bodhgaya

Powered by Koha