000 01948nam a22002177a 4500
999 _c3865
_d3865
005 20221122121349.0
008 221122b ||||| |||| 00| 0 eng d
020 _a9781484247211
082 _a006.31
_bMIC
100 _aMichelucci, Umberto
_99086
245 _aApplied deep learning:
_ba case-based approach to understanding deep neural networks
250 _a2nd
260 _bApress
_aNew York
_c2022
300 _axxi, 410 p.
365 _aINR
_b1199.00
520 _aAbout this book Work with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You’ll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You’ll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy).
650 _aMachine learning
_92343
650 _aNeural networks (Computer science)
_92344
650 _aPython (Computer program language)
_910208
942 _2ddc
_cBK