Deep Learning Now!
Chapter 1: Introduction to AI
History of AI Timeline
-
1943 - Shallow Neural Networks
- Creates the neuron to predict a function
- No learning involved
-
1958 - The Perceptron
- Learn weights (perceptron)
-
1969-1982 - Slow development of AI
- Back propagation algorithm developed
- 1974 Applies back propagation to neural networks
-
1982-1995 - Hopfield network
- Convolutional Neural Networks
- Recurrent Neural Networks
- 1989 - Handwriting recognition for postcodes
-
1997 - Long Short Term Memory (LSTM)
-
2006 - Multi-layer neural networks
-
2011 - ReLU activation function
-
2012 - Dropout technology to prevent overfitting
-
2014 - GANs get introduced
-
2015 - TensorFlow 1.0 released
-
2019 - OpenAI
- TensorFlow 2.0
Characteristics of Deep Learning
- Data Volume
- The requirements are getting bigger as more complex data are needed
- Computing Power
- Requires more computational power
- Network Scale
- How many layers of neurons are needed
Applications
-
Computer Vision
- Image classification
- Object detection
- Semantic segmentation
- Image captioning
- Image generation
-
NLP
-
Reinforcement Learning
- Game Playing
- Robotics
- Autonomous Driving
Deep Learning Frameworks
- Theano
- TensorFlow
- Scikit-learn
- No GPU acceleration
- Caffe
- Integrated with PyTorch
- Torch
- Based on Lua
- MXNet
- Pytorch
- Keras
- TensorFlow
Keras is the high level API design specs.
TensorFlow has an implementation of Keras in it called tf.keras
TensorFlow
Don't use TensorFlow 1.0. Use TensorFlow 2.0. TensorFlow 1.0 is not compatible with TensorFlow 2.0. Also, it is more verbose.