#########
1.Brain analogies
#############
2.Modeling one neruon
Biological motivation and connections
neruro synapses dendrites axopn
activation function sigmoid fuction
#############
3.Coarse model
commonly used activation functions
1.Sigmode function
2.Tanh
3.ReLU(r=Rectified Linear Unit)
4.Leaky Relu
5.Maxout
6.TLDR
Neural Network architectures
Layer-wise organization
1.Neural Networks as neurons in graphs
2.Naming conventions
3.Output layer
4.Sizing neural networks
the neural network can approximate any continuous function.
How to set number of layers and their sizes
problem overfitting :L2 regularization,dropout,input noise
generalization capbility
1.Brain analogies
#############
2.Modeling one neruon
Biological motivation and connections
neruro synapses dendrites axopn
activation function sigmoid fuction
#############
3.Coarse model
commonly used activation functions
1.Sigmode function
2.Tanh
3.ReLU(r=Rectified Linear Unit)
4.Leaky Relu
5.Maxout
6.TLDR
Neural Network architectures
Layer-wise organization
1.Neural Networks as neurons in graphs
2.Naming conventions
3.Output layer
4.Sizing neural networks
the neural network can approximate any continuous function.
How to set number of layers and their sizes
problem overfitting :L2 regularization,dropout,input noise
generalization capbility