LeNet - 5

Structure:

Dimensions in each Layer

  • conv(f=5x5,s=1)
  • pooling(f=2,s=2)
  • conv(f=5x5,s=1)
  • pooling(f=2,s=2)
  • fully connected layer(120)
  • fully connected layer(84).

5 layers: 2 convolutional layers, 2 fully connected layers, 1 output layer Totally, 60K parameters need to learn.

Alex - Net

7 layers: 4 convs, 2 fc, 1 output Totally, 60 M parameters.

VGG - 16

convs: 3x3, stride = 1, max-pooling: 2x2, stride = 2, 16 layers: 13 convs, 2 fc, 1 output Totally, 138M parameters.

ResNets

Skip connections, can help much deeper networks.

Inception Neworks(GoogleNet)

Using 1x1 convolution to reduce computational cost

results matching ""

    No results matching ""