CNN - Case studies (1)

Why look at case studies?

It is the quickest, easiest way to gain intuition, by writing and read other’s code and architecture. We’ll see a few networks such as Lenet-5, Alexnet(googleNet), resnet, inception neural network, etc.


Lenet-5 (Classic)

AlexNet(GoogleNet)

VGG-16


Resnet

Resnets are built with residual blocks.
residual blocks came out to solve the problem of vanishing and exploding gradients.
Skip Connection, or short cut is used, which refers to a[l] just skipping over a lyer or kind of skipping over almost two layers in order to process information, which allows you to train deeper networks.

case_studies_4 In plain networks, (without residual networks), training error will go up after reducing! case_studies_5

Why do Resnets work well?

It’s because when activation, it’s easy to get the identity matrix, and thus it is easy to stay on the previous error rate. But if we don’t have residual nets, we might just fall to worse situations

case_studies_6

Theses are the example when residual nets are composed of to make resnets.

case_studies_7