ReLU Layer

Definition

ReLU Layer is used to threshold feature map or a general input vector.

Functionality

Because ReLU is often the only non-linear Layer in a network, it prevents the whole network from collapsing into one single matrix operation. For the same reason, it contributes to network expressiveness.

Input:

Layer:

Output

Use Cases

ReLU Layer is mostly used right after Convolution Layer.

Caveats

Dead ReLU

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.