https://arxiv.org/pdf/1312.4400.pdf
> 3.2 Global Average Pooling

> ・・・
> However, the fully connected layers are prone to overfitting, thus hampering the generalization ability
> of the overall network. Dropout is proposed by Hinton et al. [5] as a regularizer which randomly
> sets half of the activations to the fully connected layers to zero during training. It has improved the
> generalization ability and largely prevents overfitting [4].

> In this paper, we propose another strategy called global average pooling to replace the traditional
> fully connected layers in CNN.