ArtificialIntelligence/PaperReading(14)
-
[Paper reading] DenseNet
DenseNet Abstract Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forw..
2023.08.22 -
[Paper reading] GoogleNet
Inception Module The fundamental way of solving both issues would be by ultimately moving from fully connected to sparsely connected architectures, even inside the convolutions. adding an alternative parallel pooling path As these “Inception modules” are stacked on top of each other, their output correlation statistics are bound to vary; as features of higher abstraction are captured by higher l..
2023.08.18 -
[Paper reading] ResNet
ResNet Intro Short Cut connections adding neither extra params nor computational complexity 100 ~ 1000 layers / 3.57% Conclusion 장점 아이디어가 독창적이다. 간단한 아이디어로 네트워크의 깊이를 매우 깊게 쌓았다. optimization 문제를 해결, 성능 향상 단점 왜 이러한 skip connection을 사용하면 성능이 더 잘 나오는 지 기존의 gradient descent 의 정의와 방향이 달라지지 않나? 개선할 점 다른 function들도 skip 해보기 굳이 - x, + x 되어야 하는 이유? + x, - x 순서로 진행되면 어떠한 차이가 있을지 잔차라는 개념이 어디에서 처음 나왔을까(relate..
2023.08.16 -
[Paper reading] VGGNet 2023.08.16