[GoogleML] Deep Neural Network

2023. 9. 11. 22:23ใ†ArtificialIntelligence/2023GoogleMLBootcamp

 

 

 

Deep L-layer Neural Network

 

 

 

๋™๊ทธ๋ผ๋ฏธ๊ฐ€ ์žˆ๋Š”, hidden layer ์ˆ˜๋งŒ ์„ผ๋‹ค

 

 

 

input dim ๋งŒํผ input ๋…ธ๋“œ์˜ ์ˆ˜

1๊ฐœ์˜ input image์— ๋Œ€ํ•ด, feature๋“ค์˜ ์ˆ˜๋กœ ํ•ด์„ํ•  ์ˆ˜ ์žˆ๋‹ค

 

 

 

 

 

 

Forward Propagation in a Deep Network

 

 

 

๋ชจ๋“  training example m๊ฐœ๋ฅผ ํ•œ ๋ฒˆ์— horizontally ํ‘œํ˜„ํ•˜๋Š” ๊ฒƒ -> vector version

 

 

 

4๊ฐœ์˜ layer์— ๋Œ€ํ•ด ์ž‘์—…์ด ์ด๋ฃจ์–ด์ง„๋‹ค.

 

 

 

Getting your Matrix Dimensions Right

ํ™”์งˆ์ด ๋งŽ์ด ํ๋ฆฌ๋‹ค

 

 

 

์ฐจ์›์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ

 

 

 

Why Deep Representations?

์™œ deep layer๊ฐ€ ์ž˜ ์ž‘๋™ํ•˜๋Š”๊ฐ€?

input layer์— ๊ฐ€๊นŒ์šด ๋…ธ๋“œ๋“ค์€ ์—ฃ์ง€ detect (๊ฐ€๋กœ, ์„ธ๋กœ ๋ฐฉํ–ฅ์˜ low level features)

๊ฐ€์šด๋ฐ - ๋ˆˆ, ์ฝ” detect

๋งˆ์ง€๋ง‰ - ์–ผ๊ตด ์ธ์‹ (high level features)

 

 

 

์˜ค๋””์˜ค ์ธ์‹์˜ ๊ฒฝ์šฐ์—๋„, low to high level detection

์‚ฌ๋žŒ์˜ ๋‡Œ๋„ ์œ ์‚ฌํ•œ ๊ณผ์ •์œผ๋กœ ์ธ์‹ํ•œ๋‹ค๊ณ  ํ•œ๋‹ค.

๊ณ„์ธต์ ์ธ ํ•™์Šต์ด ์ด๋ฃจ์–ด์งˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ์žฅ์ !

 

 

 

gate ์ˆ˜๊ฐ€ ์ ๋‹ค (๊ณ„์‚ฐ ๋ณต์žก๋„๊ฐ€ log n) 

 

 

 

shallower networks -> ๋งค์šฐ ๋งŽ์€ ๋…ธ๋“œ๋“ค์ด ํ•„์š” (์ง€์ˆ˜์ ์ธ ๊ณ„์‚ฐ ๋ณต์žก๋„)

๋”ฐ๋ผ์„œ ์ธต์„ ์—ฌ๋Ÿฌ ๊ฐœ ์Œ“๋Š” ๊ฒƒ์ด ๋” ํšจ๊ณผ์ ์ด๋‹ค.

๋งค์šฐ deepํ•˜๊ฒŒ Layer๋ฅผ ์Œ“๋Š” ๊ฒƒ์ด ์ตœ๊ทผ ๊ฒฝํ–ฅ

 

 

 

Building Blocks of Deep Neural Networks

 

cache z[l]

backpropagation์€ l๋กœ๋ถ€ํ„ฐ l-1์„ ์–ป๋Š” ๊ฒƒ์ด ๋ชฉ์ ์ด๋‹ค. (derivative)

W์˜ (parameter)์˜ ๋ฏธ๋ถ„๊ฐ’์„ ์•Œ๊ธฐ ์œ„ํ•ด์„œ, backpropagation!

 

 

 

forward์™€ backward ์—ฐ์‚ฐ์„ ๊ทธ๋ฆผ์œผ๋กœ ํŒŒ์•…ํ•˜๊ธฐ

 

 

 

Z์™€ ๋”๋ถˆ์–ด W์™€ b๋„ ์บ์‰ฌ์— ์ €์žฅํ•œ๋‹ค

 

 

 

Forward and Backward Propagation

forward ๊ณผ์ •

 

 

 

 

 

 

component & vector backpropagation

 

 

 

da[l]์„ ๊ณ„์‚ฐํ•˜๋Š” ์ˆ˜์‹ 

code ์ž‘์„ฑ๋ณด๋‹ค data์—์„œ ์–ด๋ ค์›€์ด ์˜ค๋Š” ๊ฒฝ์šฐ๊ฐ€ ๋งŽ๋‹ค

 

 

 

Parameters vs Hyperparameters

์šฐ๋ฆฌ๊ฐ€ controlํ•˜๋Š” parameter -> hyperparameter

learning rate, iter, layer num, unit num, activation func

 

 

 

๊ฐ€์ •์„ ํ•˜๊ณ  (idea) -> programming (Code ๊ตฌํ˜„) -> ์‹คํ—˜ ๊ฒฐ๊ณผ๋ฅผ ํ†ตํ•ด ๊ฐ€์„ค ์ˆ˜์ • (๋‹ค์‹œ ์•„์ด๋””์–ด๋กœ)

implical ๊ณผ์ • / ์ฒ˜์Œ params ์ฐพ์„ ๋•Œ, ํŠน์ • ๊ตฌ๊ฐ„์˜ ๊ฐ’๋“ค์„ ์‹œ๋„ํ•ด๋ณด๋ผ

 

 

 

Clarification For: What does this have to do with the brain?

 

 

 

What does this have to do with the brain?

human brain and neural network