๋ณธ๋ฌธ ๋ฐ”๋กœ๊ฐ€๊ธฐ
๋ฐ˜์‘ํ˜•

๋‹ค์ด๋น™ํ•˜๋Š”.. AIํ•˜๋Š”.. ์ž„์”จ

์ „์ฒด ๊ธ€
40

ํ•ด์–‘๊ธฐ์ƒ์„ ์œ„ํ•œ NetCDF ๋‹ค๋ฃจ๊ธฐ (2) ๋”๋ณด๊ธฐ๊ธฐ๋ณธ์ ์ธ python ๊ฐœ๋…์„ ์•Œ๊ณ  ์žˆ๋‹ค๋Š” ์ „์ œ ํ•˜,๊ฐ„๋‹จํ•˜๊ฒŒ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ๋“ค ์œ„์ฃผ๋กœ ๊ธฐ์–ตํ•˜๊ธฐ ์œ„ํ•ด ์ตœ์†Œํ•œ์˜ ์ฝ”๋“œ๋กœ ์ž‘์„ฑํ•˜๋Š” ๊ธ€์ž„์„ ์ฐธ๊ณ  ๋ถ€ํƒ๋“œ๋ฆฝ๋‹ˆ๋‹ค.์ฐจ์› ๋ฐ์ดํ„ฐ์˜ ๊ฐœ๋… NetCDF (Network Common Data Form) ๋ฐ์ดํ„ฐ๊ฐ€ ๋ญ์—์šฉ?ํ•ด์–‘๊ณผ ๋Œ€๊ธฐ ๊ณผํ•™์—์„œ ๋„๋ฆฌ ์“ฐ์ธ๋‹ค.๋‹ค์ฐจ์› ๋ฐ์ดํ„ฐ์‹œ๊ฐ„๊ณผ ๊ณต๊ฐ„์— ๋”ฐ๋ฅธ ๋Œ€๊ธฐ, ํ•ด์–‘, ๋ฐ์ดํ„ฐ๋“ค๋ฐฐ์—ด ๋˜๋Š” ํ–‰๋ ฌ๋กœ ๊ตฌ์„ฑ๋œ ๊ณผํ•™์ •๋ณด ํ˜•์‹ํ•ด์–‘ ์ˆ˜์น˜ ๋ชจ๋ธ ๊ฒฐ๊ณผ๋Š” ๊ณต๊ฐ„์ •๋ณด(3์ฐจ์›)์˜ ์‹œ๊ฐ„์— ๋”ฐ๋ฅธ ๊ฐ’์ด ์ž…๋ ฅ, ๊ฐ’ ๋˜ํ•œ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๋ณ€์ˆ˜๋ฅผ ํฌํ•จ. (x,y์— ๋”ฐ๋ฅธ h๊ฐ’. ๋ฉด๋ฐ์ดํ„ฐ)๋˜ํ•œ ๊ณต๊ฐ„๊ณผ ์‹œ๊ฐ„์— ๋”ฐ๋ฅธ ํŒŒ๊ณ , ํŒŒํ–ฅ, ํŒŒ์ฃผ๊ธฐ ๋“ฑ ๋ถ€๊ฐ€ ์ •๋ณด๋ฅผ ํฌํ•จ.๊ด€์ธก์ •๋ณด ์ผ ๋•Œ๋Š” ๊ด€์ธก์†Œ๋ช…, ํ‘œ์ค€์‹œ, ์ˆ˜์ธต, ์žฅ๋น„๋ช… ๋“ฑ๊ณผ ๋ชจ๋ธ์ผ ๋•Œ๋Š” ์‹œ๊ฐ„, ๊ณต๊ฐ„ ํ‘œ์ค€, ๊ฐ’ ๋‹จ์œ„, fill_value ๋“ฑ๋“ฑ ํฌํ•จ๋˜์–ด์•ผ ํ• .. 2025. 2. 20.
[Math] Deep-Learning ํ•™์Šต๋ฐฉ๋ฒ• ์ดํ•ดํ•˜๊ธฐ ๋”๋ณด๊ธฐ๊ธฐ์กด ๋ถ€์บ  ๋•Œ ๋…ธ์…˜์— ๊ฐœ์ธ์ ์œผ๋กœ ์ •๋ฆฌํ•œ ๊ฒƒ์„ ๊ณต๋ถ€ํ•  ๊ฒธ ์ž‘์„ฑํ•œ ๊ธ€์ž…๋‹ˆ๋‹ค.๊ฐœ์ธ์ ์œผ๋กœ ํ•ด์„ํ•ด์„œ ์ž‘์„ฑํ•ฉ๋‹ˆ๋‹ค. (ํ‹€๋ฆด ์ˆ˜ ์žˆ์Œ. ์ •์ •์š”์ฒญ ์š”๋งใ…‹)** ๊ฐ•์˜์ž๋ฃŒ๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค **** ์ƒ์—…์  ์ด์šฉ์„ ๊ธˆ์ง€ํ•ฉ๋‹ˆ๋‹ค **Today's Keyword์‹ ๊ฒฝ๋ง, softmax, activation function, Backpropagation, chain Rule๋น„์„ ํ˜•๋ชจ๋ธ - ์‹ ๊ฒฝ๋ง neural network์ „์ฒด ๋ฐ์ดํ„ฐ X, x๋ฅผ ๋‹ค๋ฅธ ๊ณต๊ฐ„์œผ๋กœ ๋ณด๋‚ด์ฃผ๋Š” ๊ฐ€์ค‘์น˜ W์˜ ๊ณฑ์œผ๋กœ ํ‘œํ˜„ + b(y์ ˆํŽธ)์ด ๋•Œ ์ถœ๋ ฅ ๋ฒกํ„ฐ์˜ ์ฐจ์›์€ d -> p x to O๋กœ ์—ฐ๊ฒฐํ•  ๋•Œ P๊ฐœ์˜ ๋ชจ๋ธ.softmax ํ•จ์ˆ˜๋ชจ๋ธ์˜ ์ถœ๋ ฅ์„ ํ™•๋ฅ ๋กœ ํ•ด์„ํ•  ์ˆ˜ ์žˆ๊ฒŒ๋ถ„๋ฅ˜ ๋ฌธ์ œ ํ’€๋•Œ ๋ชจ๋ธ X ์†Œํ”„ํŠธ๋งฅ์Šค โ†’ ์˜ˆ์ธกsoftmax(o) = softmax(Wx +b)ํ•™์Šตํ• .. 2025. 1. 20.
[Math] Gradient Descent (๋งค์šด๋ฏธ๋ถ„๋ง›) ๋”๋ณด๊ธฐ๊ธฐ์กด ๋ถ€์บ  ๋•Œ ๋…ธ์…˜์— ๊ฐœ์ธ์ ์œผ๋กœ ์ •๋ฆฌํ•œ ๊ฒƒ์„ ๊ณต๋ถ€ํ•  ๊ฒธ ์ž‘์„ฑํ•œ ๊ธ€์ž…๋‹ˆ๋‹ค.๊ฐœ์ธ์ ์œผ๋กœ ํ•ด์„ํ•ด์„œ ์ž‘์„ฑํ•ฉ๋‹ˆ๋‹ค. (ํ‹€๋ฆด ์ˆ˜ ์žˆ์Œ. ์ •์ •์š”์ฒญ ์š”๋งใ…‹)** ๊ฐ•์˜์ž๋ฃŒ๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค **** ์ƒ์—…์  ์ด์šฉ์„ ๊ธˆ์ง€ํ•ฉ๋‹ˆ๋‹ค **Today's Keyword๋ฏธ๋ถ„, ๊ธฐ์šธ๊ธฐ, ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•, L2 norm, L2 norm ์ œ๊ณฑ, SGD np.linalg.pinv๋กœ ๋ฐ์ดํ„ฐ๋ฅผ ์„ ํ˜• ๋ชจ๋ธ๋กœ ํ•ด์„ํ•ด์„œ ์„ ํ˜•ํšŒ๊ท€์‹์„ ์ฐพ์„ ์ˆ˜ ์žˆ์Œ. ์—ฌ๊ธฐ์„œ L2 ๋…ธ๋ฆ„์„ ์ตœ์†Œํ™”ํ•˜๋Š” ๊ฒŒ ๋ชฉํ‘œ.L2 Norm||์ •๋‹ต - ๋‘ ๋ฒกํ„ฐ์˜ ์ฐจ์ด||ยฒ๋ฅผ ์ตœ์†Œํ™”ํ•˜๋Š” B๋ฅผ ์ฐพ์•„์•ผ ํ•จ.  ๋ชฉ์ ์‹์„ ์ตœ์†Œํ™”ํ•˜๋Š” B๋ฅผ ๊ตฌํ•˜๋Š” ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• ์•Œ๊ณ ๋ฆฌ์ฆ˜L2 ๋…ธ๋ฆ„์„ ์ตœ์†Œํ™”ํ•˜๋Š” ๋ฒกํ„ฐ๋‚˜, L2 ๋…ธ๋ฆ„ ์ œ๊ณฑ์„ ์ตœ์†Œํ™”ํ•˜๋Š” ๋ฒกํ„ฐ๋‚˜ ๊ฒฐ๊ณผ๋Š” ๊ฐ™์Œ. ๊ทธ๋ž˜์„œ ๊ณ„์‚ฐ ๊ฐ„๋‹จํ•˜๊ฒŒ ํ•˜๋ ค๋ฉด L2 ๋…ธ๋ฆ„ ์ œ๊ณฑ์„ ์“ฐ๋Š” ๊ฒŒ ๋” .. 2025. 1. 8.
[Math] Gradient Descent (์ฐฉํ•œ๋ฏธ๋ถ„๋ง›) ๋”๋ณด๊ธฐ๊ธฐ์กด ๋ถ€์บ  ๋•Œ ๋…ธ์…˜์— ๊ฐœ์ธ์ ์œผ๋กœ ์ •๋ฆฌํ•œ ๊ฒƒ์„ ๊ณต๋ถ€ํ•  ๊ฒธ ์ž‘์„ฑํ•œ ๊ธ€์ž…๋‹ˆ๋‹ค.๊ฐœ์ธ์ ์œผ๋กœ ํ•ด์„ํ•ด์„œ ์ž‘์„ฑํ•ฉ๋‹ˆ๋‹ค. (ํ‹€๋ฆด ์ˆ˜ ์žˆ์Œ. ์ •์ •์š”์ฒญ ์š”๋งใ…‹)** ๊ฐ•์˜์ž๋ฃŒ๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค **** ์ƒ์—…์  ์ด์šฉ์„ ๊ธˆ์ง€ํ•ฉ๋‹ˆ๋‹ค **  Today's Keyword๋ฏธ๋ถ„, ๊ธฐ์šธ๊ธฐ, ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•, ํŽธ๋ฏธ๋ถ„, gradient vector  ๋ฏธ๋ถ„ (Differentiation)๋ณ€์ˆ˜์˜ ์›€์ง์ž„์— ๋”ฐ๋ฅธ ํ•จ์ˆ˜๊ฐ’์˜ ๋ณ€ํ™” ์ธก์ •์ตœ์ ํ™”์—์„œ ๋งŽ์ด ์‚ฌ์šฉ ๋ฏธ๋ถ„ == ์ ‘์„ ์˜ ๊ธฐ์šธ๊ธฐํ•จ์ˆ˜์˜ ๋ชจ์–‘์ด ๋งค๋„๋Ÿฌ์›Œ์•ผ ํ•œ๋‹ค (์—ฐ์†)ํ•œ ์ ์—์„œ ์ ‘์„ ์˜ ๊ธฐ์šธ๊ธฐ๋ฅผ ์•Œ๋ฉด ์ฆ๊ฐ€ : ๋ฏธ๋ถ„ ๊ฐ’ ๋”ํ•˜๊ธฐ๊ฐ์†Œ : ๋ฏธ๋ถ„ ๊ฐ’ ๋นผ๊ธฐ ๋ฏธ๋ถ„๊ฐ’์ด ์Œ์ˆ˜(์ขŒ๋กœ ์ƒ์Šนํ•˜๋Š” ํ˜•ํƒœ) = x+f'(x) ๋ฏธ๋ถ„๊ฐ’์ด ์•™์ˆ˜(์šฐ๋กœ ์ƒ์Šนํ•˜๋Š” ํ˜•ํƒœ) = x+f'(x) > x ๋Š” ์˜ค๋ฅธ์ชฝ์œผ๋กœ ์ด๋™ํ•˜์—ฌ ํ•จ์ˆ˜๊ฐ’์ด ๊ฐ์†Œsympy.s.. 2024. 12. 16.
๋ฐ˜์‘ํ˜•

์ตœ๊ทผ๋Œ“๊ธ€

์ตœ๊ทผ๊ธ€

skin by ยฉ 2024 ttutta