KimAnt ๐Ÿฅฆ

KimAnt ๐Ÿฅฆ

  • SohyeonKim (363)
    • ComputerScience (106)
      • ProcessingInMemory (6)
      • FaultTolerance (2)
      • OperatingSystem (21)
      • FreeBSD (23)
      • DesignPattern (1)
      • ComputerNetwork (12)
      • FullStackProgramming (17)
      • DockerKubernetes (16)
      • Database (5)
    • ArtificialIntelligence (72)
      • ECCV2024 (11)
      • WRTNCampusLeader (4)
      • PaperReading (14)
      • 2023GoogleMLBootcamp (33)
      • DeepLearning (10)
    • Programming (27)
      • Swift (17)
      • JAVA (3)
      • CodingTest (2)
      • Algorithms (5)
    • Experiences (37)
      • KIST Europe Internship (15)
      • Activities (8)
      • Competition (6)
      • International (7)
      • Startup (1)
    • iOS (41)
      • AppProject (10)
      • AppleDeveloperAcademy@POSTE.. (9)
      • CoreMLCreateML (8)
      • MC3Puhaha (4)
      • NC2Textinit (10)
      • MACSpaceOver (0)
    • GitHub (5)
    • UniversityMakeUsChallenge (23)
      • UMCActivities (3)
      • UMCiOS (12)
      • UMCServer (7)
      • BonheurAppProject (1)
    • Science (33)
      • 2022GWNRSummer (13)
      • 2023GWNRWinter (8)
      • 2024GWNRWinter (2)
      • Biology (6)
    • Etc (16)
      • StudyPlanner (13)
  • ํ™ˆ
  • ํƒœ๊ทธ
  • ๋ฐฉ๋ช…๋ก
RSS ํ”ผ๋“œ
๋กœ๊ทธ์ธ
๋กœ๊ทธ์•„์›ƒ ๊ธ€์“ฐ๊ธฐ ๊ด€๋ฆฌ

KimAnt ๐Ÿฅฆ

์ปจํ…์ธ  ๊ฒ€์ƒ‰

ํƒœ๊ทธ

process biohybrid swift app docker OS ๋จธ์‹ ๋Ÿฌ๋‹ kernel Programming ๋”ฅ๋Ÿฌ๋‹ Google ios server ์ค‘๋ ฅํŒŒ 3D PRINTING Container ์ˆ˜์น˜์ƒ๋Œ€๋ก  Apple umc AI

์ตœ๊ทผ๊ธ€

๋Œ“๊ธ€

๊ณต์ง€์‚ฌํ•ญ

์•„์นด์ด๋ธŒ

Gradient Descent(1)

  • [GoogleML] Optimization Algorithms

    Mini-batch Gradient Descent Understanding Mini-batch Gradient Descent batch๋„ ์‹œ๊ฐ„์ด ๋งŽ์ด ๊ฑธ๋ฆฐ๋‹ค. ์ด ๋‘˜์˜ ํ•˜์ด๋ธŒ๋ฆฌ๋“œ ๋„ˆ๋ฌด ํฌ๊ฑฐ๋‚˜ ์ž‘์ง€ ์•Š์€ ๋ฏธ๋‹ˆ ๋ฐฐ์น˜ ์‚ฌ์ด์ฆˆ 1. vectorization 2. ์ „์ฒด๋ฅผ full๋กœ ๋‹ค ๊ธฐ๋‹ค๋ฆด ํ•„์š” X 1. 2000๊ฐœ ์ดํ•˜์˜ ๋ฐ์ดํ„ฐ -> full batch 2. ํฐ ๋ฐ์ดํ„ฐ ์…‹ -> 64 / 128 / 512 ์ค‘ ํ•˜๋‚˜๋ฅผ ํƒํ•ด์„œ ์‚ฌ์šฉ 3. GPU / CPU ๋ฉ”๋ชจ๋ฆฌ์— ๋งž๊ฒŒ ์‚ฌ์šฉ ์ฃผ์˜ Exponentially Weighted Averages Understanding Exponentially Weighted Averages Bias Correction in Exponentially Weighted Averages t ๊ฐ€ ์ปค..

    2023.09.20
์ด์ „
1
๋‹ค์Œ
Git-hub Linked-in
ยฉ 2018 TISTORY. All rights reserved.

ํ‹ฐ์Šคํ† ๋ฆฌํˆด๋ฐ”

๊ฐœ์ธ์ •๋ณด

  • ํ‹ฐ์Šคํ† ๋ฆฌ ํ™ˆ
  • ํฌ๋Ÿผ
  • ๋กœ๊ทธ์ธ

๋‹จ์ถ•ํ‚ค

๋‚ด ๋ธ”๋กœ๊ทธ

๋‚ด ๋ธ”๋กœ๊ทธ - ๊ด€๋ฆฌ์ž ํ™ˆ ์ „ํ™˜
Q
Q
์ƒˆ ๊ธ€ ์“ฐ๊ธฐ
W
W

๋ธ”๋กœ๊ทธ ๊ฒŒ์‹œ๊ธ€

๊ธ€ ์ˆ˜์ • (๊ถŒํ•œ ์žˆ๋Š” ๊ฒฝ์šฐ)
E
E
๋Œ“๊ธ€ ์˜์—ญ์œผ๋กœ ์ด๋™
C
C

๋ชจ๋“  ์˜์—ญ

์ด ํŽ˜์ด์ง€์˜ URL ๋ณต์‚ฌ
S
S
๋งจ ์œ„๋กœ ์ด๋™
T
T
ํ‹ฐ์Šคํ† ๋ฆฌ ํ™ˆ ์ด๋™
H
H
๋‹จ์ถ•ํ‚ค ์•ˆ๋‚ด
Shift + /
โ‡ง + /

* ๋‹จ์ถ•ํ‚ค๋Š” ํ•œ๊ธ€/์˜๋ฌธ ๋Œ€์†Œ๋ฌธ์ž๋กœ ์ด์šฉ ๊ฐ€๋Šฅํ•˜๋ฉฐ, ํ‹ฐ์Šคํ† ๋ฆฌ ๊ธฐ๋ณธ ๋„๋ฉ”์ธ์—์„œ๋งŒ ๋™์ž‘ํ•ฉ๋‹ˆ๋‹ค.