KimAnt 🥦

KimAnt 🥦

  • SohyeonKim (370)
    • ComputerScience (113)
      • ProcessingInMemory (8)
      • FaultTolerance (6)
      • OperatingSystem (22)
      • FreeBSD (23)
      • DesignPattern (1)
      • ComputerNetwork (12)
      • FullStackProgramming (17)
      • DockerKubernetes (16)
      • Database (5)
    • ArtificialIntelligence (72)
      • ECCV2024 (11)
      • WRTNCampusLeader (4)
      • PaperReading (14)
      • 2023GoogleMLBootcamp (33)
      • DeepLearning (10)
    • Programming (27)
      • Swift (17)
      • JAVA (3)
      • CodingTest (2)
      • Algorithms (5)
    • Experiences (37)
      • KIST Europe Internship (15)
      • Activities (8)
      • Competition (6)
      • International (7)
      • Startup (1)
    • iOS (41)
      • AppProject (10)
      • AppleDeveloperAcademy@POSTE.. (9)
      • CoreMLCreateML (8)
      • MC3Puhaha (4)
      • NC2Textinit (10)
      • MACSpaceOver (0)
    • GitHub (5)
    • UniversityMakeUsChallenge (23)
      • UMCActivities (3)
      • UMCiOS (12)
      • UMCServer (7)
      • BonheurAppProject (1)
    • Science (33)
      • 2022GWNRSummer (13)
      • 2023GWNRWinter (8)
      • 2024GWNRWinter (2)
      • Biology (6)
    • Etc (16)
      • StudyPlanner (13)
  • 홈
  • 태그
  • 방명록
RSS 피드
로그인
로그아웃 글쓰기 관리

KimAnt 🥦

컨텐츠 검색

태그

umc app Google biohybrid Container kernel 중력파 ios OS server docker 딥러닝 swift 수치상대론 Programming CPU Apple pim process AI

최근글

댓글

공지사항

아카이브

Cost(1)

  • [GoogleML] Logistic Regression as a Neural Network

    W가 only parameter, nx dim vector. b는 real number loss func - single training example에 대한 error cost func - cost of your params (전체 데이터에 대해, Parameter W, b의 평균 에러를 의미) Gradient Descent slope of the function Derivatives 직선이라면 (1차 함수) a의 값에 무관하게, 함수의 증가량은 변수 증가량의 3배 즉 3으로 미분값이 일정하다 Computation Graph Derivatives with a Computation Graph Logistic Regression Gradient Descent Gradient Descent on m Exam..

    2023.09.05
이전
1
다음
Git-hub Linked-in
© 2018 TISTORY. All rights reserved.

티스토리툴바