Kwansik Yoo
by Kwansik Yoo
~1 min read

Categories

Tags

Summary πŸ€™

λ”₯λŸ¬λ‹μ—μ„œ ν•™μŠ΅ν•œλ‹€λŠ” 것은 무엇인지, 또 μ–΄λ–€ 역사적 νλ¦„μ†μ—μ„œ λ°œμ „ν•΄μ™”λŠ”μ§€λ₯Ό κ°€λ³κ²Œ μ•Œμ•„λ³΄μž.


Index πŸ‘€


AI-ML-DL


AIλŠ” ML을 ν¬ν•¨ν•˜κ³ , ML은 DLλ₯Ό ν¬ν•¨ν•œλ‹€. 각각의 범주에 λŒ€ν•΄ κ°„λž΅νžˆ μš”μ•½ν•˜μžλ©΄ λ‹€μŒκ³Ό κ°™λ‹€.

  • AI(Artificial Intelligence) : μΈκ°„μ˜ 사고λ₯Ό ν‰λ‚΄λ‚΄λŠ” 기술
  • ML(Machine Learning) : 데이터λ₯Ό 기반으둜 ν•™μŠ΅ν•˜λŠ” 기술
  • DL(Deep Learning) : Neural Networkλ₯Ό ν™œμš©ν•˜μ—¬ ν•™μŠ΅ν•˜λŠ” 기술


Key Component


λ”₯λŸ¬λ‹μ„ μ΄λ£¨λŠ” μ£Όμš”ν•œ μ»΄ν¬λ„ŒνŠΈλŠ” λ‹€μŒκ³Ό 같이 λΆ„λ₯˜ν•  수 μžˆλ‹€.

  1. Data
    ν’€κ³ μž ν•˜λŠ” λ¬Έμ œμ— 따라 μ •μ˜λœλ‹€.
    • Classification
    • Semantic Segmentation
    • Detection
    • Pose Estimation
    • Visual QnA
  2. Model
    ν•™μŠ΅μ„ μœ„ν•œ λͺ¨λΈμ΄λ‹€.
    • AlexNet
    • GoogleNet
    • ResNet
    • DenseNet
    • LSTM
    • Deep AutoEncoder
    • GAN
  3. Loss μ •λ‹΄κ³Όμ˜ 였차, 즉, λΉ„μš©μ΄λΌκ³  ν• μˆ˜ 있으며 μ΅œμ†Œν™”ν•˜λŠ” 것이 λͺ©ν‘œλ‹€.
    • MSE(Mean Square Error): Regression
    • CE(Cross Entrophy): Classification
    • MLE(Most Likelihood Estimation) : Probabilistic
  4. Algorithm μ„±λŠ₯ μ΅œμ ν™”λ₯Ό μœ„ν•œ μ•Œκ³ λ¦¬μ¦˜μ΄λ‹€.
    • SGD


Historical Review

Deep Learning’s Most Important Ideas - A Brief Historical Review (Denny Britz, 2020-07-29)

  • 2012 - AlextNet
  • 2013 - DQN
  • 2014 - Encoder/Decoder(NMT), Adam
  • 2015 - GAN(μˆ μ§‘), ResNet(DLλ₯Ό κ°€λŠ₯μΌ€ 함)
  • 2017 - Transformer (Attention is All you need)
  • 2018 - BERT(fine-tuned NLP models)
  • 2019 - GPT-3(OpenAI)
  • 2020 - Self Supervised Learning(비지도 ν•™μŠ΅)SimCLR (a simple framework for contrastive learing of visual represenations)