Summary π€
λ₯λ¬λμμ νμ΅νλ€λ κ²μ 무μμΈμ§, λ μ΄λ€ μμ¬μ νλ¦μμμ λ°μ ν΄μλμ§λ₯Ό κ°λ³κ² μμ보μ.
Index π
AI-ML-DL
AIλ MLμ ν¬ν¨νκ³ , MLμ DLλ₯Ό ν¬ν¨νλ€. κ°κ°μ λ²μ£Όμ λν΄ κ°λ΅ν μμ½νμλ©΄ λ€μκ³Ό κ°λ€.
- AI(Artificial Intelligence) : μΈκ°μ μ¬κ³ λ₯Ό νλ΄λ΄λ κΈ°μ
- ML(Machine Learning) : λ°μ΄ν°λ₯Ό κΈ°λ°μΌλ‘ νμ΅νλ κΈ°μ
- DL(Deep Learning) : Neural Networkλ₯Ό νμ©νμ¬ νμ΅νλ κΈ°μ
Key Component
λ₯λ¬λμ μ΄λ£¨λ μ£Όμν μ»΄ν¬λνΈλ λ€μκ³Ό κ°μ΄ λΆλ₯ν μ μλ€.
- Data
νκ³ μ νλ λ¬Έμ μ λ°λΌ μ μλλ€.- Classification
- Semantic Segmentation
- Detection
- Pose Estimation
- Visual QnA
- Model
νμ΅μ μν λͺ¨λΈμ΄λ€.- AlexNet
- GoogleNet
- ResNet
- DenseNet
- LSTM
- Deep AutoEncoder
- GAN
- Loss
μ λ΄κ³Όμ μ€μ°¨, μ¦, λΉμ©μ΄λΌκ³ ν μ μμΌλ©° μ΅μννλ κ²μ΄ λͺ©νλ€.
- MSE(Mean Square Error): Regression
- CE(Cross Entrophy): Classification
- MLE(Most Likelihood Estimation) : Probabilistic
- Algorithm
μ±λ₯ μ΅μ νλ₯Ό μν μκ³ λ¦¬μ¦μ΄λ€.
- SGD
Historical Review
Deep Learningβs Most Important Ideas - A Brief Historical Review (Denny Britz, 2020-07-29)
- 2012 - AlextNet
- 2013 - DQN
- 2014 - Encoder/Decoder(NMT), Adam
- 2015 - GAN(μ μ§), ResNet(DLλ₯Ό κ°λ₯μΌ ν¨)
- 2017 - Transformer (Attention is All you need)
- 2018 - BERT(fine-tuned NLP models)
- 2019 - GPT-3(OpenAI)
- 2020 - Self Supervised Learning(λΉμ§λ νμ΅)SimCLR (a simple framework for contrastive learing of visual represenations)