machine learning2019s課件meta2v按一下以編輯母片標(biāo)題樣式按一下以編輯母片文字樣式第二層第三層_第1頁(yè)
machine learning2019s課件meta2v按一下以編輯母片標(biāo)題樣式按一下以編輯母片文字樣式第二層第三層_第2頁(yè)
machine learning2019s課件meta2v按一下以編輯母片標(biāo)題樣式按一下以編輯母片文字樣式第二層第三層_第3頁(yè)
machine learning2019s課件meta2v按一下以編輯母片標(biāo)題樣式按一下以編輯母片文字樣式第二層第三層_第4頁(yè)
machine learning2019s課件meta2v按一下以編輯母片標(biāo)題樣式按一下以編輯母片文字樣式第二層第三層_第5頁(yè)
已閱讀5頁(yè),還剩25頁(yè)未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、Meta Learning (Part 2):Gradient Descent as LSTMGradient Descent as LSTMHung-yi LeeInit Compute GradientUpdateTrainingDataCompute GradientUpdateTrainingDataNetworkStructureCan we learn more than initialization parameters? The learning algorithm looks like RNN.Recurrent Neural Networkfh0h1y1x1fh2y2x2f

2、h3y3x3No matter how long the input/output sequence is, we only need one function fh and h are vectors with the same dimensionLSTMc change slowlyh change fasterct is ct-1 added by somethinght and ht-1 can be very differentNaveRNNhtytxtht-1LSTMytcththt-1ct-1xtxtzzizfzoht-1ct-1zxtht-1Wzixtht-1Wizfxtht-

3、1Wfzoxtht-1WoReview: LSTMinputforgetoutputhtxtzzizfzoytht-1ct-1cttanhReview: LSTMxtzzizfzoytht-1ct-1ctxt+1zzizfzoyt+1htct+1tanhtanhhtReview: LSTMhtxtzzizfzoytht-1ct-1cttanhSimilar to gradient descent based algorithmall “1”xtzzizfht-1ct-1ctSimilar to gradient descent based algorithmall “1”Dynamic lea

4、rning rateSomething like regularizationOtherLSTM for Gradient Descent “LSTM”“LSTM”“LSTM”3 training stepsTesting DataBatch from trainBatch from trainBatch from trainLearnableTypical LSTMLearn to minimizeIndependence assumptionThe LSTM used only has one cell. Share across all parameters “LSTM”“LSTM”“L

5、STM” Reasonable model size In typical gradient descent, all the parameters use the same update rule Training and testing model architectures can be different.Real Implementation “LSTM”“LSTM”“LSTM”Experimental Resultshttps:/ update depends on not only current gradient, but previous gradients.LSTM for

6、 Gradient Descent (v2) “LSTM”“LSTM”“LSTM”3 training stepsTesting DataBatch from trainBatch from trainBatch from train“LSTM”“LSTM”“LSTM”m can store previous gradientsExperimental Results/abs/1606.04474Meta Learning (Part 3)Hung-yi LeeEven more crazy idea Input: Training data and their

7、 labels Testing data Output: Predicted label of testing data Training Data & Labels catcatdogTesting DataLearning + PredictionFace Verificationhttps:/ your phone by FaceTrainingRegistration (Collecting Training data)In each task:Few-shot LearningTraining TasksTesting TasksTrainTestTrainTestYesTr

8、ainTestNoMeta Learning TrainTestNoYesorNoNetworkNetworkNetworkYesNo?Same approach for Speaker VerificationSiamese NetworkNetworkNoCNNSimilarity CNNshareembeddingembeddingTrainTestscoreLarge score Small score YesNoSameSiamese NetworkNetworkNoCNNSimilarity CNNshareembeddingembeddingTrainTestscoreLarge

9、 score Small score YesNoDifferentSiamese Network- Intuitive Explanation Binary classification problem: “Are they the same?”Training SetTesting Set?samedifferentdifferentNetworkSame or DifferentFace1Face2Siamese Network- Intuitive Explanation Learning embedding for facese.g. learn to ignore the backg

10、roundCNNCNNCNNFar awayAs close as possibleTo learn more What kind of distance should we use? SphereFace: Deep Hypersphere Embedding for Face Recognition Additive Margin Softmax for Face Verification ArcFace: Additive Angular Margin Loss for Deep Face Recognition Triplet loss Deep Metric Learning usi

11、ng Triplet Network FaceNet: A Unified Embedding for Face Recognition and ClusteringN-way Few/One-shot Learning Example: 5-ways 1-shotTraining Data (Each image represents a class)Testing DataNetwork (Learning + Prediction)一花二乃三玖四葉五月三玖Testing DataPrototypical Network/abs/1703.05175CNNCNNCNNCNNCNNCNN= similarityMatching Network/abs/1606.04080Bidirectional LSTMCNN= similarityConsidering the relationship among the training examplesR

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論