AI建模師-素養(yǎng)手冊(cè)(20)-基礎(chǔ)建模技能-使用SAE降維_第1頁(yè)
AI建模師-素養(yǎng)手冊(cè)(20)-基礎(chǔ)建模技能-使用SAE降維_第2頁(yè)
AI建模師-素養(yǎng)手冊(cè)(20)-基礎(chǔ)建模技能-使用SAE降維_第3頁(yè)
AI建模師-素養(yǎng)手冊(cè)(20)-基礎(chǔ)建模技能-使用SAE降維_第4頁(yè)
AI建模師-素養(yǎng)手冊(cè)(20)-基礎(chǔ)建模技能-使用SAE降維_第5頁(yè)
已閱讀5頁(yè),還剩197頁(yè)未讀 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

AI建模師素養(yǎng)手冊(cè)第20集基礎(chǔ)建模技能 使用SAE降維By高煥堂/AI建模師俱樂(lè)部會(huì)長(zhǎng)為什么需要降維呢?活用SAE:堆棧自編碼實(shí)踐范例-1:單層AE+分類器實(shí)踐范例-2:StackedAE+分類器

***本文摘自北京【電子產(chǎn)品世界】雜志連載專欄

(2021?2023)電子產(chǎn)品世界電子產(chǎn)品世界夠題憩。翫吳也艷系統(tǒng)為例高煥堂(銘傳大學(xué)、長(zhǎng)庚大學(xué)教授,臺(tái)北)1認(rèn)識(shí)協(xié)同過(guò)濾(CF)推薦系統(tǒng)2使用傳統(tǒng)的CF推薦算法***還有更多文章

為什么需要降維呢?為什么需要降維呢?I認(rèn)識(shí)維度災(zāi)難為什么需要降維呢?為什么需要降維呢?I認(rèn)識(shí)維度災(zāi)難為什么需要降維呢?為什么需要降維呢?|因?yàn)镸L常常面臨維度災(zāi)難?維度災(zāi)難是指在處理高維(High-dimensional)數(shù)據(jù)時(shí),會(huì)遇到的一系列計(jì)算難題。?基于上一節(jié)課所介紹的空間對(duì)應(yīng)(Spacemapping)關(guān)系,可以知道,訓(xùn)練數(shù)據(jù)集的特征(Feature)的數(shù)量愈多,其對(duì)應(yīng)到計(jì)算空間的維度也就愈高。?具有大量特征的數(shù)據(jù)集,通常在數(shù)百或更多的數(shù)量級(jí),就被稱為高維數(shù)據(jù)。?維度災(zāi)難(維數(shù)災(zāi)難),又稱為:維度的詛咒(Curseofdimensionality)。是指當(dāng)我們嘗試分析高維空間中的數(shù)據(jù)時(shí),常常會(huì)發(fā)生的一些奇奇怪怪的現(xiàn)象。?維度災(zāi)難這個(gè)名詞,最早是由理察?貝爾曼(RichardE.Bellman)在考慮優(yōu)化問(wèn)題時(shí)首次提出來(lái)的術(shù)語(yǔ),用來(lái)描述當(dāng)計(jì)算空間維度增加時(shí),分析高維空間(如數(shù)千維)時(shí),會(huì)遇到各種問(wèn)題場(chǎng)景。

公開(kāi)6—XA163.com/dy/article/GEDN0IP30514R9P4.html公開(kāi)網(wǎng)易首頁(yè)應(yīng)用*網(wǎng)易首頁(yè)>網(wǎng)易號(hào)>正文鄂維南院士:機(jī)器學(xué)習(xí)解維度災(zāi)難/dy/article/GEDN0IP30514R9P4.html-鄂維南教授說(shuō):“維數(shù)災(zāi)難是什么意思?就是隨著變量的個(gè)數(shù)或者維數(shù)的增加,計(jì)算復(fù)雜度是指數(shù)增加的。從數(shù)學(xué)上來(lái)講,它也有一個(gè)基本的困難,也就是多項(xiàng)式在高維不是一個(gè)有效的工具。-鄂維南教授說(shuō):"能解幵這一困局的或許正是深度學(xué)習(xí)。I以GWAS(基因關(guān)聯(lián)分析)為例-例如,在基因醫(yī)學(xué)領(lǐng)域里,基因排序相關(guān)的數(shù)據(jù),卻常常具有一項(xiàng)特性:特征的數(shù)量(即個(gè)數(shù))非常多。這每一個(gè)特征在AI里都對(duì)應(yīng)到計(jì)算空間里的一個(gè)維度,于是計(jì)算空間成為極高維度的空間。?如今,利用ML的自編碼器(Autoencoder)模型的降維功能,在降維后的潛藏空間向量,仍保有原始數(shù)據(jù)空間的重要特征。?這樣即可在低維度的潛藏空間中,進(jìn)行高速的運(yùn)算。善用SAE來(lái)降維?例如,ML領(lǐng)域的SAE(StackedAutoencoder)模型是最常用的。?它的特色在于深層的隱藏層及其神經(jīng)元(Itischaracterizedbydeephiddenlayersandneurons.)挑戰(zhàn):高維度計(jì)算?Although,thesetypesofmachinelearningalgorithm(ie.,RF)arecompetentinhandlingcomplexcorrelationsandinteractionsamongasmallnumberoffeatures,theydonotscaletolargernumbersofSNPswhichisthecasewhenusingGWASdata(genotypesofalmostonemillionSNPsandthousandsofsamples).?雖然如ML等能處理少數(shù)特征(低維空間)之間的復(fù)雜關(guān)系,但是不易于擴(kuò)大到GWAS分析中的高數(shù)量特征(通常有數(shù)千個(gè)樣本,以及上百萬(wàn)的SNPs特征)。I挑戰(zhàn):高維度計(jì)算?ThehighdimensionalitypresentingeneticdatamakesitcomputationallydifficulttoexhaustivelyevaluateallSNPcombinations.?基因資料中的高維度,要計(jì)算所有可能的SNP各種組合,使得計(jì)算上變得困難。新方法:擅用非監(jiān)督式深度學(xué)習(xí)?Usingunsuperviseddeeplearning(DL)algorithmseemsappealingsinceitexhibitsthepotentialtodealwithbigdataandthedetectionofcomplexfeaturesandassociatedinteractions.?使用非監(jiān)督式深度學(xué)習(xí)(DL)算法前景誘人,因?yàn)樗芯薮鬂撃芸梢蕴幚砭蘖繑?shù)據(jù),以及檢測(cè)復(fù)雜的特征及其組合性的相互作用。新方法:擅用非監(jiān)督式深度學(xué)習(xí)?Usingunsuperviseddeeplearning(DL)algorithmseemsappealingsinceitexhibitsthepotentialtodealwithbigdataandthedetectionofcomplexfeaturesandassociatedinteractions.?使用非監(jiān)督式深度學(xué)習(xí)算法前景誘人,因?yàn)樗芯薮鬂撃芸梢蕴幚砀呔S度數(shù)據(jù),以及檢測(cè)復(fù)雜的特征及其組合性的相互作用。I深度學(xué)習(xí)的特點(diǎn)?WithregardstothediscoveryofSNP-to-SNPinteractions,deeplearning(DL)hasshownpromise.?關(guān)于SNP到SNP交互的發(fā)現(xiàn),深度學(xué)習(xí)(DL)已顯示出光明的前景。GWAS與機(jī)器學(xué)習(xí)以AE(Autoencoder)為例潛藏層以AE(Autoencoder)為例Anautoencoder(AE)isanunsupervisedlearningmethodthatcanbeusedtotrainoutputvaluesYtobesimilartoinputvaluesXusingbackpropagationAE可以透過(guò)反向傳播技術(shù)來(lái)訓(xùn)練模型,讓其輸出Y值,非常接近于輸入值X。EncoderDecoder丫GWAS與機(jī)器學(xué)習(xí)AE的第1項(xiàng)功能:降維-AE(自編碼器)是一種可以實(shí)踐編碼和譯碼的神經(jīng)網(wǎng)絡(luò)。將原始數(shù)據(jù)透過(guò)Encoder進(jìn)行壓縮(降維);使用Decoder還原成原始數(shù)據(jù)。

活用活用SAE架構(gòu):將多個(gè)AE迭加起來(lái)活用活用SAE架構(gòu):將多個(gè)AE迭加起來(lái)-多個(gè)AE逐層堆棧以產(chǎn)生堆棧式自動(dòng)編碼器(SAE)。每一個(gè)AE模型的Encoder都可以實(shí)踐〈降維>的效果。< -多個(gè)AE逐層堆棧以產(chǎn)生堆棧式自動(dòng)編碼器(SAE)。YxX活用SAEI步驟一:先訓(xùn)練AE?先訓(xùn)練一個(gè)單層的AE。

StackedAutoencoder活用SAE練第二個(gè)AE然后,使用來(lái)StackedAutoencoder活用SAE練第二個(gè)AE然后,使用來(lái)步驟一:先訓(xùn)練AE個(gè)單層的AE。個(gè)AE的隱藏層來(lái)活用活用SAE活用活用SAE-繼續(xù)循環(huán)下去,可以創(chuàng)建任意深度的SAE。I08WH8IOI—HwnpJ58H80T-繼續(xù)循環(huán)下去,可以創(chuàng)建任意深度的SAE。%gj(-ooogQflooo.?..ooool1-繼續(xù)循環(huán)下去,可以創(chuàng)建任意深度的SAE。 。。…bQg8^80T\E3\1\1£1 ^QOOOHeoQOQ

loop出oofI步驟二:遷移學(xué)習(xí)-把各AE里的Encoder遷移過(guò)來(lái),堆棧在一起。StackedAutoencoders .?Encoder1StackedAutoencoderooo活用活用SAE活用活用SAEI步驟三:迭成一個(gè)降維模型-形成一個(gè)多層次深度〈降維〉模型。步驟四:結(jié)合分類器降維后的輸出值,成為分類器(即AI分類模型)的輸入值。??活用SAE??活用SAEI更多組合形式-也可以結(jié)合起來(lái),成為一個(gè)深度學(xué)習(xí)模型。

遷移過(guò)來(lái),成為W&B的起始值學(xué)習(xí)模式之一只修正這部分權(quán)重學(xué)習(xí)模式之二修正權(quán)重修正權(quán)重修正權(quán)重OOOO圖片圖片:引自Google圖片圖片圖片:引自Google圖片Autoencoder3Autoencoder2MultilayerPerceptronOutputInputAutoencoder3Autoencoder2MultilayerPerceptronOutputInputStackedAutoencoderOutputIOutputOutputAutoencoder1SctotiJrraininnphd珀Firstrraininfiphase(OOP——ooo-n3rdHiddenLay?T2ndHiddenl_aycrOuipot1軌HiddenLaytrHQ0OO*???9OO0|InputLay<r活用SAE更多組合:SAE+ARMReceivedMay11,2020,acceptedJune7,2020,dateofpublicationJune16,2020,dateofcurrentversionJune29,2020.DigitalObjectIdentifier10.1109/ACCESS.2020.3002923SAERMA:StackedAutoencoderRuleMiningAlgorithmfortheInterpretationofEpistaticInteractionsinGWASforExtremeObesityCASIMIROA.CURBELOMONTANEZPAULFERGUS\CARLCHALMERS\NURULHASHIMAHAHAMEDHASSAINMALIM?2,BASMAABDULAIMMA1,DENISREILLY1,ANDFRANCESCOFALCIANI3FacultyofEngineeringandTechnology.LiveqpoolJohnMooresUniversity.LiverpoolL33AF.U.K.SchoolofComputerSciences,UniversitiSainsMalaysia.GeorgeTown11800.Malaysia3DepartmentofFunctionalandComparativeGenomics,InstituteofIntegrativeBiology.BiosciencesBuilding-UniversityofLiverpool.LiverpoolL697Z1Correspondingauthor:CasimiroA.CurbeloMontanez(contact@)?OurproposedapproachextendsGWASbycombiningdeeplearningstackedautoencoders(SAEs)andassociationrulemining(ARM)toidentifyepistaticinteractionsbetweenSNPs.?我們建議結(jié)合SAE和ARM技術(shù)來(lái)擴(kuò)充GWAS?;钣没钣肧AEI更多組合:SAE+ARM活用活用SAEI更多組合:SAE+ARM?FollowingtraditionalGWASqualitycontrolandassociationanalysis,themostsignificantSNPsareselectedandusedinthesubsequentanalysistoinvestigateepistasis.?從傳統(tǒng)的GWAS質(zhì)量控制和關(guān)聯(lián)分析中,挑選出最顯著的SNPs特征,來(lái)進(jìn)行后續(xù)的模型分析,探索出上位顯性相互作用。?FollowingtraditionalGWASqualitycontrolandassociationanalysis,themostsignificantSNPsareselectedandusedinthesubsequentanalysistoinvestigateepistasis.?從傳統(tǒng)的GWAS質(zhì)量控制和關(guān)聯(lián)分析中,挑選出最顯著的SNPs特征,來(lái)進(jìn)行后續(xù)的模型分析,探索出上位顯性相互作用。?Therefore,SAEsarecombinedwithassociationruleminingtodescribewhatSNPsandassociatedinteractionscontributetoclassificationresults.?因此,將SAE與ARM相結(jié)合,以描述哪些SNPs及其組合性交互作用有助于分類結(jié)果。?AssociationrulesareimplementedtorevealbiologicallyrelevantassociationsbetweenSNPs.IfSNPsfrequentlyappeartogether,thereisanunderlyingrelationshipbetweenthem.?ARM用來(lái)揭示SNPs之間的生物學(xué)的組合關(guān)聯(lián)。如果SNPs經(jīng)常出現(xiàn)在一起,則它們之間存在潛在的關(guān)系?;钣肧AE更多組合:SAE+MLPExtractingEpistaticInteractionsinType2

DiabetesGenome-WideDataUsingStacked

AutoencoderBasmaAbdulaimma,PaulFergus,CarlChalmers

LiverpoolJohnMooresUniversity,ByromStreet,Liverpool,L33AF,UK?Inthispaper,weconsidertheapplicationofdeeplearningstackedautoencoderstomodelepistaticinteractionsbetweenSNPsandfine-tuneafullyconnectedmulti-layerperceptron(MLP).?本論文里,我們使用SAE來(lái)建立SNPs之間的上位性相互作用;并且且優(yōu)劃一個(gè)全連接的MLP模型?;钣肧AEI更多組合:SAE+MLPThisisthefirststudyofitskindtocombinesunsupervisedlearningbuiltonstackedautoencoderswithanMLPclassifierfortheclassificationofT2DsusingGWASdata.-這是首次結(jié)合SAE與MLP模型,利用GWAS資料來(lái)對(duì)T2Ds進(jìn)行分類?;钣没钣肧AE活用活用SAEI更多組合:SAE+ANN分類器AnalysisofExtremelyObeseIndividualsUsingDeep

LearningStackedAutoencodersandGenome-Wide

GeneticDataCasimiroA.CurbeloMontanez1,PaulFergus1,CarlChalmers1andJadeHind1?InthispaperaDLstackedautoencoder(SAE)isusedtodealwithnonlinearitypresentinSNP-SNPinteractionsandtoinitializeamulti-layerfeedforwardartificialneuralnetwork(ANN)classifier.?本論文使用SAE來(lái)處理SNP之間的相互作用;并且且藉之初始化一個(gè)ANN分類器。

活用SAEI更多組合:SAE+ANN分類器?Hence,weperformedunsupervisedfeatureextractioninasetof2465SNPsstackingfoursinglelayerAEswith2000-1000-500-50hiddenunits.?由4個(gè)AE堆棧起來(lái)的SAE,各層分別是2000、1000、500和50個(gè)神經(jīng)元。?透過(guò)這個(gè)AE來(lái)從2465個(gè)SNPs中,進(jìn)行特征提取。?Hence,weperformedunsupervisedfeatureextractioninasetof2465SNPsstackingfoursinglelayerAEswith2000-1000-500-50hiddenunits.InpulLayer OutputLayerFig.2.ProposedSAE.Featuresarecompressedfrom2465to50usingfoursinglelayerAEs.活用SAEI更多組合:SAE+ANN分類器?Fourmulti-layerfeedforwardANN(softmax)weretrainedwiththecompressedhiddenunitsconsideredintheSAE.?基于這SAE所產(chǎn)生的4個(gè)隱藏層,來(lái)建立一個(gè)多層的ANN(使用softmax激活函數(shù))模型?;钣没钣肧AE活用活用SAESAE+ANN分類器的效果?Thisdemonstratesthepotentialofusingourdeeplearningmethodologytoabstractlarge,complexandunstructureddataintolatentrepresentationscapableofcapturingtheepistaticeffectbetweenSNPsinGWAS.?使用DL方法,對(duì)大量、復(fù)雜和非結(jié)構(gòu)化的數(shù)據(jù),抽象為潛藏空間的表示,該潛藏空間的表示形式能夠捕獲GWAS中SNP之間的上位性效應(yīng)?;钣没钣肧AESAE+ANN分類器的效果活用活用SAESAE+ANN分類器的效果?Usingdeeplearningstackedautoencoderstoinitializethemulti-layerfeedforwardANNclassifieroutperformedtheresultsobtainedinourpreviousstudyusingthesamedataset.?利用SAE來(lái)初始化一個(gè)多層的ANN分類器,獲得比以往更棒的效果。?However,compressingthefeaturesusingSAEsmakesverydifficulttoidentifywhatinformationfromthe2465SNPscontributedtothecompresshiddenunits,mainlyduetothelackofinterpretationofdeeplearningmodelswhichactasablackbox.?但是,使用SAE壓縮功能很難識(shí)別2465個(gè)SNP中的哪些信息會(huì)貢獻(xiàn)到隱藏層的神經(jīng)元,這主要是由于DL像黑盒子一樣,人們難以解釋其推論邏輯所致?;钣没钣肧AESAE+ANN分類器的效果活用SAESAE+ANN分類器的效果?Thislimitationfosterstheneedtocreaterobustmethodsfortheinterpretationofdeeplearningnetworks.?因此想辦法來(lái)解釋這個(gè)DL模型。?WecombinedcommongenetictoolsandtechniquesforQCandassociationanalysiswithdeeplearningtocapturerelevantinformationandtheepistaticinteractionsbetweenSNPs.?因此想辦法來(lái)解釋這個(gè)DL模型。我們建議結(jié)合其它基因管控的技術(shù)或工具,來(lái)捕捉上位性相互作用的重要信息。范例實(shí)踐范例實(shí)踐-1從簡(jiǎn)單的AE模型出發(fā)范例實(shí)踐范例實(shí)踐-1從簡(jiǎn)單的AE模型出發(fā)EncoderDecoderEncoderDecoder范例實(shí)踐范例實(shí)踐-1|訓(xùn)練這個(gè)AE模型范例實(shí)踐范例實(shí)踐-1|訓(xùn)練這個(gè)AE模型訓(xùn)練

資料數(shù)據(jù)關(guān)聯(lián)觀點(diǎn)ABCDEFGHIJKLMN2x0x]x2x3368214116859721678227782181279996]110018911882112123613X(輸入值)正規(guī)化(X/10)AEH(潛藏空間)T(目標(biāo)值)設(shè)定(T=X)學(xué)習(xí)23進(jìn)行正規(guī)化23456189101112131415161718192021x0X]x2x30,0.80,90.70,0,X(輸入值)Encoder:正規(guī)化(X/10)O學(xué)習(xí)設(shè)定(T=X)2323I設(shè)定X=Tx00.6CDX(輸入值)xl0.8x20.29101112131415161718192021227

o.o.0.9

00.80.1Encoder:7o.8o.8

o.0.2x30.1G H I JKAEH(潛藏空間)hO hl h2622o.0.3812

???

oooo.0.6正規(guī)化

(X/10)Decoder:設(shè)定(T=X)tO0.900.80.1L MT(目標(biāo)值)

tl't20.8 0.20.1 0.60.7 0.20.8 0.20.8 0.20.2 0.70.6 0.10.1 0.80.8 0.20.2 0.3學(xué)習(xí)21919OOOO.O.O.O.QOO按下〈學(xué)習(xí)〉,AE展幵訓(xùn)練ABCDEFGHIJKLMNO1X(輸入值)AEH(潛藏空間)Y(預(yù)測(cè)值)2x0xJx2x3hOh]h2yoy!y2y330.60.80,20.10.801-0.0280.9210,70.80,0.60.8-1.52-0.960.06050.90.70,20.11.0820.1410.6190.80,70,0,20.20.791-0.070.8420,70,70,050.0330.870.70.918-0.9990.27600.11,2540.0770.360.1100213-L0260.23600.11.0080.0940.830.10.6-0.66-0.8860.0490.1020.30.61314Encoder:Decoder:151.0330.6111-0.4461.5820.416-1.412-1.0916028790.14441.6761.0080.7461.097-0.5717-20040.49090.8266-0.531.6590.742-03918-1.139-1.031-03440.024-0.603-0.769-0.67190,4658-0.505-0.28320正規(guī)化(X/10)設(shè)定(T*21')-學(xué)習(xí)22

范例實(shí)踐-1設(shè)計(jì)一個(gè)分類器OO???設(shè)計(jì)一個(gè)分類器

范例實(shí)踐-1拿訓(xùn)練好的AE的潛藏空間向量來(lái)訓(xùn)練分類器分類器Is131416171819202122231A|BCDE|FGHTJKLMN|0X(輸入值)AEH(潛藏空間)Y(預(yù)測(cè)值)xOxlx2x3hOhlh2yoyiy2y0.10.801-0.02-1.52-0.960.0600,1,0820,1410.691-0.070.842-0.9050.033項(xiàng)。^0.70.9-1.8-0,990.11.2540.07oT\xl0-2.13-1.026pX2JN00.1L0080.0940.830.30.6-0.66-0.8860.040.6EncoderDecoder:1.0330.6111-0.4461.5820.416-1.412-1.090.28790.14441.6761.0080.7461.097-0.57-0.53L6590.742-0.390,024-0,603-0.769-0.67-2.004-1.1390.4909-1,0310.8266-0.3440.4658-0.505-0.283正規(guī)化(X/10)設(shè)定(T=X)學(xué)習(xí)A123456789101112131415161718WhBh19BC成xOr(=AEHxl/-rz0.801-0.0280.921-1.516-0.960.061.0820.1410.6190.791-0.070.8420.9050.0330.876-1.802-0.9990.2761.2540,0770.369-2.134-1.0260.2361.0080.0940.832-0.658-0.8860.049EWoBoI

目標(biāo)值TG101001011CH(潛藏空間)hOhlJK預(yù)測(cè)值Z(A類)(R類)(B類)(A類)(A類)(B類)(A類)(B類)(A類)(B類)匯入AEH學(xué)習(xí)匯入AE的潛藏空間向量

幵始訓(xùn)練分類器ABCDEFGHIJK1X(=AEH)CH(潛藏空間)目標(biāo)值預(yù)測(cè)值2xOxlx2hOhlTZ30.801-0.0280.921-1.526-1.9651(A類)14-1.516-0.960.06L583.1710(B類)051.0820.1410.619-1.44-2.241(B類)160.791-0.070.842-1396-1.8641(A類)170.9050.0330.876-1.575-2.1261(A類)18-1.802-0,9990.2761.5213.4930(B類)091.2540.0770.369-1202-2.2991(A類)1102134-L0260.2361.7964.1220(B類)0111.0080.0940.832-1.624-2.2861(A類)112-0.658-0.8860.0491.004L6460(B類)131415Wh-0,6319-1.7635Wo-1.0033816-0.63-0.2968-2,18254匯人AEH學(xué)習(xí)17-L2241-0.8982Bo-0.1752—18Bh0.090302663完成了!19范例實(shí)踐-1AE和分類器,兩者都訓(xùn)練好了分類器范例實(shí)踐-1設(shè)計(jì)一個(gè)整合模型-然后,將訓(xùn)練好的AE和分類器的W&B遷移到這個(gè)整合模型里。如下圖所示:

把和 的&B遷移過(guò)來(lái)X(輸入值)CH(潛藏空間)預(yù)測(cè)值A(chǔ)EH)匯入W&B-1.0030.090.266(預(yù)測(cè))0.4662.1830.1751.0330788Encoder:0.6110.1440.491-1.031-0.451.6760.827-0.34-0.28fflAE和分類器的W&B遷移過(guò)來(lái)CH(潛藏空間)預(yù)測(cè)值A(chǔ)BCD1 X(輸入值)9Encoder:10 1.0330.611-0.4511 0.2880.1441.67612 -20.4910.82713 CH(潛藏空間)預(yù)測(cè)值A(chǔ)BCD1 X(輸入值)9Encoder:10 1.0330.611-0.4511 0.2880.1441.67612 -20.4910.82713 -1.14-1.031-0.340.466-0.505-0.28X(=AEH)0.090.266測(cè)試(預(yù)測(cè))Wo-1.003-2.183Bo-0.175匯入W&B準(zhǔn)備測(cè)試資料CH(潛藏空間)預(yù)測(cè)值A(chǔ)EH)匯入W&B-L00321830*266測(cè)試(預(yù)測(cè))Encoder:0.6110J.440.491-1.0311.0330.288-1.140.466-1.220,09-0.451.6760.827-0,34-0.28先降維降維ABCDEFGHIJX(輸入值)X(=AEH)x0xlx2x3xOx]x0J0,801-0.030.9210.10*10,60.8152-0.960.00J1.0820.1410.60.9-1.8-10.276Encoder:1.033Q611-0.45Wh-0.63-1.76Wo0.2880.1441.676-0.63-03-20.4910.827-1.22-0.9Bo-1.14-1*031-0.34Bh0*090.2660.466-0.505-0.28K2345678910-1.0031112131415-2.183-0,175CH(潛藏空間)h0hl16N

預(yù)測(cè)值Z匯入W&B測(cè)試

(預(yù)測(cè))然后分類分美ABCDEFGHIJKLMN1X(輸入值)X(=AEH)CH(潛藏空間)預(yù)測(cè)值2x0xlx2x3xOxlx2hOhlZ0.10.801-0.030.921-L525,1.965140.10.1Q60.8-1.52-0.960.06L5793.17100.1L0820,1410.619-L44122410*9-1.8-10.2761.5213,49207匯入W&BEncoder:8910111213141.0330.288一営0.6110.1440.491-1.031-0.45 1.6760.827-0.34Bh-0.63-0.63-1.220.090.466-0.505-0.28176-0.90.266WoBo-L003 -2,183-0.175測(cè)試

(預(yù)測(cè))復(fù)習(xí)剛才的步驟范例實(shí)踐-1|訓(xùn)練AE范例實(shí)踐-1訓(xùn)練分類器分類器范例實(shí)踐范例實(shí)踐-1設(shè)計(jì)一個(gè)整合模型范例實(shí)踐范例實(shí)踐-1設(shè)計(jì)一個(gè)整合模型-然后,將訓(xùn)練好的AE和分類器的W&B遷移到這個(gè)整合模

型里。如下圖所示:范例實(shí)踐范例實(shí)踐-1已經(jīng)訓(xùn)練好整合模型降維-?-分類OOOO■B■???接下來(lái),撰寫Python程序來(lái)實(shí)踐上述的模型FileEditFormatRunOptionsWindowHelp#vino_A08_00_modelpyijiipo11numpyisnpimportkerasfrcijiikeras.modelsimportSequentialfromkeras.layersimportDensekeia^:.optijiuzeib MGI)鼻 =nprarray([11oo11IX11IX=nprarray([11oo11IX11IXzo

2622271823

co1788261oo26197719081準(zhǔn)備.^訓(xùn)練資料I,dtype=np.f1oat32)hl=KuiiaenWB=MoneCwbh=None- Cwbo=Noz-JieContinued 設(shè)計(jì)設(shè)計(jì)&訓(xùn)練一個(gè)AE模型設(shè)計(jì)設(shè)計(jì)&訓(xùn)練一個(gè)AE模型丈 _「rintiinad = =-匚- deftrain_ae():?”Whl7gWB=4=10=3=4Epoch.3000dx=X/ID 、di=dx.Copy() 'en_dh=Dense(H,activation^11inear3name="en_dh",en_d=Dense(0,actIvation='sigmoid',name="en_d")model=Sequentia1()mcidel.add(en^dh)jaodelHald(en_d)pile(loss=keras.losses.MSE}optimizer=keras.optintizers,SGD(11=0.15),metrics=[1accuracy11)將X值轉(zhuǎn)變?yōu)??1 丿mcidel.fit(dxtdt,將X值轉(zhuǎn)變?yōu)??1 丿en*B=en_dh.get_wei^hts()amadel=kerdS.Mode1(made1.inputfmodel.get_liyer(index=0).output)hl=omodel.predict(dx) Cont inued continueddeftrain_ae():hl,enWB=4=10=3=4「s-Uno-Epoch■3000dt=dx.copy()A設(shè)定T=Xen_dh=Dense(H,activaticm='1inear',najne="en_dh"Tinput_dim=N>en_<i=DenSe(O,activatIon='SI^moid3name="en_d")model=Sequential()model.add(en_dh)model,add(en_d)pile(1oss=keras.losses.MSE,optimizer=keras.optimizers,SGD(li=0.15),metrlCS=[1accuracy1])model.fit(dxfdt,2,Epoch,0)AqviUTR—amiJkIratlira1erKIc/\arnodel=keras.Mode1(made1.inputfmodel.get_layer(indeJhl=ontodel.predict(dx) Continued 建立&訓(xùn)練AE模型AEAE范例實(shí)踐-1|已經(jīng)訓(xùn)練好了AI模型建立建立&訓(xùn)練分類器范例實(shí)踐-1訓(xùn)練分類器分類器 rnntimiPr deftiainclassifier():N=3S=10H=20=1Epoch=30D0 dx=hl TT=rrpiarray(L1,D,1,1,1,U,1,U,1,0],ff^e^npTTToaTJZJ # c_dh=Dense(Hractivation='1inear1,name=ic_d=Dense(0,activation='sigmoid1,model=Sequential()model.add(c_dh)model,add(c_d)^iodel.cojupi1e(loss=keras.losses,MSE3optimizer=kerasroptimizers.SGD(1x=0?】5)貝 metrics=[1accuracyf])activation=iiaiiie="c1ViTitl+ 」1匯入 Hl向量model.fit(dxtdt,2,Epoch,0)c_wbh=c_dh.Jget_weights()c_wbo=c_d.get_weights() continued 提供標(biāo)簽array([提供標(biāo)簽array([1定義

分類器# continued-deftram_classifier():hl,c_wbh,c_wboN=3S=10H=20=1Epoch=30D0ype=np.float52)c_dh=Dense(H,activation='1inear1,name=Hc_dh"Tinput_dim=Nc_d=Dense(0,activation='sigmoid1,iiaiiie=c_d")model=Sequential()model.add(c_dh)model,add(c_d) 、^iodel.cojupi1e(loss=keras.losses,MSE3optimizer=kerasroptimizers.SGD(1‘勺metrics=[1accuracyf])model.fit(dxtdt,2,Epoch,0)c_wbh=c_dh.Jget_weights()c_wbo=c_d.get_weights() continued

# continued-deftram_class1fier():hl,c_wbh,c_wboN=3S=10H=20=1Epoch=30D0dx=hldt=np.array([1,0,1,1,1,0,1,0,1,0],dtype=np.float52)c_dh=Dense(H,activation='1inear1,name=Hc_dh"Tinput_dim=N)c_d=Dense(0,activation='sigmoid1,iiaiiie=c_d")model=Sequential()model.add(c_dh)model,add(c_d)^iodel.cojupi1e(loss=keras.losses,MSE3optimizer=kerasroptimizers.SGD(1x=0115),展開(kāi)訓(xùn)練metrics=[1accuracyf])展開(kāi)訓(xùn)練Mcael.fit(dx,dt,2fEpoch,0)c_wbh=c_dh.getc_wbo=c_d.get_weights() rnntinued-…范例實(shí)踐-1已經(jīng)訓(xùn)練好了分類器分類器OO???建立建立&訓(xùn)練整合模型建立建立&訓(xùn)練整合模型# continueddefbulld_and_tesi_aec():enWB,c_wbh,cN=4Hl=3H2=20=1wbo建立整合模型 丿dhl=Dens8(Hl,activation='1inear1,name=''d^^Tinput_dim=N)dh2=Dense(H2,activation='1inear13nam射彳Id=Dense(0,activation=1sigMoid1,name="resulth)model=SequentIa1()modelpadd(dhl)model.add(iYi2)model.addfd)dhl.SA1 1gl-its(unWB)dh2.set_weightsxc_wbh)d,set_weights(c_wbo)TX=np.tx=TXarray(|[/1011Qo119-J7126279-J71oo1729-J?111-0711Continueddtype=np,floa132)匯入Decoder的W&B# continued deIbui1d_and_test_aec():enlVB,c_wbhrc_wboN=4Hl=3 匯入H2= 匯入,nam弟"dhl”,nam弟"dhl”,i.,name=',dh2n)■name="result11)I 分類器的W&Bdhl=Dense(Hl,activation='Iineardh2=Dense(H2,activation='1ineard=Dense(0,activation=*sigmoid1,model=SequentIa1()model.add(dhl)model.add(dh2) 分類器的W&Bdhl?$eighjs[皿叩)dh2.5研一婭]典6就L_wbh)d濟(jì)etweight$(cwEd)/隹備測(cè)試資料11co112627817211-0711rLrLrLrLy(

aTx=TX/10 丿范例實(shí)踐范例實(shí)踐-1已經(jīng)訓(xùn)練好整合模型范例實(shí)踐范例實(shí)踐-1已經(jīng)訓(xùn)練好整合模型降維-?-分類OOOO■B■??? 范例實(shí)踐-1實(shí)際執(zhí)行上述Python程序# Continued —— —— —— print("\n Testingdata-——\nomode1=keras.Mode1(mode1,input,hl=omodel.prelict(tx) printfH\n Hl \nomodel=kerasMode1(model.input,h2=omodel.predictftx) TOC\o"1-5"\h\zprint(,p\n H2 \nz=model.predict(tx) print("\n Z \n# train_ae()train_classifier()builii_and_test_aec() #End,tx)model.get_layer(index=0).output),hl)model^et_layex(index=1).outpu!),h2),np.round(z,1))TX=np.array([[6,8,2,1],[1丄[9,7,2,1],[1,2,7,9]],dtype=np.float32)tx=TX/10W2BooWRB???WRBW2BooWRB???WRBoooo4 范例實(shí)踐-1I輸出結(jié)果:--Testin^data—[0,1]0.6D.81[0.104[0.90.7[040.70.9]]降維 [[0.01757288D.86208737-L0648603][-0.5327867[0.4837&85[-D.531201B4H10.778D0340.88695950.96856691,529264]-1.232594]1.755227]] H2 H2分類[2.01100371,4292483]分類-D.726819B-3.8074827]2.13527272.1117566[-0.7628951-丄3563647]] Z [1.0.]]?>

范例實(shí)踐范例實(shí)踐-2范例實(shí)踐范例實(shí)踐-2復(fù)習(xí):StackedAE-多個(gè)AE逐層堆棧以產(chǎn)生堆棧式自動(dòng)編碼器(SAE)。-一旦訓(xùn)練了簡(jiǎn)單AE,使用來(lái)自第一個(gè)AE的隱藏層來(lái)訓(xùn)練第二個(gè)AE。通過(guò)重復(fù)此過(guò)程,可以創(chuàng)建任意深度的SAE。-首先設(shè)計(jì)簡(jiǎn)單AE,如下圖:范例實(shí)踐范例實(shí)踐-2設(shè)計(jì)&訓(xùn)練AE_1設(shè)計(jì)一個(gè)分類器:分類器-SAE訓(xùn)練好了,拿其潛藏空間向量來(lái)訓(xùn)練分類器,如下圖:范例實(shí)踐-2I設(shè)計(jì)一個(gè)整合模型oo>W&B\ooo?W&B\\ooooo范例實(shí)踐-2已經(jīng)訓(xùn)練好了3個(gè)模型-一個(gè)是StackedAE(SAE)模型。-一個(gè)是分類器。-一個(gè)是從SAE和分類器遷移過(guò)來(lái)的整合模型。?其中,我們準(zhǔn)備導(dǎo)出整合模型和分類器,如下圖:SAE整合模型匯出■■■■>*.pb分類器匯出■■■■A*.pb范例實(shí)踐-2匯出給OpenVino來(lái)進(jìn)行推理接下來(lái),接下來(lái),撰寫Python程序來(lái)實(shí)踐上述的模型接下來(lái),接下來(lái),撰寫Python程序來(lái)實(shí)踐上述的模型FileEditFormatRunOptionsWindowHelp#vino_A08_01_modelpyImportnumpyasnpimportkerasfronikerHS<modelsijiiportfronikeras,1ayersimportfremkerHS.layersijiiportfromkeras<optimizersimportSGDkeras,modelsijiiportMode1fromkerasimp□11backendimport16nSorf1owastfSequentialDense,FlattenCanv2DfromasKdefsigniDid(s):return1/(1+np,exp(-s))array([353>_^u,p,^o-Jfli■-rL■-rL■-rL■-rL11oo11IX07-ro

262027837186oo121291-21-589準(zhǔn)備訓(xùn)練資料oR--HInnT-II-□XMN 音報(bào)sFileEditFormatRunOptionsWindowHelp#vino_A08_01_modelpyImportnumpyasnpimportkerasfronikerHS<modelsijiiportfronikeras,1ayersimportfremkerHS.layersijiiportfromkeras<optimizersimportSGDkeras,modelsijiiportMode1fromkerasimp□11backendimport16nSorf1owastfSequentialDense,FlattenCanv2DfromasKdeisigmoid(s):return1/(1+np,exp(-s))8,2,1,們7,2,8",X=np.array([[6,[1,[土口,[10,8.2,[1,2,L[0,3,8, r—■wU2,U%9,2,9,1,2,I,5,切adx=X/10h1- 1hl=Ncmt!wbl=Nonewb2=*'IoneSCjmodel=None# continued將X值轉(zhuǎn)變?yōu)??1 丿建立建立&訓(xùn)練AE1模型建立建立&訓(xùn)練AE1模型 deftrain_ae_l():一1_L 」l=111 1uxi,irimt6836Epoch=5000丿建立&訓(xùn)練AE1模型dt=dxncopy()#hiddenlayer #hiddenlayerdhl=DensetH,activatic>n=1Sigmoid1,najiie='dhl",Input_dijrt=N)dl=Dense]6,activation='si^moid',n&me="dl")model=Sequentia1()modelhadd(dhl)model.&dd(i1)pile(loss=keras,losses,MSE,optImizGr-keras.optijiilzeiS,SGD(11=0.15),metrics=[1accuracy1])modelhfit(dx,dt,3,Epoch,0)wbl=dhl.get_weights()omode1=keras.Mode1(mode1.inpu19jiiodel.^st_layer(index=0).c>utput)hl=omodelHpredict(dx)print(,R\nAE_ltrainedok.\n") continued

# --Continued itftrain_ae_l():dx,hl,wb1N=6S=3H=30=6Epoch=5000dt=dxncopy()it 定義AE_1模型 展幵訓(xùn)練#hiddenlayerdhl=DensetH,activatic>n=1Sigmoid1,najiie='dhl",Input_dijrt=N)dl=Dense]6,activation='si^moid',n&me="dl")model=Sequent定義AE_1模型 展幵訓(xùn)練#hiddenlayermodel.add]d1) modelrcoiaplle(Toss=I:eras,1modelrcoiaplle(Toss=I:eras,1osses.MSE,optImizGr-keras.optijiilzeiS,SGD(11=0.15),metrics=[1accuracy1]) wblmdlil.getweightTC7omode1=keras.Mode1(mode1.inpu19jiiodel.^st_layer(index=hl=omodelHpredict(dx)print(,R\nAE_ltrainedok.\n") continued 建立建立&訓(xùn)練AE1模型建立建立&訓(xùn)練AE1模型deftrain_ae_2():hl,0,optimizei=keras.optimize±2,SGD(1deftrain_ae_2():hl,0,optimizei=keras.optimize±2,SGD(11=0.15),metrics^t'accuracy1])N=3S=8H=20=3Epoch=5000dx=hl dt_=_fix,Copy()dh2=Dense(H,aciivation=*sigmoid',name="dh2"3input32=Dense(如activation='sigmoid'rname="d2")model=Sequentia1()model,add(dh2)model.add(d2)modelxompilet1qs$=keras,losses出$E,model.fit(dxrdt,3,Epoch,0)wb2=dh2.get_wei^htsf)omode1=keras,Mode1(mode1,input,jiiodel.^et_layer(index=0),c<utput)h2=omodelHpredict(dx)print("\nAE_2trainedok.\n") continued --Continueddx,Copy()dlm=N)Epochtrain,global定義2模型Epoch=5000dh2=Dense(H,aciivaiion=*sigmoid'3name="dh2d2=Dense(如activation='sigmoid'rnajne="d2")model=Sequentia1()model,add(dh2)model.add(d2)mou^isconrp11iou5~kc■iosu'cs*optimizei=keras.optimizeismetrics=['accuracy1])model,fit(dxwigiiTsr?omodel=kerasMode1(mode1.input,model.get_layer(indexh2=omodelHpredict(dx)print("\nAE_2trainedok.\n") continued 展開(kāi)訓(xùn)練建立&訓(xùn)練分類器 建立&訓(xùn)練分類器 建立&訓(xùn)練scmoaei 分類器eftrain_classifier():N=2H3=40=1dx鷺—dtt=np,array([lt0v1,1,l,OvQ,Q]tdtype^np.floatS2)# dh3=Dense(HJ,activation='Iinear',name="dh3",iiLpiit_dim=N)d3=Dense(O,activation='sigmoid'fname=,hsaec_resu11M)sc_mode1=Sequential()sc_mode1.add(dh3)sc_model.add(13)sc_pile(loss=keras,losses.MSE,optimizer=kexas.optimizeis.SGD(1r=0.15),metrics=[1accuracy1])sc_modeLfit(dxx,dtt,1,IODO,0)pirint("\nClass1fiertrained口k,\n")# Savedto*,pb sess=K,get_session()sess.run(tf.local_variables_initializer())frozengraph=Xf.graph_util.ConvGrt_variab16S_to_ConStants(sess,1fnget_default_5raph().as_graph_def(),[1saec_result/BiasAdd*])tf.io.write_^raphffrozen_^iaph,*C:/pb/"r"saec*pb*Pas_text=False) continued 建立建立&訓(xùn)練分類器建立&訓(xùn)練分類器d臼ftrain_class1fier():globalh2,sc_modelN=2H3=4dx鷺=h2dit=np. li;u7i;i「17H,U7U」,dtype=np7troadh3=Dense(HJ,activation='Iinear',name="dh3",iiLpiit_dim=N)d3=Dense(O,activation='sigmoid'fname=,hsaec_resu11M)sc_mode1=Sequential()sc_mode1.add(dh3)sc_model.add(13)sc_pile(loss=keras,losses.MSE,optimizer=keras.Optimizers.SGD(1r=0.15),metrics=[1iccuracy'])sc_modeLfit(dxx,dtt,1,IODO,0)pirint("\nClass1fiertrained口k,\n")# Saved to*,pb sess=K,get_session()sess.run(thlocal_variables_initializerf))frozengraph=tf.graph_util.ConvGrt_variab16S_to_ConStants(sess,tf.get_dyfault_graph(),as_giaph_def()r[1saec_result/BiasAdd*])tf.ionwrite_^raph(frozen_^raph,

*C:/pb/","saec.pb"Pas_text^False) cont inued- - ContInuedarray(F1mode11000raineiConstant展開(kāi)訓(xùn)練1,0,0.01.dtvDe=nDHfloat32)metrics=[1accuracyactivation^'Iinear',name=prlntContInuedarray(F1mode11000raineiConstant展開(kāi)訓(xùn)練1,0,0.01.dtvDe=nDHfloat32)metrics=[1accuracyactivation^'Iinear',name=prlnt("\nuTass# Saveddefault_graph(),as_giaph_def()[1saec_result/BiasAdT])write_^raphffrozen_^raph,*C:/pb/",hsaec,pb\as_text=Fal£p) continued=Dense(H5=Dense(O,activation='sigmoid',name="saec_resultmode1=Sequential()mode1.add(dh3)model.add(d3)定義分類器BBSS1on()rocal_variables_initializer())=pile(loss=keras,losses.MSE;,optimizer=keras.OptimizerS.SGD(1r=0train_classifier():h2,sc_model建立建立&訓(xùn)練分類器分類器 Continued Continued 將整合模型匯入*pb檔deftrain_classifier():■z1oba1h2,SC_modelN=2H3=40=1dx鷺—dtt=np.array([1,丄1,1,1,0,0,0],dtype=np,float32)# dh3=Dense(HJ,activation='Iinear',name="dh3",iiLpiit_dim=N)d3=Dense(O,activation='sigmoid'fname=,hsaec_resu11M)sc_mode1=Sequential()sc_mode1.add(dh3)sc_model.add(d3)sc_pile(loss=keras,losses.MSE,optimizer=kexas.optimizeis.SGD(1r=0.15),metrics=[1accuracy1])sc_modeLfit(dxx,dtt,1,IODO,0)print(''\nClassifiPrtrainednk,\n")savedioT.pu——^7 sess=K,get_session()sess.run(tf.local_variables_initializer())frozengraph=tf.graph_util.ConvGrt_variab16S_to_ConStants(sess,1fnget_default_5raph().as_graph_def(),[1saec^result/BiasAdd*])tf.io.write_^raphffrozen_^raph,*C:/pb/"t"saec*pb*PaB_text=False) continued — 分類器匯出*.pb建立建立&訓(xùn)練整合模型建立建立&訓(xùn)練整合模型Jr Lrirt+If已— defsae_model():,—ax—niynz■廠wnz廠onzN=6H4=3Txy<

aQ^-1111Go^flo11

?MirLrLrLrLrLTx0862180,6,L2,2,6,71137811建立整合模型,name="dh?,name="dh?",inputname="saeiesult")dim=W)dh4=Dense(H4,activation=d4=Dense(O,activation='smodel=Sequential()model.ad(1(dh4)model,add(d4)continued /隹備

測(cè)試資料 丿 # continued defsae_mo(lel():globaldx,hl,h2,wh2,bh2N=6H4=30=2Txy<0862180,&L2,2,6,711378111111oo111,7],2],8],6],7j,1]],dtype=nprfloat52) ) ,name="dh?■"?,name="dh?■"?input_dim=N)name="saeiesult")d4=Dense(O,activation='smodel=Sequential()model.ad(1(dh4)model.adl(d4) continued # continued defsae_mo(lel():globaldx,hl,h2,wh2,bh2N=6H4=30=2Txy<a08722671137811■■11oo■■0,1,7],8,7,2],6,8,8],2,1,6],3:7],1]].dtyps^np.float32)dh4=Dense(H4,activation=d4=Dense(O,activation='smodel=Sequential()model.ad(1(dh4)model.adl(d4),name=”dh3"name="saeiesult")定義整合模型 丿dim=N)將將AE模型的Encoder的W&B,遷移到整合模型dhd.SEtweiKhtS(wbl)ContInueddhd.SEtweiKhtS(wbl)d4.setweights(wb2)z4=modelpredi

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論