人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案章節(jié)答案2024年長安大學_第1頁
人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案章節(jié)答案2024年長安大學_第2頁
人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案章節(jié)答案2024年長安大學_第3頁
人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案章節(jié)答案2024年長安大學_第4頁
人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案章節(jié)答案2024年長安大學_第5頁
免費預覽已結(jié)束,剩余7頁可下載查看

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領(lǐng)

文檔簡介

人工神經(jīng)網(wǎng)絡(luò)及應(yīng)用智慧樹知到期末考試答案+章節(jié)答案2024年長安大學ThecommonlyusedtransferoractivationfunctionsoffullyconnectedlayerincludeSigmoidandSoftmax.()

答案:對In1976,Grossberg,acognitiveneuroscientistatBostonUniversity,proposedamoreperfectperceptronmodel,adaptiveresonancetheory(ART),thatis,supervisedlearning.()

答案:對Throughrepeatedlearning,SOFMneuralnetworkcanmakethespatialdistributiondensityofconnectionweightsconsistentwiththeprobabilitydistributionofinputpatterns,soitcancompletetheclassificationofinputpatternsthroughthespatialdistributionofconnectionweights.()

答案:對Agradientisavectorthatindicatesthatthedirectionalderivativeofafunctionatacertainpointreachesitsmaximumvalueinthatdirection.()

答案:對Intheearly1990s,Edelman,theNobellaureate,putforwardthreeformsofDarwinismmodelandestablishedaneuralnetworksystemtheoryinvolvinggrouprestriction,groupselectionandgroupcompetition.()

答案:對Amari,aJapaneseresearcher,describedtrustallocationinbiologicalneuralnetworkswithmathematicallanguage.()

答案:對AftertheElmanneuralnetworkisestablishedinMATLAB,theweightsandbiasvaluesofeachlayerareinitializedaccordingtoNguyenWidrowalgorithm,andtheinitializationfunctionisinitnw.()

答案:對In1994,Hebb,aneuropsychologist,proposedtheearliestlearningrulesforneuralnetworks,whichwereusedtoadjusttheweightsbetweenneurons.()

答案:對Rosenblattisoneofthemainconstructorsofmodernneuralnetwork.Hisresearchhasarousedgreatinterestandattentionofmanyscholarsandorganizations.()

答案:對Biologicalstudieshaveshownthatneuronsarearrangedinanorderlymannerinthesensorychannelsofthehumanbrain.Neuronsindifferentregionshavedifferentfunctionsanddifferentinputinformationpatterns.()

答案:對FERNNalgorithmcanextractMOFMrulesorDNFrulesaftertrainingneuralnetworkformanytimes.()

答案:對Intheprocessofelectricalsynaptictransmission,chemicalsubstancesaretransmittedthroughgapjunctionsbetweentheanteriorandposteriorneurons.Thegapisusuallyverysmall,only2-3cm.()

答案:錯OnJune23,1960,Rosenblatt,apsychologistatCornellUniversityintheUnitedStates,developedaperceptronbasedneuralcomputermark1,whichcanrecognizeEnglishletters.()

答案:對Actionpotential(AP)isaspikelikepotentialchangeonthebasisofrestingmembranepotentialafterneuronsarestimulated.Itisthebasiclanguageofinformationtransmissioninthenervoussystem.()

答案:對Overfittingmeansthatthemodelselectedduringlearningcontainstoofewparameters,thatis,thenumberofsamplesisverysmall.()

答案:錯Facedetectionandfacerecognitionarethesameconcept,whichcanbeusedtodetectthefaceinthescene.()

答案:錯Thecellmembraneonthesurfaceofaxoniscalledaxon,andthecytoplasminsideiscalledaxon.()

答案:對Thepoolingoperationcanbedividedintotwoways:averagepoolingandmaximumpooling.()

答案:對Hopfieldneuralnetworkisasetofnonlineardifferentialequations.TheLyapunovfunctionisintroducedintothenetworkforthefirsttime,thusthestabilityofthenetworkisproved.()

答案:對Hopfieldneuralnetworkisakindoffeedbackneuralnetwork,whichhasthefunctionoflearningandmemory.Stabilityisoneofthemostimportantcharacteristicsofthisneuralnetworkmodel.()

答案:對Hopfieldneuralnetworkisakindoffeedbacknetworkwithassociativememoryfunction.Itcanbedividedintodiscretetypeandcontinuoustype.()

答案:對ADCNNwithcompletefunctionsisusuallycomposedofinputlayer,hiddenlayer,outputlayerorclassificationlayer.()

答案:對ThearchitectureofRBFneuralnetworkissimilartothemultilayerforwardnetwork,whichisathree-layerforwardneuralnetworkwithsinglehiddenlayer.Theinputlayeriscomposedofsignalsourcenodes,andthehiddenlayerisasingleneuronlayer.()

答案:對SOFMneuralnetworkformsself-organizingmappingspacebyintroducinggrid,andestablishestopologicalconnectionamongneurons.()

答案:對RBFneuralnetworkisakindofneuralnetworkstructurewithgoodlocalapproximationperformance.Somescholarshaveprovedthatitcanapproachanycontinuousfunctionswitharbitraryprecision.()

答案:對Whentheerrorbetweentheactualoutputandtheidealoutputexceedstheexpectedvalue,itisnecessarytoentertheerrorback-propagationprocess.()

答案:對Whenalltheelementsinthematrixarezero,thematrixiscalledidentitymatrix.()

答案:錯Theideaoforthogonalleastsquare(OLS)comesfromlinearregressionmodel.()

答案:對Ionpumpisoneofmembranetransportproteins,whichcandriveionsacrosscellmembraneinreverseconcentrationgradient.()

答案:對DifferentfromthealgorithmderivationofBPnetwork,recursivenetworkgenerallyadoptsthealgorithmoforderedchainrule.()

答案:對Deepconvolutionalgenerativeadversarialnetworks(DCGAN)istoaddconvolutionalneuralnetworktothegenerationmodelanddiscriminantmodel,whichgreatlyimprovesthegenerationperformance.()

答案:對8.Forthelearningeffectofthesametrainingset,differentstepsizemeansdifferentiterationtimesofweakclassifier,andthelargerstepsizemeansmoreiterationtimesofweakclassifier.()

答案:錯ADALINEneuralnetworkreflectsthelinearmappingrelationshipintheinputandoutputsamplevectorspace.ItusesLMSalgorithm,andthemeansquareerrorofthenetworkpresentsaparabolicsurfaceofquadraticfunction,andthevertexofthesurfaceistheoptimalsolutionoftheneuralnetwork.()

答案:對Inputlayergenerallyreferstotheneuralnetworklayerusedforinputimage.Thehiddenlayerincludestheconvolutionallayer,thepoolinglayerandthefullyconnectedlayer.()

答案:對Adaptivelinearneuron(ADALINE)wasproposedbyStanfordProfessorWidrowandhisgraduatestudentHoffin1960.Itwasalmostatthesametimeastheperceptron.()

答案:對Establishthebehaviormechanismmodelofartificialneuralnetwork,whichisascienceandtechnologytoabstracthumanbrainneuronsfromtheperspectiveofinformationprocessing,andformdifferentnetworkstructuresaccordingtodifferentconnectionmodes,andthensimulatetheactionofneuralnetwork.()

答案:對ThestructureofANNiscomposedofalargenumberofprocessingunitsin().

答案:parallelThestrengthofsynaptictransmissionisnotunchangeable,itwillshowchangingtransmissionefficiency.()

答案:對TheerrorfunctionusedinBPneuralnetworkalgorithmis().

答案:meansquareerrorfunctionInrecentyears,deeplearninghasshownexcellentperformanceinsolvingmanyproblemssuchasvisualrecognition,speechrecognitionandnaturallanguageprocessing.()

答案:對Theinputlayerisresponsibleforreceivinginformationfromtheoutsideworldandtransmittingittothenextlayerofneurons.()

答案:對Hopfieldneuralnetworkiscalled()

答案:associativememoryInLMSlearningalgorithm,theweightvariationofADALINEneuralnetworkisinverselyproportionaltotheinputandoutputerrorsofthenetwork.()

答案:錯Thelearningstrategyofperceptronistoselectthemodelparameterswandbthatminimizethelossfunction.()

答案:minimizeThedescriptionoftheparameter"validator"ofthewx.TextCtrlclassis:the().

答案:validatorofcontrolWhenthewindoweventoccurs,themaineventloopwillrespondandassigntheappropriateeventhandlertothewindowevent.()

答案:對Thedescriptionoftheparameter"defaultDir"ofclasswx.FileDialogis:().

答案:defaultpathFromtheuser'spointofview,thewxPythonprogramisidleforalargepartofthetime,butwhentheuserortheinternalactionofthesystemcausestheevent,andthentheeventwilldrivethewxPythonprogramtoproducethecorrespondingaction.()

答案:對InthedesignofartificialneuralnetworksoftwarebasedonwxPython,creatingGUImeansbuildingaframeworkinwhichvariouscontrolscanbeaddedtocompletethedesignofsoftwarefunctions.()

答案:對Thedescriptionoftheproperty"tooltipstring"isthepromptthatappearswhenthemouseisovertheobject.()

答案:對Itisnecessarytodeterminethestructureandparametersoftheneuralnetwork,includingthenumberofhiddenlayers,thenumberofneuronsinthehiddenlayerandthetrainingfunction.()

答案:對Themenupropertybarhas"label"and"tag".Thelabelisequivalenttothetagvalueofthemenuitem,andthetagisthenameofthemenudisplay.()

答案:錯Inlarge-scalesystemsoftwaredesign,weneedtoconsiderthelogicalstructureandphysicalstructureofsoftwarearchitecture.()

答案:對Thedescriptionoftheproperty"string"is:thetextdisplayedontheobject.()

答案:對The"netprod"inthenetworkinputmodulecanbeusedfor().

答案:dotmultiplicationordotdivisionTheneuralnetworktoolboxcontains()modulelibraries.

答案:fiveThe"dotrod"intheweightsettingmoduleisanormaldotproductweightfunction.()

答案:錯Themathematicalmodelofsingleneuronisy=f(wx+b).()

答案:對Theneuronmodelcanbedividedintothreeparts:inputmodule,transferfunctionandoutputmodule.()

答案:對SOFMneuralnetworksaredifferentfromotherartificialneuralnetworksinthattheyadoptcompetitivelearningratherthanbackwardpropagationerrorcorrectionlearningmethodsimilartogradientdescent,andinasense,theyuseneighborhoodfunctionstopreservetopologicalpropertiesofinputspace.()

答案:對ThecorelayerofSOFMneuralnetworkis().

答案:competitionlayerForSOFMneuralnetwork,thecompetitivetransferfunction(CTF)responseis0forthewinningneurons,and1forotherneurons.()

答案:錯Whentheinputpatterntothenetworkdoesnotbelongtoanypatterninthenetworktrainingsamples,SOFMneuralnetworkcanonlyclassifyitintotheclosestmode.()

答案:對Inordertodividetheinputpatternsintoseveralclasses,thedistancebetweeninputpatternvectorsshouldbemeasuredaccordingtothesimilarity.()areusuallyused.

答案:EuclideandistancemethodandcosinemethodThemostbasicpropertyofAdaBoostisthatitreducesthetrainingerrorcontinuouslyinthelearningprocess,thatis,theclassificationerrorrateonthetrainingdatasetuntileachweakclassifieriscombinedintothefinalidealclassifier.()

答案:對Boostingalgorithmisthegeneralnameofaclassofalgorithms.Theircommongroundistoconstructastrongclassifierbyusingagroupofweakclassifiers.Weakclassifiermainlyreferstotheclassifierwhosepredictionaccuracyisnothighandfarbelowtheidealclassificationeffect.Strongclassifiermainlyreferstotheclassifierwithhighpredictionaccuracy.()

答案:對ThemainpurposeofaddingregularizationtermintotheformulaofcalculatingstrongclassifieristopreventtheoverfittingofAdaBoostalgorithm,whichisusuallycalledstepsizeinalgorithm.()

答案:對Amongthemanyimprovedboostingalgorithms,themostsuccessfuloneistheAdaBoost(adaptiveboosting)algorithmproposedbyYoavFreundofUniversityofCaliforniaSanDiegoandRobertSchapireofPrincetonUniversityin1996.()

答案:對ThelossfunctionofAdaBoostalgorithmis().

答案:exponentialfunctionThecharacteristicofElmanneuralnetworkisthattheoutputofthehiddenlayerisdelayedandstoredbythefeedbacklayer,andthefeedbackisconnectedtotheinputofthehiddenlayer,whichhasthefunctionofinformationstorage.()

答案:對Thefeedbacklayerisusedtomemorizetheoutputvalueoftheprevioustimeofthehiddenlayerunitandreturnittotheinput.Therefore,Elmanneuralnetworkhasdynamicmemoryfunction.()

答案:對InElmannetwork,thetransferfunctionoffeedbacklayerisnonlinearfunction,andthetransferfunctionofoutputlayerislinearfunction.()

答案:對TheneuronsinthehiddenlayerofElmannetworkadoptthetangentS-typetransferfunction,whiletheoutputlayeradoptsthelineartransferfunction.Ifthereareenoughneuronsinthefeedbacklayer,thecombinationofthesetransferfunctionscanmakeElmanneuralnetworkapproachanyfunctionwitharbitraryprecisioninfinitetime.()

答案:對Elmanneuralnetworkisakindofdynamicrecurrentnetwork,whichcanbedividedintofullfeedbackandpartialfeedback.Inthepartialrecurrentnetwork,thefeedforwardconnectionweightcanbemodified,andthefeedbackconnectioniscomposedofagroupoffeedbackunits,andtheconnectionweightcannotbemodified.()

答案:對Intheartificialneuralnetwork,thequalityofmodelingwilldirectlyaffecttheperformanceofthegenerativemodel,butasmallamountofpriorknowledgeisneededfortheactualcasemodeling.()

答案:錯AGANmainlyincludesageneratorGandadiscriminatorD.()

答案:對TheessenceoftheoptimizationprocessofDandGistofindthe().

答案:minimaxBecausethegenerativeadversarialnetworkdoesnotneedtodistinguishthelowerboundandapproximateinference,itavoidsthepartitionfunctioncalculationproblemcausedbythetraditionalrepeatedapplicationofMarkovchainlearningmechanism,andimprovesthenetworkefficiency.()

答案:對Fromtheperspectiveofartificialintelligence,GANusesneuralnetworktoguideneuralnetwork,andtheideaisverystrange.()

答案:對InAlexNet,thereare650000neuronswithmorethan600000parametersdistributedinfiveconvolutionlayersandthreefullyconnectedlayersandSoftmaxlayerswith1000categories.()

答案:錯ComparedwithGPU,CPUhashigherprocessingspeed,andhassignificantadvantagesinprocessingrepetitivetasks.()

答案:錯VGGNetiscomposedoftwoparts:theconvolutionlayerandthefullconnectionlayer,whichcanberegardedasthedeepenedversionofAlexNet.()

答案:對Atpresent,DCNNhasbecomeoneofthecorealgorithmsinthefieldofimagerecognition,butitisunstablewhenthereisasmallamountoflearningdata.()

答案:錯Inthefieldoftargetdetectionandclassification,thetaskofthelastlayerofneuralnetworkistoclassify.()

答案:對Atpresent,researchershavesuccessfullyappliedHopfieldneuralnetworktosolvethetravelingsalesmanproblem(TSP),whichisthemostrepresentativeofoptimizationcombinatorialproblems.()

答案:對Theorderinwhichneuronsadjusttheirstatesisnotunique.Itcanbeconsideredthatacertainordercanbespecifiedorselectedrandomly.Theprocessofneuronstateadjustmentincludesthreesituations:from0to1,and1to0andunchanged.()

答案:對Hopfieldneuralnetworkisakindofneuralnetworkwhichcombinesstoragesystemandbinarysystem.Itnotonlyprovidesamodeltosimulatehumanmemory,butalsoguaranteestheconvergenceto().

答案:localminimumIn1982,AmericanscientistJohnJosephHopfieldputforwardakindoffeedbackneuralnetwork"Hopfieldneuralnetwork"inhispaperNeuralNetworksandPhysicalSystemswithEmergentCollectiveComputationalAbilities.()

答案:對Undertheexcitationofinputx,DHNNentersadynamicchangeprocess,untilthestateofeachneuronisnolongerchanged,itreachesastablestate.Thisprocessisequivalenttotheprocessofnetworklearningandmemory,andthefinaloutputofthenetworkisthevalueofeachneuroninthestablestate.()

答案:對Inpracticalapplication,theinverseofthecorrelationmatrixandthecorrelationcoefficientarenoteasytoobtain,sotheapproximatesteepestdescentmethodisneededinthealgorithmdesign.Thecoreideaisthattheactualmeansquareerrorofthenetworkisreplacedbythemeansquareerrorofthek-thiteration.()

答案:對Thealgorithmusedinsingle-layerADALINEnetworkisLMSalgorithm,whichissimilartothealgorithmofperceptron,andalsobelongstosupervisedlearningalgorithm.()

答案:對Intermsofalgorithm,ADALINEneuralnetworkadoptsW-Hlearningrule,alsoknownastheleastmeansquare(LMS)algorithm.Itisdevelopedfromtheperceptronalgorithm,anditsconvergencespeedandaccuracyhavebeengreatlyimproved.()

答案:對WhentherearemultipleADALINEinthenetwork,theadaptivelinearneuralnetworkisalsocalledMadalinewhichmeansmanyAdalineneuralnetworks.()

答案:對ADALINEneuralnetworkhassimplestructureandmulti-layerstructure.Itisflexibleinpracticalapplicationandwidelyusedinsignalprocessing,systemidentification,patternrecognitionandintelligentcontrol.()

答案:對Atpresent,RBFneuralnetworkhasbeensuccessfullyappliedinnonlinearfunctionapproximation,timeseriesanalysis,dataclassification,patternrecognition,informationprocessing,imageprocessing,systemmodeling,controlandfaultdiagnosis.()

答案:對RBFneuralnetworkisanovelandeffectivefeedforwardneuralnetwork,whichhasthebestlocalapproximationandglobaloptimalperformance.()

答案:對ThebasicideaofRBFneuralnetworkistouseradialbasisfunctionasthe"basis"ofhiddenlayerhiddenunittoformhiddenlayerspace,andhiddenlayertransformsinputvector.Theinputdatatransformationoflowdimensionalspaceismappedintohigh-dimensionalspace,sothattheproblemoflinearseparabilityinlow-dimensionalspacecanberealizedinhigh-dimensionalspace.()

答案:對ForthelearningalgorithmofRBFneuralnetwork,thekeyproblemistodeterminethecenterparametersoftheoutputlayernodereasonably.()

答案:錯ThemethodofselectingthecenterofRBFneuralnetworkbyself-organizinglearningistoselectthecenterofRBFneuralnetworkbyk-meansclusteringmethod,whichbelongstosupervisedlearningmethod.()

答案:錯InthestandardBPneuralnetworkalgorithmandmomentumBPalgorithm,thelearningrateisaconstantthatremainsconstantthroughoutthetrainingprocess,andtheperformanceofthelearningalgorithmisverysensitivetotheselectionofthelearningrate.()

答案:對BPneuralnetworkhasbecomeoneofthemostrepresentativealgorithmsinthefieldofartificialintelligence.Ithasbeenwidelyusedinsignalprocessing,patternrecognition,machinecontrol(expertsystem,datacompression)andotherfields.()

答案:對L-Malgorithmismainlyproposedforsuperlargescaleneuralnetwork,anditisveryeffectiveinpracticalapplication.()

答案:錯In1974,PaulWerbosofthenaturalsciencefoundationoftheUnitedStatesfirstproposedtheuseoferrorbackpropagationalgorithmtotrainartificialneuralnetworksinhisdoctoraldissertationofHarvardUniversity,anddeeplyanalyzedthepossibilityofapplyingittoneuralnetworks,effectivelysolvingtheXORloopproblemthatsinglesensorcannothandle.()

答案:對TheoutputofBPneuralnetworkis()ofneuralnetwork.

答案:theoutputofthelastlayerWhentheperceptronislearning,eachsamplewillbeinputintotheneuronasastimulus.Theinputsignalisthefeatureofeachsample,andtheexpectedoutputisthecategoryofthesample.Whentheoutputisdifferentfromthecategory,wecanadjustthesynapticweightandbiasvalueuntiltheoutputofeachsampleisthesameasthecategory.()

答案:對Theperceptronlearningalgorithmisdrivenbymisclassification,sothestochasticgradientdescentmethodisusedtooptimizethelossfunction.()

答案:misclassificationThebasicideaofperceptronlearningalgorithmistoinputsamplesintothenetworkstepbystep,andadjusttheweightmatrixofthenetworkaccordingtothedifferencebetweentheoutputresultandtheidealoutput,thatistosolvetheoptimizationproblemoflossfunctionL(w,b).()

答案:對

答案:對Perceptronisasingle-layerneuralnetwork,orneuron,whichisthesmallestunitofneuralnetwork.()

答案:對LogarithmicS-typetransferfunction,namelySigmoidfunction,isalsocalledS-shapedgrowthcurveinbiology.()

答案:對Thefunctionoftransferfunctioninneuronsistogetanewmappingoutputofsummeraccordingtothespecifiedfunctionrelationship,andthencompletes

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
  • 6. 下載文件中如有侵權(quán)或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論