神经网络英文文献Word格式.docx

上传人:b****1 文档编号:3928207 上传时间:2023-05-02 格式:DOCX 页数:14 大小:182.37KB
下载 相关 举报
神经网络英文文献Word格式.docx_第1页
第1页 / 共14页
神经网络英文文献Word格式.docx_第2页
第2页 / 共14页
神经网络英文文献Word格式.docx_第3页
第3页 / 共14页
神经网络英文文献Word格式.docx_第4页
第4页 / 共14页
神经网络英文文献Word格式.docx_第5页
第5页 / 共14页
神经网络英文文献Word格式.docx_第6页
第6页 / 共14页
神经网络英文文献Word格式.docx_第7页
第7页 / 共14页
神经网络英文文献Word格式.docx_第8页
第8页 / 共14页
神经网络英文文献Word格式.docx_第9页
第9页 / 共14页
神经网络英文文献Word格式.docx_第10页
第10页 / 共14页
神经网络英文文献Word格式.docx_第11页
第11页 / 共14页
神经网络英文文献Word格式.docx_第12页
第12页 / 共14页
神经网络英文文献Word格式.docx_第13页
第13页 / 共14页
神经网络英文文献Word格式.docx_第14页
第14页 / 共14页
亲,该文档总共14页,全部预览完了,如果喜欢就下载吧!
下载资源
资源描述

神经网络英文文献Word格式.docx

《神经网络英文文献Word格式.docx》由会员分享,可在线阅读,更多相关《神经网络英文文献Word格式.docx(14页珍藏版)》请在冰点文库上搜索。

神经网络英文文献Word格式.docx

Loadforecast;

ArtificialNeuronNetwork;

backpropagationtraining;

Matlab

1.Introduction

Loadforecastingisvitallybeneficialtothepowersystemindustriesinmanyaspects.Asanessentialpartinthesmartgrid,highaccuracyoftheloadforecastingisrequiredtogivetheexactinformationaboutthepowerpurchasingandgenerationinelectricitymarket,preventmoreenergyfromwastingandabusingandmakingtheelectricitypriceinareasonablerangeandsoon.Factorssuchasseasondifferences,climatechanges,weekendsandholidays,disastersandpoliticalreasons,operationscenariosofthepowerplantsandfaultsoccurringonthenetworkleadtochangesoftheloaddemandandgenerations.

Since1990,theartificialneuralnetwork(ANN)hasbeenresearchedtoapplyintoforecastingtheload.“ANNsaremassivelyparallelnetworksofsimpleprocessingelementsdesignedtoemulatethefunctionsandstructureofthebraintosolveverycomplexproblems”.Owingtothe

transcendentcharacteristics,ANNsisoneofthemostcompetentmethodstodothepracticalworkslikeloadforecasting.Thispaperconcernsaboutthebehaviorsofartificialneuralnetworkinloadforecasting.AnalysisofthefactorsaffectingtheloaddemandinOntario,CanadaismadetogiveaneffectivewayforloadforecastinOntario.

2.BackPropagationNetwork

2.1.Background

Becausetheoutstandingcharacteristicofthestatisticalandmodelingcapabilities,ANNtoulddealwithnon-linearandcomplexproblemsintermsofclassificationorforecasting.Astheproblemdefined,therelationshipbetweentheinputandtargetisnon-linearandverycomplicated.ANNisanappropriatemethodtoapplyintotheproblemtoforecasttheloadsituation.Forapplyingintotheloadforecast,anANN

needstoselectanetworktypesuchasFeed-forwardBackPropagation,LayerRecurrentandFeed-forwardtime-delayandsoon.Todate,Backpropagationiswidelyusedinneuralnetworks,whichisafeed-forwardnetworkwithcontinuouslyvaluedfunctionsandsupervisedlearning.Itcanmatchtheinputdataandcorrespondingoutputinanappropriatewaytoapproachacertainfunctionwhichisusedforachievinganexpectedgoalwithsomepreviousdatainthesamemanneroftheinput.

2.2.Architectureofbackpropagationalgorithm

Figure1showsasingleNeuronmodelofbackpropagationalgorithm.

Generally,theoutputisafunctionofthesumofbiasandweightmultipliedbytheinput.Theactivationfunctioncouldbeanykindsoffunctions.However,thegeneratedoutputisdifferent.

Owingtothefeed-forwardnetwork,ingeneral,atleastonehiddenlayerbeforetheoutputlayerisneeded.Three-layernetworkisselectedasthearchitecture,becausethiskindofarchitecturecanapproximateanyfunctionwithafewdiscontinuities.ThearchitecturewiththreelayersisshowninFigure2below:

Figure1.Neuronmodelofbackpropagationalgorithm

Figure2.Architectureofthree-layerfeed-forwardnetwork

Basically,therearethreeactivationfunctionsappliedintobackpropagationalgorithm,namely,Log-Sigmoid,Tan-Sigmoid,andLinearTransferFunction.TheoutputrangeineachfunctionisillustratedinFigure3below.

Figure.3.Activationfunctionsappliedinbackpropagation(a)Log-sigmoid(b)Tan-sigmoid(c)linearfunction

2.3.Trainingfunctionselection

AlgorithmsoftrainingfunctionemployedbasedonbackpropagationapproachareusedandthefunctionwasintegratedintheMatlabNeuronnetworktoolbox.

Functionname

Algorkhtn

trainb

Batchtrainingwithweighl&

biaslearningrules

irainbfg

BFGSquasi-Xewtonbackpropagation

trainbr

Bayesianlugularization

trainc

Cyclicalorderincrementaltrainingw/learningfunclions

iraincgb

Powell-Bealeconjugategradientbackpropagation

intinegf

FlctchepPowcllcoiijugaiL'

pradicnthackpropagaiion

[raincgp

Polak-Ribiervconjugategradientbackpjopagatiort

traingd

Gradientdescentbackpropagaiioji

miingdm

Gradientduscentwithmoineniuinbackpropagalion

iraingda

GradienidescentwithadaptiveIrbackpropagation

iniingdx

GradientLlescentw/tnomcimun&

adaptivekbackpropagaiion

train]m

Lcvenberg-Marquard【backpropagation

irainoss

Onestepsecantbackpropagaiion

trainr

RandomorderiiKreineiilaltrainingu/iearnin^funclions

irainrp

ResidentbackpropagationtRprop)

trains

Scquetuialorderinccementaltrainingw/leamingfunctions

irainscg

Scaledconjugategradientbackpropagation

TABLE」.TRAININGFUNCTIONSINMATLAB'

SNNTOOLBOX

3.TrainingProcedures

3.1.Backgroundanalysis

TheneuralnetworktrainingisbasedontheloaddemandandweatherconditionsinOntarioProvince,CanadawhichislocatedinthesouthofCanada.TheregioninOntariocanbedividedintothreepartswhicharesouthwest,centralandeast,andnorth,accordingtotheweatherconditions.Thepopulationisgatheredaroundsoutheasternpartoftheentireprovince,whichincludestwoofthelargestcitiesofCanada,TorontoandOttawa.

3.2.DataAcquisition

Therequiredtrainingdatacanbedividedintotwoparts:

inputvectorsandoutputtargets.Forloadforecasting,inputvectorsfortrainingincludealltheinformationoffactorsaffectingtheloaddemandchange,suchasweatherinformation,holidaysorworkingdays,faultoccurringinthenetworkandsoon.Outputtargetsaretherealtimeloadscenarios,whichmeanthedemandpresentedatthesametimeasinputvectorschanging.

Owingtotheconditionalrestriction,thisstudyonlyconsiderstheweatherinformationandlogicaladjustmentofweekdaysandweekendsasthefactorsaffectingtheloadstatus.Inthispaper,factorsaffectingtheloadchangingarelistedbelow:

(1).Temperature(°

C)

(2).DewPointTemperature(C)

(3).RelativeHumidity(%)

(4).Windspeed(km/h)

(5).WindDirection(10)

(6).Visibility(km)

(7).Atmosphericpressure(kPa)

(8).Logicaladjustmentofweekdayorweekend

Accordingtotheinformationgatheredabove,theweatherinformationinTorontotakenplaceofthewholeOntarioprovinceischosentoprovidedataacquisition.Thedatawasgatheredhourlyaccordingtothehistoricalweatherconditionsremainedintheweatherstations.Loaddemanddataalsoneedstobegatheredhourlyandcorrespondingly.Inthispaper,2yearsweatherdataandloaddataiscollectedtotrainandtestthecreatednetwork.

3.3.DataNormalization

Owingtopreventthesimulatedneuronsfrombeingdriventoofarintosaturation,allofthegathereddataneedstobenormalizedafteracquisition.Likeperunitsystem,eachinputandtargetdataarerequiredtobedividedbythemaximumabsolutevalueincorrespondingfactor.Eachvalueofthenormalizeddataiswithintherangebetween-1and+1sothattheANNcouldrecognizethedataeasily.Besides,weekdaysarerepresentedas1,andweekendarerepresentedas0.

3.4.Neuralnetworkcreating

ToolboxinMatlabisusedfortrainingandsimulatingtheneuronnetwork.Thelayoutoftheneuralnetworkconsistsofnumberofneuronsandlayers,connectivityoflayers,activationfunctions,anderrorgoalandsoon.Itdependsonthepracticalsituationtosettheframeworkandparametersofthenetwork.ThearchitectureoftheANNcouldbeselectedtoachievetheoptimizedresult.Matlabisoneofthebestsimulationtoolstoprovidevisiblewindows.Three-layerarchitecturehasbeenchosentogivethesimulationasshowninFigure2above.Itisadequatetoapproximatearbitraryfunction,ifthenodesofthehiddenlayeraresufficient.

Duetothepracticalinputvalueisfrom-1to+1,thetransferfunctionofthefirstlayerissettobetansigmiod,whichisahyperbolictangentsigmoidtransferfunction.Thetransferfunctionoftheoutputlayerissettobelinearfunction,whichisalinearfunctiontocalculatealayer'

soutputfromitsnetinput.Thereisoneadvantageforthelinearoutputtransferfunction:

becausethelinearoutputneuronsleadtotheoutputtakeonanyvalue,thereisnodifficultytofindoutthedifferencesbetweenoutputandtarget.

Thenextstepistheneuronsandtrainingfunctionsselection.

Generally,TrainbrandTrainlmarethebestchoicesaroundallofthetrainingfunctionsinMatlabtoolbox

Trainlm(Levenberg-Marquardtalgorithm)isthefastesttrainingalgorithmfornetworkswithmoderatesize.However,thebigproblem

appearsthatitneedsthestorageofsomematriceswhichissometimeslargefortheproblems.Whenthetrainingsetislarge,trainlmalgorithmwillreducethememoryandalwayscomputetheapproximateHessianmatrixwithnxndimensions.AnotherdrawbackofthetrainImisthattheover-fittingwilloccurwhenthenumberoftheneuronsistoolarge.Basically,thenumberofneuronsisnottooIargewhenthetrainImaIgorithmisempIoyedintothenetwork.Trainbr(Ba

展开阅读全文
相关资源
猜你喜欢
相关搜索
资源标签

当前位置:首页 > 工程科技 > 能源化工

copyright@ 2008-2023 冰点文库 网站版权所有

经营许可证编号:鄂ICP备19020893号-2