A HMM-based adaptive fuzzy inference system for stock market forecasting
更新时间:2023-05-10 17:34:01 阅读量: 实用文档 文档下载
- 阿根廷推荐度:
- 相关推荐
Neurocomputing104(2013)10–25
ContentslistsavailableatSciVerseScienceDirect
Neurocomputing
journalhomepage:/locate/neucom
AHMM-basedadaptivefuzzyinferencesystemforstockmarketforecasting
Md.Ra ulHassana,n,KotagiriRamamohanaraob,JoarderKamruzzamanc,Musta zurRahmanb,M.MarufHossainb
a
DepartmentofInformationandComputerScience,KingFahdUniversityofPetroleumandMinerals,Dhahran31261,SaudiArabiaDepartmentofComputerScienceandSoftwareEngineering,TheUniversityofMelbourne,Victoria3010,Australiac
GippslandSchoolofIT,MonashUniversity,Churchill,VIC3842,Australia
b
articleinfo
Articlehistory:
Received24March2012Receivedinrevisedform12July2012
Accepted12September2012CommunicatedbyP.Zhang
Availableonline6December2012Keywords:Fuzzysystem
HiddenMarkovModel(HMM)StockmarketforecastingLog-likelihoodvalue
abstract
Inthispaper,weproposeanewtypeofadaptivefuzzyinferencesystemwithaviewtoachieveimprovedperformanceforforecastingnonlineartimeseriesdatabydynamicallyadaptingthefuzzyruleswitharrivalofnewdata.Thestructureofthefuzzymodelutilizedintheproposedsystemisdevelopedbasedonthelog-likelihoodvalueofeachdatavectorgeneratedbyatrainedHiddenMarkovModel.Aspartofitsadaptationprocess,oursystemchecksandcomputestheparametervaluesandgeneratesnewfuzzyrulesasrequired,inresponsetonewobservationsforobtainingbetterperformance.Inaddition,itcanalsoidentifythemostappropriatefuzzyruleinthesystemthatcoversthenewdata;andthusrequirestoadapttheparametersofthecorrespondingruleonly,whilekeepingtherestofthemodelunchanged.Thisintelligentadaptivebehaviorenablesouradaptivefuzzyinferencesystem(FIS)tooutperformstandardFISs.Weevaluatetheperformanceoftheproposedapproachforforecastingstockpriceindices.Theexperimentalresultsdemonstratethatourapproachcanpredictanumberofstockindices,e.g.,DowJonesIndustrial(DJI)index,NASDAQindex,StandardandPoor500(S&P500)indexandfewotherindicesfromUK(FTSE100),Germany(DAX),Australia(AORD)andJapan(NIKKEI)stockmarkets,accuratelycomparedwithotherexistingcomputationalandstatisticalmethods.
&2012ElsevierB.V.Allrightsreserved.
1.Introduction
Adaptiveonlinesystemshavegreatappealindomainswhereeventschangedynamically.Typicalexamplesinclude nancial,manufacturingandcontrolengineering.Asystemistermedadaptiveifitcanevolveaccordingtothechangeincharacteristicsoftheproblem.Forinstance,tomodelachaotictimeserieswherethevalueschangerandomly,thesystemshouldcontinuouslyupdateitsknowledgeandadaptitself.Theaimofsuchasystemistoimproveperformancethroughenhancedmodellingofthechangesinbehavior.Differentapplicationareasofengineering,computerscienceand nancialforecastingandanalysiscanbene tfromusingsuchkindsofadaptivesystems.
Anadaptiveonlinelearningsystemshouldpossessthefollow-ingcriteriatobeef cientandeffective:
1.Itshouldbeabletocapturethecharacteristicsofnewinforma-tionasitbecomesavailable;
2.Thesystemshouldbeabletorepresentanoverallknowledgeabouttheproblem,withoutneedingtomemorizethelargeamountofrawdata;
3.Thesystemshouldbeabletoupdateitsknowledgeinrealtimeandincrementallyupdateitsmodel;
4.Theperformanceoftheadaptivesystemshouldbebetterthanthatofthestaticof inesystemfornonstationarytimeseriesdata.Neuralnetworks,havebeenpopularforsupervisedlearning;however,ithasbeendemonstratedbyseveralstudies[1–7]thatthesetoolscanbelimitedintheirabilitytobeadaptive.Incontrast,FuzzyLogiccanmoreeasilybemadeadaptive[8],sincenewrulescanbegeneratedonlineandruleparameterscanbemodi edinaccordancewiththenewdata.Whengeneratinganadaptivefuzzymodel,performanceisacrucialfactor.Particularly,sinceincreasingthenumberofrulesmaynotalwaysguaranteeanimprovedperformance.However,changingtheparametervaluesaccordingtonewdatacanpotentiallyovercomethein uenceofthefarthestpastdatainthemodelconstruction.
Thereexistanumberofadaptivemodelswhichcombineaneuralnetworklikestructuretooptimizetheparametersoffuzzyrules.OneexampleistheAdaptiveNeuroFuzzyInferenceSystem(ANFIS)[8].Thelimitationofthissystemisthatitcannotadapt
Correspondingauthor.Tel.:þ610383441408;fax:þ610393481184.E-mailaddresses:hassan.ra ul@,mrhassan@kfupm.edu.sa(Md.R.Hassan).
0925-2312/$-seefrontmatter&2012ElsevierB.V.Allrightsreserved./10.1016/j.neucom.2012.09.017
n
Md.R.Hassanetal./Neurocomputing104(2013)10–2511
thestructureofthefuzzymodeloncethefuzzymodelhasbeenbuilt.EvolvingFuzzyNeuralNetwork(EFuNN)isanothersystemintroducedin[9,10]whichusesevolvingconnectionistsystems(ECOS)architecturetomakethesystemevolve.InthedynamicversionofEFuNN[11]theparametersareselfoptimized.InEFuNNanewruleisgeneratedifthedistancebetweenthenewdatavectorandclustercentresforeachoftheexistingrulesisgreaterthantheprede nedradiusofclusterR.Hence,theperformanceofthemodeldependsontheoptimalchoiceofR.Furthermore,thedistancefunctionbetweentwofuzzymembershipvectorsworkswellfordiscretizeddatavaluesbutisnotsuitableforrealcontinuousnumbers.Toadjusttheruleparametersafeedbackalgorithmisusedwhichrequiresstoragetokeepthedesiredoutputs.
Recently,theDynamicEvolvingNeuroFuzzyInferenceSystem(DENFIS)[12]hasbecomepopular,duetoitsadaptiveandonlinelearningnature.DENFISisquitesimilartoEFuNNN,exceptthatinDENFIS,thepositionoftheinputvectorintheinputspaceisidenti edonlineandtheoutputisdynamicallycomputed,basedonthesetoffuzzyrulescreatedduringtheearlierlearningprocess.Rulesarecreatedandupdatedbypartitioningtheinputspaceusinganonlineevolvingclusteringmethod(ECM).InECM,thedistancebetweenadatapointandclustercenteriscomparedwithaprede nedthresholdDthr,whichisthenusedtogenerateclustersandcorrespondingfuzzyrules.ThethresholdDthr,whichiseffec-tivelytheradiusofacluster,mustbestaticallyde nedandcanaffecttheperformanceoftheobtainedmodel.DENFISusesEuclideandistance[13]tomeasurethedifferencebetweentwoinputvectors.However,Euclideandistanceisnotasuitablemethodtodifferentiatetimeseriesdatapatternsconsistingoflineardrift[14].Forexample,thetwotimeseriesdatavectorsD1:/012345678SandD2:/5678910111213S(asshowninboldinFig.1)havesimilartrends,althoughtheyaredissimilarintermsofEuclideandistance.Foratimeseriesapplication,sincethesetwodatavectorsexhibitsimilarpattern,theyshouldbelongtothesamerule.Consequently,theperformanceofDENFISusuallydegradeswithadaptingmorenewdatawhenitisappliedforforecastingnon-lineartimeseriesdata.
AnotherapproachproposedinliteratureforrealizingadaptiveFuzzyInferenceSystemsisthroughleveragingevolutionaryapproaches,suchasGeneticAlgorithm(GA).In[15]aGA-basedapproachforadaptingandevolvingfuzzyruleswasproposedtoachieveautomatednegotiationamongrelax-criterianegotiationagentsine-markets.Anevolutionaryapproachforautomaticgen-erationofFISwasproposedin[16],wherethestructureandparametersofFISaregeneratedthroughreinforcementlearning
141210
D2
e
8ulaV64D1
200
10
20
30
4050
Time (t)
Fig.1.TwosimilardatapatternswithdifferentEuclideandistance(ED).HereEDbetweenD1andD2is15.
andthefuzzyrulesareevolvedviaGA.In[17],amethodforgeneratingMamdaniFISwasintroduced,wherethefuzzymodelparametersareoptimizedbyapplyingGA.AlthoughGAisquitepopularfordevelopinganevolvingfuzzysystem,itsinherentcomputationalandtimecomplexitymakesthisapproachinapplicabletoanever-changingnon-linearchaotictimeseriesdataforecasting.
HiddenMarkovModel(HMM)canbeappliedto ndsimilaritiesinthepatternsofatimeseriesdata[18–20].In[21,22,19,23],HMM–FuzzymodelwasproposedbyexploitingtheabilityofHMMtocapturepatternsimilaritiesaswellastheeaseoffuzzyapproachdealwithadaptivesystem.TheHMM–Fuzzymodelisanof inedatadrivenfuzzyrulegenerationtechniquewhereHMM’sdatapatternidenti cationmethodisusedtorankthedatavectorsandthenfuzzyrulesaregenerated.ThereasonforusingHMMisthatitmodelsasystemthatprovideshigherprobabilitytothedatavectorsthatrepresentthesystem,thanthedatavectorsthatrepresenttheminorityscenarioofthesystem.Thoughthesemodelshaveshownpromisingresults,theirperformanceinforecastingtimeseriesdataisstillinadequateandtheyaredesignedforof inelearningonly.Toimproveperformance,amodelneedstolearnonlinewherenewandrecentdatatrendscanbecapturedmakingthemodelcontinuouslyadaptive.
Inthispaper,weproposeamodelcalledtheAdaptiveFuzzyInferenceSystem(AFIS)whichconsistsoftwophases.First,aninitialfuzzymodelisgeneratedusingasmallnumberoftrainingdatavectors.Togeneratetheinitialfuzzymodel,aHMMistrainedandusedtocomputelog-likelihoodvaluesforeachofthedatavectors.Theselog-likelihoodvaluesarethenusedtorankandgroupthedatavectorstogenerateappropriatefuzzyrules,asdescribedinSection3.1.2.Second,thefuzzymodelisconformedtoarrivalofnewdatamakingitacontinuousadaptiveonlinesystem.Onobservingnewdataeitherthefuzzyrulethatsatis esthedataisidenti edusingtheHMMandisthenadaptedforthenewdataoranewfuzzyruleisgenerated.
TheproposedAFIShassigni cantdifferencesfromthemodelsinourpreviousstudiesinanumberofways.First,AFISisanonlinelearningsystemwhileotherslearnonlyof ine.Second,AFISisanadaptivemodel.Onceamodelisbuiltbasedontheavailabledata,itremainsunchangedinpreviousstudieswhile,inAFIS,anintelligentonlinelearningisusedtoadapttheinitialmodelasnewdataarrives.Inthelattercase,currentlyde nedruleis ne-tunedto tthenewdataandifnecessary,newruleisgenerated.Third,inAFIS,thetrainingdatasetdoesnothavetobelargeandthemodelnotnecessarilybetrainedwithdatahavingcharacteristicsofunknowntestdata,rathercanbetrainedincrementallyasnewdatabecomeavailable.AllthesefeaturesmakeAFISverysuitableforforecastingtimeseriesdataanditoutperformsotherexistingmethodsinliteratureincludingourpreviousmodelsasdemonstratedinSection5.
Theremainderofthepaperisorganizedasfollows.InSection2,webrie stlyinSection6,wesuggestfutureimprove-mentsandconcludethepaper.NotationsarelistedinTable1areusedindescribingalgorithmsintheremainingpartofthepaper.
2.Preliminaries
Inthissection,wedescribethepreliminaryconceptofHMMwhichisusefulinunderstandingtheproposedAFIS.
2.1.HiddenMarkovModel
AHiddenMarkovModel(HMM)islikea nitestatemachinethatrepresentsthestructureorstatisticalregularitiesof
12Md.R.Hassanetal./Neurocomputing104(2013)10–25
Table1
Listofnotations.NotationDescription
NNumberofstatesofaHMM
MNumberofdistinctsymbolsforaHMM
!x
Aninputdatavector/aninputobservationsequence
x1,x2,...,xkDatafeaturesindatavector!
xAStatetransitionprobabilitymatrixSii’thstateofaHMM
QStatesequence:fq1,q2,...gqCurrentstateq0Thenextstate
aijStatetransitionprobabilityfromstateSitoSjattimetB
Observationemissionprobabilitymatrix
bSjðckÞ
EmissionprobabilityofobservationsymbolckfromstateSjp
Initialstatetransitionprobabilityvectorl
ThehiddenMarkovmodel
Rll-dimensionalspaceofcontinuous/realnumbersMij
Membershipfunctionforj’thruleofi’thfeaturexiojThe ringstrengthofj’thrule
Eð!xjÞPredictionerrorforthedatavector!
xj
Emse
PredictionerrorforthetotaltrainingdatasetinMSESi
ScalingmatricesforallstatesiGAprobabilitydistributionon½0,1Þ
NðÞ
AmultivariateGaussiandensityfunction
m
ThemeanvectorfortheGaussiandensityfunctionNðÞ
u2S
ThecovariancematrixfortheGaussiandensityfunctionNðÞFij
Centerofthemembershipfunctionfori’thfeatureofj’thrulesij
Steepnessofthemembershipfunctionfori’thfeatureofj’thruleO
ThesetofobservationsymbolscAdistinctobservationsymbolv
Asetofobservationsequences
!xcontAninputdatavector/anobservationsequenceofcontinuousrealnumbers
lliThelog-likelihoodvalueofgenerating!
xgiventheHMMkDimensionofaninputdatavector/instance!
l
x!bCo-ef cientofconsequentpartofafuzzyruleD
Trainingdataset
sequences.HMMshavebeenappliedforspeechrecognitionsince
early1970s.Wewill rstusethecommonurnandballexampletoreviewthebasicideaofHMMs.SupposethereareNurnscontainingcoloredballsandthereareMdistinctcolorsofballs.Eachurnhasa(possibly)differentdistributionofcolors.First,wepickaninitialurnaccordingtosomeprobability.Second,werandomlypickaballfromtheurnandthenreplaceit.Third,weagainselectanurnaccordingtoarandomselectionprocessassociatedwiththeurns.WerepeattheSteps2and3.Inthisexample,wecanregardtheurnsasstatesandtheballsasobservationsymbols.
InHMMs,thestates(intheaboveexample,theurns)arenotobservable(i.e.,hidden).Observationsareprobabilisticfunctionofstate.Statetransitionsareprobabilistic.Moreformally:1.N,thenumberofstatesinthemodel.ThesetofstatesisdenotedasS¼fS1,S2,...,SNg.
2.M,thenumberofdistinctobservationsymbols,i.e.,theindividualsymbols.ThesetofsymbolsisdenotedasO¼fc1,c2,...,cMg.
3.ThestatetransitionprobabilitydistributionmatrixA¼½aij ;thevaluesofaijarecalculatedasaij¼Pr½q0¼Sj9q¼Si ,
1riandjrN;
ð1Þ
whereq0isthenextstate,qisthecurrentstate,Sjisthejthstate.
4.Theobservationsymbolprobabilitydistributionmatrix,B¼½bSjðckÞ ,whereckAOandbSjðckÞ¼Pr½ck9q¼Sj ,
1rjrNand1rkrM;
ð2Þ
wherebSjðcÞrepresentstheemissionprobabilityofanobserva-tionsymbolcinstateSj.
5.Theinitialstatedistributionvectorp¼fpigwhere
pi¼Pr½q0¼Si ,irN,
ð3Þ
whereq0isinitialstate.
6.l—theentiremodel,l¼ðA,B,pÞ.
TherearethreebasicproblemsassociatedwithusingHMMs[20]First,givenanobservationsequence,!
.
x¼/x1,x2,...,xTS,xiAOand
aHMMl¼ðA,B,pÞ,computePrð!
x9lÞ.Second,givenansequence!
observation
x¼/x1,x2,...,xTS,xiAOandamodell, ndtheoptimalstatesequenceQ¼/q1,q2,...,qTS,qiAS.Third,givenasetofobservationsequencesw,estimatemodelparametersl¼ðA,maximizePrð!B,pÞthat
x9lÞ8!
ixiAw.
OurgoalofusingHMMsistorankthedatavectors,whichwewillthenusetogeneratefuzzyrulesinalaterphase.Toachievethis,weneedtosolveboththethirdandthe rstproblemsdescribedabove.Thereisnoknownmethodtosolvethethirdproblemanalytically,i.e.,toadjustthemodelparameterstomaximizetheprobabilityoftheobservationdatavectors.TheBaum–Welchalgorithm[24]isaniterativeprocedurethatcandeterminetheparameterssub-optimally,similartotheexpectationmaximization(EM)method.Itoperatesasfollows:(1)lettheinitialmodelbel0.(2)computealbasedonl0andobservationsequence!x.(3)iflogPr!
new
ÀlogPrð!
ðx9lÞ
x9l0Þot,stop,elsesetl0¼landgotostep2(tistheminimumtolerancebetweentwosubsequentmodels).
The rstproblemcanbesolvedusingtheforward–backwardalgorithm,wheregiventheHMM,theprobabilityofgeneratingak-dimensionaldatavector,/x1,x2,...,xkS,iscalculatedusingthefollowingsetofequations[20]:
Prð!
x9lÞ¼
XPrð!x9Q,lÞPrðQ9lÞ,ð4Þ8Q
whereQ¼statesequenceq1,q2,...,qkandqiAS(forak-stateHMM),!
l¼TheHMMmodel,
x¼Inputdatavector/x1,x2,...,xkS,xiAO(ObservationSequence).
ThevaluesofPrð!
x9Q,lÞandPrðQ9lÞarecalculatedusingthefollowingequations[20]:
Prð!
x9Q,lÞ¼
YkPrðxi9qi,lÞ¼bq1ðx1Þbq2ðx2Þ...bqkðxkÞ,ð5Þ
i¼1
wherebqiðxiÞ¼emissionprobabilityofthefeaturexifromstateqi.PrðQ9lÞ¼p1:aq1,q2:aq2,q3...aqkÀ1,qk,
ð6Þ
whereporp1¼priorprobabilitymatrix,aqi,qj¼transitionprob-abilityfromstateqitostateqj.
SofarwehavedescribedaHMMthatdealswithasequenceofdiscretesymbols.Mostoftherealworldproblems,however,arecontinuous(e.g.,speechsignalrecognition,humanmovementrecognitionandstockindicesprediction)andhenceaHMMabletodealwithcontinuousdatasetisrequired.Thiscanbeachievedthroughaslightmodi cationofthediscreteHMM.ThefollowingsectionreviewshowaHMMcanbeusedforcontinuousdata.2.2.HMMforcontinuousdata
ThereareanumberofwaystogenerateaHMMtodealwithcontinuousdata.Firstly,thecontinuousdatasetcanbeconvertedintoanumberofdiscretesetsbyadoptingaquantizationtechnique.Infact,anumberofstudies,especiallyaredealingwithcontinuousspeechdata[25], rsttranslatethecontinuousfeaturesintoasetofdiscretesymbols.Anotherapproachistomapthediscreteoutput
Md.R.Hassanetal./Neurocomputing104(2013)10–2513
Fig.2.Step-by-stepexampleoftheproposedmodel:(1)Convertunivariatetimeseriesdataintodatavectors(window);(2)FeedthedatavectorsintoaHMM;(3)TraintheHMMusingexpectationmaximizationalgorithm;(4)Calculatelog-likelihoodvalueforeachofthetrainingdatavectorsandrankthem;(5)Groupthedatavectorsbasedontheranking;(6)Generateasetoffuzzyrules(consideredasfuzzysystem)usingthedatavectorgroups;(7)Adaptthegeneratedfuzzysystemwheneveranewdatavectorarrives;(8)FeedthenewdatavectorintothetrainedHMM;(9)Computelog-likelihoodlnewvalueforthenewdatavector;(10)Iflnewisnotwithintherangeofminimumandmaximumlog-likelihoodvalues(i.e.rankingscore)ofthefuzzysystem,createanewfuzzyrule;(11)Otherwiseidentifytherulewherethenewdatavector tsinand(12)Modifytheselectedfuzzyrule.
distributionbj(k)tothecontinuousoutputprobabilitydensityfunction.Theadvantageofdoingthisoverthequantizationisthattheinherentquantizationerrorcanbeeliminated[26].Hence,aHMMwithcontinuousoutputprobabilitydensityfunctionislesserrorpronethanadiscreteHMMwithquantizedcontinuousdata.
Tore-estimatetheHMMparameters,Baumetal.[27,28]describedageneralizationoftheBaum–Welchalgorithmtodealwithsuchacontinuousdensityfunction.Anecessaryconditionisthattheprobabilitydensityfunctionsmustbestrictlylogconcave,whichconstrainsthechoiceofcontinuousprobabilitydensityfunctiontobeGaussian,PoissonortheGammadistribution[26].
3.Adaptivefuzzyinferencesystem
Theproposedadaptiveonlinefuzzyinferencesystemhastwophases(asillustratedinFig.2):
Phase1:Initialfuzzyrulebasegeneration
JConvertunivariatetimeseriesdataintodatavectors(window)
FeedthedatavectorsintoaHMM
JTraintheHMMusingexpectationmaximizationalgorithm
JCalculatelog-likelihoodvalueforeachofthetrain-ingdatavectorsandrankthem
JGroupthedatavectorsbasedontheranking(log-likelihoodscores)
JGenerateasetoffuzzyrules(consideredasfuzzysystem)usingthedatavectorgroups
Phase2:Adaptationoftheruleparametersinresponsetoarriv-ingofnewdatasequences,oronlinegenerationofanewrule,ifrequired
JFeedthenewdatavectorintothetrainedHMMJComputelog-likelihoodvalueforthenewdatavectorJIfthelog-likelihoodvaluedoesnotfallwithintherangeofminimumandmaximumlog-likelihoodvalues(i.e.rankingscore)ofthefuzzysystem,createanewfuzzyrule
JElseidentifytherulewherethenewdatavector tsin
JAdapttheselectedfuzzy
rule
J
14Md.R.Hassanetal./Neurocomputing104(2013)10–25
InPhase1,theinitialfuzzymodelisgenerated.Intheprocessofgeneratingthemodel,extractionofappropriateandaccuratefuzzyrulesfromdataisachallenge.Thisisbecause,evenforasmallnumberofdatafeaturesinthedataset,therearepotentiallyalargenumberofrulesthatcanbegenerated.Thereareseveralmethodsthatcanbeemployedforgeneratingfuzzyrulesrepresentingtheinput–outputrelationship[29–31].InAFIS,wefollowthefuzzyrulegenerationapproachthroughusingaHMM’sabilitytoidentifyandgroupsimilarpatternsfollowingthestudies[22,32,18,23].TheHMMconsidersthedependencybetweenfeatures,becauseitusesaMarkovprocess,whereitisassumedthatthecurrentstatedependsontheimmediatepaststate.Detailsaboutthegenerationoffuzzyrulebaseareprovidedinsequel.3.1.HMM–Fuzzymodel
TogeneratethefuzzyrulebaseaHMMistrainedusingwellknownBaum–Welch[27]algorithmandthetrainedHMMisthenusedtocomputelog-likelihoodvaluesforeachofthedatavectors.TheHMMlog-likelihoodvalueisusedasaguidetoextracttheappropriatefuzzyrulesfromthetrainingdataset.
Letusconsiderthetrainingdataset,whichisaunivariatetimeseries,wherethesetofdatavectorsareobtainedbychoosinga xedwindowsizeWTwhichslidesforwardwithrespecttotime.LetDbeaunivariatetimeseriesdataoflengthT,i.e.,D¼/x1,x2,x3,ÁÁÁ,xTS,wherexiAOfor1rirT.Table2showstheinputdatavectorsandthecorrespondingdesiredoutput.ThetrainingdatavectorsarerankedusingatrainedHMMandthentheinitialfuzzymodelisdescribedinthefollowingsubsections.
Itmaybenotedthatwehaveused xedwindowsizewithuniformsamplingofrecentdata(inourcasewithtimelag1)topredictfuturedata,whichisthestandardpracticeintimeseriesforecasting.ArecentstudybyMinvydasandKristina[33]showedpromisingresultswhennon-uniformsamplinginsteadofuniformsamplingofdatawasusedinforecasting.Itrequiresdeterminationofanoptimalsetoftimelagsfromtheobservedtimeseriesdata.SinceourfocusinthisworktomakeHMM–Fuzzymodeladaptiveforonlineforecasting,westicktothestandardapproachhere,however,theeffectofnon-uniformsamplingonourmodelisworthinvestigat-inginfuture.
3.1.1.Rankingthedatavectors
Topartitiontheinputdataspace,initiallythedatavectorsarerankedusingHMMlog-likelihoodscores.Sincethedataarecon-tinuous,aHMMforcontinuousdatasequence(asdescribedin
Section2.2)isused.Eachdatavector!
xihasaHMMlog-likelihood
valuell!
i.ThisvalueisthetheHMMl:llð!logofprobabilityxi9lÞÞ¼logðPQofgeneratingWT!xi,given
i¼logðfSt¼1aSi,mq2
tÀ1StNðxt,utSqtÞÞ.Thesescores,arethususedtorankthedatavectors,consideringthetrainedHMMasareferencepoint.Thisisdepictedinthefollowingscenario.
Example1.LetusconsideradatasetD,where!
1rirm;irepresentstheindexfordatavector!
xiADfor
xiandxijj’thelementof!
isthe
xi;1rjrk.Thatis,eachofthedatavectorsisk-dimensional(herethedimensionofthedatavectoristhelength
Table2
ThesetofpredictordatavectorsandthedesiredoutputforaunivariatedataDoflengthTwherexiADfor1rirT.DataVector1:/x1,x2,...,xWTSDesiredOutput:xWTþ1DataVector2:/x2,x3,...,xWTþ1SDesiredOutput:xWTþ2DataVector3:
/x3,x4,...,xWTþ2SDesiredOutput:
xWTþ3yyyyyy
y
yData
Vectorm:
/xTÀWT,xTÀWTþ1,...,xTÀ1S
Desired
Output:
xT
ofthewindowsize‘WT’)andthedatasetcontainsmdatavectors.AssumethatthedatasetDrepresentsthedailyclosingpriceofastock:i.e.theithdatavector/xi1,xi2,xi3,xi4Swillbe/xdayForthisdatasetD,wetrainaHMMi,xdayliþ1,xdayiþ2,xdayiþ3S.¼ðA,B,pÞ.Assumethatthelog-likelihoodvaluesfordatavectors!x1,!x2,!
thethree
x3arel1,l2andl3,respectively.InHMM–Fuzzymodel,datavalueswithcloselog-likelihoodvalueswouldbeassignedthesamerank.Letusassumethatthevaluesofl1andl3areveryclosewithinatolerancelevel.Ontheotherhand,supposethevalueofl3isnotclosetothevalueofl1andl2.datavectors!x1and!
Thus,
x3willbeassignedthesamerankanddata
vector!
x2willbeassignedadifferentrank.
3.1.2.Fuzzyruleinference
AFISusestheTakagi–Sugeno(TS)typefuzzyinferencemodel[34].Themodelcomprisesasetoffuzzyrulessuchthateachrulehastwoparts:apremiseandaconsequence.Theconsequenceisusuallyalinearcombinationofallvariablesinaninputspace,andisusuallydenotedasafunctionoftheinputvariables.
InAFIS,allfuzzymembershipfunctionsareradialbasisfunctionswhichdependontwoparameters,asgivenbythefollowingequation:MjiðxÞ¼eÀ1=2ððxiÀFijÞ=sijÞ2
,
ð7Þ
whereMji(x)representsthemembershipfunctionfortheattributexiandj’thfuzzyrule,Fijisthecenterandsijisthesteepnessofthemembershipfunctionforthei’thfeature,i.e.,xiinthedatasetconsideredtogeneratethej’thrule.
Infuzzymodel,thenon-linearityinthedatasetisconsideredtobeacombinationoflinearrepresentations.Assoonasrepre-sentativestraightlinesareobtainedfornon-lineardata,theTSfuzzymodelisgeneratedandthemembershipfunctionofeachofthelinearrepresentationsisderived.Themathematicalequationforeachofthelinearrepresentationsisa rst-orderpolynomial,whichisusedtorepresentafuzzyruleinthemodel.Therepresentationsyntaxforsuchafuzzyruleis[35]jthrule:Ifv1isMj
1
andv2isMj
2
andÁÁÁvkisMjk
Theny
^jisfjðv1,v2,...,vkÞ:Herev!!
iAV,1rirk;Visak-dimensionalinputdatavectorandMjiisthefuzzyrelationshipamongvis.
Thelinearfunctionfjðv1,v2,...,vkÞisrepresentedasfollows:y^jpred¼v^kþ1¼bj0þbj1v1þbj2v2þÁÁÁþbjkvk,
ð8Þ
here,y
^j
predistheoutputpredictedbyj’th-ruleandbj0,bj1,...,bjkarethecoef cients.
IntheTSmodel,theconsequenceofde ningalinearmappingcanbetermedgeometricallyasahyperplaneintheinput–outputspace[24]andthedefuzzi cationofthemodeliscomputedastheweightedaverageofeachrule’soutputasrepresentedinEq.(9)[35].
Pc
y
^pred¼j¼1
ojÂy^jpred
,ð9Þ
j¼1j
whereoj¼Q
ki¼1Mji(forak-dimensionalinputdatavector)andc¼thetotalnumberofrulesinthemodel.
InAFIS,theleast-squareestimation(LSE)[36,37]functionisusedtoobtaintheoptimizedparametervaluesofEq.(8)intheconsequentpartofeachfuzzyrule.
Letusassumethat,therearemdatavectorsforthej’thfuzzy
rule.Theco-ef cientb!
iAb,0rirkofEq.(8)isobtainedbyapplyingtheLSEformula(Eq.(10)).!b¼CXT!y,
ð10Þ
Md.R.Hassanetal./Neurocomputing104(2013)10–2515
where
Bx11x12...x1k
1
BBx21x22...x2kCC
C¼ðXTXÞÀ1,X¼BB
B
TB:
B............C
CBC
and!
y¼½y1y2...ym @xm1
xm2
...
xAmk3.1.3.Fuzzyrulegeneration
Intheprocessofgeneratingtheinitialfuzzyrules,wedividethedatasetusingthelog-likelihoodscore/rankforeachdatavector,throughapplicationofadivideandconquerapproach.
Tobeginwith,wecreateonlyonefuzzyrulethatrepresentstheentireinputspaceofthetrainingdataset.Atthispoint,allthedatavectorsareconsideredtobelongtooneglobalgroup,there-forethelog-likelihoodvalueforeachdatavectordoesnothaveanyroleingeneratingthefuzzyrule.Intheprocessofrulegeneration,wecalculatethecenterFiandsteepnesssi,inordertode nethemembershipfunctionforeachfeaturexiinthedataset.LetusassumethedatasetDisusedtobuildtheinitial
fuzzymodel.Theparameters{!F,!sð!
xÞ}fortheonlyonegener-atedrulewhichsatis esthewholedatasetDisobtainedasfollows:
Pm
F¼1xij
i¼jm
,
ð11Þvuu 1Xm
si¼tmðxijÀFiÞ2,ð12Þ
j¼1wherem¼totaldatavectorsinD;xij¼i’thattributeofj’thvector.
Theparametervalueoftheconsequentpartisobtainedusing
Eq.(10).Thegeneratedfuzzyruleisusedtopredicttheoutputy
^jforeachdatavector!
xjinthetrainingdataset.Theprediction
errorEð!
xjÞforeachdatavectoriscomputedusingEq.(13).Eð!xjÞ¼y
^jÀyj,ð13Þ
here,y
^jisthepredictedvalueusingthegeneratedfuzzyrulesetandytheactualvalueforj’thdatavector!
jisxj.
Thetotalmeansquarederror(MSE)Emseforthetrainingdataset(m¼totaltrainingdatavectors/instances)isobtainedbythefollowingEq.(14).
EPm!2
j¼1EðxjÞ
mse¼
m:ð14ÞThepredictionerrorEmseisusedtoevaluatetheperformanceofthedevelopedmodelforthetrainingdataset.Iftheerrorforthetrainingdatasetdoesnotreducefurther,thealgorithmistermi-natedandnofurtherruleisgenerated.Otherwise,theinputtrainingdataissplitintotwogroupswiththehelpofdatavectorssortedaccordingtotheirranks.Thesplittingofthedataisdonebygroupingthedatavectorsbasedontheirranks.
Initially,thesplitisdoneinsuchawaythatthe rstgroupcontainsdatavectorshavingcomparativelyhigherrankthanthedatabelongingtotheanothergroup.Toachievethis,aparameteryisintroduced,i.e.the rsty%ofthewholerankeddatasetisconsideredtoformagroupandtheremainingdata,i.e.ð1ÀyÞ%ofthewholerankeddatasetbelongstoanothergroup.Wecreateanewruleforeachcreatedpartition.Thuseachsplitincreasesthenumberofrulesbyone.ThepredictionerrorEmseforthetrainingdatasetisrecalculatedusingtheextractedruleset.Ateachstepofincreaseinthenumberofrules,theconvergenceoferrorismonitored.RulegenerationisstoppedwhenaddingaruledoesnotyieldfurtherimprovementinpredictionerrorEmse.
Inthecaseoffurtherrulecreation,thedatasetinthesecond
partisfurthersplittoextractmorerules.The rsty%oftherankeddatasetinthesecondpartisselectedtoformagroup.Theremainingpartofthedatasetinthesecondpartofthewholedatasetisconsideredasanothergroup.AnewruleisgeneratedforeachofthenewpartitionsandthepredictionerrorEmseforthetrainingdatasetisrecalculatedusingtheextractedruleset.ThisprocessofrulegenerationcontinuesuntilthepredictionerrorEmseforthetrainingdatasetreachesaplateauorthereisnodatainthelastpartitionedgrouptofurthersplit.
Eachtimeanewrule(letusconsiderthisnewruleisthej’thrule)isgenerated,thetotalnumberofdatavectorsnjandthe
HMMlog-likelihoodvaluerange(startingpoint:lstart
jandend
point:lend
j)consideredtogeneratethatj’thruleisstoredtobeusedatlaterstageofthesystem.
Example2.LetusconsiderthedatasetdescribedinExample1.First,onefuzzyruleisgeneratedusingallthedatavectorsindatasetD.Weassumethatthepredictionerrorforthegeneratedfuzzyruleis0.9,whichisgreaterthanthebestpossibleminimumpredictionerror0.Tominimizeerrorfurther,thenumberoffuzzyruleisincreasedbyone.ThedatasetisdividedintotwopartsusingtheHMMlog-likelihoodvalueofeachdatavector.TheHMMlog-likelihoodvalueisusedasarankscoreforeachdatavector.The rstdividedpartconsistsofthedatavectorswithlog-likelihoodvaluesintherangeof–0.0to–1.5(i.e.the rsty%datavectorsofthewholedataset)andtheseconddividedpartcontainsthedatavectorswithlog-likelihoodvaluesintherangeof–1.5to–3.5(i.e.theremainingð1ÀyÞ%ofthedataset).Twofuzzyrulesaregeneratedusingthesetwopartsofthedivideddataset.Soweobtainapredictionerror0.3usingthetwofuzzyrules.Assumingthaterrormaybereducedfurther,additionalrulegenerationisrequired.Thedatasetinthesecondpartitionedgroupissplitintotwogroups.Atthisstep,the rsty%rankeddatavectorsofthesecondpartitioneddatasetisusedasonegroupandtheremainingdatavectorsfromthesecondpartitioneddatasetisconsideredasanothergroup.Newrulesaregeneratedforeachofthegroups(the rstpartitionedgroupandthenewlyobtainedtwogroups)andsowehavenowobtainedthreefuzzyrules.Letusassumethatapredictionerrorusingthesethreefuzzyrulesof0.5.Afurthersplitofthelastpartitioneddataproducesapredictionerroris0.7.Letusassumethatwehavereachedatapointwhereitisnotpossibletofurthersplitthelastpartitioneddatavectors.Now,wehavethebestminimumpredictionerror0.3thatwasachievedusingtwofuzzyrules.The nalfuzzymodelisbuiltusingthosetwofuzzyrulesandtherulegenerationprocessisterminated.3.2.Adaptivefuzzy
TomaketheHMM–Fuzzymodeladaptivetothenewarrivalofdatawhichmightbecomeavailableafterthemodelhasbeenbuilt, rstweneedtoidentifytherule,thatistobeadaptedtore ectthenewdatabehavior.Therefore,thisprocesshastwosteps:(1)Identifyingtherelatedruleand(2)updatetheselectedrule’sparameters.
3.2.1.Extractingtheruletobemodi ed
Whennewdataisavailable,thelog-likelihoodvalueforthenewdataiscalculated.Basedonthelog-likelihoodvaluethecorrespondingrulethatwasgeneratedusingdatavectorswithsimilarlog-likelihoodscoreisidenti ed.Thenthatrule’spara-metersareadaptedwiththenewdata.
Example3.LetusconsideradatasetDwithatotalndatavectors.ForthisdatasetwetrainaHMMlandproducethesetoffuzzy
16Md.R.Hassanetal./Neurocomputing104(2013)10–25
Fig.3.Theselectedruleanditsadaptation.
rulesusingtheHMM–Fuzzyapproach.Letusassumen¼40andwehavethreefuzzyrules.Amongtheserules,the rstrulehasbeengeneratedusingthedatasetwithlog-likelihoodvaluesintherangeof–0.0to–0.6,whilethatofforthesecondruleisinbetween–0.61and–0.7andthatforthethirdruleisinbetween–0.71and–3.0.Supposethenewdatavectorproducesalog-likelihoodvalueof–0.9.Thisvalueindicatesthatthenewdatavectorwillbecoveredbythethirdrule.Hence,toadaptthefuzzymodelgiventhenewdatavectorwemustupdatethethirdrule.3.2.2.Adapttheextractedrule(s)
Adaptationoftheextractedrulegiventhenewdatavectoristheprocessofadjustmentoftheparametersoftherule.Therearethreeparameterswhichneedtobeadjusted:(1)Thelinearparametersoftheconsequentpartoftherule,(2)ThecenterFofeachmembershipfunctionM,and(3)Thesteepnesssofeachmembershipfunction.
!!n
Toadaptthelinearparametersbtoboftheconsequentpartoftheextractedruleuponarrivingnewdataynew,theformulaforrecursiveLSE[37]isusedasinEq.(15).!!T
CxnewxnewC
C¼CÀ
1þxnewCxnew
!n!!T!n!b¼bþCxnewynewÀxnewb:
n
whereFij¼currentcenterde ningthemembershipfunctionMij,
Fnij¼thenewcenterofthemembershipfunctionMij,j¼theselectedrule,nj¼numberofdatavectorsthatwereusedtogeneraterulejduringinitialmodelbuilding,xnewi¼ithfeatureofthedatavector!
xnew,Mij¼membershipfunctionoftheithfeatureoftheselectedj’thrule.
Fig.3showstheeffectofadaptationonmembershipfunctionsofaselectedrule.
Example4.ConsideringthesamesetupdescribedinExample3,we
!
nowhavethenewdatavectorxnewandtheselectedrulerulej.Now,weadjustthecenterFijandsteepnesssijofeachmembership
!
functionMijofrulerulej.LetusassumethatFj¼/6:2,5:7,6:3,6:5S,
!!
i.e.,F1j¼6:2,...,F4j¼6:5,sj¼/0:4,0:7,0:63,0:49Sandxnew¼/5:5,6:5,6:7:ingEq.(16)and(17)weobtaintheadjusted
!n
centervaluesasFj¼/6:16,5:75,6:28,6:53S,andtheadjusted
!n
steepnessvaluesassj¼/0:4488,0:7502,0:6550,0:5213S.3.2.3.Generatenewrule
InAFIS,anewruleisalsogeneratedifrequired.IftheHMMgeneratesalog-likelihoodvalueforanewdatavectorwhichdoesnot tintheexistingdataarray(i.e.,thenewlog-likelihoodvalueexceedstherangeoflog-likelihoodvaluesthatwasusedtogeneratetheexistingfuzzymodel)anewruleneedstobegenerated.Inthiscase,giventhedesiredoutputforthenewdatavector,theLSEalgorithmisusedtoobtainthelinearparametersforthenewrule.Parametersofthemembershipfunctionsforthenewruleareobtainedfromthenewdatavector.Sincethereisonlyonedatavectorpertainingtothenewrule,thecentervalue(Fnewi)andsteepnessvalue(snewi)ofeachmembershipfunctionforthisnewruleareobtainedfromthenewarrivingdatavector.However,everytimethesystemfetchesnewdatavectorofsimilarpatternitwilladaptitselfwiththenewarrivingdatavectorasdescribedinpreviousSection3.2.1
.
ð15Þ
!
Here,xnewisthedatavectorthatcorrespondstotheoutputynew.
!
Basedonthenewlyavailabledataxnew,recalculationoftheparameters:FandsforeachmembershipfunctionoftheselectedruleisdoneusingEqs.(16)and(17).
Fnij¼
Ã1Â
xnewiþnjFij,
jð16Þ
sij
n2
!
Á2njÀ12
nsþx¼ÀFij,
njþ1jijnjþ1newi
ð17Þ
Md.R.Hassanetal./Neurocomputing104(2013)10–2517
4.Experimentdesignanddatasets
Ourexperimentsareconductedonrealstockdata.Thehard-wareusedisanAMD2.3GHzCPUwith4GBmemory.ProgramswerewritteninMatlabandrunusingWindowsVista.4.1.Datasets
Wehaveusedsevenleadingstockmarketindicesdatafromdifferentpartsoftheworld:DowJonesIndustrialAverage(DJI),NASDAQComposite(NASDAQ)andS&P500IndexRTH(S&P500)fromUSAstockmarket;FTSE100,DAXPerformanceIndex(GDAXIorDAX)fromEuropeanstockmarket;andAllOrdinaries(AORD)andNIKKEI225(N225orNIKKEI)fromAsianstockmarket[38].Wehaveusedthehistoricalpastweeklydataoftheabove-mentionedstockmarketindices.ThetimerangeforeachdatasetisdetailedinTable3.4.2.Datasetup
Wehaveusedweeklystockindicestobeusedinthemodeltrainingandevaluation.Touseinthesystem,wehaveusedfourweeksasinputvariablesand fthweekasthepredictvariable.
InAFIS,webuildaninitialmodelconsideringasmallamountofdataastrainingdata.ThisinitiallybuiltmodelisadaptedwiththearrivingofnewdatafollowingtheadaptiveprocessasdescribedinSection3.2.Toevaluatetheonlineadaptivebehavioroftheproposedsystem,webuilttheinitialmodelbyvaryingthelengthofthetrainingdatasetusingassmallasof60datavectorstoamaximum3000datavectors.Theremainingpartofthedatasetwasusedastestdata(completelyunknowntothesystem).Tomakethecomparisonconsistent,webuilttheothermodelsusingthesamesplitofdataintotrainingandtestsets.Theothermodelswehavetestedare:DENFIS,Chiu’sfuzzymodel[39](fuzzymodelisgeneratedusingasubtractiveclustering)followedbyahybridlearningalgorithmpresentedbyJangetal.[40]andHMM–Fuzzymodel.
AutoregressiveIntegratedMovingAverage(ARIMA)isapop-ulartechniqueforforecastingtimeseriesdata.WecomparedtheperformanceofAFISwithARIMAwhere,weusedthe rst1000datainstancesastrainingdataandtheremainingdatainstancesastestdata.
WehavealsogeneratedpredictionsusingarepetitivelytrainedARIMA,wheretheARIMAistrainedeverytimewitheacharrivingnewdatainstances.Eachtime,theARIMAistrainedusingthenewdatainstance,alongwith1000datainstancesfromtherecentpast.WetermthisARIMAasrepetitivelytrainedARIMA.
Wedevelopedanotherfuzzymodelthroughpartitioningthetrainingdatarandomlyandthengeneratingfuzzyrulesforeachofthepartitions,inordertostudytheeffectivenessofinitialpartitioningofdatausingHMMinAFIS.ThenumberofpartitionswaschosentobesameasthenumberoffuzzyrulesgeneratedinAFIS.WerefertothismodelasRandomlyPartitionedFuzzyRuleGeneration(RPFRG).Wemadethismodeladaptivebyadaptingitsparametervalueswitharrivingnewdata.Toadapttheparameter
Table3
Detailsofthedatasetusedintheexperiment.StocknameFromdateTodateDJI
01/10/192824/08/2009NASDAQ05/02/197124/08/2009S&P50003/01/195024/08/2009FTSE10002/04/198424/08/2009DAX26/11/199024/08/2009AORD03/08/198424/08/2009NIKKEI
04/01/1984
24/08/2009
values,wehaveusedthesamemethodologyasinAFIS.WecallthisadaptivemodelasAdaptiveFuzzySystemfollowedbyRPFRG(ARPFRG).InARPFRG,therulethatneedstobeadaptedisselectedrandomlyandtheparameterswereadaptedgiventhenewdatavectorfollowingEqs.(16)and(17).DetailsoftheseapproachesareprovidedinAppendix.
Wedevelopedafuzzymodelwherefuzzyrulesweregener-atedfollowedbyak-meansclusteringalgorithm[41].Inthismodel,thevalueofkisprovidedbytheuserpriortobuildthefuzzymodel.DetailsofthisapproachisprovidedinAppendix:SectionAppendixC.Inthisstudywerefertothismodelasfuzzyrulegenerationusingk-meansclustering.Wemodi edthisof ineapproachofgeneratingfuzzymodeltoanonlineadaptivesystemfollowingtheproceduredescribedinAppendix:SectionD.Theadaptiveversionofthefuzzyrulegenerationusingk-meansisreferredasadaptivek-meansfuzzymodel.
Arti cialNeuralNetwork(ANN)isapopulartooltoforecastfuture.Weusedathreelayer(Input-Hidden-Outputlayer)ANNtrainedbybackpropagationalgorithm[42].Todeterminethemostsuitablearchitecture,wetrainedtheANNbyvaryingthenumberofhiddennodesfrom5to35andthenselectedthatANNwhichproducedthebestforecastperformance.
4.3.Performancemetrics
Wehaveusedthreedifferentmetricstoevaluatethepredictedmodels:MeanAbsolutePercentageError(MAPE),NormalizedRootMeanSquaredError(NRMSE)andt-test.
4.3.1.MeanAbsolutePercentageError(MAPE)
Thisvalueiscalculatedby rsttakingtheabsolutedeviationbetweentheactualvalueandtheforecastvalue.Thenthetotaloftheratioofdeviationvaluewithitsactualvalueiscalculated.Thepercentageoftheaverageofthistotalratioisthemeanabsolutepercentageerror.ThefollowingequationshowstheprocessofcalculatingtheMAPE.
Pr 9yÀy^ MAPE¼
i¼1
ii9i
r
Â100%,ð18Þ
wherer¼totalnumberoftestdatavectors,yi¼actualstockprice
onweeki,andy
^i¼forecaststockpriceonweeki.4.3.2.NormalizedRootMeanSquaredError(NRMSE)
Thisistherootmeansquarederrordividedbytherangeofobservedvalues
q Pr
NRMSE¼
¼1ðyiÀy^2
iiÞ
,ð19Þmaxminwhereymaxandyminarethemaximumandminimumvaluesofyiwiththertestdatavectors.
4.3.3.t-test
t-testisastatisticalhypotheticaltestwheretheaveragesofthetwosamples:thepredictedoutputusingAFISandthepredictedoutputusinganotherfuzzyapproachistestedagainst
thenullhypothesisH0.Letusconsiderthetwoaveragesare:y
AFISandy
#respectively.Thenullhypothesisisde nedasH0:y
AFIS¼y#,ð20Þ
wherey
^AFIS¼averageofpredictedoutputsfromAFIS;y^#¼averageofpredictedoutputsfromanyotherapproach.Thet-valueforthe
18Md.R.Hassanetal./Neurocomputing104(2013)10–25
102102
MAPE
101
MAPE
3000
101
100
05001000150020002500
100
0500100015002000
Number of training data instancesNumber of training data instances
102102
MAPE
101
MAPE
101
100
0500100015002000
100
02004006008001000
Number of training data instancesNumber of training data instances
102
102
MAPE
MAPE
100
200
300
400
500
600
700
800
101101
100100
0100200300400500600700800900
Number of training data instancesNumber of training data instances
102
MAPE
101
100
0100200300400500600700800900
Number of training data instances
parisonoftheMAPEforalldatasetsfortheforecastsgeneratedusingAFIS,HMM–Fuzzy,DENFISandChiu’sFuzzymodel[39].(a)DJI,(b)NASDAQ,(c)S&P500,(d)FTSE100,(e)DAX,(f)AORDand(g)NIKKEI.
Md.R.Hassanetal./Neurocomputing104(2013)10–2519
0.40.350.3
0.350.30.25
NRMSE
NRMSE
0.250.20.150.10.0500.20.150.10.05
0500100015002000
Number of training data instancesNumber of training data instances
0.450.40.350.3
0.70.6
0.5
NRMSE
NRMSE
0.250.20.150.10.0500.40.30.20.1
Number of training data instances
0.70.60.5
0.450.40.350.3
NRMSE
0.40.30.20.100
100
200
300
400
500
600
700
800
NRMSE
0.250.20.150.10.050
Number of training data instancesNumber of training data instances
NRMSE
100
200
300
400
500
600
700
800
900
Number of training data instances
parisonoftheNRMSEforalldatasetsfortheforecastsgeneratedusingAFIS,HMM–Fuzzy,DENFISandChiu’sFuzzymodel.(a)DJI,(b)NASDAQ,(c)S&P500,(d)FTSE100,(e)DAX,(f)AORDand(g)NIKKEI.
20Md.R.Hassanetal./Neurocomputing104(2013)10–25
t-testiscalculatedasinEq.(21).AFISÀy#y
t-value¼^#ÞvarðyAFISÞÀvarðy
ð21Þ
wheren¼totalnumberofsamples.Thet-valueisusedtodeterminethesigni cancelevelofdifferencebetweenthetwodatasamples.Thissigni cancelevelisknownasp-value.Ap-valuer0:05indicatesthatthenullhypothesisisrejectedwithin95%con dencelevelandhencethedifferencesinpredic-tionarestatisticallysigni cant.4.4.Choiceofparameters
Followingthestudies[43,22,19,21,23],thenumberofstatesinHMMforthestocksischosenas5,asthenumberofinputfeaturesis5(i.e.thewindowsize‘WT’¼5)inthedataset.Wegeneratedforecastsbyvaryingthewindowsizefrom3to6andnoticedinsigni cantvariationinforecastperformance.Alltheexperimentalresultsreportedhereweregeneratedusingthewindowsize‘WT’¼5.TheinitialparametervaluesoftheHMMarechosenbyfollowingthesamestepsasinHassanetal.[19].Weidenti edtheparametervaluesofARIMAthroughanalysisofthetrainingdatausingAkaikeInformationCriterion(AIC)andBayesianInformationCriterion(BIC).Theparametervalues(y)oftheHMM–Fuzzymodel(wherethe rst1000datainstanceswereusedastrainingdata)andARIMA(p,d,f)arelistedinTable6.InHMM–Fuzzy,forchoosingthe
Table4
Comparisonofthep-valueoft-testforalldatasets.Stockname
parametervaluey,weused90%ofthetrainingdatatogeneratefuzzyrulesandusedtheremaining10%ofthetrainingdatatomonitortheperformanceofthegeneratedfuzzymodel.Weselectedaythatproducedtheminimumerrorforthis10%data,throughvaryingthevalueofyfrom10%to90%.Thisyisthenusedtogeneratethefuzzymodelusingthefulltrainingdataset.Itshouldbenotedthatweusedthetrainingdataonlyforselectinganoptimalyandgeneratingtheinitialfuzzymodel,whilekeepingthetestdatacompleteunknowntothesystem.Thesameapproachwasusedtoselecttheparametervalues(i.e.,radiusofclusters)togenerateChiu’sfuzzymodel.TheobtainedmodelusingChiu’ssubtractiveclusteringtechniquewastunedusingahybridlearningalgorithmpresentedbyJangetal.[40]with500epochs.
5.Resultsanddiscussion
GraphsinFigs.4and5showtheperformancesinMAPEandNRMSErespectivelyforthesevenstocksconsideredinthispaper.Ascanbeseenfromthesegraphs,theforecastperformanceoftheproposedadaptivefuzzyinferencesystem(AFIS)onthestockmarketdatasetsclearlyoutperformsthatofalltheotherreportedcompetingfuzzymodels.Fromobservationsofthe gures,twoimportantaspectsareevidenthere:(a)betterperformanceisachievedbyAFISirrespectiveofthelengthoftrainingdataset,especiallyforDJI,NASDAQandS&P500datasetsMAPEandNRMSEattainedbyAFISissigni cantlylowerthanothers;(b)AFISachievesitssuperiorperformancewithonlyveryshort
p-values(vs.AFIS)within95%con denceHMM–Fuzzy
Chiu’sModel
Signi cantlydifferenceð1:006Â10Þ
Signi cantlydifferenceð2:017Â10À3Þ
Signi cantlydifference(0.0243)
Signi cantlydifferenceð2:336Â10À5Þ
Signi cantlydifference(0.0064)
Signi cantlydifference(0.0137)
Signi cantlydifference(0.0471)
À7
DENFIS
Signi cantlydifferenceð2:009Â10À9Þ
Signi cantlydifferenceð2:171Â10À7Þ
Signi cantlydifference(0.0092)
Signi cantlydifferenceð3:336Â10À13Þ
Signi cantlydifference(0.0015)
Signi cantlydifferenceð3:987Â10À13Þ
Signi cantlydifferenceð5:289Â10À90Þ
DJINASDAQS&P500FTSE1000DAXAORDNIKKEI
Signi cantlydifferenceð1:092Â10Þ
Signi cantlydifference(0.049)
Signi cantlydifference(0.0341)
Signi cantlydifferenceð3:056Â10À3Þ
Signi cantlydifference(0.0154)
Signi cantlydifferenceð2:931Â10À36Þ
Signi cantlydifference(0.02685)
À3
1010
32
AFISRPFRGARPFRG
101010
65432
101
AFISRPFRGARPFRG
NRMSE
10101010
0 1 2 3
MAPE
Number of training instancse
1010
10110
200250300350400450500550600650700200250300350400450500550600650700
Number of training instancse
Fig.6.PerformancecomparisonamongHMM–Fuzzy,AFIS,randomlypartitionedfuzzyrulegeneration(RPFRG)andadaptivefuzzysystemfollowedbyrandomlypartitionedfuzzyrule(ARPFRG)forDJIstockindex.(a)Performancemetric:NRMSEand(b)Performancemetric:MAPE.
Md.R.Hassanetal./Neurocomputing104(2013)10–2521
lengthofthetrainingdata.Forexample,forNIKKEIseries,allthefuzzymodelsproducedaminimumconsistentMAPEandNRMSEstartingfromthelengthofthetrainingdataas200(asweseeinFigs.4(g)and5(g)).ForthisstockAFISproducedevenabetterperformancestartingfromthelengthofthetrainingdataas60andonwards.
Tofurtheranalyzetheresults,wehaveconductedapairedt-testwith5%signi cancelevel(i.e.95%con dencelevel)betweenAFISandotherconsideredtechniques.AsshowninTable4thecomputedp-valuesbetweenthepredictedvaluesbyusingAFISandthatofusingHMM–Fuzzy,Chiu’ssubtractiveclusteringbasedfuzzymodelandDENFISaremuchlessthan0.05.ThefactthattheperformanceofAFISisfarbetterthantheotherfuzzysystems(Figs.4and5)alongwiththesmallerp-value(i.e.,p-valueo0:05)statisticallysignifythatAFISiscapableofforecastingtimeseriesdatasigni cantlybetterthanHMM–Fuzzy,Chiu’sfuzzymodelandDENFISforthestockdataconsideredinourexperiment.
ToanalyzetheeventthatmakesAFISsuchanef cientforecastapproach, rstwegeneratedfuzzyrulesusingaschemeofrandomlypartitioningthetrainingdata.GeneratedrulesarealsoadaptedassoonasnewdataarrivesfollowingarandomprocessasstatedinSection4.Fig.6providestheperformanceresultsofRandomlyPartitionedFuzzyRuleGeneration(RPFRG)andAdaptiveFuzzySystemfollowedbyRPFRG(ARPFRG)alongwithHMM–FuzzyandAFISforDJIstockindex.Table6showsthatAFISisclearlyabletomodelthebehaviorofthestockseries.Forexample,theperformanceofAFISinMAPEis1.93foratrainingdatalengthof700,whereasforthesametrainingdataARPFRGattainsaMAPEvalueofnearly430000(seeFig.6b).Itisworthmentioningherethatduetointroducingrandomnessingeneratingfuzzyrulesandidentifyingtherulethatneedstobeadapted,theMAPEvaluesforbothRPFRGandARPFRGaremuchhigherthanthatofAFIS.Second,wegeneratedfuzzymodelusingak-meansclusteringalgorithmanditsadaptiveversion.Intheadaptivek-meansfuzzymodel,withthearrivalofnewdatavectors,theinitialfuzzymodelgeneratedusingak-meansalgorithmisadaptedbycouplingtheintelligentdynamicadaptiveapproachdescribedinSection3.2.1.AsshowninTable5,theperformanceoftheadaptivek-meansfuzzymodelissigni cantlybetterthanthatofnon-adaptivefuzzymodel.Thissigni estheimportanceoftheproposedadaptiveapproachinimprovingforecastaccuracy.Moreinterestingly,eventhoughtheadaptiveapproachyieldsbetterresults,theperformanceofAFISisfarbetterthantheperformanceoftheadaptivek-meansfuzzymodelintermsofMAPE.Hence,these ndingssubstantiatethattheeffectivenessofpartition-ingdatausingHMMaswellastheintelligentadaptiveapproachcontributedtotheimprovedperformanceofAFIS.
Inliterature,ANNhasbeenusedtoforecasttimeseriesdata,e.g.,stockmarketpredictionbyAtsalakisaetal.[44]andforeigncurrencyexchangerateforecastingbyKamruzzamanetal.[45].Table6providesacomparisonbetweenAFISandANN.TheforecastperformanceofAFISissigni cantlybetterthanthatofANN.ThepoorperformanceofANNisduetoitsinabilitytocopewithnewdatavector.Thisisbecause,whilegeneratingforecastusinganANNthathasbeentrainedadataset,mightnotre ectthecharacteristicsofthenewdatavector.RetraininganANNwithnewdataistimeconsuming,andhencenotsuitablefortimeseriesdatalikestockmarket,wherethetrendmaychangeconsiderablyfromthatofthepast.EvidentlythebetterperformanceofAFISisduetoitsintelligentadaptiveabilitywiththenewdata.
ARIMAisoneofthewidelyusedtechniquestopredicttimeseriesdata.ARIMAisanof ineprocesswheretheinitialmodelisbuiltusingavailabletrainingdataset.Oncethemodelisbuiltthemodeldoesnotadaptitselfwiththearrivalofnewdata.Table6showstheperformancecomparisonbetweenAFISandARIMAforthesevenstockindices.Tomakethecomparisonconsistent,theperformanceofrepetitivelytrainedARIMAisalsopresented.Onceagain,AFISoutperformsstandardARIMA.However,theperfor-manceofAFISisslightlybetterthanrepetitivelytrainedARIMAexceptinthecaseofNIKKEIforwhichtrainedARIMAperformsslightlybetter.ThiswasnotsurprisingasARIMAisretrainedwithnewdatathusexhibitingadaptiveness.However,ARIMAissig-ni cantlyworseintermsofitscomputationalperformance,as
Table5
PerformancecomparisonamongAFIS,Fuzzymodelgeneratedusingk-meansanditsadaptiveversion(trainedforeachnewdata;the rst1000datainstancesusedfortrainingandtheremainingdatafortesting).Stockname
AFISNRMSE
DJI
NASDAQS&P500FTSE100DAXAORDNIKEI
0.00870.01700.01020.03960.07350.02510.0259
MAPE1.52162.22761.62911.70053.67911.56682.4377
Fuzzyrulegenerationusingk-meansNRMSE3.4020.34460.42870.040.34020.34460.0339
MAPE42.262260.77346.6231.700342.094860.7732.4259
Adaptivek-meansfuzzymodelNRMSE0.0130.02160.0110.040.01210.02160.0339
MAPE2.42163.44221.7041.69762.22783.44222.426
3354233
#ofFuzzyrules
Table6
PerformancecomparisonamongAFIS,ARIMAandArti cialNeuralNetwork(trainedforeachnewdata;the rst1000datainstancesusedfortrainingandtheremainingdatafortesting).Stockname
AFISNRMSE
DJI
NASDAQS&P500FTSE100DAXAORDNIKKEI
0.00870.01700.01020.03960.07350.02510.0259
MAPE1.52162.22761.62911.70053.67911.56682.4377
ARIMA
RepetitivelytrainedARIMA
MAPE78.462550.944746.900420.728679.150251.594225.8386
p,d,f3,1,1,2,4,1,1,1,1,1,1,1,1,1,3022403
NRMSE0.00900.01740.01050.04040.07420.03120.0237
MAPE1.56972.39531.64111.72133.79931.68172.3701
Arti cialNeuralNetworkNRMSE0.31840.31420.32250.1180.32250.26030.0411
MAPE42.728860.556332.94372.761846.749847.14832.4377
#Nodes35101010151520
y
0.80.90.70.80.80.90.9
NRMSE0.34290.27200.34070.44750.34320.27360.3955
22Md.R.Hassanetal./Neurocomputing104(2013)10–25
showninTable7.OnaverageAFISisbetween4and13timesfasterthanrepetitivelytrainedARIMA.TheaverageexecutiontimetogenerateapredictionusingAFISisalmostconsistent(4–5ms)forthesevenstockindicesasshowninTable8.But,thetimetogenerateapredictionusingrepetitivelytrainedARIMAvariesfrom19msto57ms.
Furthermore,unlikerepetitivelytrainedARIMA,AFISdoesnotrequiretoretrainandrebuildthemodelforeverynewobservation.Instead,AFISadaptsonlyitsstructureandco-ef cientdynamicallywitheveryarrivingnewdatainstance,thusprovidingthebestperformancecomparedwithothercompetingmodels.
Theaboveresultsdemonstratethecapabilityofourmethodinyieldingbetterforecastingforstockmarketdata.Inaddition,wedidfurtherexperimentstoassessitsef cacyonothertimeseriesdata.Fig.7showstheforecastvaluesandtheactualvaluesofmonthlyelectricityproductioninAustralia(dataavailableon[46]).Asshowninthe gure,AFIScanbetterfollowthetrendoftimeseriesincomparisonwiththatofof inefuzzymodels,e.g.HMM–FuzzymodelandChiu’sfuzzymodel.Thisisbecause,eachofthefuzzymodelswastrainedusingasmalldata(Jan,1956–Oct,1964,lengthis100)andhenceastimegoesontheof inefuzzymodelscannotproduceareasonableforecastforthenewdata.Ontheotherhand,AFISemploysitsintelligentadaptiveabilitywiththearrivalofnewdata.ForecasterrorintermsofotherperformancemetricsshowninTable9alsoshowsthesuperiorityofAFIScomparedtoothermodels,evenifthelengthoftrainingdataissmall.
Table10summarizesthecharacteristicsthatarerequiredforaperfectonlineadaptivesystem.AFIShastheallthecharacteristicswhileDENFISsatis esthreecriteriaamonginthelistandrepetitivelytrainedARIMAsatis esonlytwocriteria.Moreover,
Table7
ExecutiontimecomparisonbetweenAFISandrepetitivelytrainedARIMA(the rst1000datainstancesusedforbuildingtheinitialmodelandtheremainingdatafortesting;thisexperimentwasexecuted10timesforeachofthestocksandtheaverageperformancealongwithperformancevariationisreportedhere).Stockname
LengthofLengthoftrainingdatatestdata
AFIS
Trainingtimeforbuildinginitialmodeltime(s)mean7std
DJI
NASDAQS&P500FTSE100DAXAORDNIKKEI
100010001000100089010001000
32161007204232184303324
3.2093.3152.9983.1263.3913.3263.253
70.351270.132470.512170.414270.353170.136770.4851
RepetitivelytrainedARIMA
PredictiontimeandtimeTrainingtimeforbuilding
initialmodeltime(s)toadapttime(s)
mean7stdmean7std14.091570.4251
4.894170.33939.080870.47341.626570.33810.415670.23321.527570.29931.644170.8474
2.682.632.582.712.252.552.57
70.131270.353270.451270.441770.393170.667170.3985
Predictiontimeandtimetoadapttime(s)mean7std137.830025.7742116.96707.56252.66006.02406.2243
73.828970.655671.873570.125370.186770.0851470.6285
9.795.2712.874.656.403.943.79
Speedupperdataprediction
Table8
Executiontimetogenerateaprediction.Stockname
Timetopredictthenextdata(ms)AFIS
DJI
NASDAQS&P500FTSE100DAXAORDNIKKEI
4.384.864.455.074.955.045.07
RepetitivelytrainedARIMA42.8625.6057.2823.5631.6719.8819.21
9.795.2712.874.656.403.943.79
Speedup
1600014000
Million kilowatt hours
12000100008000600040002000
Date
Fig.7.Forecastvaluesvs.actualvalueswhereforecastsarecomputedusingAFIS,HMM–Fuzzymodel,Chiu’sfuzzymodelandDENFISforthemonthlyelectricity
productioninAustralia.(Trainingdata:Jan1956–Oct1964andTestdata;Nov1964–Aug1995).
Md.R.Hassanetal./Neurocomputing104(2013)10–2523
Table9
PerformancecomparisonamongAFIS,HMM–Fuzzy,DENFISandChiu’smodel(byvaryingthelengthoftrainingdata:100and200)forMonthlyelectricityproductioninAustralia:millionkilowatthours:Jan1956–Aug1995.TrainingdataFromJan1956Jan1956
ToOct1964Feb1973
TestdataFromNov1964Mar1973
ToAug1995Aug1995
AFISNRMSE0.06860.0507
MAPE7.54004.5667
HMM–FuzzyNRMSE0.48980.0610
MAPE38.00555.1254
Chiu’smodelNRMSE0.29580.0611
MAPE19.01745.1157
DENFISNRMSE0.46860.4180
MAPE50.193832.5049
Table10
Comparisonofadaptiveonlinelearningsystemsbasedonthedesiredcharacteristics.Desiredcharacteristics
AFISRep.trained
ARIMA||||
|ÂÂ|
DENFIS
Canitcaptureanynewinformationastheyareavailable?
Doesthesystemrepresenttheoverallknowledgeabouttheproblemwithoutmemorizingthelargeamountofrepresentativedataset?Isthesystemabletoupdateanyknowledgeinrealtimewhichisobservedintherecentdatasetsthatwasnotpreviouslyconsideredduringbuildingtheinitialsystemandtherebybeabletoavoidrebuildinganewsystemwhenthereisnochangeinthemodel?Istheperformanceofthesystemsigni cantlybetterthanastaticsystem?
|||Â
theperformanceofDENFISismuchworsethanthatofAFISonallthesevenstockindicesasdemonstratedinourexperiment.6.Conclusion
Inthispaper,anewadaptivefuzzyinferencesystem(AFIS)hasbeenproposedanddevelopedwithaviewtoachieveimprovedperformancebydynamicallyadaptingwiththearrivalofnewdata.ThestructureofthebasefuzzymodelofourproposedsystemisdevelopedbasedonHMMlog-likelihoodofeachdatavector.TherationaleofusingHMMistomodeltheunderlyingsystemandusethisHMMtorankthedatavectoraccordingly.Fuzzyrulesarethengeneratedaftergroupingthedatavectorsthathavehigherlog-likelihoodvaluesthantheothers.Theseinitiallygeneratedrulesareadjusteddynamicallyeverytimenewdataisobserved.DuetotheintelligentadaptationmechanisminAFIS,itperformsbetterthanotherexistingcompetingmodels(bothstaticfuzzymodelsanddynamicmodels,i.e.,staticARIMA,HMM–Fuzzy,Chiu’sModel,DENFISandrepetitivelytrainedARIMA).Onesuchdynamicadap-tivefuzzyinferencesystemhasmanypotentialapplicationsincomputerscience, nancialengineeringandcontrolengineeringwherethevalueofaneventcontinuouslychangeswithtime.AppendixA.RandomlyPartitionedFuzzyRuleGeneration(RPFRG)
Inthisapproachoffuzzyrulegeneration,weassumethatthenumberoffuzzyrulestobegeneratedforagivendataisknowntotheuserpriortobuildingthefuzzymodel.Letusconsiderthenumberoffuzzyrulesis‘k’.Hence,thetraininginput(predictor)dataispartitionedinto‘k’groupswhereeachdatavectorshouldbelongtoonlyoneofthe‘k’groups.Thisisdoneforeachdatavectortobeputinoneofthe‘k’groupsrandomly.LetusconsiderasetofinputpredictordatavectorsX1toX33andthevalueof‘k’is3.Thus,foreachdatavectorXiarandomvaluebetweentherangeof1–3isgeneratedandfollowingtherandomlygeneratedvaluethecorrespondingdatavectorisconsideredtobelongtothatgroup.Assumethatarandomvalue‘3’isgeneratedforX1and‘2’isgeneratedforX2.SoX1andX2willbeassignedtoGroup1andGroup2,respectively.Thepartition-ingoftheinputdatavectorsareasfollows:Group1:X3X4X7X9X10X16X20X29X33
Group2:X2X5X6X8X11X14X17X19X23X24X28X31Group3:X1X12X13X15X18X21X22X25X26X27X30X32
Sincedataarepartitionedinto‘k’groupsbychoosingtheirgrouplabelsusingrandomlygeneratednumberwecallthispartitioningasRandomlypartitioned.Oncethedatahasbeenpartitionedinto‘k’groups,individualfuzzyruleisgeneratedforeachgroupofdatafollowingSection3.1.2.Thisapproachisreferredasrandomlypartitionedfuzzyrulegeneration(RPFRG).
AppendixB.AdaptiveRandomlyPartitionedFuzzyRuleGeneration(RPFRG)
Inthisapproach,thefuzzymodelgeneratedinSectionAppendixAistransformedtoanadaptivesystemwheretherulestructureisadaptedwiththearrivalofnewinputdatavectorxnew.Inthisregardassoonasadatavectorxnewisavailable,thesystemeitherchoosesafuzzyruleamongtheexisting‘k’fuzzyrulesgeneratedinSectionAppendixAandthenadaptsruleorgeneratesanewfuzzyruleintheRPFRGfuzzymodel.Here,therulethatneedstobeadaptedischosenbygeneratingarandomnumberinbetween1and‘k’,where‘k’isthetotalnumberoffuzzyrulesintheRPFRGfuzzymodel.
Letusconsiderthenewinputdatavectoris/5:9,1:2Swhilethedesiredoutputforthisinputdatais0.9.IntheprocessofadaptingtheRPFRGfuzzymodel,anintegervalueintherangeof1–3(since,weconsiderthevalueof‘k’¼3)isgenerated.Letusassumethattherandomlygeneratedintegervalueis2.Thus,Ruleno2ischosentoadaptgiventhenewinputdatavector.Notethat,theparametersofRuleno2beforeadaptationareasfollows:
M2,1:F¼4.8633ands¼2.3628 M2,2:F¼3.1875ands¼1.2521
Now,followingEqs.(15)–(17),thenewparameters(i.e.,afteradaptingtherulegivennewinputdatavector)areasfollows:
M2,1:Fn¼4.9431andsn¼2.2804 M2,2:Fn¼3.0346andsn¼1.3195
TheeffectofadaptationonthefuzzymembershipfunctionsofRuleno2willbesimilartotheoneillustratedinFig.3.
!n
Inasimilarway,thenewvaluesofCnandbarecomputedbythefollowingEq.(15).Oncetheruleisadaptedtheresultantfuzzymodelbecomesmoresuitabletoforecastnewdatacomparedtothenonadaptivestaticsystem(e.g.,RPFRG).Thisapproachis
24Md.R.Hassanetal./Neurocomputing104(2013)10–25
referredasadaptiverandomlypartitionedfuzzyrulegeneration(ARPFRG).
AppendixC.Fuzzyrulegenerationusingk-meansclusteringInthisapproach,ak-meansclusteringalgorithm[41]isappliedtopartitiontheinputdataconsideringthatthenumberoffuzzyrules(‘k’)tobegeneratedforagivendataisknowntotheuserpriortobuildingthefuzzymodel.Oncethedataarepartitionedinto‘k’clusters,atotal‘k’fuzzyrulesaregeneratedbythefollowingSection3.1.2,whereeachrulecorrespondtoonecluster.Werefertothisapproachask-meansfuzzymodel.
AppendixD.Adaptivek-meansfuzzymodel
Inthisapproach,thek-meansfuzzymodelismadeanadaptivesystem,bycouplingthek-meanspartitioningwiththedynamicadaptivefuzzi cationasdescribedinSection3.2.2.Intheprocessofadaptation,tochoosethefuzzyrulethatneedstobeadaptedgivenanewdatavectorxnew,theminimumEuclideandistancebetweenclustercentersandthenewdatavectorxnewisused.Letusconsiderthattherearethreerulesthathavebeengeneratedfromthreeclusters.Thus,distancesfromxnewtoeachoftheclustercentersarecomputed.Assumethesedistancesare2.3,0.9and4.5fromthecenterpointsofcluster1,cluster2andcluster3,respectively.Since,0.9istheminimumdistance,Ruleno2isselectedtoadaptitsparameters.AdaptationoftheruleparametersisaccomplishedbythefollowingEqs.(15)–(17).Werefertothisapproachasadaptivek-meansfuzzymodel.References
[1]LiangXun,Rong-ChangChen,JianYang,Anarchitecture-adaptiveneural
networkonlinecontrolsystem,NeuralComput.Appl.17(4)(2008)413–423.[2]A.Robins,Sequentiallearninginneuralnetworks:areviewandadiscussion
ofpseudorehearsalbasedmethods,Intell.DataAnal.8(3)(2004)301–322.[3]Y.Zhi-Gang,S.Shen-Min,D.Guang-Ren,R.PEI,Robustadaptiveneural
networkswithanonlinelearningtechniqueforrobotcontrol,in:AdvancesinNeuralNetworks:(PartI–III:ISNN2006:ThirdInternationalSymposiumonNeuralNetworks),2006,pp.1153–1159.
[4]J.A.S.Freeman,D.Saad,Onlinelearninginradialbasisfunctionnetworks,
NeuralComput.9(7)(1997)1601–1622.
[5]R.M.French,Semi-destructiverepresentationsandcatastrophicforgettingin
connectionistnetworks,ConnectionSci.1(1992)365–377.
[6]T.M.Heskes,B.Kappen,On-linelearningprocessesinarti cialneuralnet-works,in:J.Taylor(ed.,)MathematicalApproachestoNeuralNetworks,Elsevier,Amsterdam,1993,pp.199–233.
[7]G.A.Rummery,M.Niranjan,On-lineq-LearningusingConnectionistSystems,
TechnicalReport.CUED/F-INENG/TR166,CambridgeUniversityEngineeringDepartment(1994).
[8]J.S.R.Jang,ANFIS:adaptive-network-basedfuzzyinferencesystem,IEEE
Trans.Syst.ManCybern.23(1993)651–663.
[9]N.Kasabov,Evolvingfuzzyneuralnetworks-algorithms,applicationsand
biologicalmotivation,in:MethodologiesfortheConception,DesignandApplicationofSoftComputing,WorldScienti c,Singapore,1998.
[10]N.Kasabov,Evolvingfuzzyneuralnetworks:theoryandapplicationsforon-lineadaptiveprediction,decisionmakingandcontrol,AustralianJournalofIntelligentInformationProcessingSystems,1998,154–160.
[11]N.Kasabov,Evolvingfuzzyneuralnetworksforonline,adaptive,knowledge-basedlearning,IEEETrans.Syst.ManCybern.B31(2001)902–918.
[12]N.K.Kasabov,Q.Song,DENFIS:dynamicevolvingneural-fuzzyinference
systemanditsapplicationfortimeseriesprediction,IEEETrans.FuzzySyst.10(2002)144–154.
[13]M.M.Deza,E.Deza,EncyclopediaofDistances,Springer,Berlin,Heidelberg,2009.[14]Similaritysearchandoutlierdetectionintimeseries,//IT/Similarity-Search-and-Outlier-Detection-in-Time-Series-4480.htmlS.
[15]K.M.Sim,Evolvingfuzzyrulesforrelaxed-criterianegotiation,IEEETrans.
Syst.ManCybern.B38(6)(2008)1486–1499.
[16]Y.Zhou,M.J.Er,Anevolutionaryapproachtowarddynamicself-generatedfuzzy
inferencesystems,IEEETrans.Syst.ManCybern.B38(4)(2008)963–969.
[17]A.Elmzabi,M.Bella h,M.Ramdani,Anadaptivefuzzyclusteringapproach
forthenetworkmanagement,Int.J.Inf.Technol.1(3)(2007)12–17.
[18]M.R.Hassan,B.Nath,M.Kirley,AfusionmodelofHMM,ANNandGAfor
stockmarketforecasting,ExpertSyst.Appl.31(1)(2007)171–180.
[19]M.R.Hassan,HybridHMMandSoftComputingModelingwithApplications
toTimeSeriesAnalysis,Ph.D.Thesis,DepartmentofComputerScienceandSoftwareEngineering,TheUniversityofMelbourne,2007.
[20]L.R.Rabiner,AtutorialonHiddenMarkovModelsandselectedapplications
inspeechrecognition,Proc.IEEE77(1989)257–286.
[21]M.R.Hassan,AcombinationofHMMandfuzzymodelforstockmarket
forecasting,Neurocomputing72(16–18)(2009)3439–3446.
[22]M.R.Hassan,B.Nath,M.Kirley,AHMMbasedfuzzymodelfortimeseries
prediction,in:ProceedingsofFUZZ-IEEEConference,2006,pp.9966–9974.[23]M.R.Hassan,B.Nath,M.Kirley,J.Kamruzzaman,Ahybridofmultiobjective
evolutionaryalgorithmandHMM–Fuzzymodelfortimeseriesprediction,Neurocomputing81(2012)1–11.
[24]M.Mannle,Identifyingrule-basedTSKfuzzymodels,in:Proceedingsof
EUFIT,1999,pp.286–299.
[25]H.Bahi,M.Sellami,CombinationofvectorquantizationandHiddenMarkov
Modelsforarabicspeechrecognition,in:ACS/IEEEProceedingsofInterna-tionalConferenceonComputerSystemsandApplications,2001,p.0096.[26]X.Huang,Y.Aricki,M.Jack,HiddenMarkovModelsforSpeechRecognition,
EdinburghUniversityPress,1990.
[27]L.E.Baum,T.Pitrie,G.Souls,N.Weiss,Amaximizationtechniqueoccurringin
thestatisticalanalysisofprobabilisticfunctionsofMarkovchains,Ann.Math.Stat.41(1970)164–171.
[28]L.E.Baum,Aninequalityandassociatedmaximizationtechniqueinstatistical
estimationofprobabilisticfunctionsofMarkovprocesses,Inequalities3(1972)1–8.
[29]S.-M.Chen,S.-H.Lee,Anewmethodforgeneratingfuzzyrulesfrom
numericaldataforhandlingclassi cationproblems,Appl.Artif.Intell.(2001)645–664.
[30]P.P.Angelov,R.A.Buswell,Automaticgenerationoffuzzyrule-basedmodels
fromdatabygeneticalgorithms,Inf.Sci.(2003)17–31.
[31]X.Z.Wang,Y.D.Wang,X.F.Xu,W.D.Ling,D.S.Yeung,Anewapproachto
fuzzyrulegeneration:fuzzyextensionmatrix,FuzzySetsSyst.(2001)291–306.
[32]M.R.Hassan,B.Nath,M.Kirley,Adataclusteringalgorithmbasedonsingle
HiddenMarkovModel,in:ProceedingsoftheInternationalMulticonferenceonComputerScienceandInformationTechnology,2006,pp.57–66.
[33]M.Ragulskisand,K.Lukoseviciute,Non-uniformattractorembeddingfor
timeseriesforecastingbyfuzzyinferencesystems,Neurocomputing72(2009)2618–2626.
[34]T.Takagi,M.Sugeno,Fuzzyidenti cationofsystemsanditsapplicationto
modelingandcontrol,IEEETrans.Syst.ManCybern.(1985)116–132.
[35]J.Zurada,OptimalDataDrivenRuleExtractionusingAdaptiveFuzzy-Neural
Models,Ph.D.Dissertation,UniversityofLouisville,2002.
[36]A.E.Gaweda,J.M.Zurada,Data-drivenlinguisticmodelingusingrelational
fuzzyrules,IEEETrans.FuzzySyst.11(2003)121–134.
[37]G.C.Goodwin,K.S.Sin,AdaptiveFilteringPredictionandControl,Prentice-Hall,UpperSaddleRiver,NJ,1984.
[38]Yahoo nance,/http:// /S.URL/http:// /S.[39]S.L.Chiu,Anef cientmethodforextractingfuzzyclassi cationrulesfrom
highdimensionaldata,put.Intell.1(1997)1–7.
[40]J.R.Jang,C.T.Sun,E.Mizuatani,Neuro-FuzzyandSoftComputing,Prentice
Hall,EnglewoodCliffs,NJ,1997.
[41]J.A.Hartigan,M.A.Wong,Algorithmas136:ak-meansclusteringalgorithm,
Appl.Stat.28(1979)100–108.
[42]D.Rumelhart,J.McClelland,ParallelDistributedProcessing,MITPress,1986.[43]M.R.Hassan,B.Nath,StockmarketforecastingusingHiddenMarkovModel:
anewapproach,in:ProceedingsofInternationalConferenceonIntelligentSystemDesignandApplication,2005,pp.192–196.
[44]G.Atsalakisa,K.Valavanisb,Surveyingstockmarketforecastingtechniques,
partII:softcomputingmethods,ExpertSyst.Appl.36(3)(2009)5932–5941.[45]J.Kamruzzaman,R.Sarker,Forecastingofcurrencyexchangeratesusing
ANN:acasestudy,in:InternationalConferenceonNeuralNetworksandSignalProcessing,2003,pp.793–797.
[46]Monthlyelectricitydata,//TSDLS.URL/http://
/data/set/22l0/monthly-electricity-production-in-australia-million-kilowatt-hours-jan-1956-aug-1995#!display=line&ds=22l0S
.Md.Ra ulHassanreceivedaB.Sc.(Engg)inElectro-nicsandComputerSciencefromShahJalalUniversityofScienceandTechnology,BangladeshandaPh.D.inComputerScienceandSoftwareEngineeringfromtheUniversityofMelbourne,Australiain2000and2007respectively.Currently,heisafacultymemberintheDepartmentofInformationandComputerScience,KingFahdUniversityofPetroleumandMinerals,SaudiArabia.Hisresearchinterestsincludeneuralnetworks,fuzzylogic,evolutionaryalgorithms,HiddenMarkovModelandsupportvectormachinewithaparticularfocusondevelopingnewdataminingandmachinelearningtechniquesfortheanalysisandclassi cation
ofbiomedicaldata.Heiscurrentlyinvolvedinseveralresearchanddevelopmentprojectsforeffectiveprognosisanddiagnosisofbreastcancerfromgeneexpres-sionmicroarraydata.Heistheauthorofaround30paperspublishedin
Md.R.Hassanetal./Neurocomputing104(2013)10–25
recognizedinternationaljournalsandconferenceproceedings.HeisamemberoftheMelbourneuniversitybreastcancerresearchgroup,AustralianSocietyofOperationsResearch(ASOR),andIEEEComputerSociety;andisinvolvedinseveralProgramCommitteesofinternationalconferences.HealsoservesastherevieweroffewrenownedjournalssuchasBMCBreastCancer,IEEETransactionsonFuzzySystems,Neurocomputing,KnowledgeandInformationSystems,CurrentBioinfor-matics,InformationScience,DigitalSignalProcessing,IEEETransactionsonindustrialelectronicsandComputer
Communications.
25
KotagiriRamamohanaraoreceivedtheB.E.degreefromAndhraUniversityin1972,theM.E.degreefromMusta zurRahmanreceivedhisPh.D.inComputerScienceandSoftwareEngineeringfromtheUniversityofMelbourneinAugust2010.HecompletedaGradu-ateCerti cateinResearchCommercializationin2009fromMelbourneBusinessSchoolandB.Sc.inCompu-terScienceandEngineeringin2004fromBangladeshUniversityofEngineeringandTechnology(BUET).Hisresearchinterestsincludescienti candbusinesswork owmanagement,schedulinginGridsandP2Psystems,Cloudcomputingandautonomicsystems.Musta zhascontributedtotheGridbusWork owEnginethatfacilitatesuserstoexecutescienti cwork- owapplicationsonGrids.Musta zurRahmanis
currentlyworkingasaConsultantofBusinessAnalyticsandOptimizationServicetheIndianInstituteofSciencein1974,andthePh.D.degreefromMonashUniversityin1980.HejoinedtheDepartmentofComputerScienceandSoftwareEngi-neeringattheUniversityofMelbournein1980,wasawardedtheAlexandervonHumboldtFellowshipin1983,andwasappointedaprofessorofcomputersciencein1989.Heheldseveralseniorpositions,suchasheadoftheSchoolofElectricalEngineeringandComputerScienceattheUniversityofMelbourne,codirectoroftheKeyCentreforKnowledge-BasedSystems,andresearchdirectorfortheCooperative
ResearchCentreforIntelligentDecisionSystems.HeservedasamemberoftheAustralianResearchCouncilInformationTechnologyPanel.HealsoservedontheeditorialboardsoftheIEEETransactionsonKnowledgeandDataEngineering,ComputerJournalandtheVLDBJournal.Atpresent,heisalsoontheeditorialboardsofUniversalComputerScienceandtheJournalofKnowledgeandInformationSystems.Heservedasaprogramcommitteememberofseveralinternationalconferences,includingSIGMOD,IEEEICDM,VLDB,ICLP,andICDE.HewasaprogramcochairofVLDB,PAKDD,andDOODconferences.HeisasteeringcommitteememberofIEEEICDM,DASFAA,andPAKDD.HeisafellowoftheInstituteofEngineersAustralia,theAustralianAcademyofTechnologicalSciencesandEngineeringandtheAustralianAcademyofScience.HeisarecipientoftheCentenaryMedalforhiscontributiontocomputerscience.Hehaspublishedmorethan200researchpapers.Hisresearchinterestsareintheareasofdatabasesystems,logic-basedsystems,agent-orientedsystems,informationretrieval,datamining,andmachinelearning.HeiscurrentlyworkingasaProfessorattheUniversityof
Melbourne.
JoarderKamruzzamanreceivedaB.Sc.andM.Sc.inelectricalengineeringfromBangladeshUniversityofEngineering&Technology,Dhaka,Bangladeshin1986and1989respectively,andaPh.D.ininformationsystemengineeringfromMuroranInstituteofTech-nology,Japan,in1993.Currently,heisafacultymemberintheFacultyofInformationTechnology,MonashUniversity,Australia.Hisresearchinterestincludescomputernetworks,computationalintelli-gence,andbioinformatics.Hehaspublishedover150peer-reviewedpublicationswhichinclude40journalpapersand6bookchapters,andeditedtworeferencebooksoncomputationalintelligencetheoryandappli-cations.Heiscurrentlyservingasaprogramcommitteememberofanumberofinternational
conferences.
LineatIBM
Australia.
M.MarufHossainreceivedtheB.Sc.(Hons)degreefromtheUniversityofDhaka,Bangladeshin2000,theMITdegreefromtheDeakinUniversity,Australiain2005andaPh.D.inComputerSciencefromtheUniversityofMelbourne,Australiain2009.HeiscurrentlyworkingasaseniordataanalystatAustralianTransactionReportsandAnalysisCentre,Australia.Hisresearchinterestsincludedatamining,machinelearning,receiveroperatingcharacteristicscurves,andclassi cation.
正在阅读:
A HMM-based adaptive fuzzy inference system for stock market forecasting05-10
20xx党支部民主评议党员党员工作情况报告总结三篇09-10
建筑工程质量通病防治专项施工方案11-11
运命论原文及译文12-08
加强基层建设年项目建设推进活动实施方案07-23
医护人员工作总结01-21
施工组织设计 - 图文12-15
- 1Abstract Supervised fuzzy clustering for the identification of fuzzy classifiers
- 2Implementation of Blast Furnace Gas Water Spray Cooling Control System Based on PLC and OPC
- 3An MPEG-Processor-based Robot Vision System for Real-Time Detection of Moving Objects by a
- 4SELF-ORGANIZATION SIMULATION OVER GEOGRAPHICAL INFORMATION SYSTEM BASED ON MULTI-AGENT PLAT
- 5Jena Inference API 学习总结
- 6ECONOMIC RELIABILITY FORECASTING IN AN UNCERTAIN WORLD
- 7The role of phytochelatins in constitutive and adaptive heav
- 8Chapter 6 - Stock Valuation
- 9Adaptive Applications of Intelligent Agents
- 10Financial time series forecasting with machine learning tech
- 教学能力大赛决赛获奖-教学实施报告-(完整图文版)
- 互联网+数据中心行业分析报告
- 2017上海杨浦区高三一模数学试题及答案
- 招商部差旅接待管理制度(4-25)
- 学生游玩安全注意事项
- 学生信息管理系统(文档模板供参考)
- 叉车门架有限元分析及系统设计
- 2014帮助残疾人志愿者服务情况记录
- 叶绿体中色素的提取和分离实验
- 中国食物成分表2020年最新权威完整改进版
- 推动国土资源领域生态文明建设
- 给水管道冲洗和消毒记录
- 计算机软件专业自我评价
- 高中数学必修1-5知识点归纳
- 2018-2022年中国第五代移动通信技术(5G)产业深度分析及发展前景研究报告发展趋势(目录)
- 生产车间巡查制度
- 2018版中国光热发电行业深度研究报告目录
- (通用)2019年中考数学总复习 第一章 第四节 数的开方与二次根式课件
- 2017_2018学年高中语文第二单元第4课说数课件粤教版
- 上市新药Lumateperone(卢美哌隆)合成检索总结报告
- forecasting
- inference
- adaptive
- system
- market
- based
- fuzzy
- stock
- HMM
- 课外阅读—耗散结构理论简介
- 急诊科护理查房形式
- 想成为酸奶般白的MM么 那就用MG 乳酪白皙润颜唧唧面膜
- 会计基础题库(业务题)
- 人教版英语七年级上Unit5检测试卷
- 常微分方程小论文
- 农信联社致老干部春节慰问信
- 2010突发公共卫生事件应急工作总结
- 2013-2017年中国甲醇汽油行业市场供需预测及投资战略咨询报告
- 2010年工业行业淘汰落后产能企业名单公告
- MyQQ项目需求分析说明书
- 2.1平行四边形的面积
- 2010年9月多省、市(区)联考申论真题
- 十三五重点项目-调理设备项目资金申请报告
- 优秀学生典型材料
- 急诊科辅医服务内容及标准
- 苏州市2014–2015学年第一学期期末模拟试卷(1) 八年级语文
- 程序公正和实体公正关系的思考
- 2011届毕业设计(论文)任务书范本
- 第一学期班主任工作计划