Forecasting Financial Time Series with Support Vector Machines.

更新时间:2023-07-26 10:19:01 阅读量: 实用文档 文档下载

说明:文章内容仅供预览,部分内容可能不全。下载后的文档,内容与下面显示的完全一致。下载之前请确认下面内容是否您想要的,是否完整无缺。

ForecastingFinancialTimeSerieswithSupportVectorMachinesBasedonDynamicKernels

JohannesMager

InstituteofComputerArchitecturesUniversityofPassau,GermanyEmail:mager@ m.uni-passau.de

UlrichPaasche

NeuralResearchCenterMunichGmbH

Munich,Germany

Email:ulrich.paasche@nrcm.de

BernhardSick

InstituteofComputerArchitecturesUniversityofPassau,GermanyEmail:sick@ m.uni-passau.de

Abstract—Thetechnicalanalysisof nancialtimeseriesandinparticularthepredictionoffuturedevelopmentsisachallengingproblemthathasbeenaddressedbymanyresearchersandpractitionersduetothepossiblepro t.Weprovideaforecastingtechniquebasedonacertainmachinelearningparadigm,namelysupportvectormachines(SVM).SVMgainedmoreandmoreimportanceforpracticalapplicationsinthepastyearsastheyhaveexcellentgeneralizationabilitiesduetotheprincipleofstructuralriskminimization.However,standardkernelfunctionsforSVMarenotabletocomparetimeseriesofvariablelengthappropriately,i.e.,whenweassumethatthesetimeseriesmustbescaledinanon-linearway.Therefore,weusethedynamictimewarping(DTW)techniqueasakernelfunction.Wedemonstratefortwo nancialtimeseries(FDAXandFGBLfutures)thatexcellentresultscanbeobtainedwiththisapproach.

I.INTRODUCTION

Thepredictionoffuturestockmarketdevelopmentsisaproblemthathasbeenattractingtheattentionofbothpracti-tionersandresearchersformanydecades.Itcaneasilybeseenthattherearecertainrecurringpatternsinthehistoryofmarketprices,andtherearevariousapproachesforclassifyingthem[1],[2].Butamuchhardertaskistorecognizesuchpatternsintheconstantlyevolving nancialmarketsearlyandwithsuf cientreliability.Evenworse:Itstillisheavilydisputed,whetherchartpatternsallowforapredictionofcertainfutureeventsatall.

Inthisarticle,weproposeamachinelearningtechniqueforforecasting nancialtimeseries,whichreliesonthepopulartechniqueofsupportvectormachines(SVM).Usingalargehistoricalsetofreal-world nancialtimeseries,weexaminetheperformanceofdifferentvariantsandparametersettings.Tofurtherincreasethepredictionaccuracyontimeseries,standardkernelsoftheSVMarereplacedbyspecialdynamickernelfunctions,whichareadaptedforanalyzingtemporaldata.Wewillshowthattheutilizationofthesekernelsresultsinasigni cantlybetteraccuracyanditbecomespossibletooutperformthemarket’soveralldevelopment.

Withtheintegrationofthistechniqueintoaframeworkfortechnicalanalysis,Investox,itisalsopossibletoevaluatetheperformanceusingavirtualtradingagentonhistoricaldataandusethesystemon“live”datafeeds.

Thearticleisorganizedasfollows:InSectionII,weprovideashortinsightintotheprinciplesoftechnicalanalysisand nancialmarketdataanddiscusssomerelatedwork.In

SectionIII,SVManddynamickernelfunctionsareintroduced.SectionIVfollowswiththeexperiments:We rstexplaintherationalebehindtheconstructeddatasetsandsetouttheutilizederrormeasures.Thereafter,theresultsofourexperimentsaredocumented.Finally,SectionVsummarizesthemajorinsightandgivesanoutlooktofutureresearch.

II.FINANCIALANALYSIS

A.PrinciplesofTechnicalAnalysis

Theanalysisof nancialmarketscanbedividedintotwobig elds:Whereasfundamentalanalysistriestoanalyzealleconomicfactorsofacompanyoramarketinordertocalcu-latethetruevalueofacommercialpaper,technicalanalystsassumethatallimportantinformationforthepaper’sfuturedevelopmentisalreadycontainedinitspastbehavior[3].Therefore,futuremovementscanbeanticipatedbythoroughlyanalyzingthestock’shistoryanditsinherentpatterns[4].Whiletheprinciplesofsomeofthetechniquesutilizedforatechnicalanalysisdatebacktothe18thcentury,theirvalidityhaspermanentlybeendisputed.Mostpopularly,theef cientmarkethypothesis[5]statesthat nancialmarketsareinformationallyef cient,and,therefore,allpastinformationisalreadycontainedineachstock’slastvalue.Asaresult,itisclaimedthattechniquesforatechnicalanalysiscannotperformbetterthanarandomwalkonthechartortheoveralldevelopmentofthemarket.Despiteallobjections,itstillwasnotpossibletoprooftheinvalidityoftechnicalanalysis,anditstechniquesaregainingpopularityamongboth,investorsandresearchers.

B.CharacteristicsofFinancialMarketData

The nancialinstrumentsusedforourworkaretwofu-tures,derivativeinstrumentstradedattheEuropeanderivativesexchangeEurex[6].Afuturescontractgivestheholdertheobligationtobuy(longposition)orsell(shortposition)aspeci edunderlyingassetatadistinctdateinthefutureandatapre-speci edprice.Thisdualitygivesthetraderthepossibilitytobene tfromrisingaswellasfromfallingmarketprices[7].

Aseverytransactioninamarketvariestheratioofsupplyanddemand,marketpricescanchangeinverysmallandir-regularintervals.Tofacilitateanalysis,thedataiscompressedintointervalsofacertainsize.Consequently,itispossibleto

notonlyidentifyasinglepriceforeachinterval,buttoextractadditionalinformation:Theopen,high,low,andclosepricesforthisinterval,namedOHLC-data(seeFig.

1).

Fig.1.Ontheleftsideweseethemarketrateofacertainequityduringoneday.Ontherightside,thesedatahavebeencompressedanddepictedusingtheso-calledcandlesticklayout:Theupperandlowershadowsmarktheday’shighestandlowesttradedprices,whereasthebodyofthecandlespansfromtheopentothecloseprice.Thecolorofthebodyillustratestheequity’sdevelopmentduringtheday:Ifthepricewentup,thebodyiswhiteandblackotherwise.

C.RelatedResearchintheFieldofTechnicalAnalysisOverthelast15years,therehasbeenavastamountofscienti cinvestigationstousingmachinelearningmethodsfortechnicalanalysis.

[8],forexample,useabackpropagationneuralnetwork(multilayerperceptron)withonehiddenlayertopredictthedailyclosepricesofthestockindexS&P500,andcomparetheresultstoanARIMA-model.Asaresult,theyshowthatalthoughtheneuralnetworkhasahighertolerancetomarket uctuations,itsoutputistoovolatiletoindicatelong-termtrends.Abettersuitedapproachisdescribedin[9],whichutilizesrecurrentElmanneuralnetworks[10]forforecastingforeignexchangeprices.Itiscombinedwithamechanismtoautomaticallychooseandoptimizethenetwork’sparameters.Asaresult,itishighlightedthattheforecastsdonotdifferasmuchbetweendifferentmodelsasbetweendifferentinputdata.Foronlytwooutof veexchangerates(JPY/USDandGBP/USD),reliablepredictionsarepossible,whereasfortheotherrates,thepredictionaccuracyissimilartoanaiveforecast.

[11]usesamodi edSVMmodelforregressionwiththe(static)Gaussiankernel.ByadjustingtheregularizationconstantCwithaweightfunction,recenterrorsaremoreheavilypenalizedthandistanterrors,thusincreasingthein uenceofthemostrecentstockprices.Inadditiontothat,[12]addsasimilarweightfunctiontothethresholdε,whichlimitsthetoleranceofVapnik’sε-insensitiveerrorfunction[13].Thisapproachhelpstofurtherreducethecomplexityofthebuiltmodelandthenumberofsupportvectors.Furtheremphasizingtheneedofthoroughdatapreparation,[14]usessupportvectorclassi cationcombinedwithavarietyofdifferentpre-processingmethods.Asakernelfunction,

theyusethepolynomialkernelinadditiontotheGaussiankernelfunction,paredtoabackpropagationnetwork,theGaussianversionheavilyincreasesthemeasuredpredictionaccuracy.

Althoughallthesearticleswereabletopresentsomesuccessintheirexperiments,themajor awisobvious:Withastatickernelfunctionitisonlypossibletoincorporateacertain(limited)amountofinformationaboutthechart’shistory.Theinherenttemporalstructureofthedatacannotbeanalyzedap-propriately,leadingtorelativelypoorandunstablepredictionresults.

III.SUPPORTVECTORMACHINESWITHDYNAMIC

KERNELFUNCTIONSA.FundamentalsofSupportVectorMachines

Inthisarticle,cost-sensitivesupportvectormachines(C-SVM)andν-SVMareusedtoclassifythetimeseriesusingcharacteristicattributesextractedfromthetimeseriesasinputs.Basically,SVMuseahyperplanetoseparatetwoclasses[15]–[18].Forclassi cationproblemsthatcannotbelinearlyseparatedintheinputspace,SVM ndasolutionusinganon-linearmappingfromtheoriginalinputspaceintoahigh-dimensionalso-calledfeaturespace,whereanoptimallyseparatinghyperplaneissearched.Thosehyperplanesarecalledoptimalthathaveamaximalmargin,wheremarginmeanstheminimaldistancefromtheseparatinghyperplanetotheclosest(mapped)datapoints(so-calledsupportvectors).Thetransformationisusuallyrealizedbynonlinearkernelfunctions.C-SVMandν-SVMbothallow,butalsominimizemisclassi cation.

Comparedtothepopulararti cialneuralnetworks,SVMhaveseveralkeyadvantages:Bydescribingtheproblemasaconvexquadraticoptimizationproblem,theyareensuredtoconvergetoauniqueglobaloptimuminsteadofonlyapossiblylocaloptimum.Additionally,byminimizingthestructuralriskofmisclassi cation,SVMarefarlessvulnerabletoover tting,oneofthemajordrawbacksofstandardneuralnetworks.B.RelatedWorkintheFieldofDynamicKernelFunctionsAnoverviewandcomparisonofmethodsfortimeseriesclassi cationwithSVMcanbefoundin[19]or[20],forinstance.OnecommonmethodforclassifyingtimeserieswithSVMistouseoneofthedefaultstatickernels(i.e.,poly-nomialorGaussian).Forspeakerveri cation[21],phoneticclassi cation[22],orinstrumentclassi cation[23]thishassuccessfullybeendone.Abigdisadvantageofthisapproachisthatstatickernelsareunabletodealwithdataofdifferentlength.Therefore,itisnecessarytore-samplethetimeseriestoacommonlength,ortoextracta xednumberoffeaturesbeforestatickernelscanbeapplied.Itisobviousthatthere-samplingorthereductiontosomeextractedfeaturesinducesalossofinformationandisnotverywellsuitedtodealwithtimeseriesofvariablelength,wherealinearfunctionforre-scalingisnotapplicable.Amoresophisticatedapproachistousemethodsthatdirectlycomparethedatapointsoftwotimeseriesinamore exibleway,forexamplewith

tangentdistance[24],timealignment[25]–[27],ordynamictimewarpingkernels[28].Alsoprobabilisticmodels,suchasHMM8hiddenMarkovmodels)andGMM(Gaussianmixturemodels),thataretrainedonthetimeseriesdata,canbeusedincombinationwithSVM.Theso-calledFisher-kernelshavebeenwidelyused,e.g.,forspeechrecognition[29],[30],speakeridenti cation[31]–[33],orwebaudioclassi cation[34].[35],[36]usedanothersimilaritymeasureonGMM,theKullback-Leiblerdivergence,forspeakeridenti cationandveri cation.

Altogether,wecanstatethatdynamickernelfunctions[20]incorporatetemporalinformationdirectlyintoansupportvectormachine’skernelanduseitforcalculatingthesimilaritybetweendifferentinputtimeseries.Therefore,itbecomespos-sibletoalsodetectsimilaritiesbetweenmisalignedsequencesoravaryingfrequencyofthecontainedpatterns.C.DynamicTimeWarpingasKernelFunctionforSVMInourwork,weusedakernelbasedonthedynamictimewarping(DTW)method,whichhaspreviouslybeenutilizedforhandwritingandspeechrecognitionin[27],[28].Wealsorelyonownresultsdescribedin

[37].

Fig.2.ExamplefortheresultsobtainedwithDTW:Thecorrespondenceofpointsoftwosimilartimeseries(oneisdrawnwithaconstantoffsethere)isindicatedbyconnectinglines.

TheDTWkerneltakestwoinputtimeseriesandcalculatestheirsimilaritybydetermininganoptimalso-calledwarpingpathconsistingofpairsoftheirrespectivepoints.Eachpointofoneseriesisassignedtooneormorepointsoftheotherseries,obeyingthreeconstraints:

The rstandthelastpointsofbothseriesareassignedtoeachother.

Allassignmentsrespecttheseries’temporalorder.

Everypointofbothseriesbelongstoatleastoneassign-ment.

Thewarpingpathwiththeminimumsumofdistancesinitsassignmentswillbechosenastheoptimalwarpingpath.Otherdynamickernels,suchasthelongestcommonsubse-quence(LCSS)kernelwepresentedandinvestigatedin[37]followasimilarapproach.

IV.TESTSANDEXPERIMENTS

A.PreparationsandDataSetConstruction

Forourwork,weusedtheSVMroutinesfromthesoftwarepackageLibSVM[38].Theimplementationofthedynamickernelfunctionsfollows[37].

Tocomparetheforecastingaccuracyofthedifferentmodels,avarietyofdifferentmeasuresareusedintheliterature.How-ever,[39]and[40]showthatallofthepopularmeasuresareeithernotinvarianttoscalingorcontainunde nedintervals.Therefore,weusedthemeanabsolutescalederror(MASE)asproposedby[40],whichscalesthemeasurederrorusingthemeanabsoluteerrorofanaiveforecast(alsocalledrandomwalk).Thisforecastingtechniquesimplyassumesthattheresultforthenextpatternequalsthepreviousresult.

IfYtdenotestheobservationattimet∈{1,...,n}andFtistheforecast,wecallet=Yt Fttheforecasterror.Themeanabsolutescalederrorisde nedasthearithmeticmeanoftheforecasterrorsscaledbytheaverageerrorofarandomwalk:

MASE=mean

et |Y (1)i Yi 1| .

i=2Consequently,aMASEsmallerthan1.0indicatesthattheforecastingmethodperformsbetterthananaiveforecast.Appliedtothedomainoftechnicalanalysis,wecanseethatconstantMASEvaluessmallerthan1.0contradicttheef cientmarkettheory.Additionally,wespeci edthehitrateHITSofallforecasts,whichsimplyisthepercentageofcorrectlypredictedtrendsinthechart:HITS=

|{Fi|(Yi Yi 1)·(Fi Fi 1)>0,i=1,...,n}|

n

.

(2)

Fig.3.Inthediagram,weseehowthehistoryoftheFDAXwasdividedintosixdifferent,overlappingseriesofasizeof1000dayseach.Thelast250valuesofeachpart(approximatelyoneyear)wasusedtocalculatethepredictionaccuracyofthedevelopedsystemonthisspeci ctimeseries.Asaresult,amaximumnumberof750valueswasusedfortraining.

Forourexperiments,wedecidedtousetwopopularfutures:TheFDAXfutureonthestockindexDAX,andtheFGBLfutureonGermangovernmentbonds.Asallfuturespriceshaveapre-de nedenddateand,therefore,containperiodicbehaviorandpointsofdiscontinuity,thedatawasmanuallyadjusted.Tominimizetheimpactoftemporaryanomalies,we

decidedtoverifyourresultspiecewiseontheentirehistoryofthetwocharts,bydividingthemintoatotalof20differenttimeseriesofdailyvalues(seeFig.3).Forallexperiments,adailycompressionofthedatawasused.B.ExperimentSetupandResults

Theoverallorganizationoftheconductedexperimentswasmadeupofseveralparts:Firstofall,weexaminedtheperformanceofseveraldifferentinputandoutputseries.Wethencompareddifferentkernelfunctionsanddeterminedtheirbestparametersettings.Inthefollowingstep,differentvariantsoftheSVMtechniquewerecompared.Finally,weinvestigatedoptimalsettingsforthetotalamountandthelengthoftheinputseriesusedfortrainingandprediction.

Asoutputdata,itisalwayspossibletotrytopredicttheactualclosepriceofthenextday.Forusingthepredictioninatradingsystem,itismoreinterestingtopredictanupcomingtrend.ThiscanbedoneusingtherateofchangeROCnforagivenperiodnonatimeseriesY:

ROCn(Yt)=100·

Yt Yt n

Y.

(3)

t n

Earlyexperimentsshowedthatthetheforecastingaccuracycanbeconsiderablyincreasedusingthispre-processingfunc-tion.

Weconductedextensivetests,whereweexaminedmanydifferentinputtimeseriesandtheirperformanceinconjunctionwiththeoutputseries.Thebestresultswereachievedusingamultidimensionalinputvectorconsistingofseveralratesofchangewithdifferentperiods.ThisvectorincorporatesthetimeseriesROC1,ROC2,ROC3,ROC5,andROC8,andwillbedenotedROC5inthefollowing.Asaresult,thedifferentvaluesateachtimeexpress,bywhichratiothecurrentpricediffersfromadistinctpriceinthepast.TheresultsofourtestsaresetoutinTableI.

TABLEI

THEVALUESSHOWTHEPREDICTIONACCURACYOFAν-SUPPORTVECTORREGRESSIONSYSTEMUSINGTHEDYNAMICTIMEWARPINGKERNELFORDIFFERENTINPUTANDOUTPUTSERIES:WHILETHEOUTPUTSERIESROC2ANDROC5DESCRIBEROCOUTPUTSWITHDIFFERENTPERIODS,CLOSE–OPENDENOTESTHEDEVIATIONBETWEENADAY’S

OPENANDCLOSEPRICES.INCONTRASTTOTHEONE-DIMENSIONAL

INPUTSERIES

CLOSE,OHLC4ANDROC5AREMULTI-DIMENSIONAL

INPUTS,BUILTOFTHEDAY’SFOUROHLCVALUESORDIFFERENTRATES

OFCHANGE.

THELASTROWSHOWSTHEPERFORMANCEOFTHENAIVE

FORECASTINGMETHOD.ASTHEERRORMEASUREMASEISSCALEDBYTHEERROROFNAIVEFORECAST,ITALWAYSRESULTSINTHEVALUE1.

Output→ROC2

ROC5

Close–Open↓InputMASEHITSMASEHITSMASEHITSClose0.95350.49901.51310.48650.64390.4958OHLC40.96130.50011.52990.48910.64540.4924ROC50.77560.76041.09410.82630.53640.7382naive

1.0000

0.6727

1.0000

0.8027

1.0000

0.4829

Inasecondstep,wecomparedtheperformanceofSVMwithdifferentkernelfunctions.Fortheseexperiments,threedifferentdynamickernelfunctionstakenfrom[41]wereused:Thedynamictimewarpingkernel(DTW)aswellasthelongestcommonsubsequencekernelswithglobal(LCSS-global)aswellaslocalscaling(LCSSlocal).Asaresult,theDTW-kernelwasnotonlyconsiderablyfasterthanitsopponents.Duringthewholetraining,theLCSSkernelswerenotonceabletooutperformthepredictionaccuracyoftheDTWkernelonthetestdata(seeFig.4).Additionally,theLCSSkernelsappearedtorelyonspeci cattributes(features),whereastheDTWkernelshowedgoodresultsforalldatasets.ForthechoiceoftheSVMtype,weconductedclassi cationandregressionexperiments:Apartfromtheε-SVR(supportvectorregression)[42]andtheν-SVR[43],wemeasuredtheperformancefortheC-SVC(supportvectorclassi cation[44]andtheν-SVC[43].Insteadoftryingtopredictactualvalues,thesetechniquesweretrainedtoclassifythedataintotwocategories:oneforexpectedincreasing(rising),theanotheroneforexpecteddecreasing(falling)trends.Asaresult,wesawthatthepredictionresultsoftheν-SVRsigni cantlyoutperformedallothervariants,regardingbothMASE(forregressiontypes)andthehitrate,withtheε-SVRformulationrankingsecond.

Finally,weconductedsomeexperimentsinwhichwevariedthetotalamountandthelengthoftheinputtimeseriesoftheSVM.Con rmingtheobservationof[45],anincreaseintheamountofinputinformationdoesnotnecessarilyincreasethepredictionaccuracy.Instead,wecanseeinFig.4thatasmalleramountofcurrentinformationsigni cantlyimprovesthepredictionaccuracycomparedtoalargebacklogofhis-toricalinformation.C.MajorFindings

Fortheinputinformation,westatedthatthesheeramountofhistoricaldatadoesnotnecessarilyproducebetterresults.Instead,themainfocusshouldlieonthoroughpre-processingroutinestocapturetemporalpatternsofdifferentscale.Inthisregard,theappliedtechniqueofcreatingamulti-dimensionalvectorwithratesofchangeofdifferentmagnitudeworkedexceptionallywell.

Apartfromthat,ourresultsclearlyshowthehighabilityofSVMwithdynamickernelfunctionsintheareaof nancialtimeseriesforecasting.TheDTWkernelwasabletoproduceahitrateofupto70%overthewholehistoryofbothexaminedderivatives,comparedtoahitrateofonly47%forthenaiveforecast.Thisisevenmorerelevantasthehitratedirectlycorrelatestotheinputofcommonalgorithmictradingsystemsystems,triggeringactionswitheachtrendshift.

V.CONCLUSIONANDOUTLOOK

Inthisarticle,ashortintroductionintothe eldoftechnicalanalysisof nancialtimeserieshasbeengiven,andtheapplicationofSVMwithdynamickernelfunctionsinthisdomainhasbeenexamined.Aswedescribed,thedevelopedtechniquehasahighabilitytopredictfuturepricemovements

Fig.4.Thesegraphsshowsthedependenceofthepredictiononthesizeofthetimeslotusedeachtimeforpredictionandtraining:Theusedkernelfunctionsarefromlefttoright:DTW,LCSSglobal,andLCSSlocal.Theroundmarksinthediagramdenotetheresultswithatrainingsetof75periods,whereassquareandtriangularmarksshowtheresultsfor150and300periods.

ingdynamickernelfunctions,itispossibletouseawholerangeoftheprecedingseriesandanalyzeitasawholewiththeSVM’skernel.Aswecouldshow,thisapproachsigni cantlyincreasesthepredictionaccuracyandreliablyperformsbetterthanastandardnaiveforecast.

Forreal-worldexperimentsandapplicationsofthedevel-opedsystem,aninterfacetothetechnicalanalysissoftwareInvestox[46]wascreated(seeFig.5).Usingthisapplication,itbecomesnotonlypossibletoverifytheresultsonhistoricaldatausingavirtualbroker,butalsotoapplythesystemdirectlytocurrentdatainputsinaconstantlyevolvingmarketenvironment(seealso[47]).

Inourfutureresearch,theperformanceofthedevelopedsystemwillbeexaminedindifferenttradingconstellations.Contrarytotheworkonend-of-daydata,theperformanceofthetechniqueisalsohighenoughtouseitintheareaofintra-dayforecasting.Thisinvolvespredictionsinintervalsofonlyseveralminutes,ifnotjustinseconds’intervals.Inthisenvironmentofhighuncertaintyandconstanttrendshift,verydifferentrequirementsmayapply.Ontheotherhand,itisalsopossibletonotonlyusetheinputofonepre-processedtimeseries,buttocombinedifferentmarketpricesforpredictingacertainvalue.Thiskindofinter-marketanalysismayhavethepotentialtodetect uctuationsinaspeci cpriceandprematurelyratetheresultingin uenceonthetargetvalue.

REFERENCES

[1]S.Nison,Japanesecandlestickchartingtechniques:acontemporary

guidetotheancientinvestmenttechniquesforthefareast.PrenticeHallInternational,1991.

[2]R.PrechterandA.Frost,Elliottwaveprinciple:keytomarketbehavior.

JohnWiley&Sons,1978.

[3]R.Freedman,Introductionto nancialtechnology.Elsevier,2006.[4]L.Stevens,Essentialtechnicalanalysis:toolsandtechniquestospot

markettrends.JohnWiley&Sons,2002.

[5]E.Fama,“Ef cientcapitalmarkets:areviewoftheoryandempirical

work,”JournalofFinance,vol.25,pp.383–417,1970.[6]EurexFrankfurtAG,“Eurex.”[Online].Available:

[7]J.Hull,Options,Futures,andOtherDerivatives.Prentice-Hall,2006.

[8]S.-i.WuandR.-P.Lu,“Combiningarti cialneuralnetworksand

statisticsforstock-marketforecasting,”inProceedingsofthe1993ACMConferenceonComputerScience,1993,pp.257–264.

[9]C.-M.KuanandT.Liu,“Forecastingexchangeratesusingfeedfor-wardandrecurrentneuralnetworks,”JournalofAppliedEconometrics,vol.10,pp.347–64,1995.

[10]J.Elman,“Findingstructureintime,”CognitiveScience,vol.14,pp.

179–211,1990.

[11]F.TayandL.Cao,“Modi edsupportvectormachinesin nancialtime

seriesforecasting,”Neurocomputing,vol.48,pp.847–861,2002.

[12]L.CaoandF.Tay,“Supportvectormachinewithadaptiveparameters

in nancialtimeseriesforecasting,”IEEETransactionsonNeuralNetworks,vol.14,pp.1506–1518,2003.

[13]V.Vapnik,Thenatureofstatisticallearningtheory.Springer,1995.[14]K.-j.Kim,“Financialtimeseriesforecastingusingsupportvector

machines,”Neurocomputing,vol.55,pp.307–319,2003.

[15]C.J.C.Burges,“Atutorialonsupportvectormachinesforpattern

recognition,”DataMiningandKnowledgeDiscovery,vol.2,no.2,pp.121–167,1998.

[16]V.N.Vapnik,“Anoverviewofstatisticallearningtheory,”IEEETrans-actionsonNeuralNetworks,vol.10,no.5,pp.988–999,1999.[17]B.Sch¨olkopf,C.J.C.Burges,andA.J.Smola,AdvancesinKernel

Methods.Cambridge:MITPress,1998,ch.1.[18]P.-H.Chen,C.-J.Lin,andB.Sch¨olkopf,“Atutorialonν-supportvector

machines,”AppliedStochasticModelsinBusinessandIndustry,vol.21,pp.111–136,2005.

[19]V.WanandS.Renals,“Evaluationofkernelmethodsforspeaker

veri cationandidenti cation,”inIEEEInternationalConferenceonAcoustics,SpeechandSignalProcessing,May2002,pp.669–672.[20]S.Rueping,“SVMkernelsfortimeseriesanalysis,”inLLWA01–

TagungsbandderGI-Workshop-WocheLernen–Lehren–Wissen–Adaptivit¨at,Oct.2001,pp.43–50.

[21]V.WanandW.M.Campbell,“Supportvectormachinesforspeaker

veri cationandidenti cation,”inIEEEInternationalWorkshoponNeuralNetworksforSignalProcessing,Dec.2000,pp.775–784.

[22]P.ClarksonandP.J.Moreno,“Ontheuseofsupportvectormachines

forphoneticclassi cation,”inInternationalConferenceonAcoustics,SpeechandSignalProcessing,vol.2,1999,pp.585–588.

[23]J.MarquesandP.J.Moreno,“Astudyofmusicalinstrumentclassi ca-tionusinggaussianmixturemodelsandsupportvectormachines,”HPLabsTechnicalReports,Tech.Rep.CRL-99-4,1999.

[24]B.HaasdonkandD.Keysers,“Tangentdistancekernelsforsupportvec-tormachines,”in16thInternationalConferenceonPatternRecognition(ICPR),vol.2,2002,pp.864–868.

[25]S.ChakrabarttyandY.Deng,“Dynamictimealignmentinsupportvector

machinesforrecognitionsystems,”InternalReport,TheJohnsHopkinsUniversity,Baltimore,2001.

[26]H.Shimodaira,K.ichiNoma,M.Nakai,andS.Sagayama,“Dynamic

time-alignmentkernelinsupportvectormachine,”inNeuralInformationProcessing(NIPS2001),2001,pp.921–928.

Fig.5.ThescreenshotshowstheintegrationofthedevelopedsystemintothetechnicalanalysissoftwareInvestox.Whilethewindowintheforegroundshowstheprogressofthecalculationsteps,thepredictedvalues,andthetestresults,wecanalreadyseethepredictedvalueasanoscillatinglineinthelowerwindow.Theconstantlyrisinglineaboveshowsthe nancialperformanceofaforecastingsystembasedonthepredictedvalues.

[27]——,“Supportvectormachinewithdynamictime-alignmentkernelfor

speechrecognition,”inEuropeanConferenceonSpeechCommunicationandTechnology(Eurospeech),vol.3,Sept.2001,pp.1841–1844.

[28]C.Bahlmann,B.Haasdonk,andH.Burkhardt,“On-linehandwriting

recognitionwithsupportvectormachines–akernelapproach,”in8thInternationalWorkshoponFrontiersinHandwritingRecognition(IWFHR),2002,pp.49–54.

[29]N.SmithandM.Niranjan,“Data-dependentkernelsinSVMclassi -cationofspeechpatterns,”in6thInternationalConferenceonSpokenLanguageProcessing,2000,pp.297–300.

[30]N.SmithandM.Gales,“SpeechrecognitionusingSVMs,”inNIPS,

2001,pp.1197–1204.

[31]J.N.ShaiFineandR.A.Gopinath,“AhybridGMM/SVMapproach

tospeakeridenti cation,”inInternationalConferenceonAcoustics,Speech,andSignalProcessing(ICASSP),2001,pp.417–420.

[32]V.WanandS.Renals,“Speakerveri cationusingsequencediscriminant

supportvectormachines,”IEEETransactionsonSpeechandAudioProcessing,vol.13,no.2,pp.203–210,Mar.2005.

[33]——,“Svmsvm:Supportvectormachinespeakerveri cationmethodol-ogy,”inIEEEInternationalConferenceonAcoustics,SpeechandSignalProcessing,vol.2,Apr.2003,pp.221–224.

[34]P.J.MorenoandR.Rifkin,“UsingtheFisherkernelmethodforweb

audioclassi cation,”inIEEEInternationalConferenceonAcoustics,SpeechandSignalProcessing,vol.4,2000,pp.2417–2420.

[35]P.J.MorenoandP.P.Ho,“AnewSVMapproachtospeaker

identi cationandveri cationusingprobabilisticdistancekernels,”HPLaboratoriesCambridge,Tech.Rep.HPL-2004-7,Jan.2004.

[36]P.J.Moreno,P.P.Ho,andN.Vasconcelos,“AKullback-Leiblerdiver-gencebasedkernelforSVMclassi cationinmultimediaapplications,”HPLaboratoriesCambridge,Tech.Rep.HPL-2004-4,Jan.2004.[37]T.Gruber,C.Gruber,andB.Sick,“Onlinesignatureveri cationwith

newtimeserieskernelsforsupportvectormachines,”inProceedingsofthe2006InternationalConferenceonBiometricAuthentication,2006,pp.500–508.[38]C.-C.ChangandC.-J.Lin,LIBSVM:alibrary

forsupportvectormachines,2001.[Online].Available:http://www.csie.ntu.edu.tw/cjlin/libsvm/

[39]S.ArmstrongandF.Collopy,“Errormeasuresforgeneralizingabout

forecastingmethods–empiricalcomparisons,”InternationalJournalofForecasting,vol.8,pp.69–80,1992.

[40]R.HyndmanandA.Koehler,“Anotherlookatmeasuresofforecast

accuracy,”InternationalJournalofForecasting,vol.22,pp.679–688,2006.

[41]T.Gruber,“MethodenzurZeitreihenverarbeitungmitSupportVector

MachinesundihreAnwendunginderUnterschriftenveri kation,”Mas-ter’sthesis,Universit¨atPassau,2005.

[42]A.Smola,“Regressionestimationwithsupportvectorlearningma-chines,”TechnischeUniversit¨atM¨unchen.,Tech.Rep.,1996.[43]B.Sch¨olkopf,A.Smola,R.Williamson,andP.Bartlett,“Newsupport

vectoralgorithms,”NeuralComputation,vol.12,pp.1083–1121,2000.[44]C.CortesandV.Vapnik,“Support-vectornetworks,”MachineLearning,

vol.20,pp.273–297,1995.

[45]S.Walczak,“Anempiricalanalysisofdatarequirementsfor nancial

forecastingwithneuralnetworks,”JournalofManagementInformationSystems,vol.17,pp.203–222,2001.[46]A.Kn¨opfel,“InvestoxB¨orsensoftware.”[Online].Available:

http://investox.de

[47]U.Paasche,“Investox:SupportVectorMachinesmitdynamischen

Kernelsf¨urZeitreihenprognosen,”2007.[Online].Available:http://www.nrcm.de/php/supportvectormachines.php

本文来源:https://www.bwwdw.com/article/424m.html

Top