數(shù)據(jù)挖掘和模式識別中的矩陣方法Matrix Methods in Data Mining and Pattern Recognition_第1頁
數(shù)據(jù)挖掘和模式識別中的矩陣方法Matrix Methods in Data Mining and Pattern Recognition_第2頁
數(shù)據(jù)挖掘和模式識別中的矩陣方法Matrix Methods in Data Mining and Pattern Recognition_第3頁
數(shù)據(jù)挖掘和模式識別中的矩陣方法Matrix Methods in Data Mining and Pattern Recognition_第4頁
數(shù)據(jù)挖掘和模式識別中的矩陣方法Matrix Methods in Data Mining and Pattern Recognition_第5頁
已閱讀5頁,還剩281頁未讀, 繼續(xù)免費閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領(lǐng)

文檔簡介

fa04_eldenfm1.qxp 2/28/2007 3:24PM Page1

MatrixMethodsinDataMiningandPatternRecognition

fa04_eldenfm1.qxp 2/28/2007 3:24PM Page2

FundamentalsofAlgorithms

Editor-in-Chief:NicholasJ.Higham,UniversityofManchester

TheSIAMseriesonFundamentalsofAlgorithmsisacollectionofshortuser-orientedbooksonstate-of-the-artnumericalmethods.Writtenbyexperts,thebooksprovidereaderswithsufficientknowledgetochooseanappropriatemethodforanapplicationandtounderstandthemethod’sstrengthsandlimitations.Thebookscoverarangeoftopicsdrawnfromnumericalanalysisandscientificcomputing.Theintendedaudiencesareresearchersandpractitionersusingthemethodsandupperlevelundergraduatesinmathematics,engineering,andcomputationalscience.

Booksinthisseriesnotonlyprovidethemathematicalbackgroundforamethodorclassofmethodsusedinsolvingaspecificproblembutalsoexplainhowthemethodcanbedevelopedintoanalgorithmandtranslatedintosoftware.Thebooksdescribetherangeofapplicabilityofamethodandgiveguidanceontroubleshootingsolversandinterpretingresults.Thetheoryispresentedatalevelaccessibletothepractitioner.MATLAB?softwareisthepreferredlanguageforcodespresentedsinceitcanbeusedacrossawidevarietyofplatformsandisanexcellentenvironmentforprototyping,testing,andproblemsolving.

Theseriesisintendedtoprovideguidestonumericalalgorithmsthatarereadilyaccessible,containpracticaladvicenoteasilyfoundelsewhere,andincludeunderstandablecodesthatimplementthealgorithms.

EditorialBoard

PeterBenner DianneP.O’Leary

TechnischeUniversit?tChemnitz UniversityofMaryland

JohnR.Gilbert RobertD.Russell

UniversityofCalifornia,SantaBarbara SimonFraserUniversity

MichaelT.Heath RobertD.Skeel

UniversityofIllinois,Urbana-Champaign PurdueUniversity

C.T.Kelley DannySorensen

NorthCarolinaStateUniversity RiceUniversity

CleveMoler AndrewJ.Wathen

TheMathWorks OxfordUniversity

JamesG.Nagy HenryWolkowicz

EmoryUniversity UniversityofWaterloo

SeriesVolumes

Eldén,L.,MatrixMethodsinDataMiningandPatternRecognition

Hansen,P.C.,Nagy,J.G.,andO’Leary,D.P.,DeblurringImages:Matrices,Spectra,andFilteringDavis,T.A.,DirectMethodsforSparseLinearSystems

Kelley,C.T.,SolvingNonlinearEquationswithNewton’sMethod

fa04_eldenfm1.qxp 2/28/2007 3:24PM Page3

LarsEldén

Link?pingUniversity

Link?ping,Sweden

MatrixMethodsinDataMiningandPatternRecognition

SocietyforIndustrialandAppliedMathematics

Philadelphia

fa04_eldenfm1.qxp 2/28/2007 3:24PM Page4

Copyright?2007bytheSocietyforIndustrialandAppliedMathematics.

10987654321

Allrightsreserved.PrintedintheUnitedStatesofAmerica.Nopartofthisbookmaybereproduced,stored,ortransmittedinanymannerwithoutthewrittenpermissionofthepublisher.Forinformation,writetotheSocietyforIndustrialandAppliedMathematics,3600UniversityCityScienceCenter,Philadelphia,PA19104-2688.

Trademarkednamesmaybeusedinthisbookwithouttheinclusionofatrademarksymbol.Thesenamesareusedinaneditorialcontextonly;noinfringementoftrademarkisintended.

GoogleisatrademarkofGoogle,Inc.

MATLABisaregisteredtrademarkofTheMathWorks,Inc.ForMATLABproductinformation,pleasecontactTheMathWorks,Inc.,3AppleHillDrive,Natick,MA01760-2098USA,508-647-7000,Fax:508-647-7101,info@,

Figures6.2,10.1,10.7,10.9,10.11,11.1,and11.3arefromL.Eldén,Numericallinearalgebraindatamining,ActaNumer.,15:327–384,2006.ReprintedwiththepermissionofCambridgeUniversityPress.

Figures14.1,14.3,and14.4wereconstructedbytheauthorfromimagesappearinginP.N.Belhumeur,J.P.Hespanha,andD.J.Kriegman,Eigenfacesvs.fisherfaces:Recognitionusingclassspecificlinearprojection,IEEETrans.PatternAnal.Mach.Intell.,19:711–720,1997.

LibraryofCongressCataloging-in-PublicationData

Eldén,Lars,1944-

Matrixmethodsindataminingandpatternrecognition/LarsEldén.

p.cm.—(Fundamentalsofalgorithms;04)

Includesbibliographicalreferencesandindex.

ISBN978-0-898716-26-9(pbk.:alk.paper)

Datamining.2.Patternrecognitionsystems—Mathematicalmodels.3.Algebras,Linear.I.Title.

QA76.9.D343E522007

05.74—dc20

2006041348

isaregisteredtrademark.

2007/2/

pagev

Contents

Preface ix

LinearAlgebraConceptsandMatrixDecompositions

1

VectorsandMatricesinDataMiningandPatternRecognition

3

1.1

DataMiningandPatternRecognition...............

3

1.2

VectorsandMatrices........................

4

1.3

PurposeoftheBook........................

7

1.4

ProgrammingEnvironments....................

8

1.5

FloatingPointComputations....................

8

1.6

NotationandConventions.....................

11

2

VectorsandMatrices

13

2.1

Matrix-VectorMultiplication....................

13

2.2

Matrix-MatrixMultiplication...................

15

2.3

InnerProductandVectorNorms.................

17

2.4

MatrixNorms............................

18

2.5

LinearIndependence:Bases....................

20

2.6

TheRankofaMatrix........................

21

3

LinearSystemsandLeastSquares

23

3.1

LUDecomposition..........................

23

3.2

Symmetric,PositiveDefiniteMatrices...............

25

3.3

PerturbationTheoryandConditionNumber...........

26

3.4

RoundingErrorsinGaussianElimination.............

27

3.5

BandedMatrices...........................

29

3.6

TheLeastSquaresProblem....................

31

4

Orthogonality

37

4.1

OrthogonalVectorsandMatrices.................

38

4.2

ElementaryOrthogonalMatrices..................

40

4.3

NumberofFloatingPointOperations...............

45

4.4

OrthogonalTransformationsinFloatingPointArithmetic...

46

v

2007/2/2

pagevi

vi

Contents

5

QRDecomposition

47

5.1

OrthogonalTransformationtoTriangularForm......

...47

5.2

SolvingtheLeastSquaresProblem.............

...51

5.3

ComputingorNotComputingQ..............

...52

5.4

FlopCountforQRFactorization..............

...53

5.5

ErrorintheSolutionoftheLeastSquaresProblem....

...53

5.6

UpdatingtheSolutionofaLeastSquaresProblem.....

...54

6

SingularValueDecomposition

57

6.1

TheDecomposition......................

...57

6.2

FundamentalSubspaces....................

...61

6.3

MatrixApproximation....................

...63

6.4

PrincipalComponentAnalysis................

...66

6.5

SolvingLeastSquaresProblems...............

...66

6.6

ConditionNumberandPerturbationTheoryfortheLeastSquares

Problem............................

...69

6.7

Rank-DeficientandUnderdeterminedSystems.......

...70

6.8

ComputingtheSVD.....................

...72

6.9

CompleteOrthogonalDecomposition............

...72

7

Reduced-RankLeastSquaresModels

75

7.1

TruncatedSVD:PrincipalComponentRegression.....

...77

7.2

AKrylovSubspaceMethod.................

...80

8

TensorDecomposition

91

8.1

Introduction..........................

...91

8.2

BasicTensorConcepts....................

...92

8.3

ATensorSVD.........................

...94

8.4

ApproximatingaTensorbyHOSVD............

...96

9

ClusteringandNonnegativeMatrixFactorization

101

9.1

Thek-MeansAlgorithm...................

...102

9.2

NonnegativeMatrixFactorization..............

...106

II

DataMiningApplications

10

ClassificationofHandwrittenDigits

113

10.1HandwrittenDigitsandaSimpleAlgorithm........

...113

10.2ClassificationUsingSVDBases...............

...115

10.3

TangentDistance.......................

...122

11

TextMining

129

11.1PreprocessingtheDocumentsandQueries.........

...130

11.2TheVectorSpaceModel...................

...131

11.3

LatentSemanticIndexing...................

...135

11.4

Clustering...........................

...139

2007/2/2

pagevii

Contents vii

11.5 NonnegativeMatrixFactorization.................141

11.6 LGKBidiagonalization.......................142

11.7 AveragePerformance........................145

12 PageRankingforaWebSearchEngine 147

12.1 Pagerank...............................147

12.2 RandomWalkandMarkovChains.................150

12.3 ThePowerMethodforPagerankComputation..........154

12.4 HITS.................................159

13 AutomaticKeyWordandKeySentenceExtraction 161

13.1 SaliencyScore............................161

13.2 KeySentenceExtractionfromaRank-kApproximation.....165

14 FaceRecognitionUsingTensorSVD 169

14.1 TensorRepresentation .......................169

14.2 FaceRecognition ..........................172

14.3 FaceRecognitionwithHOSVDCompression...........175

IIIComputingtheMatrixDecompositions

15 ComputingEigenvaluesandSingularValues 179

15.1 PerturbationTheory ........................180

15.2 ThePowerMethodandInverseIteration.............185

15.3 SimilarityReductiontoTridiagonalForm.............187

15.4 TheQRAlgorithmforaSymmetricTridiagonalMatrix.....189

15.5 ComputingtheSVD ........................196

15.6 TheNonsymmetricEigenvalueProblem..............197

15.7 SparseMatrices ...........................198

15.8 TheArnoldiandLanczosMethods ................200

15.9 Software ...............................207

Bibliography 209

Index 217

2007/2/2

pageviii

2007/2/2

pageix

Preface

ThefirstversionofthisbookwasasetoflecturenotesforagraduatecourseondataminingandapplicationsinscienceandtechnologyorganizedbytheSwedishNationalGraduateSchoolinScientificComputing(NGSSC).Sincethenthemate-rialhasbeenusedandfurtherdevelopedforanundergraduatecourseonnumericalalgorithmsfordataminingandITatLink¨opingUniversity.Thisisasecondcourseinscientificcomputingforcomputersciencestudents.

Thebookisintendedprimarilyforundergraduatestudentswhohavepre-viouslytakenanintroductoryscientificcomputing/numericalanalysiscourse.Itmayalsobeusefulforearlygraduatestudentsinvariousdataminingandpatternrecognitionareaswhoneedanintroductiontolinearalgebratechniques.

Thepurposeofthebookistodemonstratethatthereareseveralverypowerfulnumericallinearalgebratechniquesforsolvingproblemsindi?erentareasofdataminingandpatternrecognition.Toachievethisgoal,itisnecessarytopresentmaterialthatgoesbeyondwhatisnormallycoveredinafirstcourseinscientificcomputing(numericalanalysis)ataSwedishuniversity.Ontheotherhand,sincethebookisapplicationoriented,itisnotpossibletogiveacomprehensivetreatmentofthemathematicalandnumericalaspectsofthelinearalgebraalgorithmsused.

Thebookhasthreeparts.Afterashortintroductiontoacoupleofareasofdataminingandpatternrecognition,linearalgebraconceptsandmatrixdecom-positionsarepresented.Ihopethatthisisenoughforthestudenttousematrixdecompositionsinproblem-solvingenvironmentssuchasMATLABr.Somemath-ematicalproofsaregiven,buttheemphasisisontheexistenceandpropertiesofthematrixdecompositionsratherthanonhowtheyarecomputed.InPartII,thelinearalgebratechniquesareappliedtodataminingproblems.Naturally,thedataminingandpatternrecognitionrepertoireisquitelimited:Ihavechosenproblemareasthatarewellsuitedforlinearalgebratechniques.InordertouseintelligentlythepowerfulsoftwareforcomputingmatrixdecompositionsavailableinMATLAB,etc.,someunderstandingoftheunderlyingalgorithmsisnecessary.AveryshortintroductiontoeigenvalueandsingularvaluealgorithmsisgiveninPartIII.

Ihavenothadtheambitiontowriteabookofrecipes:“givenacertainproblem,hereisanalgorithmforitssolution.”Thatwouldbedi?cult,astheareaisfartoodiversetogiveclear-cutandsimplesolutions.Instead,myintentionhasbeentogivethestudentasetoftoolsthatmaybetriedastheyarebut,morelikely,thatwillneedtobemodifiedtobeusefulforaparticularapplication.SomeofthemethodsinthebookaredescribedusingMATLABscripts.Theyshouldnot

ix

2007/2/2

pagex

x Preface

beconsideredasseriousalgorithmsbutratheraspseudocodesgivenforillustrationpurposes.

Acollectionofexercisesandcomputerassignmentsareavailableatthebook’sWebpage:/books/fa04.

ThesupportfromNGSSCforproducingtheoriginallecturenotesisgratefullyacknowledged.Thelecturenoteshavebeenusedbyacoupleofcolleagues.ThanksareduetoGeneGolubandSaaraHyv¨onenforhelpfulcomments.Severalofmyownstudentshavehelpedmetoimprovethepresentationbypointingoutinconsistenciesandaskingquestions.IamindebtedtoBerkantSavasforlettingmeuseresultsfromhismaster’sthesisinChapter10.Threeanonymousrefereesreadearlierversionsofthebookandmadesuggestionsforimprovements.Finally,IwouldliketothankNickHigham,serieseditoratSIAM,forcarefullyreadingthemanuscript.Histhoughtfuladvicehelpedmeimprovethecontentsandthepresentationconsiderably.

LarsEld′en

Link¨oping,October2006

2007/2/2

page

PartI

LinearAlgebraConceptsand

MatrixDecompositions

2007/2/2

page

2007/2/2

page3

Chapter1

VectorsandMatricesinDataMiningandPatternRecognition

1.1 DataMiningandPatternRecognition

Inmodernsociety,hugeamountsofdataarecollectedandstoredincomputerssothatusefulinformationcanlaterbeextracted.Oftenitisnotknownatthetimeofcollectionwhatdatawilllaterberequested,andthereforethedatabaseisnotdesignedtodistillanyparticularinformation,butratheritis,toalargeextent,unstructured.Thescienceofextractingusefulinformationfromlargedatasetsisusuallyreferredtoas“datamining,”sometimeswiththeadditionof“knowledgediscovery.”

Patternrecognitionisoftenconsideredtobeatechniqueseparatefromdatamining,butitsdefinitionisrelated:“theactoftakinginrawdataandmakinganactionbasedonthe‘category’ofthepattern”[31].Inthisbookwewillnotemphasizethedi?erencesbetweentheconcepts.

Therearenumerousapplicationareasfordatamining,rangingfrome-business[10,69]tobioinformatics[6],fromscientificapplicationssuchastheclassificationofvolcanosonVenus[21]toinformationretrieval[3]andInternetsearchengines[11].

Dataminingisatrulyinterdisciplinaryscience,inwhichtechniquesfromcomputerscience,statisticsanddataanalysis,linearalgebra,andoptimizationareused,ofteninarathereclecticmanner.Duetothepracticalimportanceoftheapplications,therearenownumerousbooksandsurveysinthearea[24,25,31,35,45,46,47,49,108].

Itisnotanexaggerationtostatethateverydaylifeisfilledwithsituationsinwhichwedepend,oftenunknowingly,onadvancedmathematicalmethodsfordatamining.Methodssuchaslinearalgebraanddataanalysisarebasicingredientsinmanydataminingtechniques.Thisbookgivesanintroductiontothemathematicalandnumericalmethodsandtheiruseindataminingandpatternrecognition.

3

2007/2/2

page4

Chapter1.VectorsandMatricesinDataMiningandPatternRecognition

1.2 VectorsandMatrices

Thefollowingexamplesillustratetheuseofvectorsandmatricesindatamining.Theseexamplespresentthemaindataminingareasdiscussedinthebook,andtheywillbedescribedinmoredetailinPartII.

Inmanyapplicationsamatrixisjustarectangulararrayofdata,andtheelementsarescalar,realnumbers:

a11

a21

A= .

.

.

am1

a12

a1n

a22······

a2n

Rmn

a

.

a

.

×.

..

···

..

m2

mn

Totreatthedatabymathematicalmethods,somemathematicalstructuremustbeadded.Inthesimplestcase,thecolumnsofthematrixareconsideredasvectorsinRm.

Example1.1.Term-documentmatricesareusedininformationretrieval.Con-siderthefollowingselectionoffivedocuments.1Keywords,whichwecallterms,aremarkedinboldface.2

Document1: TheGoogleTMmatrixPisamodeloftheInternet.

Document2: PijisnonzeroifthereisalinkfromWebpagejtoi.

Document3: TheGooglematrixisusedtorankallWebpages.

Document4: Therankingisdonebysolvingamatrixeigenvalueproblem.

Document5: Englanddroppedoutofthetop10intheFIFAranking.

Ifwecountthefrequencyoftermsineachdocumentwegetthefollowingresult:

Term

Doc1

Doc2

Doc3

Doc4

Doc5

eigenvalue

0

0

0

1

0

England

0

0

0

0

1

FIFA

0

0

0

0

1

Google

1

0

1

0

0

Internet

1

0

0

0

0

link

0

1

0

0

0

matrix

1

0

1

1

0

page

0

1

1

0

0

rank

0

0

1

1

1

Web

0

1

1

0

0

InDocument5,FIFAistheF′ed′erationInternationaledeFootballAssociation.Thisdocumentisclearlyconcernedwithfootball(soccer).Thedocumentisanewspaperheadlinefrom2005.After

the2006WorldCup,Englandcamebackintothetop10.

2Toavoidmakingtheexampletoolarge,wehaveignoredsomewordsthatwouldnormallybeconsideredasterms(keywords).Notealsothatonlythestemofthewordissignificant:“ranking”isconsideredthesameas“rank.”

2007/2/2

page5

1.2.VectorsandMatrices

5

Thuseachdocumentisrepresentedbyavector,orapoint,inR10,andwecanorganizealldocumentsintoaterm-documentmatrix:

00010

0

0

0

0

1

0

0

0

0

1

A=

.

1

0

1

0

0

1

0

0

0

0

0

1

0

0

0

1

0

1

1

0

0

1

1

0

0

0

0

1

1

1

0

1

1

0

0

Nowassumethatwewanttofindalldocumentsthatarerelevanttothequery“rankingofWebpages.”Thisisrepresentedbyaqueryvector,constructedinawayanalogoustotheterm-documentmatrix:

0

0

0

0

0

10

q= ∈R.

0

0

1

1

1

Thusthequeryitselfisconsideredasadocument.Theinformationretrievaltaskcannowbeformulatedasamathematicalproblem:findthecolumnsofAthatareclosetothevectorq.TosolvethisproblemwemustusesomedistancemeasureinR10.

Intheinformationretrievalapplicationitiscommonthatthedimensionmislarge,oftheorder106,say.Also,asmostofthedocumentscontainonlyasmallfractionoftheterms,mostoftheelementsinthematrixareequaltozero.Suchamatrixiscalledsparse.

Somemethodsforinformationretrievaluselinearalgebratechniques(e.g.,sin-gularvaluedecomposition(SVD))fordatacompressionandretrievalenhancement.VectorspacemethodsforinformationretrievalarepresentedinChapter11.

Oftenitisusefultoconsiderthematrixnotjustasanarrayofnumbers,orasasetofvectors,butalsoasalinearoperator.DenotethecolumnsofA

a1j

a2j

a = . , j=1,2,...,n,

·j .

.

amj

2007/2/2

page6

Chapter1.VectorsandMatricesinDataMiningandPatternRecognition

andwrite

A=a·1 a·2 ···a·n.

Thenthelineartransformationisdefined

x2

··

·

x1

..

y=Ax=a1a2

...an

.

=

n

xja·j.

j=1

xn

Example1.2.Theclassificationofhandwrittendigitsisamodelprobleminpatternrecognition.Herevectorsareusedtorepresentdigits.Theimageofonedigitisa16×16matrixofnumbers,representinggrayscale.ItcanalsoberepresentedasavectorinR256,bystackingthecolumnsofthematrix.Asetofndigits(handwritten3’s,say)canthenberepresentedbyamatrixA∈R256×n,andthecolumnsofAspanasubspaceofR256.WecancomputeanapproximatebasisofthissubspaceusingtheSVDA=UΣVT.Threebasisvectorsofthe“3-subspace”areillustratedinFigure1.1.

2

2

4

4

6

6

8

8

10

10

12

12

14

14

16

16

2 4 6 8 10 12 14 16 2 4 6 8 10 12 14 16

2

2

2

4

4

4

6

6

6

8

8

8

10

10

10

12

12

12

14

14

14

16

16

16

2

4

6

8

10

12

14

16

2

4

6

8

10

12

14

16

2

4

6

8

10

12

14

16

Figure1.1.HandwrittendigitsfromtheU.S.PostalServicedatabase[47],andbasisvectorsfor3’s(bottom).

Letbbeavectorrepresentinganunknowndigit.Wenowwanttoclassify(automatically,bycomputer)theunknowndigitasoneofthedigits0–9.Givena

setofapproximatebasisvectorsfor3’s,u1,u2,...,uk,wecandeterminewhetherkbisa3bycheckingifthereisalinearcombinationofthebasisvectors,j=1xjuj,suchthat

k

b? xjuj

j=1

2007/2/2

page7

1.3.PurposeoftheBook

7

issmall.Thus,herewecomputethecoordinatesofbinthebasis{uj}kj=1.

InChapter10wediscussmethodsforclassificationofhandwrittendigits.

Theveryideaofdataminingistoextractusefulinformationfromlarge,oftenunstructured,setsofdata.Thereforeitisnecessarythatthemethodsusedaree?cientandoftenspeciallydesignedforlargeproblems.Insomedataminingapplicationshugematricesoccur.

Example1.3.ThetaskofextractinginformationfromallWebpagesavailableontheInternetisdonebysearchengines.ThecoreoftheGooglesearchengineisamatrixcomputation,probablythelargestthatisperformedroutinely[71].TheGooglematrixPisoftheorderbillions,i.e.,closetothetotalnumberofWebpagesontheInternet.ThematrixisconstructedbasedonthelinkstructureoftheWeb,andelementPijisnonzeroifthereisalinkfromWebpagejtoi.

ThefollowingsmalllinkgraphillustratesasetofWebpageswithoutlinksandinlinks:

1

2

3

4

5

6

AcorrespondinglinkgraphmatrixisconstructedsothatthecolumnsandrowsrepresentWebpagesandthenonzeroelementsincolumnjdenoteoutlinksfromWebpagej.Herethematrixbecomes

0

1

0

0

0

0

3

1

0

0

0

0

0

3

0

0

0

0

3

3

1

1

P=

0

3

0

0

3

2

.

1

1

3

3

1

2

1

1

0

0

0

1

0

0

1

0

0

3

Forasearchenginetobeuseful,itmustuseameasureofqualityoftheWebpages.TheGooglematrixisusedtorankallthepages.TherankingisdonebysolvinganeigenvalueproblemforP;seeChapter12.

1.3 PurposeoftheBook

Thepresentbookismeanttobenotprimarilyatextbookinnumericallinearalge-brabutratheranapplication-orientedintroductiontosometechniquesinmodern

2007/2/2

page8

Chapter1.VectorsandMatricesinDataMiningandPatternRecognition

linearalgebra,withtheemphasisondataminingandpatternrecognition.Itde-pendsheavilyontheavailabilityofaneasy-to-useprogrammingenvironmentthatimplementsthealgorithmsthatwewillpresent.Thus,insteadofdescribingindetailthealgorithms,wewillgiveenoughmathematicaltheoryandnumericalbackgroundinformationsothatareadercanunderstandandusethepowerfulsoftwarethatisembeddedinapackagelikeMATLAB[68].

Foramorecomprehensivepresentationofnumericalandalgorithmicaspectsofthematrixdecompositionsusedinthisbook,seeanyoftherecenttextbooks[29,42,50,92,93,97].Thesolutionoflinearsystemsandeigenvalueproblemsforlargeandsparsesystemsisdiscussedatlengthin[4,5].Forthosewhowanttostudythedetailedimplementationofnumericallinearalgebraalgorithms,softwareinFortran,C,andC++isavailableforfreeviatheInternet[1].

Itwillbeassumedthatthereaderhasstudiedintroductorycoursesinlinearalgebraandscientificcomputing(numericalanalysis).Familiaritywiththebasicsofamatrix-orientedprogramminglanguagelikeMATLABshouldhelponetofollowthepresentation.

1.4 ProgrammingEnvironments

InthisbookweuseMATLAB[68]todemonstratetheconceptsandthealgorithms.Ourcodesarenottobeconsideredassoftware;insteadtheyareintendedtodemon-stratethebasicprinciples,andwehaveemphasizedsimplicityratherthane?ciencyandrobustness.Thecodesshouldbeusedonlyforsmallexperimentsandneverforproductioncomputations.

EvenifweareusingMATLAB,wewanttoemphasizethatanyprogram-mingenvironmentthatimplementsmodernmatrixcomputationscanbeused,e.g.,Mathematicar[112]orastatisticspackage.

1.5 FloatingPointComputations

1.5.1 FlopCounts

Theexecutiontimesofdi?erentalgorithmscansometimesbecomparedbycountingthenumberoffloatingpointoperations,i.e.,arithmeticoperationswithfloatingpointnumbers.Inthisbookwefollowthestandardprocedure[42]andcounteachoperationseparately,andweusethetermflopforoneoperation.Thusthestatementy=y+a*x,wherethevariablesarescalars,countsastwoflops.

Itiscustomarytocountonlythehighest-orderterm(s).Weemphasizethatflopcountsareoftenverycrudemeasuresofe?ciencyandcomputingtimeandcanevenbemisleadingundercertaincircumstances.Onmoderncomputers,whichinvariablyhavememoryhierarchies,thedataaccesspatternsareveryimportant.Thustherearesituationsinwhichtheexecutiontimesofalgorithmswiththesameflopcountscanvarybyanorderofmagnitude.

2007/2/2

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論