版權說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權,請進行舉報或認領
文檔簡介
Chapter9FundamentalLimitsinInformationTheoryProblems:(pp.618-625)9.39.5
9.109.11
6
9.311Chapter9FundamentalLimitsinInformationTheory9.1Introduction9.2Uncertainty,Information,andEntropy9.3Source-CodingTheorem9.4DataCompaction9.5DiscreteMemorylessChannels9.6MutualInformation9.7ChannelCapacity9.8Channel-CodingTheorem9.9DifferentialEntropyandMutualInformationforContinuousEnsembles2Chapter9FundamentalLimitsinInformationTheory9.10InformationCapacityTheorem9.11ImplicationsoftheInformationCapacityTheorem9.12InformationCapacityofColoredNoiseChannel9.13RateDistortionTheory9.14DataCompression9.15SummaryandDiscussion3第九章信息論基礎9.1引言9.2不確定性、信息和熵9.3信源編碼定理9.4無失真數(shù)據(jù)壓縮9.5離散無記憶信道9.6互信息9.7信道容量9.8信道編碼定理9.9連續(xù)信號的相對熵和互信息9.10信息容量定理9.11信息容量定理的含義9.12有色噪聲信道的信息容量9.13率失真定理9.14數(shù)據(jù)壓縮9.15總結(jié)與討論4Chapter9FundamentalLimitsinInformationTheoryMainTopics:
Entropy-basicmeasureofinformationSourcecodinganddatacompactionMutualinformation-channelcapacityChannelcoding
InformationcapacitytheoremRate-distortiontheory-sourcecoding59.1Introduction
Purposeofacommunicationsystem
carryinformation-bearingbasebandsignalsfromoneplacetoanotheroveracommunicationchannelRequirementsofacommunicationsystemEfficient:sourcecodingReliable:error-controlcoding69.1Introduction
Questions:1.Whatistheirreduciblecomplexitybelowwhichasignalcannotbecompressed?2.Whatistheultimatetransmissionrateforreliablecommunicationoveranoisychannel?So,invokeinformationtheory(Shannon1948) ↓
mathematicalmodelingandanalysis ofcommunicationsystems79.1Introduction
Answers:1.Entropyofasource2.CapacityofachannelAremarkableresult:If(theentropyofthesource)<(the capacityofthechannel)Thenerror-freecommunicationoverthe channelcanbeachieved.89.2Uncertainty,Information,andEntropyUncertaintyDiscretememorylesssource:->adiscreterandomvariable,S(statisticallyindependent)
(9.1)(9.2)(9.3)99.2Uncertainty,Information,andEntropyeventbeforeoccur,amountofuncertaintyoccur,amountofsurpriseafter,informationgain (resolutionofuncertainty)and:probability↑,surprise↓,information↓e.g.:
nosurprise,noinformation,information()>information()So,theamountofinformationisrelatedtotheinverseoftheprobabilityofoccurrence.109.2Uncertainty,Information,andEntropyAmountofinformationProperties:
Forbase2--unitcalledbit
(9.4)119.2Uncertainty,Information,andEntropyEntropy--meanofI(sk)
Itisameasureoftheaverageinformationcontentpersourcesymbol.Definition:(9.9)129.2Uncertainty,Information,andEntropySomePropertiesofEntropy
BoundaryLowerbound:ifandonlyif forsomek--nouncertaintyUpperbound:ifandonlyif
forallk(可用拉式乘子法證明)(9.10)139.2Uncertainty,Information,andEntropyProve:1.Lowerbound149.2Uncertainty,Information,andEntropy2.upperboundSuppose
(Figure9.1)15Figure9.1
Graphsofthefunctionsx
1andlogxversusx.169.2Uncertainty,Information,andEntropyExample9.1
EntropyofBinaryMemorylessSourceEntropyofthesourceEntropyfunction(Figure9.2)H17Figure9.2
EntropyfunctionH(p0).189.2Uncertainty,Information,andEntropyDistinctionbetweenEqu.(9.15)andEqu.(9.16)
TheofEquation(9.15)givestheentropyofadiscretememorylesssourcewithsourcealphabet. TheentropyfunctionEquation(9.16)isafunctionofthepriorprobabilityp0definedontheinterval[0,1].199.2Uncertainty,Information,andEntropyExtensionofadiscretememorylesssourceExtendedsource:Block--consistingofnsuccessivesourcesymbolssourcealphabetdistinctblocks∵discretememorylesssource→statisticallyindependent∴entropy(9.17)209.2Uncertainty,Information,andEntropyExample9.2Entropyofextendedsource
alphabet probabilitiesentropyofthesource entropyoftheextendedsource219.3Source-CodingTheoremWhy? EfficientNeed: Knowledgeofthestatisticsofthesource3.Example :
Variable-lengthcode
Shortcodewords–frequentsourcesymbols Longcodewords–raresourcesymbols4.Requirementsofanefficientsourceencoder:Thecodewordsareinbinaryform.Thesourcecodeisuniquelydecodable.5.Figure9.3showsasourceencodingscheme.22Figure9.3
Sourceencoding.ablockof0sand1s239.3Source-CodingTheoremAssume:
alphabet--Kdifferentsymbolsprobabilityofkthsymbolsk
--pk,
k=0,1,...,K-1
binarycodewordlengthassignedtosymbolsk
--
lkAveragecode-wordlength--averagenumberofbitspersourcesymbolCodingefficiency(9.18)(9.19)Note:efficientwhen
--Minimumpossiblevalueof
249.3Source-CodingTheoremHowistheminimumvaluedetermined?Answer:Shannon’sfirsttheorem--thesource-codingtheorem
Givenadiscretememorylesssourceofentropy,theaveragecode-wordlengthforanydistor-
tionlesssourceencodingschemeisboundedasBACKBackwhen
(9.21)(9.20)259.4DataCompactionWhydatacompaction?
Signalsgeneratedbyphysicalsourcescontaina significantamountofredundantinformation.→notefficient
Requirementofdatacompaction:
Notonlyefficientintermsoftheaveragenumberofbitspersymbolbutalsoexactinthesensethattheoriginaldatacanbereconstructedwithnolossofinformation.--losslessdatacompressionExamplesPrefixCoding,HuffmanCoding,Lempel-ZivCoding269.4.1PrefixCodingDiscretememorylesssource
alphabetstatisticsrequirement
uniquelydecodable
definition:acodeinwhichnocodewordistheprefixofanyothercodeword.codewordof
--Wheremki∈(0,1);n--code-wordlength
calledprefix279.4.1PrefixCodingTable9.2
CodeIandCodeIIInotaprefixcodeCodeIIaprefixcodedecodingusedecisiontree--Figure9.4
Procedure:
1.Startattheinitialstate.2.Checkthereceivedbit.If=1,decodermovestoaseconddecisionpoint,andrepeatstep2.If=0,movestotheterminalstate,andbacktostep1.28Figure9.4
DecisiontreeforcodeIIofTable9.2.e.g.:1011111000…→s1s3s2s0s0
…299.4.1PrefixCodingProperty:
1.uniquelydecodable2.satisfyKraft-McMillanInequality
wherelk
isthecodewordlength.3.instantaneouscodes
Theendofacodewordisalwaysrecognizable.Note:性質(zhì)1和2只是前綴碼的必要條件.(e.g.CodeII,CodeIII滿足性質(zhì)1和2,但只有CodeII是前綴碼.)(9.22)309.4.1PrefixCodingProperty:
4.Givenentropy,aprefixcodecanbeconstructedwithanaveragecodewordlength,whichisboundedas:(9.23)319.4.1PrefixCodingSpecialcase:
Theprefixcodeismatchedtothesourceinthat
,underthecondition.Prove:329.4.1PrefixCodingExtendedprefixcode:
Thecodeismatchedtoanarbitraydiscrete
memorylesssourcebythehighorderoftheextendedprefixcode.(→increaseddecodingcomplexity)Prove:Whereistheaveragecode-wordlengthoftheextendedprefixcode.
339.4.2HuffmanCodingAnimportantclassofprefixcodes
Basicidea
Asequenceofbitsroughlyequalinlengthtotheamountofinformationconveyedbythesymbolisassignedtoeachsymbol.
averagecode-wordlengthapproachesentropyEssenceofthealgorithmReplacetheprescribedsetofsourcestatisticswithasimplerone.349.4.2HuffmanCodingEncodingalgorithm1.Splittingstage:(i)Sourcesymbolsarelistedinorderofdecreasingprobability(P).(ii)The2symbolsoflowestPareassigneda0&1.2.Combinethe2symbolsasanewsymbolwithsumP,andreplacethesourcesymbolsasinstep1.3.Repeat2untiltwosymbolsleft.Thenthecodeforeach(original)sourcesymbolisfoundbyworkingbackwardandtracingthesequenceof0sand1sassignedtothatsymbolaswellasitssuccessors.
359.4.2HuffmanCodingExample9.3HuffmanTreeFigure9.5
(a)ExampleoftheHuffmanencodingalgorithm.(Ashighaspossible)(b)Sourcecode.369.4.2HuffmanCodingExample9.3HuffmanTree(Cont.)
Theaveragecode-wordlengthis=2.2Theentropyis=2.12193bitsTwoobservations:Theaveragecode-wordlengthexceedstheentropybyonly3.67percent.Theaveragecode-wordlengthdoesindeedsatisfytheEquation(9.23).379.4.2HuffmanCodingExample9.3HuffmanTree(Cont.)Notes:1.Encodingprocessisnotunique.(i)Arbitraryassignments
of0&1tothelasttwosourcesymbols.→trivialdifferences(ii)Ambiguousplacementofacombinedsymbolwhenitsprobabilityisequaltoanotherprobability.(ashighorlowaspossible?)→noticeabledifferences
Answer:
2.Requiresprobabilisticmodelofthesource.(Drawback)High,variance↓;Low,variance↑389.4.3Lempel-ZivCodingProblemofHuffmancode1.Itrequiresknowledgeofaprobabilisticmodelofthesource.Inpractice,sourcestatisticsarenotalwaysknownapriori.2.Storagerequirementspreventitfromcapturingthehigher-orderrelationshipsbetweenwordsandphrasesinmodelingtext.→efficiencyofthecode↓AdvantageofLempel-Zivcoding
intrinsicallyadaptiveandsimplertoimplementthanHuffmancoding399.4.3Lempel-ZivCodingBasicideaofLempel-Zivcode
EncodingintheLempel-Zivalgorithmisaccomplishedbyparsingthesourcedatastreamintosegments
thataretheshortestsubsequencesnotencounteredpreviously.
Forexample:(pp.580)
inputsequence
000101110010100101...Assume:Subsequencesstored:0,1Datatobeparsed:000101110010100101...
Result:codebookinFigure9.6
40Figure9.6
IllustratingtheencodingprocessperformedbytheLempel-Zivalgorithmonthebinarysequence000101110010100101....NumericalPositions:123456789Subsequences: 01000101110010100101Numericalrepresentations: 11124221416162Binaryencodedblocks: 0010
001110010100100011001101Binaryencodedrepresentationofthesubsequence=(binarypointertothesubsequence)+(innovationsymbol)419.4.3Lempel-ZivCodingThedecoderisjustassimpleastheencoder.
BasicconceptFixed-lengthcodesareusedtorepresentavariablenumberofsourcesymbols.→Suitableforsynchronoustransmission.Basicconcept1.Inpractice,fixedblocksof12bitslong →acodebookof4096entries2.standardalgorithmforfilecompression.Achievesacompactionofapproximately55%forEnglishtext.429.5DiscreteMemorylessChannels
AdiscretememorylesschannelisastatisticalmodelwithaninputXandanoutputYthatisanoisyversionX;bothXandYarerandomvariables.(seeFigure9.7)
inputalphabetoutputalphabettransitionprobabilitiesDefinition(9.31)(9.32)foralljandk43Figure9.7
Discretememorylesschannel.Discrete---bothofalphabetsXandYhavefinitesizesmemoryless--currentoutputsymboldependsonlyonthecurrent inputsymbolandnotanyofthepreviousones.449.5DiscreteMemorylessChannelsChannelmatrix(ortransitionmatrix)(9.35)Note:row--fixedchannelinputcolumn--fixedchanneloutputforallj459.5DiscreteMemorylessChannelsNOTE:jointprobabilitydistributionmarginalprobabilitydistributioninputprobabilitydistribution469.5DiscreteMemorylessChannelsExample9.4BinarysymmetricchannelFigure9.8Transitionprobabilitydiagramofbinarysymmetricchannel.479.6MutualInformation
HowcanwemeasuretheuncertaintyaboutXafterobservingY?Themean(9.40)(9.41)Answer:conditionalentropy--theamountofuncertaintyremainingaboutthechannelinputafterthechanneloutputhasbeenobserved.489.6MutualInformationMutualinformationH(X)--uncertaintyaboutthechannelinputbeforeobservingtheoutputH(X|Y)--uncertaintyaboutthechannelinputafter
observingtheoutputH(X)-H(X|Y)--uncertaintyaboutthechannelinputthatisresolvedbyobservingthechanneloutput(9.43)(9.44)499.6.1PropertiesofMutualInformationProperty1--symmetric
Property2--nonnegativeProperty3
Relatedtothejointentropyofthechannelinputandchanneloutputby(9.54)(9.50)(9.45)50Figure9.9
Illustratingtherelationsamongvariouschannelentropies.519.7ChannelCapacityDiscretememorylesschannelhere
Themutualinformationofachannelthereforedependsnotonlyonthechannelbutalsoonthewayinwhichthechannelused.(9.49)529.7ChannelCapacityDefinitionWedefinethechannelcapacityofadiscretememoryless
channelasthemaximummutualinformationI(X;Y)inanysingleuseoftheChannel(i.e.,signalinginterval),wherethemaximizationisoverallpossibleinputprobabilitydistributionsonX.(9.59)Subjecttoandforallj539.7ChannelCapacityNote:1.Cismeasuredinbitsperchanneluse,orbitspertransmission.2.Cisafunctiononlyofthetransitionprobabilities,whichdefinethechannel.3.ThevariationalproblemoffindingthechannelcapacityCisachallengingtask.549.7ChannelCapacityExample9.5BinarysymmetricchannelTransitionprobability(seefigure9.8)(SeeFigure9.10)Observations:1.Noisefree,p
=0,C=1(maximumvalue)2.Useless,p=1/2,C=0(minimumvalue)55Figure9.10
Variationofchannelcapacityofabinarysymmetricchannelwithtransitionprobabilityp.569.8Channel-CodingTheoremGoalIncreasetheresistanceofadigitalcommunicationsystemtochannelnoise.Why?noise→error
Figure9.11
Blockdiagramofdigitalcommunicationsystem.579.8Channel-CodingTheoremBlockcodes(n,k);coderate:r=k/nQuestion:Doesthereexistachannelcodingschemesuchthattheprobabilitythatamessagebitwillbeinerrorislessthananypositivenumberε(i.e.,arbitrarilysmallprobabilityoferror),andyetthechannelcodingschemeisefficientinthatthecoderateneednotbetoosmall?Channelcoding--introducecontrolledredundancy
toimprovereliabilitySourcecoding--reduce
redundancytoimprove efficiency589.8Channel-CodingTheoremAnswer:Shannon’ssecondtheorem(Channelcodingtheorem)1. IfExistsacodingscheme.C/Tc--criticalrate2.IfNot.ThetheoremspecifiesthechannelcapacityCasafundamentallimitontherateatwhichthetransmissionofreliableerror-freemessagescantakeplaceoveradiscretememoryless
channel.Back(9.61)(9.62)averageinformationrate≤channelcapacityperunittime599.8Channel-CodingTheoremNOTE:Anexistenceproof.(Donottellushowtoconstructagoodcode?)Nopreciseresultfortheprobabilityofsymbolerror(Pe)afterdecodingthechanneloutput.(lengthofthecode↑,Pe→0)Powerandbandwidthconstraintswerehiddeninthediscussionpresentedhere.(showupinthechannelmatrixPofthediscretememorylesschannel.)609.8Channel-CodingTheoremApplicationofthechannelcodingtheoremtobinarysymmetricchannelsSourceTs0,1sourceentropy1bitpersymbolinformationrate1/TsbpsafterencodingTccoderatertransmissionrate1/Tcsymbols/sThen,ifTheprobabilityoferrorcanbemadearbitrarilylowbytheuseofasuitablechannelencodingscheme.andFor,thereexistsacodecapableofachievinganarbitrarilylowprobabilityoferror.Back619.8Channel-CodingTheoremExample9.6RepetitioncodeBSCC=0.9192channelcodingtheorem→foranyε>0and ,thereexistsacodeoflengthnlargeenough&r&appropriatedecodingalgorithm,suchthatPe<ε.Seefigure9.1262Figure9.12
Illustratingsignificanceofthechannelcodingtheorem.639.8Channel-CodingTheoremExample9.6Repetitioncode(1,n)n=2m+1ifn=3,0->000,1->111decodingmajorityrule
m+1ormorebitsreceivedincorrectly→errorAverageprobabilityoferrorCharacteristic:exchangeofcoderateformessagereliability→Table9.3(r↓,Pe↓)649.9DifferentialEntropyandMutualInformationforContinuousEnsemblesXacontinuousrandomvariablefX(x)theprobabilitydensityfunctionWehave(9.66)h(X),thedifferentialentropyofX.Note:ItisnotameasureoftherandomnessofX.Itisdifferentfromordinaryorabsoluteentropy.659.9DifferentialEntropyandMutualInformationforContinuousEnsemblesAssumeXintheinterval,probabilityOrdinaryentropyofthecontinuousrandomvariableX669.9DifferentialEntropyandMutualInformationforContinuousEnsemblescontinuousrandomvectorconsistingofnrandomvariablesX1,X2,...,Xnthejointprobabilitydensityfunctionof
thedifferentialentropy
(9.68)679.9DifferentialEntropyandMutualInformationforContinuousEnsemblesExample9.7UniformdistributionArandomvariableXuniformlydistributedovertheinterval(0,a).TheprobabilitydensityfunctionThen,weget(9.69)Note:log2a<0fora<1.Unlikeadiscreterandomvariable,thedifferentialentropyofacontinuousrandomvariablecanbenegative.689.9DifferentialEntropyandMutualInformationforContinuousEnsemblesExample9.8GaussiandistributionX,Yrandomvariables,use(9.12)(9.70)(9.71)(9.72)Assume:1.X,Yhavethesamemeanandthesamevariance.2.XisGaussiandistributed,as699.9DifferentialEntropyandMutualInformationforContinuousEnsembles(9.73)then,(9.74)(9.75)(9.76)∵forY∴709.9DifferentialEntropyandMutualInformationforContinuousEnsemblesCombining(9.75)and(9.76),(9.77)whereequalityholds,andonlyif,fY(x)=fX(x)
.Summarize(twoentropicpropertiesofaGaussianrandomvariable)Forafinitevariance,theGaussianrandomvariablehasthelargestdifferentialentropyattainablebyanyrandomvariable.TheentropyofaGaussianrandomvariableXisuniquelydeterminedbythevarianceofX(i.e.,itisindependentofthemeanofX).719.9.1MutualInformationApairofcontinuousrandomvariablesXandYMutualinformation(9.78)Properties(9.79)(9.80)(9.81)729.9.1MutualInformationh(X),h(Y)thedifferentialentropyofX,Y.Where:h(X|Y)istheconditionaldifferentialentropyofX,givenY;h(Y|X)istheconditionaldifferentialentropyofY,givenX;(9.82)Conditionaldifferentialentropy739.10InformationCapacityTheoremInformationcapacitytheoremforband-limited,power-limitedGaussianchannels.signalX(t)azero-meanstationaryprocess,band-limitedtoBhertz.Tseconds,transmittedoveranoisychannelThenumberofsamples(9.83)XkthecontinuousrandomvariablesobtainedbyuniformsamplingoftheprocessX(t)attheNyquist
rateof2Bsamplespersecond.K=1,2,...,K749.10InformationCapacityTheoremNoise
AWGN,zeromean,powerspectraldensity=N0/2,band-limitedtoBhertz.ThenoisesampleNkisGaussianwithzeromeanandvariancegivenbyFigure9.13Modelofdiscrete-time,memorylessGaussianchannel.(9.84)(9.85)Thesamplesofreceivedsignal759.10InformationCapacityTheoremThecosttoeachchannelinput,(9.86)wherePistheaveragetransmittedpower.TheinformationcapacityofthechannelThemaximumofthemutualinformationbetweenthechannelinputXkandthechanneloutputYkoveralldistributionsontheinputXkthatsatisfythepowerconstraintofEquation(9.86).(9.87)769.10InformationCapacityTheorem(9.88)(9.89)(9.90)whereMaximizing,requiresmaximizing.Fortobemaximum,hastobeaGaussianrandomvariable.Thatis,thesamplesofthereceivedsignalrepresentanoiselikeprocess.Next,sinceisGaussianbyassumption,thesampleofthetransmittedsignalmustbeGaussiantoo.Xk
,Nk
areindependent779.10InformationCapacityTheoremso(9.91)ThemaximizationspecifiedinEquation(9.87)isattainedbychoosingthesamplesofthetransmittedsignalfromanoiselikeprocessofaaveragepowerP.ThreestagesfortheevaluationoftheinformationcapacityC1.ThevarianceofYk=so(9.92)789.10InformationCapacityTheorem2.ThevarianceofNk=(9.93)so3.Informationcapacity(9.94)equivalentform(K/TtimesC)(9.95)799.10InformationCapacityTheoremShannon’sthirdtheorem,theinformationcapacitytheorem:TheinformationcapacityofacontinuouschannelofbandwidthBhertz,perturbedbyadditivewhiteGaussiannoiseofpowerspectraldensityN0/2andlimitedinbandwidthtoB,isgivenbywherePistheaveragetransmittedpower.Thechannelcapacitytheoremdefinesthefundamentallimitontherateoferror-freetransmissionforapower-limited,band-limitedGaussianchannel.Toapproachthislimit,thetransmittedsignalmusthavestatisticalpropertiesapproximatingthoseofwhiteGaussiannoise. Back809.10.1SpherePackingPurpose:Forsupportingtheinformationcapacitytheorem.Anencodingscheme,yieldsKcodewords,codewordlength(numberofbits)=nPowerconstraint:nP,Paveragepowerperbit.Thereceivedvectorofnbits,Gaussiandistributed,MeanequaltothetransmittedcodewordVarianceequalto,thenoisevariance.819.10.1SpherePackingWithhighprobability,thereceivedvectorliesinsideasphereofradius,centeredonthetransmittedcodeword.Thissphereisitselfcontainedinalargersphereofradius,whereistheaveragepowerofthereceivedvector.Seefigure9.14Figure9.14
Thesphere-packingproblem.829.10.1SpherePackingQuestion:Howmanydecodingspherescanbepackedinsidethelargesphereofreceivedvectors?Inotherwords,howmanycodewordscanweinfactchoose?Firstrecognizethatthevolumeofann-dimensionalsphereofradiusrmaybewrittenas;isascalingfactor.Statements1.Thevolumeofthesphereofreceivedvectorsis2.Thevolumeofthedecodingsphereis839.10.1SpherePackingThemaximumnumberbenonintersectingdecodingspheresthatcanbepackedinsidethesphereofpossiblereceivedvectorsis(9.96)Example9.9Reconfigurationofconstellationforreducedpower64-QAMFigure9.159.15bhasanadvantageover9.15a:asmallertransmittedaveragesignalenergypersymbolforthesameBERonanAWGNchannel84Figure9.15
(a)Square64-QAMconstellation.(b)Themosttightlycoupledalternativetothatofparta.HighSNRonAWGNchannel,thesameBERSquaredEuclideandistancesfromthemessagepointstotheoriginb<a859.11ImplicationsoftheInformationCapacityTheoremIdealsystemRb=CAveragetransmittedpower(9.97)accordingly,theidealsystemisdefinedby(9.98)(9.99)signalenergy-per-bittonoisepowerspectraldensityratioAnidealsystemisneededtoassesstheperformanceofapracticalsystem.869.11ImplicationsoftheInformationCapacityTheorembandwidth-efficiencydiagramAplotofbandwidthefficiencyRb/BversusEb/N0.(Figure9.16)wherethecurvelabeled”capacityboundary”correspondstotheidealsystemforwhichRb=C.Observations:1.Forinfinitebandwidth,(9.100)ThisvalueiscalledShannonlimitforanAWGNchannel,assumingacoderateofzero.(-1.6dB)879.11ImplicationsoftheInformationCapacityTheoremFigure9.16
Bandwidth-efficiencydiagram.889.11ImplicationsoftheInformationCapacityTheorem(9.101)2.Thecapacityboundary,definedbythecurveforthecriticalbitrateRb=C.Rb<C,error-freetransmissionRb>C,error-freetransmissionisnotpossible3.Thediagramhighlightspotentialtrade-offsamongEb/N0,Rb/B,andprobabilityofsymbolerrorPe.899.11ImplicationsoftheInformationCapacityTheoremExample9.10M-aryPCMAssumption:
Thesystemoperatesabovethethreshold.Theaverageprobabilityoferrorduetochannelnoiseisnegligible.acodeword:ncodeelements,eachhavingoneofMpossiblediscreteamplitudelevels.noisemargin:sufficientlylargetomaintainanegligibleerrorrateduetochannelnoise.
↓TheremustbeacertainseparationbetweentheseMpossiblediscreteamplitudelevels,kconstant,noisevariance,BchannelbandwidthTheaveragetransmittedpowerwillbeleastiftheamplituderangeissymmetricalaboutzero.909.11ImplicationsoftheInformationCapacityTheorem(9.102)Thediscreteamplitudelevels,normalizedwithrespecttotheseparation,willhavethevaluetheaveragetransmittedpower(假設先驗等概)Whertz,highestfrequencycomponent2W,sampledrateL,representationlevelsofquantizer(equallylikely)themaximumrateofinformationtransmission(9.103)919.11ImplicationsoftheInformationCapacityTheorem(9.104)Forauniquecodingprocess(9.105)(9.106)(9.107)929.11ImplicationsoftheInformationCapacityTheorem(9.108)Brequiredtotransmitarectangularpulseofduration1/2nWiswhereisaconstantwithavaluelyingbetween1and2.Using=1,(minimumvalue)TheyareidenticaliftheaveragetransmittedpowerinthePCMsystemisincreasedbythefactork2/12,comparedwiththeidealsystem.PowerandbandwidthinaPCMsystemareexchange
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經(jīng)權益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 班級公開課與示范教學計劃
- 秋季數(shù)字化學習與在線教育實施計劃
- 第1課時 三位數(shù)乘兩位數(shù)(教學實錄)-2024-2025學年四年級上冊數(shù)學人教版
- 三年級信息技術上冊 第八課 尋找“食人花”教學實錄 華中師大版
- 2024年服裝設計師兼職合同
- 2024年度橋西區(qū)圖書館數(shù)字資源室租賃協(xié)議3篇
- 8制作我的小樂器 教學實錄-2024-2025學年科學四年級上冊教科版
- 六盤水職業(yè)技術學院《自動化工具軟件》2023-2024學年第一學期期末試卷
- 2024年秋季國開電大《形勢與政策》形考作業(yè)參考答案
- 2024SaaS企業(yè)管理軟件銷售及服務合同3篇
- 《物流系統(tǒng)規(guī)劃與設計》課程教學大綱
- 護理質(zhì)控分析整改措施(共5篇)
- 金屬礦山安全教育課件
- 托盤演示教學課件
- 中華農(nóng)耕文化及現(xiàn)實意義
- DB32T 4353-2022 房屋建筑和市政基礎設施工程檔案資料管理規(guī)程
- DBJ61-T 112-2021 高延性混凝土應用技術規(guī)程-(高清版)
- 2023年高考數(shù)學求定義域?qū)n}練習(附答案)
- 農(nóng)產(chǎn)品品牌與營銷課件
- 蘇科版一年級心理健康教育第17節(jié)《生命更美好》教案(定稿)
- 車輛二級維護檢測單參考模板范本
評論
0/150
提交評論