![Possibility Theory and its applications a retrospective and可能性理論及其應(yīng)用_第1頁](http://file4.renrendoc.com/view10/M00/28/2C/wKhkGWVvcVSAf-brAADvl25mY18263.jpg)
![Possibility Theory and its applications a retrospective and可能性理論及其應(yīng)用_第2頁](http://file4.renrendoc.com/view10/M00/28/2C/wKhkGWVvcVSAf-brAADvl25mY182632.jpg)
![Possibility Theory and its applications a retrospective and可能性理論及其應(yīng)用_第3頁](http://file4.renrendoc.com/view10/M00/28/2C/wKhkGWVvcVSAf-brAADvl25mY182633.jpg)
![Possibility Theory and its applications a retrospective and可能性理論及其應(yīng)用_第4頁](http://file4.renrendoc.com/view10/M00/28/2C/wKhkGWVvcVSAf-brAADvl25mY182634.jpg)
![Possibility Theory and its applications a retrospective and可能性理論及其應(yīng)用_第5頁](http://file4.renrendoc.com/view10/M00/28/2C/wKhkGWVvcVSAf-brAADvl25mY182635.jpg)
版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡介
Possibility
Theory
and
itsapplications:
a
retrospectiveand
prospective
viewD.
Dubois,
H.
PradeIRIT-CNRS,
Université
PaulSabatier31062
TOULOUSEFRANCEOutlineBasic
definitionsPioneersQualitative
possibility
theoryQuantitative
possibility
theoryPossibility
theory
is
an
uncertaintytheory
devoted
to
the
handling
ofincomplete
information.structures.similar
to
probability
theory
because
it
is
basedon
set-functions.differs
by
the
use
of
a
pair
of
dual
setfunctions
(possibility
and
necessity
measures)instead
of
only
one.it
is
not
additive
and
makes
sense
on
ordinalThe
name
"Theory
of
Possibility"
was
coined
by
Zadehin
1978The
concept
of
possibilityFeasibility:somethingItispossible
to
do(physical)Plausibility:occursItispossible
that
something(epistemic)Consistency
:
Compatible
with
what
isknown(logical)Permission:
It
is
allowed
to
do
something(deontic)POSSIBILITY
DISTRIBUTIONS(uncertainty)S:
frame
of
discernment
(set
of
"states
of
theworld")x
:
ill-known
description
of
the
current
state
ofaffairs
taking
its
value
on
SL:
Plausibility
scale:
totally
ordered
set
ofplausibility
levels
([0,1],
finite
chain,
integersA
possibility
distribution
πx
attached
to
x
is
amapping
from
S
to
L
:s,
πx(s)L,
such
thats,
πx(s)
=
1
(normalization)Conventions:πx(s)
=
0
iff
x
=
s
is
impossible,
totally
excludedπx(s)
=
1
iff
x
=
s
is
normal,
fully
plausible,unsurprizingEXAMPLE
:
x
=
AGE
OF
PRESIDENTIf
I
do
not
know
the
age
of
the
president,I
may
have
statistics
on
presidents
ages…but
generally
not,
or
they
may
beirrelevant.partial
ignorance
:70
≤
x
≤
80
(sets,
intervals)a
uniform
possibility
distributionπ(x)
=
1=
0x
[70,
80]otherwisepartial
ignorance
with
preferences
:
Mayhave
reasons
to
believe
that
72>
71
73
>
70
74
>
75
>
76
>
77EXAMPLE
:
x
=
AGE
OF
PRESIDENTLinguistic
information
described
by“
he
is
old
”
:
π
=
μfuzzy
sets:OLDIf
I
bet
on
president"s
age:
I
maycome
up
with
a
subjective
probability
!But
this
result
is
enforced
by
thesetting
of
exchangeable
bets
(Dutchbook
argument).
Actual
information
isA
possibility
distribution
is
the
representation
ofstate
of
knowledge:a
description
of
how
we
think
the
state
of
affairs
isπ"
more
specific
than
π
in
the
widesense
if
and
only
ifπIn
other
words:
any
value
possible
for
π"
should
beat
least
as
possible
for
πthainformative
than
πCOMPLETE
KNOWLEDGE
:
The
most
specificonesπ(s0)
=
1
;
π(s)
=
0
otherwiseIGNORANCE
:
π(s)
=
1,s
SPOSSIBILITY
AND
NECESSITYOF
AN
EVENTA
possibility
distribution
on
Sof
x)(thenormal
valuesan
event
AHow
confident
are
we
that
xAS
?(A)
=
maxuAπ(s);The
degree
of
posN(A)
=
1
–(s)(Ac)=minu
A1
–
πThe
degree
of
certainty
(neceComparing
the
value
of
a
quantity
x
to
a
thresholdwhen
the
value
of
x
is
only
known
to
belong
to
an
interv[a,
b].In
this
example,
the
available
knowledge
ismodeled
by
p(x)
=
1
if
x
[a,
b],
0
otherwisProposition
p
=
"x
>i)
a
>
:
then
x
>"
to
be
checkedis
certainly
true
:N(x
>
)
=
P(x
>b
<
:
then
x
>
is
certainly
false
;N(x
>
)
=
P(x
>a
≤
≤
b:
then
x
>
is
possibly
true
or
f N(x
>
)
=
0;
P(x
>
)
=
1.Basic
properties(A)
=
to
what
extent
at
least
one
element
in
Ais
consistent
with
π
(=
possible)N(A)
=
1
–
(Ac)
=
to
what
extent
no
element=
to
what
extent
πoutside
A
is
possibleimplies
A(A
B)
=
max(
(A),
(B));
N(AN(B)).B)
=
min(N(A),Mind
that
most
of
the
time
:B)
<
min(
(A),
(B));B)
>
max(N(A),(AN(AN(B)Corollary
N(A)
>
01(A)
=Pioneers
of
possibility
theoryIn
the
1950’s,
G.L.S.
Shackle
called"degree
of
potential
surprize"
of
an
event
itsdegree
of
impossibility.Potential
surprize
is
valued
on
a
disbelief
scale,namely
a
positive
interval
of
the
form
[0,
y*],where
y*
denotes
the
absolute
rejection
of
theevent
to
which
it
is
assigned.The
degree
of
surprize
of
an
event
is
the
degreeof
surprize
of
its
least
surprizing
realization.He
introduces
a
notion
of
conditional
possibilityPioneers
of
possibility
theoryIn
his
1973
book,
the
philosopher
DavidLewisconsiders
a
relation
betweenforeventsA,B,C,ABCACB.possible
worlds
he
calls
"comparativepossibility".He
relates
this
concept
of
possibility
to
anotion
of
similarity
between
possible
worldsfor
defining
the
truth
conditions
ofcounterfactual
statements.The
ones
and
only
ordinal
counterparts
topossibility
measuresPioneers
of
possibility
theoryThe
philosopher
L.
J.
Cohen
considered
theproblem
of
legal
reasoning
(1977)."Baconian
probabilities"
understood
as
degrees
ofprovability.It
is
hard
to
prove
someone
guilty
at
the
court
oflaw
by
means
of
pure
statistical
arguments.A
hypothesis
and
its
negation
cannot
both
havepositive
"provability"Such
degrees
of
provability
coincide
withnecessity
measures.Pioneers
of
possibility
theoryZadeh(1978)
proposed
an
interpretation
ofmembership
functions
of
fuzzy
sets
as
possibilitydistributions
encoding
flexible
constraints
induceby
natural
language
statements.relationship
between
possibility
and
probability:what
is
probable
must
preliminarily
be
possible.refers
to
the
idea
of
graded
feasibility
("degreesof
ease")
rather
than
to
the
epistemic
notion
ofplausibility.the
key
axiom
of
"maxitivity"
for
possibilitymeasures
is
highlighted
(also
for
fuzzy
events).Qualitative
vs.
quantitative
possibilitytheoriesQualitative:comparative:
A
complete
pre-ordering
≥π
onU
A
well-ordered
partition
of
U:
E1
>
E2
>
…
>
Enabsolute:
πx(s)lattice...Quantitative:
πx(s)L
=
finite
chain,
complete[0,
1],
integers...One
must
indicate
where
the
numbers
come
from.All
theories
agree
on
the
fundamental
maxitivityaxiom(A
B)
=
max(
(A),
(B))Theories
diverge
on
the
conditioningoperationOrdinal
possibilisticconditioningA
Bayesian-like
equation:A)
=
min(A)
is
the
maximal
solution
to
this
equation.A(B
|
A) =
1
if
A,
B
≠
?,0N(B
|
A)
=
1
–
(Bc|
A)(A)
=
(A
B)
>=
(A B)
if(A)
>?Independence)Not
the
converse!!!!QUALITATIVE
POSSIBILISTIC
REASONINGThe
set
of
states
of
affairs
is
partitioned
via
πinto
a
totally
ordered
set
of
clusters
of
equallyplausible
statesE1
(normal
worlds)
>
E2
>...
En+1
(impossibleworlds)ASSUMPTION:
the
current
situation
is
normal.By
default
the
state
of
affairs
is
in
E1N(A)
>
0
iff
P(A)
>
P(Ac)iff
A
is
true
in
all
the
normal
situationsThen,
A
is
accepted
as
an
expected
truthAccepted
events
are
closed
under
deductionA
CALCULUS
OF
PLAUSIBLE
INFERENCE(B)
≥(C)
means
?
Comparing
propositions
onthe
basis
of
their
most
normal
models
?ASSUMPTION
for
computing
(B):
the
currentsituation
is
the
most
normal
where
B
is
true.PLAUSIBLE
REASONING
=
“
reasoning
as
ifthe
current
situation
were
normal”
and
jumpingto
accepted
conclusions
obtained
from
thenormality
assumption.DIFFERENT
FROM
PROBABILISTICREASONING
BASED
ON
AVERAGINGACCEPTANCE
IS
DEFEASIBLE
If
B
is
learned
to
be
true,
then
thenormal
situations
become the
mostplausible
ones
in
B,
and
the
acceptedbeliefs
are
revised
accordinglyAccepting
A
in
the
context
where
B
is
true:B)
iff
N(A
|
B)
>
0N(A)
> 0
,
N(Ac
|
B)
>P(A
B)
>
P(Ac(conditioning)One
may
have 0
:non-monotonyWITH
APLAUSIBLE
INFERENCEPOSSIBILITY
DISTRIBUTIONGiven
a
non-dogmatic
possibility
distribution
πon
S
(π(s)
>
0,
s)Propositions
A,
and
BA
|=π
B
iff
(AIt
means
thatB)
>
(A
Bc)B
is
truA
is
trueThis
is
a
form
of
inference
firstproposed
by
Shoham
in
nonmonotonicreasoningWITH
APLAUSIBLE
INFERENCEPOSSIBILITY
DISTRIBUTION(in
A)Exa
mple
(continued)Pieces
of
knowledge
like
?
=
{b
f,
p?f}can
be
expressed
by
constraintsb,
p(bf)
>(
b?f)(pb)
>(p?b)(p?f)
>(pf)the
minimally
specific
π*
ranks
normalsituations
first:?pbf,
?p?bthen
abnormalLast,
totallysituations:
?fabsurd
situationsbfp
,
?bpExample
(back
to
possibilistic
logic)=
material
implicationRanking
of
rules:
bf
has
lesspriority
that
others
according
to
p*:N*(b
f
)
=
N*(p
b)
>N*(b
f)Possibilistic
base
:K
=
{(b)},f
),
(pwith
<b),
(p?fApplications
of
qualitativepossibility
theoryException-tolerant
Reasoning
in
rule
basesBelief
revision
and
inconsistency
handlingin
deductive
knowledge
basesHandling
priority
in
constraint-basedreasoningDecision-making
under
uncertainty
withqualitative
criteria
(scheduling)Abductive
reasoning
for
diagnosis
underpoor
causal
knowledge
(satellite
faults,
carengine
test-benches)ABSOLUTE
APPROACHTO
QUALITATIVE
DECISIONA
set
of
states
S;A
set
of
consequences
X.A
decision
=
a
mapping
f
from
S
to
Xf(s)
is
the
consequence
of
decision
fwhen
the
state
is
known
to
be
s.Problem
:
rank-order
the
set
of
decisionsin
XS
when
the
state
is
ill-known
and
there
is
a
utility
function
on
X.This
is
SAVAGE
framework.ABSOLUTE
APPROACHTO
QUALITATIVE
DECISIONUncertainty
on
states
ispossibilistica
function
π:
SL
is
a
totally
ordered
plausibility
scalePreference
on
consequences:a
qualitative
utility
function
μ:
XUtotally
rejectedμ(x)
=
0consequenceμ(y)
>
μ(x)μ(x)
=
1y
preferred
to
xpreferred
consequencePossibilistic
decision
criteriQualitative
pessimistic
utility
(Whalen):UPES(f)
=
minsSmax(n(π(s)),
μ(f(s)))where
n
is
the
order-reversing
map
of
VLow
utility
:
plausible
state
with
badconsequencesQualitative
optimistic
utility
(Yager):UOPT(f)
=
maxsHigh
utility:consequencesSmin(π(s),
μ(f(s)))plausible
states
with
goodThe
pessimistic
and
optimistic
utilities
are
well-known
fuzzy
pattern-matching
indicesin
fuzzy
expert
systems:μ
=
membership
function
of
rule
conditionπ
=
imprecision
of
input
factin
fuzzy
databasesμ
=
membership
function
of
queryπ
=
distribution
of
stored
imprecise
datain
pattern
recognitionμ
=
membership
function
of
attribute
templateπ
=
distribution
of
an
ill-known
object
attribuAssumption:plausibility
and
preferenceand
U
are
commensuratescales
LThere
exists
a
common
scale
V
that
containsboth
L
and
U,
so
that
confidence
and
uncertaintylevels
can
be
compared.(certainty
equivalent
of
a
lottery)If
only
a
subset
E
of
plausible
states
isknownπ
=EEUPES(f)
=
mins
μ(f(s))
(utility
of
the
worstconsequence
in
E)criterion
of
Wald
under
ignoranceEUOPT(f)=
maxs
μ(f(s))On
a
linear
state
spacePessimistic
qualitativeutility
of
binaryactsxAy,
with
μ(x)xAy
(s)
=
x
if
Aoccurs>
μ(y):=
y
if
its
coUPES(xAy)
=
median
{μ(x),
N(A),
μ(y)}Interpretation:
If
the
agent
is
sure
enough
of
A,it
is
as
if
the
consequence
is
x:
UPES(f)
=
μF(x)If
he
is
not
sure
about
A
it
is
as
if
theconsequence
is
y:
UPES(f)
=
μF(y)Otherwise,
utility
reflects
certainty:
UPES(f)
=
N(AWITH
UOPT(f)
:
replace
N(A)
by
(A)Representation
theoremfor
pessimistic
possibilistic
criteriaaSuppose
the
preference
relation
on
acts
obeysthe
following
properties:a(XS,
a)
is
a
complete
preorder.there
are
two
acts
such
that
fA,
f,
x,
y
constant,
xag.y
xAfyAfif
f
>a
h
and
g
>a
h
imply
f g
>a
hif
x
is
constant,
h
>a
x
and
h
>a
g
imply
h
>a
x
gthen
there
exists
a
finite
chain
L,
an
L-valuednecessity
measure
on S
and
an
L-valued
utilityafunction
u,
such
that
is
representable
by
thepessimistic
possibilistic
criterion
UPES(f).Merits
and
limitationsof
qualitative
decision
theoryProvides
a
foundation
for
possibility
theoryPossibility
theory
is
justified
by
observinghow
a
decision-maker
ranks
actsApplies
to
one-shot
decisions
(nocompensations/
accumulation
effects
inrepeated
decision
steps)Presupposes
that
consecutive
qualitativevalue
levels
are
distant
from
each
other(negligibility
effects)Quantitative
possibilitytheoryMembership
functions
of
fuzzy
setsNatural
language
descriptions
pertaining
tonumerical
universes
(fuzzy
numbers)Results
of
fuzzy
clusteringSemantics:
metrics,
proximity
toprototypesUpper
probability
boundRandom
experiments
with
imprecise
outcomesConsonant
approximations
of
convex
probabilitysetsSemantics:
frequentist,
subjectivist(gambles)...Quantitative
possibilitytheoryOrders
of
magnitude
of
very
smallprobabilitiesdegrees
of
impossibility
k(A)
ranging
onintegers
k(A)
=
n
iff
P(A)
=
enLikelihood
functions
(P(A|
x),
wherex
varies)
behave
like
possibilitydistributionsBP(A|
B)
≤
maxx
P(A|
x)POSSIBILITY
ASUPPER
PROBABILITYGiven
a
numerical
possibility
distribution
pdefineP(p)
=
{Probabilities
P
|
P(A)
≤
(A)A}Then,
generally
it
holdsthat(A)
=
sup
{P(A)
|
PP(p)}
N(A)
=
inf
{P(A)
|
P
P(p)So
p
is
a
faithful
representation
of
a
familyof
probability
measures.From
confidence
sets
to
possibilitydistributionsConsider
a
nested
family
of
sets
E1
E2 …
Ena
set
of
positive
numbers
a1
…an
in
[0,
1]and
the
family
of
probability
functionsP
=
{P
|
P(Ei)
≥
ai
for
all
i}.P
is
always
representable
by
means
of
a
possibilitymeasure.
Its
possibility
distribution
is
preciselyπx
=
mini
max(μEi,
1
–
ai)Random
set
viewLet
mi
=
i
–
i+1then
m1
+…
+
mn
=
1A
basic
probabilityassignment
(SHAFER)π(s)=
∑i:
sAimi
(one
point-coverage
function)Only
in
the
consonant
case
can
m
berecalculated
from
πCONDITIONAL
POSSIBILITYMEASURESA
Coxian
axiom(C),(Awith
*C)
==
product(A
|C)*Then:
(A
|C)
=(AC)/(C)N(A|
C)
=
1
–
(Ac
|
C)Dempster
rule
of
conditioning
(preserves
s-maxitivity)For
the
revision
of
possibility
distributions:
minimachange
of
when
N(C)
=
1.It
improves
the
state
of
information(reduction
of
focal
elements)Bayesian
possibilistic
conditioning(A
|b
C)
=
sup{P(A|C),
P
≤
,
P(C)
>
0}N(A
|b
C)
=
inf{P(A|C),
P
≤
,
P(C)
>
0}It
is
still
a
possibility
measureIt
can
be
shown
that:π(s|b
C)
=
π(s)max(1,(A
|b
C)
=
(AC)/
(
(AC)
+
N(AcC))N(A|b
C)
=
N(A
C)
/
(N(A=
1
–
(Ac
|b
C)C)
+
P(AcC))For
inference
from
generic
knowledge
based
onobservationsPossibility-Probability
transformationsWhy
?fusion
of
heterogeneous
datadecision-making
:
betting
according
to
apossibility
distribution
leads
to
probabiliExtraction
of
a
representative
valueSimplified
non-parametric
impreciseprobabilistic
modelsPOSS
PROB:
Laplace
indifference
principle“
All
that
is
equipossible
is
equiprobable
”=
changing
a
uniform
possibility
distribution
intouniform
probability
distributionPROB
POSS:
Confidence
intervalsReplacing
a
probability
distribution
by
an
intervalA
with
a
confidence
level
c.It
defines
a
possibility
distributionπ(x)
=
1
if
xA,=
1
–
c
if
x
AElementary
forms
of
probability-possibility
transformations
exist
for
along
timePossibility-Probability
transformations
:BASIC
PRINCIPLESPossibility
probability
consistency:
P
≤Preserving
the
ordering
of
events
:P(A)
≥
P(B)(A)
≥or
elementary
even(B)onlyp(x")(x)
>
(x")
if(order
preservation)Informational
criteria:from
to
P:
Preservation
of
symmetries(Shapley
value
rather
than
maximal
entropy)from
P
to:
optimize
information
content(Maximization
or
minimisation
ofspecificityFrom
OBJECTIVE
probability
to
possibility
:Rationale
:
given
a
probability
p,
try
andpreserve
as
much
information
as
possibleSelect
a
most
specific
element
of
the
setPI(P)
= { :
≥
P}
of
possibilitymeasures
dominating
P
such
that(x)
>(x")
iff
p(x)
>
p(x")may
be
weakened
into
:p(x)
>
p(x")
implies
(x)
>The
result
is
=i
j=i,…npi(case
of
no
ties)From
probability
to
possibility
:Continuous
caseThe
possibility
distribution
obtained
bytransforming
p
encodes
then
family
of
confidenceintervals
around
the
mode
of
p.The
a-cut
of
is
the
(1-
a)-confidence
intervalof
pThe
optimal
symmetric
transform
of
the
uniformprobability
distribution
is
the
triangular
fuzzynumberThe
symmetric
triangular
fuzzy
number
(STFN)
isa
covering
approximation
of
any
probability
withunimodal
symmetric
density
p
with
the
samemode.In
other
words
the
a-cut
of
a
STFN
contains
the(1-
a)-confidence
interval
of
any
such
p.IL
=
{x,
p(x)
≥
}=
[aL,
aL+
L]is
the
interval
oflength
L
with
maximalprobabilityThe
most
specificpossibility
distributiondominating
p
is
πsuch
that
L
>
0,
π(aL)
=
π(aL+
L)
=
1
–P(IL).From
probability
to
possibility
:Continuous
casebPossibilistic
view
of
probabilisticinequalitiesChebyshev
inequality
defines
apossibility
distribution
that
dominateany
density
with
given
mean
andvariance.The
symmetric
triangular
fuzzy
number(STFN)
defines
a
possibility
distributithat
optimally
dominates
any
symmetricdensity
with
given
mode
andbounded
support.From
possibility
to
probabilitIdea
(Kaufmann,
Yager,
Chanas):Pick
a
number
in
[0,
1]
at
randomPick
an
element
at
random
in
the-cut
of
π.a
generalized
Laplacean
indifference
principle
:change
alpha-cuts
into
uniform
probabilitydistributions.Rationale
:
minimise
arbitrariness
by
preservingthe
symmetry
properties
of
the
representation.The
resulting
probability
distribution
is:The
centre
of
gravity
of
the
polyhedron
P(p)The
pignistic
transformation
of
belief
functions
(Smets)The
Shapley
value
of
the
unanimity
game
N
in
gametheory.SUBJECTIVE
POSSIBILITYDISTRIBUTIONSStarting
point
:
exploit
the
bettingapproach
to
subjective
probabilityA
critique:
The
agent
is
forced
to
beadditive
by
the
rules
of
exchangeable
bets.For
instance,
the
agent
provides
a
uniformprobability
distribution
on
a
finite
set
whether(s)he
knows
nothing
about
the
concernedphenomenon,
or
if
(s)he
knows
the
concernedphenomenon
is
purely
random.Idea
:
It
is
assumed
that
a
subjectiveprobability
supplied
by
an
agent
is
only
atrace
of
the
agent"s
belief.SUBJECTIVE
POSSIBILITYDISTRIBUTIONSAssumption
1:
Beliefs
can
be
modelledby
belief
functions(masses
m(A)
summing
to
1
assigned
tosubsets
A).Assumption
2:
The
agent
uses
aprobability
function
induced
by
his
or
herbeliefs,
using
the
pignistic
transformation(Smets,
1990)
or
Shapley
value.Method
:
reconstruct
the
underlying
belieffunction
from
the
probability
provided
bythe
agent
by
choosing
among
theisopignistic
ones.SUBJECTIVE
POSSIBILITYDISTRIBUTIONSThere
are
clearly
several
belief
functions
withprescribed
Shapley
value.Consider
the
least
informative
of
those,in
the
sense
of
a
non-specificity
index(expected
cardinality
of
the
random
set)I(m)
=
∑
m(A)
card(A).RESULT
:
The
least
informative
belieffunction
whose
Shapley
value
is
p
isunique
and
consonant.SUBJECTIVE
POSSIBILITYDISTRIBUTIONSThe
least
specific
belief
function
in
thesense
of
maximizing
I(m)
is
characterizedbyi
j=1,n=
min(pj,
pi).It
is
a
probability-possibility
transformatipreviously
suggested
in
1983:
This
is
theunique
possibility
distribution
whoseShapley
value
is
p.It
gives
results
that
are
less
specific
thanthe
confidence
interval
approach
toobjective
probability.Applications
of
quantitativepossibilityRepresenting
incomplete
probabilistic
data
foruncertainty
propagation
in
computations(but
fuzzy
interval
analysis
based
on
theextension
principle
differs
from
conservativeprobabilistic
risk
analysis)Systematizing
some
statistical
methods(confidence
intervals,
likelihood
functions,probabilistic
inequalities)Defuzzification
based
on
Choquet
integral
(linearwith
fuzzy
number
addition)Applications
of
quantitativepossibilityUncertain
reasoning
:
Possibilistic
nets
are
acounterpart
to
Bayesian
nets
that
copes
withincomplete
data.
Similar
algorithmic
propertiesunder
Dempster
conditioning
(Kruse
team)Data
fusion
:
well
suited
for
mergingheterogeneous
information
on
numerical
data(linguistic,
statistics,
confidence
intervals)
(BloRisk
analysis
:
uncertainty
propagation
usingfuzzy
arithmetics,
and
random
interval
arithmeticswhen
statistical
data
is
incomplete
(Lodwick,Ferson)Non-parametric
conservative
modelling
ofimprecision
in
measurements
(Mauris)PerspectivesQuantitative
possibility
is
not
as
wellunderstood
as
probability
theory.Objective
vs.
subjective
possibility
(a
la
DeFinetti)How
to
use
possibilistic
conditioning
in
inferencetasks
?Bridge
the
gap
with
statistics
and
the
confidenceinterval
literature
(Fisher,
likelihood
reasoning)Higher-order
modes
of
fuzzy
intervals(variance,
…)
and
links
with
fuzzy
randomvariablesQuantitative
possibilistic
expectations
:
decisiontheoretic
characterisation
?ConclusionPossibility
theory
is
a
simple
and
versatiletool
for
modeling
uncertaintyA
unifying
framework
for
modeling
andmerging
linguistic
knowledge
andstatistical
dataUseful
to
account
for
missing
informationin
reasoning
tasks
and
risk
analysisA
bridge
between
logic-based
AI
andprobabilistic
reasoningProperties
of
inference
|=A
|=π
A
if
A
≠
?
(restricted
reflexivity)if
A
≠
?,
then
A
|=π
?
never
holds
(consistencypreservation)πThe
set
{B:
A
|=
B}
is
deductively
closed-If
A
B
and
C
|=π
A
then
C
|=π
B(right
weakening
rule
RW)-If
A
|=π
B
and
A
|=π
C
then
A
|=π
B
C(Right
AND)Properties
of
inference
|=IfA
|=πC
;
B
|=π
C
then
AB
|=π
C(LeftOR)IfA
|=πB
and
AB
|=π
Cthen
A
|=π(cut,Cweak
transitivity)(But
if
A
normally
implies
B
which
normally
implies
C,then
A
may
not
imply
C)CcIf
A
|=π
B
and
if
A
|=π
is
false,
then
A
C
|=πB
(rational
monotony
RM)If
B
is
normally
expected
when
A
holds,then
B
is
expected
to
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- LY/T 3405-2024竹材弧形原態(tài)重組材
- 人教版數(shù)學(xué)七年級(jí)下冊(cè)第7課時(shí)《平行線的性質(zhì)(一)》聽評(píng)課記錄
- 2025年造紙色漿合作協(xié)議書
- 湘教版數(shù)學(xué)七年級(jí)上冊(cè)《3.4一元一次方程模型的應(yīng)用(1)》聽評(píng)課記錄
- 蘇人版道德與法治九年級(jí)上冊(cè)7.2《違法要受法律處罰》聽課評(píng)課記錄
- 生態(tài)保護(hù)資源共享合同(2篇)
- 環(huán)境監(jiān)測(cè)設(shè)備合作開發(fā)合同(2篇)
- 六年級(jí)上冊(cè)聽評(píng)課記錄
- (人教版)七年級(jí)下冊(cè)數(shù)學(xué)配套聽評(píng)課記錄:5.1.3 《同位角、內(nèi)錯(cuò)角、同旁內(nèi)角》
- 四年級(jí)科學(xué)聽評(píng)課記錄
- 第一章 整式的乘除 單元測(cè)試(含答案) 2024-2025學(xué)年北師大版數(shù)學(xué)七年級(jí)下冊(cè)
- JD37-009-2024 山東省存量更新片區(qū)城市設(shè)計(jì)編制技術(shù)導(dǎo)則
- 春節(jié)后復(fù)工安全教育培訓(xùn)考試試題及答案
- 《朝天子-詠喇叭》
- 氧化還原反應(yīng)方程式的配平(八大配平技巧)-PPT課件
- 天津人社局解除勞動(dòng)合同證明書
- (高清正版)JJF(浙)1090—2014薄片千分尺校準(zhǔn)規(guī)范
- 2020年采購部年度目標(biāo)計(jì)劃 采購部工作目標(biāo)
- 陽光分級(jí)閱讀高一上The Emperor Penguin課件
- mil-std-1916抽樣標(biāo)準(zhǔn)(中文版)
- 黑水虻幼蟲的營養(yǎng)成分表
評(píng)論
0/150
提交評(píng)論