previous    up   next

Histogram and Possibility Distribution



A method to convert a probability distribution into a possibility distribution has been proposed by Dubois and Prade [Dubois and Prade, 1987a] and is dealt with here.

A histogram gives a probability distribution P for each element $\omega \in \Omega$ (finite). This probability distribution can be approximated by a possibility distribution, so that the values of probabilities of events are bounded by the degrees of possibility and necessity: $\forall A \in \mathcal{P}(\Omega), \quad N(A) \leqslant P(A) \leqslant \Pi(A)$.

Given a value of possibility measure represented by nested focal elements, and probabilistic weighting, we can try to approximate it by a value of probability measure by interpreting each focal element Ei as a conditional probability $P(\cdot / E_i)$ uniformly distributed over Ei.

The weight of the probability associated with the element $\omega \in \Omega$ (finite) is therefore:


\begin{displaymath}\forall \omega, \quadp(\omega) = \sum_{i = 1}^r P(\omega / E...... m(E_i)= \sum_{\omega \in E_i} \frac{m(E_i)}{\vert E_i\vert}\end{displaymath}
(47)


Although somewhat arbitrary, one value of probability P has been thus selected in the class of those which satisfy the inequalities $\forall A \in \mathcal{P}(\Omega), \quad N(A) \leqslant P(A) \leqslant \Pi(A)$.

The weights of probability $\{p(\omega_i) / i = 1,n \}$ can be calculated easily from the possibility distribution $\{ \pi(\omega_i) / i = 1,n \}$:

 \begin{displaymath}p(\omega_i) = \sum_{j = 1}^n \frac{1}{j} (\pi(\omega_j) - \pi(\omega_{j + 1}))\end{displaymath}
(48)


where $\pi(\omega_1) = 1 \geqslant \pi(\omega_2) \geqslant \ldots \geqslant\pi(\omega_{n + 1}) = 0$, and  $\omega_{n + 1}$ is an artificial element ($\Omega$ contains n éléments).

It is easy to notice that 48 defines a bijective transformation between the distributions p and $\pi$. This formula is reversed in:

 \begin{displaymath}\pi(\omega_i) = \sum_{j = 1}^n \min (p(\omega_i), p(\omega_j))\end{displaymath}
(49)


This result can make it possible to define a fuzzy subset starting from a histogram by observing the condition of coherence $\forall A \in \mathcal{P}(\Omega), \quad N(A) \leqslant P(A) \leqslant \Pi(A)$.

By approximating the focal elements Ei by singletons, we get:

\begin{displaymath}\Pi(\omega) = \sum_{p(\omega^\prime) \leqslant p(\omega)} p(\omega^\prime)\end{displaymath}
(50)


It is checked that $\pi$, defined by 49, is less accurate than $\Pi$ because $\pi \geqslant \Pi$.

The $\alpha $-cuts of the fuzzy subset A are interpreted as `` sets of confidence '' (with the meaning of the confidence intervals of the statistics) associated with the probability distribution p.

In particular, it is checked that if $\alpha = \mu_A(\omega)$, then $1 - \alpha = P(A_{\complement_\alpha})$, i.e. it is sure, with probability $1 - \alpha$, that the value of the random variable described by  p is in the strict $\alpha $-cut of A.



      previous    up   next     
  
 IRIT-UPS