![]() |
![]() |
![]() ![]() ![]() |
Consider a set
of elements. With Bayes' decision rule, it is possible to divide the elements
of
into p classes
,
from n discriminant attributes
.
We must already have examples of each class to choose typical values for
the attributes of each class.
The probability of meeting element ,
having attribute Ai, given that we consider class Cl,
will be denoted by
.
If we put all these probabilities together for each attribute, we obtain
the global probability of meeting element ,
given that the class is Cl
:
|
(5) |
Classification must allow the class of unknown element
to be decided with the lowest risk of error. Decision in Bayes' theory
chooses the class Cl for which the a posteriori
membership probability
is the highest:
|
(6) |
According to Bayes' rule, the a posteriori
probability of membership
is calculated from the a priori probabilities of membership of element
to class Cl :
|
(7) |
Denominator
is a normalization factor. It ensures
that the sum of probabilities
is equal to 1 when l varies.
Some classes appear more frequently than others, and P(Cl) denotes the a priori probability of meeting class Cl.
denotes the conditional probability of meeting the element
,
given that we focus on class Cl( given that class Cl
is true).