Robots work in an unstructured and unpredictable environment that changes constantly with many sources of uncertainty:
- External factors (changes in environment, dynamic obstacles, weather conditions, etc)
- Actuators (inaccuracies, vibration, high temperature, etc)
- Sensors (wrong readings, noise, measurement errors, sensor accuracy)
- Errors in odometry and kinematic model
Due to this robotics applications require a probabilistic not deterministic approach. For example, in localization we cannot fully trust the sensors to know where the robot is. Instead we can assign probabilities to each position to account for uncertainties, then we update the probabilities when new sensor data is available. The advantage is that accuracy improves when we accumulate more data.
Random variable: a limited set of possible outcomes resulting from a random experiment, e.g.:
- Dice roll:
$X={1, 2, 3, 4, 5, 6}$ - Coin toss:
$X={heads, tails}$
We can assign probabilities to outcomes, e.g.
$$
\begin{align*}
P(X = heads)
Important result: the sum of the probabilities of all possible outcomes must be 1 (100%). $$ \sum_{x}{P(X=x) =1} $$ Also, the probability of any outcome must be a value between 0 (impossible) and 1 (certain!): $$ 0 \leq P(X=x) \leq 1 $$ The combined probability of two independent events taking place can be calculated by multiplying the probabilities of each event: $$ \begin{align*} P(X=heads)&=1/2 \ P(Y=2)&=1/6 \ P(X=heads, Y=2) &= P(X=heads \cap Y=2)= 1/12 \end{align*} $$ More generally: $$ P(X \cap Y) = P(X) \cdot P(Y) $$
When the probability of one event affects the probability of another event, we say the two events are dependent. For example the probability of detecting an obstacle with a sensor
The combined probability of two dependent events is not the product of the two probabilities. We introduce the conditional probability: the probability that an event Y happens given that another dependent event X has already been observed:
The probability distribution function that maps all the values that a random variable can take with their likelihood. Can be discrete or continuous.
A uniform distribution assigns the same probability to all possible values. Good starting point or initial guess if there is no information
e.g. result of rolling two dices:
Mean: most common value $$ \mu &= \sum_{x} {x \cdot P(x)} \ \mu &= 2 \cdot \frac{1}{36}+3 \cdot \frac{2}{36}+... = 7 $$ Variance, a metric that describes how far the distribution is from the mean: $$ \begin{align*} \sigma² &= \sum_{x}{P(x)\cdot (x-\mu)²} \ \sigma² &= \frac{1}{36} \cdot (2-7)²+\frac{2}{36} \cdot (3-7)²+...=5.83 \end{align*} $$ Standard deviation: larger means higher uncertainty $$ \sigma = \sqrt{\sigma^2} = 2.4 $$
Gaussian or Normal distributions have a simple equation that depends only on mean and variance: $$ f(x) = \frac{1}{\sigma \sqrt{2 \pi}}e^{-\frac{1}{2}(\frac{x-\mu}{\sigma})²} $$ Area under the curve must add to 1.
Easy to combine processes described by Gaussians. Given two distributions (that describe the probability of two random processes A and B).
- Probability of A and B ocurring is also a Gaussian:
- Probability of A or B ocurring is also a Gaussian :
Given event
Example: a factory has three machines producing items (Machine 1 produces 30%, Machine 2 50%, and Machine 3: 20%), each with a different defect rate (Machine 1: 5%, Machine 2: 3%, Machine 3: 10%).
The probability of a defective item A in the factory production can be calculated using the total probability theorem as the sum of the probability of a defect for each machine (i.e. its defect rate) times the probability that the item comes from this machine (i.e. the machine share of the total production): $$ P(A)&=P(A∣B_1)P(B_1)+P(A∣B_2)P(B_2)+P(A∣B_3)P(B_3) \&= 5% \cdot 30% + 3% \cdot 50% + 10% \cdot 20% = 5% $$
We know from set theorem that intersection is commutative, hence: $$ P(A \cap B) = P(B \cap A) \ P(A∣B)P(B) = P(B∣A)P(A) $$ Refactoring we get the Bayes rule: $$ P(A∣B) = \frac{P(B∣A)P(A)}{P(B)} $$ which describes how to update the probability of an event when new information appears.
Definitions:
- Posterior probability P(A|B) "Probability of A given B": updated probabiltiy of event A when we observe a dependent event B
- Prior P(A): initial guess that we want to update based in new observations
- Marginal probability P(B): overall probability of observing event B at all
- Likelihood P(B|A) "Probability of B given A": probability of B assuming A occured
We will use this to recursively improve our estimation of robot localization based on new sensor data
Real sensors produce noisy signal with added random error