**Exercise 1 – Bayesian Networks – Inference**

Prove that inference in BN is #P-complete

**Exercise 2 – Bayesian Networks – Inference**

Figure shows a graphical model with conditional probabilities tables about whether or not you will panic at an exam based on whether or not the course was boring (“B”), which was the key factor you used to decide whether or not to attend lectures (“A”) and revise doing the exercises after each lecture (“R”).

You should use the model to make exact *inference* and answer the
following queries:

- what is the probability that you will panic or not before the exam given that you attended the lectures and revised after each lecture?
- what is the probability that you will panic or not before the exam?
- your teacher saw you panicking at the exam and he wants to work out from the model the reason for that. Was it because you did not come to the lecture or because you did not revise?

Repeat the inference in the last two queries by means of stochastic inference implementing in R the Prior-Sample, Rejection-Sampling, Likelihood-weighting and Gibbs-Sampling Markov Chain Monte Carlo.