Fill This Form To Receive Instant Help

#### There is some Bayes’ net structure over three variables which can represent any given probability distribution over those variables

###### Math

There is some Bayes’ net structure over three variables which can represent any given probability distribution over those variables.

? True

? False

Q1.2

2 Points

For a Markov Chain Monte Carlo Method to work properly, the

posterior distribution being estimated must be Gaussian.

Q1.3

2 Points

Given any probability distribution over N variables, every Bayes’ net

with N nodes can represent that distribution.

Q1.4

2 Points

When deploying a multi-agent system, all individual agents must share

identical internal algorithms and models of the environment.

Q1.5

2 Points

? True

? False

? True

? False

? True

? False

All conditional independence properties of the probability distribution

represented by a Bayes’ net are determined by the graph structure.

Q1.6

2 Points

Given that a Bayes’ net with N nodes that can be design represent any

probability distribution over N variables, that same network can

represent any marginal distribution over M variables where M ≤ N.

Q1.7

2 Points

In the worst case, the complexity of inference could be exponential in

the size (number of variables) of a Bayes’ net.

Q1.8

2 Points

The smoothing algorithm for HMMs returns the single most likely

sequence of hidden states given the observations.

? True

? False

? True

? False

? True

? False

Q1.9

2 Points

A particle filter with n particles at time t, {pt1 , pt2 . . . ptn } uses each pti

to generate a new particle for time t + 1.

Q1.10

2 Points

All Markov Chains have a stationary distribution.

Q2 Multiple Choice

20 Points

Select all that apply. 4 points each, 20 points total.

Q2.1

4 Points

Which of the type of problem would be appropriate for a Kalman filter?

? True

? False

? True

? False

? True

? False

Q2.2

4 Points

The Viterbi Algorithm would be used for which of these problems:

Q2.3

4 Points

Markov Chain Monte Carlo

Tracking a bouncing ball

Automating control of a space ship that has to dock with another

Calculating likelihood if it raining given that I’ve seen an umbrella Tracking a moving target to find next location

Decoding an audio stream into the most likely text

Transforming analog radio signals into bit sequences

Finding out the most likely explanation for the price of a stock

Q2.4

4 Points

Which of these are inference tasks in temporal models:

Q2.5

4 Points

Algorithms such as Gibbs Sampling and Metropolis-Hastings, uses samples which, in the long run, end up converging to the posterior probability  is an approximation algorithm enumerates the complete joint probability table iv. is similar to Hill Climbing filtering variable elimination prediction smoothing normalization

Q3 Bayes' Nets

21 Points

(28 points total, 4 points each) The Whizzo Chocolate Company (as described at https://www.youtube.com/watch?v=7-UssHVuCys) makes the The Whizzo Quality Assortment candy box that has several flavors of candy: 30% are Crunchy Frog (which smell of pond water and contain only the finest of baby frogs) and 70% are Spring Surprise (which smell delightful, but when you pop it in your mouth steel bolts spring out and plunge straight through-both cheeks). All candies start out round and look the same. Someone (who can smell) trims some of the candies so that they are square. Then, a second person who can’t smell wraps each candy in a red or brown wrapper. 80% of Spring

Surprise candies are round and 74% have red wrappers. 90% of

Crunchy Frog candies are square and 82% have brown wrappers. All

candies are sold individually in sealed, identical, black boxes! You have

just bought a box, but haven’t opened it yet.

The following are potential Bayes' Nets for the relationships between

Works if there is a stationary distribution

Can be used to compute the Kalman Filter

Are examples of Markov Chain Monte Carlo algorithms

Use transition probabilities in the Bayes Net to compute next

samples

Q3.1

4 Points

Which network(s) can correctly represent P(Flavor, Wrapper, Shape)?

Q3.2

4 Points

Which network(s) assert(s) P(Wrapper—Shape) = P(Wrapper)?

Network 1

Network 2

Network 3

Network 4

Network 5

Network 6

Q3.3

4 Points

Which network(s) assert(s) P(Wrapper—Shape, Flavor) = P(Wrapper—

Shape)?

Q3.4

4 Points

Network 1

Network 2

Network 3

Network 4

Network 5

Network 6

Network 1

Network 2

Network 3

Network 4

Network 5

Network 6

From the problem description, what independence relationships

should hold for this problem? Which network is the best

representation of this problem?

Q3.5

4 Points

What is the probability that the candy has a red wrapper?

Q3.6

1 Point

Network 1

Network 2

Network 3

Network 4

Network 5

Network 6

0.785

0.95

0.635

0.8

In the box is a round candy with a red wrapper. What is the probability

that it is a Spring Surprise candy?

Q4 HMMs

15 Points

Suppose you are a robot navigating a maze (pictured above), wheresome of the cells are free and some are blocked. At each time step,you are occupying one of the free cells. You are equipped withsensors which give you noisy observations, (wU ,wD,wL,wR) of the fourcells adjacent to your current position (UP, DOWN, LEFT, and RIGHTrespectively). Each wI is either FREE or BLOCKED, and is accurate 80%of the time, independently of the other sensors or your currentposition. Assume that if a cell is off the given grid it is treated asblocked.

YES

NO

Imagine that you have experienced a motor malfunction that causes

you to randomly move to one of the four adjacent cell with probability

1/4 (25%). If you move towards a blocked cell, you hit the wall and stay

where you are.

Q4.1

5 Points

Suppose you start in the central cell in the Figure. One time step

passes and you are now in a new, possibly different state and your

sensors indicate (free,blocked,blocked,blocked). Which states have a

non-zero probability of being your new position?

Q4.2

5 Points

What is the value of P(S1 = (2,2) — WU, WD, WL, WR)?

Q4.3

5 Points

? (1,2),(2,2),(3,3)

? (2,1),(2,2),(2,3)

? (1,1),(2,2),(3,3)

? (1,2),(2,2),(3,2)

? 0.65

? 0.0512

? 0.02

? 0.33

Suppose that s0 is your starting state and that s1 and s2 are random

variables indicating your state after the first and second time steps.

Which Bayes Net best illustrates the relationships between each si

and the sensor observations associated with that state.

Q5 More HMMs

20 Points

The weather example V2.0: The weather in College Park is notoriously

fickle. For simplicity we only consider sun and rain and we assume that

the weather changes once per day. The weather satisfies the following

transition probabilities: (a) When it rains, the probability of sun the

following day is 0.6. (b) When the sun shines, the probability of rain the

following day is 0.3.

?

?

?

5/14/22, 1:30 PM Submit Final Exam | Gradescope

Q5.1

5 Points

Which is the transition matrix for College Park weather?

Q5.2

5 Points

Given the correct transition matrix from the above, assume that we

observe the weather over a ten day period. In particular, we observe

the following:

The sun shines on the first day.

It rains on day 5.

It rains on day 7.

The sun shines on day 10.

What is the probability of sun on day 6, P(r6)?

?

?

?

Q5.3

5 Points

Which is the most likely weather sequence on days 8 and 9. (HINT:

Work out each of the four possibilities).