EECS Bayesian Models Homework 1Solution

$30.00 $24.00

Please read these instructions to ensure you receive full credit on your homework. Submit the written portion of your homework as a single PDF le through Courseworks (less than 5MB). In addition to your PDF write-up, submit all code written by you in their original extensions through Courseworks (e.g., .m, .r, .py, etc.). Any coding…

Rate this product

You’ll get a: zip file solution

 

Description

Rate this product

Please read these instructions to ensure you receive full credit on your homework. Submit the written portion of your homework as a single PDF le through Courseworks (less than 5MB). In addition to your PDF write-up, submit all code written by you in their original extensions through Courseworks (e.g., .m, .r, .py, etc.). Any coding language is acceptable, but your code should be your own. Do not submit Jupyter or other notebooks, but the original source code only. Do not wrap your les in .rar, .zip, .tar and do not submit your write-up in .doc or other le type. Your grade will be based on the contents of one PDF le and the original source code. Additional les will be ignored. We will not run your code, so everything you are asked to show should be put in the PDF le. Show all work for full credit.

Late submission policy: Late homeworks will have 0.1% deducted from the nal grade for each minute late. Your homework submission time will be based on the time of your last submis-sion to Courseworks. Therefore, do not re-submit after midnight on the due date unless you are con dent the new submission is signi cantly better to overcompensate for the points lost. You can resubmit as much as you like, but each time you resubmit be sure to upload all les you want graded! Submission time is non-negotiable and will be based on the time you submitted your last le to Courseworks. The number of points deducted will be rounded to the nearest integer.

 

Problem 1. (10 points)

Your friend is on a gameshow and phones you for advice. She describes her situation as follows: There are three doors with a prize behind one of the doors and nothing behind the other two. She randomly picks one of the doors, but before opening it, the gameshow host opens one of the other two doors to show that it contains no prize. She wants to know whether she should stay with her original selection or switch doors. What is your suggestion? Calculate the relevant posterior probabilities to convince her that she should follow your advice.

Problem 2. (15 points)

Let = ( 1; : : : ; K ), with j 0;
j j = 1. Let Xi Multinomial( ), i.i.d. for i = 1; : : : ; N.
Find a conjugate prior for and
calculate its posterior distribution and identify it by name.

P
What is the most obvious feature about the parameters of this posterior distribution?

Problem 3. (30 points)

You are given a dataset fx1; : : : ; xN g, where each x 2 N. You model it as i.i.d. Poisson( ). Since you don’t know , you model it as Gamma(a; b).

a) Using Bayes rule, calculate the posterior of and identify the distribution.

b) Using the posterior, calculate the predictive distribution on a new observation,

• 1
p(x jx1; : : : ; xn) = p(x j )p( jx1; : : : ; xN )d
0

1

Problem 4. (20 points)

In this problem you will use your derivations from Problem 3 to code a naive Bayes classi er for distinguishing spam from non-spam emails. The data is provided on Courseworks.

Each 54-dimensional vector x has a label y with y = 0 indicating \non-spam” and y = 1 indicating \spam”. We model the nth feature vector of a spam email as

 

54
~

dY

; yn = 1) =
Poisson(xn;dj 1;d);
p(xnj 1

 

=1

and similarly for class 0. We model the labels as yn Bernoulli( ). Assume independent gamma priors on all 1;d and 0;d, as in Problem 3, with a = 1 and b = 1. For the label bias assume the prior Beta(e; f) and set e = f = 1.

Let (x ; y ) be a new test pair. The goal is to predict y given x . To do this we use the predictive distribution under the posterior of the naive Bayes classi er. That is, for possible label y = y 2 f0; 1g we compute

p(y = yjx ; X; ~y) / p(x jy = y; fxi : yi = yg)p(y = yj~y)

where X and ~y contain N training pairs of the form (xi; yi). This can be calculated as follows:

54
Z
0
1
p(x j y;d)p( y;djfxi : yi = yg)d
p(x jy = y; fxi : yi = yg) = d=1

 

Y

 

The results from Problem 3 can be directly applied here. Also, as discussed in the notes
• 1

p(y = yj~y) = p(y = yj )p( j~y)d

j

0

j

P

P

 

which has the solutions p(y = 1 ~y) =
e +
n 1(yn = 1)

and p(y = 0 ~y) =
f +
n 1(yn = 0)
.

N + e + f

 

 

 

 

N + e + f

a) Using the marginal distributions discussed above, implement this naive Bayes classi er for binary classi cation in your preferred language.

b) Make predictions for all data in the testing set by assigning the most probable label to each feature vector. In a 2 2 table, list the total number of spam classi ed as spam, non-spam classi ed as non-spam, as well as the o -diagonal values (i.e., a confusion matrix). Use the provided ground truth for this evaluation.
~
c) Pick three misclassi ed emails and for each email plot its features x compared with E[ 1]
~
and E[ 0], and give the predictive probabilities for that email. Mark the 54 points along the x-axis with their names in the readme le.

d) Pick the three most ambiguous predictions, i.e., the digits whose predictive probabilities are the closest to 0:5. Show the same information for these three emails that you showed in Problem 4(c) above.

 

 

 

 

2

EECS Bayesian Models Homework 1Solution
$30.00 $24.00