Question

Let Θ be a Bernoulli random variable that indicates which one of two hypotheses is true,...

Let Θ be a Bernoulli random variable that indicates which one of two hypotheses is true, and let P(Θ=1)=p. Under the hypothesis Θ=0, the random variable X has a normal distribution with mean 0, and variance 1. Under the alternative hypothesis Θ=1, X has a normal distribution with mean 2 and variance 1.
Suppose for this part of the problem that p=2/3. The MAP rule can choose in favor of the hypothesis Θ=1 if and only if xc1. Find the value of c1.

For this part, assume again that p=2/3. Find the conditional probability of error for the MAP decision rule, given that the hypothesis Θ=0 is true. P(error|Θ=0)=

Find the overall (unconditional) probability of error associated with the MAP rule for p=1/2.

Homework Answers

Answer #1

a) Observe that if is the rejection region of , then

Therefore

b) The probability of error given that is

c) Now the map rule for p = 1/2 would be (from the previous calculation)

Therefore the unconditional probability of error is

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X1, . . . , Xn be a random sample from a Bernoulli(θ) distribution, θ...
Let X1, . . . , Xn be a random sample from a Bernoulli(θ) distribution, θ ∈ [0, 1]. Find the MLE of the odds ratio, defined by θ/(1 − θ) and derive its asymptotic distribution.
Let X_1,…, X_n  be a random sample from the Bernoulli distribution, say P[X=1]=θ=1-P[X=0]. and Cramer Rao Lower...
Let X_1,…, X_n  be a random sample from the Bernoulli distribution, say P[X=1]=θ=1-P[X=0]. and Cramer Rao Lower Bound of θ(1-θ) =((1-2θ)^2 θ(1-θ))/n Find the UMVUE of θ(1-θ) if such exists. can you proof [part (b) ] using (Leehmann Scheffe Theorem step by step solution) to proof [∑X1-nXbar^2 ]/(n-1) is the umvue , I have the key solution below x is complete and sufficient. S^2=∑ [X1-Xbar ]^2/(n-1) is unbiased estimator of θ(1-θ) since the sample variance is an unbiased estimator of the...
Let X be a Poisson random variable with parameter λ and Y an independent Bernoulli random...
Let X be a Poisson random variable with parameter λ and Y an independent Bernoulli random variable with parameter p. Find the probability mass function of X + Y .
Let X1, X2, ·······, Xn be a random sample from the Bernoulli distribution. Under the condition...
Let X1, X2, ·······, Xn be a random sample from the Bernoulli distribution. Under the condition 1/2≤Θ≤1, find a maximum-likelihood estimator of Θ.
Let X be the binomial random variable obtained by adding n=4 Bernoulli Trials, each with probability...
Let X be the binomial random variable obtained by adding n=4 Bernoulli Trials, each with probability of success p=0.25. Define Y=|X-E(x)|. Find the median of Y. A.0 B.1 C.2 D.3 E.Does not exist
Let θ > 1 and let X1, X2, ..., Xn be a random sample from the...
Let θ > 1 and let X1, X2, ..., Xn be a random sample from the distribution with probability density function f(x; θ) = 1/xlnθ , 1 < x < θ. c) Let Zn = nlnY1. Find the limiting distribution of Zn. d) Let Wn = nln( θ/Yn ). Find the limiting distribution of Wn.
Let Y ⇠ Gamma(alpha,beta) and conditioned on Y = y, X ⇠ Poisson(y). Find the unconditional...
Let Y ⇠ Gamma(alpha,beta) and conditioned on Y = y, X ⇠ Poisson(y). Find the unconditional distribution of X in the case that alpha = r is an integer and beta=1-p/p
for p in (0, 1). 
Find the conditional distribution of Y|X = x. (Use Bayes’ rule)
Let X1,...,Xn∼iid Gamma(3,1/θ) and we assume the prior for θ is InvGamma(10,2). (a) Find the posterior...
Let X1,...,Xn∼iid Gamma(3,1/θ) and we assume the prior for θ is InvGamma(10,2). (a) Find the posterior distribution for θ. (b) If n= 10 and   ̄x= 18.2, find the Bayes estimate under squared error loss. (c) The variance of the data distribution is φ= 3θ2. Find the Bayes estimator (under squared error loss) for φ.Let X1,...,Xn∼iid Gamma(3,1/θ) and we assume the prior for θ is InvGamma(10,2). (a) Find the posterior distribution for θ. (b) If n= 10 and   ̄x= 18.2, find...
2. Let the probability density function (pdf) of random variable X be given by:                           ...
2. Let the probability density function (pdf) of random variable X be given by:                            f(x) = C (2x - x²),                         for 0< x < 2,                         f(x) = 0,                                       otherwise      Find the value of C.                                                                           (5points) Find cumulative probability function F(x)                                       (5points) Find P (0 < X < 1), P (1< X < 2), P (2 < X <3)                                (3points) Find the mean, : , and variance, F².                                                   (6points)
Problem 1. Let x be a random variable which approximately follows a normal distribution with mean...
Problem 1. Let x be a random variable which approximately follows a normal distribution with mean µ = 1000 and σ = 200. Use the z-table, calculator, or computer software to find the following: Part A. Find P(x > 1500). Part B. Find P(x < 900). Part C. Find P(900 < x < 1500).