# Probability

Homework 7 STA 4321 & STA 5325, Spring 2021 Introduction to Probability University of Florida Due date: 5:00pm on Monday, April 5th, 2021 All work must be shown for complete credit. 1. It is common for engineers to work with the “error function” 2 erf(z) = √ π Z z 0 exp − x2 dx instead of the standard normal probability distribution function Φ, which we defined as: 2 Z z 1 x √ exp − Φ(z) = dx. 2 −∞ 2π Show that the following relationship between Φ and the function erf holds for all z: 1 1 z Φ(z) = + erf √ . 2 2 2 2. Suppose that a pair of random variables ( X, Y ) is uniformly distributed on the vertices of the square [−1, 1] × [−1, 1]: i.e., the joint mass function has nonzero probability on (−1, −1), (1, −1), (−1, 1), and (1, 1) with each of these four points occurring with probability p X,Y ( x, y) = 14 . (a) Compute P( X 2 + Y 2 < 1) (b) Compute P(2X − Y > 0) (c) Compute P(| X − Y | < 2) 3. Suppose X ∼ N(µ, σ2 ) for some µ ∈ R and σ > 0 (that is, X is normally distributed with mean µ and variance σ2 ). Given some realization of X, a mathematician constructs a rectangle with length L = | X | and width W = 4| X |. What is the expected value of the area of the rectangle? 4. Let V be a random variable following the beta distribution with parameters α, β. Specifically, the density of V is ( Γ ( α + β ) α −1 v (1 − v ) β −1 : 0 ≤ v ≤ 1 Γ(α)Γ( β) f V (v) = 0 : otherwise Find E(V k ) for arbitrary integer k without using moment generating functions. You answer may be left in terms of quantities involving the Γ function. 5. Suppose that the random variables Y1 , Y2 have joint probability density function given by 2 6y1 y2 0 ≤ y1 ≤ y2 , y1 + y2 ≤ 2 f Y1 ,Y2 (y1 , y2 ) = 0 otherwise (a) Show that the marginal density of Y1 is a beta distribution with parameters α = 3 and β = 2. (b) Derive the marginal density of Y2 . (c) Derive the conditional density of Y2 given Y1 = y1 , f Y2 |Y1 =y1 . (d) Find P(Y2 < 1.1 | Y1 = 0.6) 6. Derive the moment generating function of Y, a negative binomial random variable with r = 10 and success probability p (i.e., Y is the number of failures before the r = 10th success in a sequence of independent Bernoulli trials). You may use the following facts without proof: (a) If X1 , X2 , . . . , X10 are independent geometric random variables each with success probability p, then we can write Y = ∑10 i = 1 Xi . (b) If U and V are independent random variables, then for any function g : R → R, E[ g(U ) g(V )] = E[ g(U )]E[ g(V )] (assuming all relevant expected values exist). (c) For all constants t and r such that |r exp(t)| < 1 ∞ ∑ [r exp(t)] k =0 k = 1 . 1 − r exp(t) Note that when r is positive, t < − log(r ) =⇒ |r exp(t)| < 1.

Purchase answer to see full attachment