« previous | Tuesday, March 1, 2011 | next »
Review
Topic 3: Discrete Random Variables
- PMF, CDF
- Expected value and variance
Topic 4: Continuous Random Variables
- PDF, CDF
- Epxected value and variance
Topic 5 Overview
Consider 2 or more random variables.
- Joint PMF or PDF
- Expected value and variance
Central Limit Theorem (CLT)
What is Joint Distribution?
Discrete
Given 2 Bernoulli discrete random variables and (not independent)
Joint Probability Mass Function
- Marginal PMF of X: (also satisfies 1 & 2)
- Marginal PMF of Y: (also satisfies 1 & 2)
Continuous
Let and be continuous random variables:
- Note that whatever comes first in is the outer integral
Marginal Cases:
Example
Independence
- if discrete
- if continuous
- You should be able to factor joint PDF into two functions: alone and alone
For example, if and are independent, .
Conditional Distributions
If and are continuous, the result of the above equation will be since is set to a certain constant number.
The way to get around it is by looking at the joint PDF and marginal PDF:
After finding symbolic conditional PDF, integrate over X and plug in Y
Example
Suppose and denote the air pressure in the front tires of a car (left and right) and their joint PDF is given by:
The tires are supposed to be filled to 26 psi.
- What is ?
- do double integral, set equal to 1, and solve for K
- What is the probability that both tires are underfilled?
- double-integrate from 20 to 26 on both integrals
Lecture 12
Thursday, March 3, 2011
(continued from lecture 11)
Homework Example
Given 2 independent continuous rv's as Uniform distribution between 0 and 3, find
We don't know , but we do know the marginal PDFS:
By independence, the joint PDF is just the product of the two marginal PDFs:
Plot the function as a square with bottom left corner at origin and sides of length 3
Solve function inside probability () for and plot bounded area:
Find the area of the bounded area using either the graph or the integral above
Expected Value
For any arbitrary function
Easier way:
Variance
ONLY IF X AND Y ARE INDEPENDENT!
(lecture 12)
Random Sample
A collection of independent random variables with same distribution as population(recall from Topic 1) that can be used to estimate a parameter. The numbers we get to estimate parameters are called statistics
Central Limit Theorem
The original rv X may have any distribution.
If we let and , we know that:
Suppose we pick a random sample X1 through Xn from X:
- As sample size gets larger, the average of the sample approaches a normal distribution:
Proof
Let X Be a uniform distribution
Take a 15 random samples of size 30. The expected value of the sample average should be the average of the population and have a normal distribution
Variance of the sample average should be the variance of the population average divided by the sample size:
Example 1
Flipping a fair coin ( for getting Heads or Tails)
Therefore by CLT, the average of 1000 flips will approach
Example 2
Suppose the weight of 50 yr-old males has a certain distribution with mean 150 lbs. and std. dev. 32 lbs. What is the approximate probability that the sample mean weight for a random sample of 64 is 160 lbs?