STAT 211 Topic 3

From Notes
Jump to navigation Jump to search

Lecture 5

Lecture 5 Notes

« previous | Tuesday, February 1, 2010 | next »


Random Variables

a function that maps each element in the sample space to a numeric value

Note: Random variables always denoted by uppercase letters

When flipping a coin: if we get heads, we assign 1, if we get tails, we assign 0.

Random variable X so that X(H) = 1 and X(T) = 0
Discrete Variables
Finite number of possible values
EX: gender (male or female); rolling a die (1-6)
Continuous Variables
Possibly infinite number of possible outcomes or along an interval

A random variable that is either 0 or 1 is a Bernoulli random variable


Probability Distribution

Probability Mass Function (PMF)

The PMF is the probability that a trial will result in a certain outcome.

Probability distribution function of a discrete random variable X is a little pX.

Notice that because we can't have negative probability

Example

Suppose six lots of components are ready to be shipped by a certain supplier. The number of defective components in each lot is as follows:

Lot 1 2 3 4 5 6
Defective Items 0 1 3 0 1 2

Let X be the number of defective items in each lot:

Cumulative Distribution Function (CDF)

The CDF represents the probability that an observed result will be at most (ex. rolling a die will be less than 3)

CDF of discrete random variable X with PMF :

Example

Suppose a discrete random variable X gives the face value we get after rolling a die such that x = 1, 2, …, 6

  1 2 3 4 5 6
1/6 1/6 1/6 1/6 1/6 1/6

Complex probability: P(2 ≤ x ≤ 4)

In General: P(axb)

Unfair Coin

Probability of landing on heads and tails is not 50%.

We can "guess" based on many, many, many coin flips, then divide the results by the total trials.


Example

Suppose 3 independent electronic components connected in serial. Let be 1 if the th component works, and 0 if it fails. Suppose that each component works with probability

is a discrete variable

PMF

Let be the # of components that work: can be 0..3

0 1 2 3

CDF

0 1 2 3
...

On average, what is the number of components that work?


Expected Value

Expected value (mean value) of a random variable is the sum of all possibilities for X multiplied by their PMF; denoted by:

From previous example:

0 1 2 3

Variance of Random Variable

Variance of a random variable when given PMF:

Given our previous example , we can find the variance:

...

Lecture 6

Lecture 6 Notes

Thursday, February 3, 2011

Review Question

Suppose a local TV station sells 15, 30, and 60 second advertising spots. If X is the length of a randomly selected commercial, we are given the PMF:

15 30 60
0.1 0.3 0.6

What is the average length of all commercials on the station? (Expected value)

What is the variance of the length among all commercials on the station?

Functions of Random Variables

Linear conversion ratios for any function :

Special (linear) Cases:


Binomial Distribution

A Binomial Experiment stisfies the following condisitons:

  1. = number of independent trials / components
  2. results in success or failure ( = # of successes)
  3. = probability of success (consistent from trial to trial)
Note:

Written as: , and if this is the case, the probability of getting successes is:

Expected Value of Binomial Distribution

Lecture 7

Lecture 7 Notes

Tuesday, February 8, 2011

Binomial Experiments (cont'd)

Example 1

The probability of getting heads by tossing a coin is . Suppose we toss this coin 4 times:

  1. How many ways can we get only one success (heads)?
    SFFF, FSFF, FFSF, FFFS
  2. What is the probability of getting 1 heads in 4 flips?

Example 2

I asked 36 students in my past classes how many times they go out in a week. From the data, the fraction of people who go out at least 4 times per week is 5/36. Suppose I randomly select 5 students among those students:

  1. What is the chance that at least 3 of the 5 students go out at least 4 times a week?
    X = # of students that go out ≥ 4 times/wk among 5 selected students
    X ~ Bin(n = 5, p = 5/36)

Example 3

Consider a system with 7 components serially connected. Suppose each component works with probability and work independently:

  1. What is the probability that at most 3 components work?
    X ~ Bin(n = 7, p = 0.4)


Poisson Distribution

Amount of events happening in a fixed unit parameter (time, volume, area, count, etc.)

  • Even though the number of possible outcomes goes to infinity, it is still considered a discrete random variable
  • We need to be given an average rate, well call it λ
  • Written as X ~ Poisson(λ)

Expected value = Variance =

Example 1

It has been observed that the average number of traffic accidents on the Hollywood Freeway between 7 and 8 AM on Tuesday mornings is 1 per hour. What is the chance that there will be 2 accidents on the Freeway, on some specified Tuesday morning (per hour)?

Let X = # of accidents on freeway in one hour on Tuesday morning: X ~ Poisson(λ = 1)

Example 2

Suppose bacteria distribution in river is 1 per 20 cc. If we draw 10 cc in a test tube:

  1. what is the chance that the sample contains 2 bacteria?
    X = # of bacteria in test tube.
  1. what is the chance that the sample contains more than 2 bacteria?
    X ≥ 2 ⇒ P(X ≥ 2) = 1 - P(X ≤ 1)