CSCE 411 Lecture 24

From Notes
Jump to navigation Jump to search
Lecture Slides

« previous | Wednesday, October 24, 2012 | next »


The Monte Carlo Method: Estimating Pi

How do we calculate

Main idea: Given a 2×2 square centered at (0,0) with a circle of radius 1 inscribed. The area of the circle is and the area of the square is 4.


Imagine a drunk and blindfolded guy throwing darts at this circle. Choose random points and determine whether they are in the circle or not:

The number of points in the circle ÷ Total number of sample points ≈ .

Let equal 1 if a point is in the circle, and 0 otherwise. Therefore .

How many darts does the drunk and blindfolded guy need to throw?

Let , where is the value of after the th dart.

Thus is an estimator for .

The Chernoff bound gives us a good estimate of how close is to .

Therefore, the probability (that deviates significantly from ) exponentially decreases with (number of trials).

A Chernoff Bound

Let be random variables such that and

Let .

For , we have


Minimum Cut Algorithm

Lecture Handout

An example randomized algorithm

Start out with a connected, undirected, loop-free multi-graph with vertices, where there can be multiple edges between each vertex.

G1 = [[a,b], [a,c], [a,d], [b,e], [b,d], [c,d], [c,d], [c,f], [d,e], [d,f], [e,f]]
G2 = [[a,b], [a,c], [a,d], [b,d], [c,d]]

A cut in the multigraph is a partition of the vertex set into two disjoint nonempty sets , where . The size of the cut is given by the number of edges in that cross the cut.

If is an edge of , then is the multigraph obtained from by contracting the edge . I.e. we identify and , replacing them with a new node containing the union of incident edges of and and remove any resulting self-loops.

G1/[c,d] = [[a,b], [a,cd], [a,cd], [b,e], [b,cd], [cd,f], [cd,f], [cd,e], [e,f]]
G2/[a,d] = [[ad,b], [ad,b], [ad,c], [ad,c]]
G2/[a,d]/[ad,b] = [[abd,c],[abd,c]]

Contract edges at random to get a final number. This can give a wrong answer, but repeating the algorithm multiple times and returning the minimum result of all runs decreases the probability that it will be wrong.