« previous | Wednesday, April 4, 2012 | next »
Linear Regression
Given , , and you think , but know nothing (this is where statistics comes in).
Observe a finite collection of data .
Data is assumed to be "noisy" (not 100% true). We want to reconstruct
In general, we need two things:
- A class of functions
- A measure to be minimized
Linear Regression
Assume that our class is linear (i.e. ). We want the best choice for and : Least-Squares linear regression
Measure the deviation from , so we need a function that depends on and the data. Our goal is to minimize this function.
Calculate its gradient, set it to 0, and solve for and .
"this is on Wikipedia"
Quadratic Regression
Least-Squares Quadratic Regression
let , so
Penalty Coefficient
Sometimes the data looks like a certain form, but could be something else. Introduce a penalty coefficient to check "fitness"
Line Integrals
Given a curve and a vector field , the line integral represents how much of the vector field is in the direction of the curve , like Work in physics.
Written: Calculate (vector field notation) or (differential form notation)
Example: represents the unit circle going counter-clockwise, and
- Break into separate line regions (if necessary)
- Parameterize each line : , ,
- Determine meaning of differential:
- Plug in vectors into dot product:
- Set up integral with internal components (use chain rule):
- Evaluate:
Example
represents the triangle formed by (0,0), (1,0), and (1,1). Set up an integral for each line. (generally move counter-clockwise)
Parameterization for each :
- , ,
- , ,
- , ,
Example
Let represent a force field (like gravity). If the force is constant, then . If the force changes for each point, use the integral to get a continuous sum of the Work at each point.