CSCE 411 Lecture 11
Jump to navigation
Jump to search
« previous | Wednesday, September 19, 2012 | next »
Dynamic Programming
Matrix Chain Algoritm
Multiplying matrix with : naïve algorithm takes multiplications and additions
Chained multiplications can add up quickly:
Order Matters
Multiplying the following matrices:
- A : 30 × 1
- B : 1 × 40
- C : 40 × 10
- D : 10 × 25
requires 41,200 multiplications, but requires 1400
Matrix Chain Order Problem
Use dynamic programming to find the optimum chaining to give minimum number of operations:
Given matrices such that their sizes are compatible for multiplication.
Define to be the minimum number of operations needed to compute .
Goal: find Basis:
- Consider all possible ways to split through into two pieces
- Compare best case cost for computing product of two pieces (plus cost of multiplying two products)
- Take the best one such that
- minimum cost to compute is
- minimum cost to compute is
- cost of computing product of is
Find Dependencies
Fill in our table for
1 | 2 | 3 | 4 | 5 | |
---|---|---|---|---|---|
1 | 0 | (goal) | |||
2 | n/a | 0 | |||
3 | n/a | n/a | 0 | ||
3 | n/a | n/a | n/a | 0 | |
4 | n/a | n/a | n/a | n/a | 0 |
Compute cells adjacent to values we already know first (work from diagonal to )
Pseudocode
(1..m).each {|i| M[i][i] = 0}
(1..n-1).each do |d| # diagonals
(1..n-d).each do |i| # rows w/ an entry on dth diagonal
j = i+d # column corresponding to row i on dth diagonal
M[i][j] = Infinity
(1..j-1).each do |k|
M[i][j] = min(M[i][j], M[i][k] + M[k+1][j]+d[i-1]d[k]d[j])
end
end
end