       Next: Lecture 14 - data Up: No Title Previous: Lecture 12 - introduction

# Lecture 13 - dynamic programming applications

16.3-5 Give an algorithm to find the longest montonically increasing sequence in a sequence of n numbers.

Build an example first: (5, 2, 8, 7, 1, 6, 4)

Ask yourself what would you like to know about the first n-1 elements to tell you the answer for the entire sequence?

1. The length of the longest sequence in . (seems obvious)
2. The length of the longest sequence will extend! (not as obvious - this is the idea!)

Let be the length of the longest sequence ending with the ith character:

 sequence 5 2 8 7 3 1 6 4 1 1 2 2 2 1 3 3

How do we compute i?  To find the longest sequence - we know it ends somewhere, so Length = The Principle of Optimality

To use dynamic programming, the problem must observe the principle of optimality, that whatever the initial state is, remaining decisions must be optimal with regard the state following from the first decision.

Combinatorial problems may have this property but may use too much memory/time to be efficient.

Example: The Traveling Salesman Problem

Let be the cost of the optimal tour for i to 1 that goes thru each of the other cities once  Here there can be any subset of instead of any subinterval - hence exponential.

Still, with other ideas (some type of pruning or best-first search) it can be effective for combinatorial search.

When can you use Dynamic Programming?

Dynamic programming computes recurrences efficiently by storing partial results. Thus dynamic programming can only be efficient when there are not too many partial results to compute!

There are n! permutations of an n-element set - we cannot use dynamic programming to store the best solution for each subpermutation. There are subsets of an n-element set - we cannot use dynamic programming to store the best solution for each.

However, there are only n(n-1)/2 continguous substrings of a string, each described by a starting and ending point, so we can use it for string problems.

There are only n(n-1)/2 possible subtrees of a binary search tree, each described by a maximum and minimum key, so we can use it for optimizing binary search trees.

Dynamic programming works best on objects which are linearly ordered and cannot be rearranged - characters in a string, matrices in a chain, points around the boundary of a polygon, the left-to-right order of leaves in a search tree.

Whenever your objects are ordered in a left-to-right way, you should smell dynamic programming!

Minimum Length Triangulation

A triangulation of a polygon is a set of non-intersecting diagonals which partitions the polygon into diagonals. The length of a triangulation is the sum of the diagonal lengths.

We seek to find the minimum length triangulation. For a convex polygon, or part thereof: Once we identify the correct connecting vertex, the polygon is partitioned into two smaller pieces, both of which must be triangulated optimally! Evaluation proceeds as in the matrix multiplication example - values of t, each of which takes O(j-i) time if we evaluate the sections in order of increasing size. What if there are points in the interior of the polygon?

Dynamic Programming and High Density Bar Codes

Symbol Technology has developed a new design for bar codes, PDF-417 that has a capacity of several hundred bytes. What is the best way to encode text for this design? They developed a complicated mode-switching data compression scheme. Latch commands permanently put you in a different mode. Shift commands temporarily put you in a different mode.

Originally, Symbol used a greedy algorithm to encode a string, making local decisions only. We realized that for any prefix, you want an optimal encoding which might leave you in every possible mode.  the cost of encoding the ith character and ending up in node j).

Our simple dynamic programming algorithm improved to capacity of PDF-417 by an average of !

Dynamic Programming and Morphing

Morphing is the problem of creating a smooth series of intermediate images given a starting and ending image.

The key problem is establishing a correspondence between features in the two images. You want to morph an eye to an eye, not an ear to an ear.

We can do this matching on a line-by-line basis: This should sound like string matching, but with a different set of operations:

• Full run match: We may match run i on top to run j on bottom for a cost which is a function of the difference in the lengths of the two runs and their positions.
• Merging runs: We may match a string of consecutive runs on top to a run on bottom. The cost will be a function of the number of runs, their relative positions, and lengths.
• Splitting runs: We may match a big run on top to a string of consecutive runs on the bottom. This is just the converse of the merge. Again, the cost will be a function of the number of runs, their relative positions, and lengths.

This algorithm was incorported into a morphing system, with the following results:        Next: Lecture 14 - data Up: No Title Previous: Lecture 12 - introduction

Algorithms
Mon Jun 2 09:21:39 EDT 1997