Finger exercises, to be discussed Wed Oct 2, 2002 ------------------------------------------------- Variations on "golden.c", discussed in class. 1. We do not have to maintain the array y[i] of values of f(x), if we are willing to run the occasional risk of re-evaluating f() at the same x values. Modify "golden.c" by wiping out all of the stuff related to y[i] and eval[i], and simply decide how to shorten the interval from aprime to bprime, based on looking at the values f(aprime + c*(bprime-aprime)) vs f(aprime + (1-c)*(bprime-aprime)) in each iteration in the while loop in main(). (a) You must pick a value of c between 0 and 1 to make the program work. What do you think the optimal value of c would be? Does anything prevent you from using, or coming close to, this value? (b) You may (or may not) want to change the stopping condition for the while loop as well (e.g. use a relative stopping criterion rather than an absolute one; or stop when the y-values stabilize rather than when the x-values stabilize, etc.). What do you think makes sense here? (c) Compare the performance of this "almost-golden.c" program with golden.c itself. (e.g. how many function evaluations does "almost-golden.c" require, for the same decimal accuracy as "golden.c"?) 2. Write a program "bisection.c" that finds the maximum of f(x) (unimodal in [a,b]) by finding the roots of df(x) = f'(x), based on your experience with "golden.c" and "almost-golden.c". Compare the performance of the three programs. 3. If you have time, please think about writing programs for some of the other univariate methods we are discussing: function iteration newton's method, regula falsi, the illinois method