a logo for the course

Computational Methods in Optimization

David Gleich

Purdue University

Spring 2023

Course number CS-52000

Monday, Wednesday, Friday, 1:30-2:20pm

Location Lawson 1106


Homework 3

Please answer the following questions in complete sentences in a clearly prepared manuscript and submit the solution by the due date on Gradescope.

Remember that this is a graduate class. There may be elements of the problem statements that require you to fill in appropriate assumptions. You are also responsible for determining what evidence to include. An answer alone is rarely sufficient, but neither is an overly verbose description required. Use your judgement to focus your discussion on the most interesting pieces. The answer to "should I include 'something' in my solution?" will almost always be: Yes, if you think it helps support your answer.

Problem 0: Homework checklist

Problem 1: Convexity and least squares

  1. Show that is a convex function. Feel free to use the result proved on the last homework.

  2. Show that the null-space of a matrix is a convex set. (A convex set satisfies the condition that, for every pair of points in the set, any point on the line joining those points is also in the set.)

Problem 2: Ridge Regression

The Ridge Regression problem is a variant of least squares: This is also known as Tikhonov regularization.

  1. Show that this problem always has a unique solution, for any if , using the theory discussed in class so far.

    This property is one aspect of the reason that Ridge Regression and used. It is also a common regularization method that can help avoid overfitting in a regression problem.

  2. Use the SVD of to characterize the solution as a function of .

  3. What is the solution when ?
    What is the solution when .

  4. Give the solutions of the least squares problem using the sports teams data when and . (Here, you will not need the constraint that the sum of entries is 1.) Any notable differences from the ranking giving in class?

  5. Suppose that you only want to regularize one component of the solution, say, , so that your optimization problem is Show how to adapt your techniques in this problem to accomplish this goal.

Problem 3: Thinking about constraints

Find the minimizer of where also (that is, we have added one linear constraint) and justify, as concisely and correctly as you can, that you have found a solution. Do the terms local and global minimizer apply? If so, use one of them to describe your solution. Furthermore, evaluate the gradient of at your solution. What is notable about the gradient at your solution in comparison with the solution where have no constraints?

Problem 4: Alternate formulations of Least Squares

Consider the constrained least squares problem: where , and rank .

  1. Convert this problem into the standard constrained least squares form.

  2. Form the augmented system from the Lagrangian as we did in class.

  3. Manipulate this problem to arrive at the normal equations for a least-squares problem: .
    Discuss any advantages of the systems at intermediate steps.