A First Look at Gradient Descent
In this problem, we’re doing to give a computational introduction to two fundamental ideas of modern computational machine learning:
- Minimizing a function is often a useful thing to do and
- This can often be done by moving in directions guided by the derivative of the function.
For our first example, we’ll see how minimizing a function by iteratively moving in the direction of the derivatives can be used to solve a familiar math problem.
Let
Part A
A critical point of
Part B
Use the second derivative test to show that
Part C
Implement the following algorithm as a Python function:
Inputs:
- Start with an initial guess
. - Set
and . - While
:- if
- Break
;. .
- if
- Return
.
You should use the formula for
Test your function like this:
= 9, epsilon = 1e-8, alpha = 0.2) mystery_fun(a
Please show:
- One setting of
for which your function returns a real number very close to the exact value of . - One setting of
for which your function fails to return a real number close to the exact value of within the maximum number of steps.
Part D
Is it possible to compute the positive square root of a positive real number using only the operations of addition, subtraction, multiplication, and division?
© Phil Chodrow, 2025