Numerical Gradient Descent in MATLAB
Gradient Descent is an iterative optimization algorithm with the goal of finding the minimum of a function. The gradient of a function points in the direction of the steepest ascent, and hence, by moving in the opposite direction - the direction of steepest descent - we hope to reach a local minimum of the function. Conceptually, it's analogous to finding the lowest point in a valley by taking steps proportional to the slope of the hill at your current position. This is a simple implementation using MATLAB.
function GradientDescent()
% Define the function and its gradient
syms x y
f = @(x, y) (1 - x)^2 + 100 * (y - x^2)^2; % Rosenbrock function
grad_f = matlabFunction(gradient(f(x, y), [x, y])); % Gradient of the function
% Initialize
alpha = 0.001; % Learning rate
tolerance = 1e-6; % Stopping criterion
max_iter = 10000; % Maximum number of iterations
x = [-1; -1]; % Starting point
% Create the grid for the 3D plot
[X, Y] = meshgrid(-2:0.1:2, -2:0.1:2);
Z = f(X, Y);
% Plot the function
figure;
surf(X, Y, Z);
hold on;
title('Gradient Descent');
% Perform Gradient Descent
for i = 1:max_iter
% Calculate the gradient
g = grad_f(x(1), x(2));
% Update the current point
x_new = x - alpha * g;
% Check for convergence
if norm(x_new - x) < tolerance
break;
end
% Plot the step
Z_new = f(x_new(1), x_new(2));
plot3([x(1), x_new(1)], [x(2), x_new(2)], [f(x(1),x(2)), Z_new], 'r-', 'LineWidth', 2);
plot3(x_new(1), x_new(2), Z_new, 'ro', 'MarkerFaceColor', 'r', 'MarkerSize', 6);
% Update x
x = x_new;
end
hold off;
end