This tutorial presents a brief survey of semidifferentials in the familiar context of finite-dimensional Euclidean space. This restriction exposes the most critical ideas, important connections to convexity and optimization, and a few novel applications. The text delves more deeply into these topics and is highly recommended for a systematic course and self-study. The main focus of this tutorial is on Hadamard semidifferentials. The Hadamard semidifferential is more general than the Frchet differential, which is now dominant in undergraduate mathematics education. By slightly changing the definition of the forward directional derivative, the Hadamard semidifferential rescues the chain rule, enforces continuity, and permits differentiation across maxima and minima. The Hadamard semidifferential also plays well with convex analysis and naturally extends differentiation to smooth embedded submanifolds, topological vector spaces, and metric spaces of shapes and geometries. The current elementary exposition focuses on the more familiar territory of analysis in Euclidean spaces and applies the semidifferential to some representative problems in optimization and statistics. These include algorithms for proximal gradient descent, steepest descent in matrix completion, and variance components models. This tutorial will be of interest to students in advanced undergraduate and beginning graduate courses in the mathematical sciences.
This tutorial presents a brief survey of semidifferentials in the familiar context of finite-dimensional Euclidean space. This restriction exposes the most critical ideas, important connections to convexity and optimization, and a few novel applications. The text delves more deeply into these topics and is highly recommended for a systematic course and self-study. The main focus of this tutorial is on Hadamard semidifferentials. The Hadamard semidifferential is more general than the Frchet differential, which is now dominant in undergraduate mathematics education. By slightly changing the definition of the forward directional derivative, the Hadamard semidifferential rescues the chain rule, enforces continuity, and permits differentiation across maxima and minima. The Hadamard semidifferential also plays well with convex analysis and naturally extends differentiation to smooth embedded submanifolds, topological vector spaces, and metric spaces of shapes and geometries. The current elementary exposition focuses on the more familiar territory of analysis in Euclidean spaces and applies the semidifferential to some representative problems in optimization and statistics. These include algorithms for proximal gradient descent, steepest descent in matrix completion, and variance components models. This tutorial will be of interest to students in advanced undergraduate and beginning graduate courses in the mathematical sciences.