# subderivative和对max函数求导

-- TOC --

## 基本概念

In mathematics, the subderivative, subgradient, and subdifferential generalize the derivative to convex functions which are not necessarily differentiable. Subderivatives arise in convex analysis, the study of convex functions, often in connection to convex optimization. For example, the absolute value function $$f(x)=|x|$$ is nondifferentiable when x=0. However, as seen in the graph in the above (where f(x) in blue has non-differentiable kinks similar to the absolute value function), for any x0 in the domain of the function one can draw a line which goes through the point (x0, f(x0)) and which is everywhere either touching or below the graph of f. The slope of such a line is called a subderivative (because the line is under the graph of f).

Consider the function $$f(x)=|x|$$ which is convex. Then, the subdifferential at the origin is the interval $$[−1, 1]$$. The subdifferential at any point $$x0<0$$ is the singleton set {−1}, while the subdifferential at any point $$x0>0$$ is the singleton set {1}.

## 对max函数求导

max函数定义如下：

$$f(x,y) = \max(x,y)$$

$$\cfrac{\partial f}{\partial x}=1(x\ge y)$$

$$\cfrac{\partial f}{\partial y}=1(x\le y)$$

1(...)这是个indicator function，它表示，如果满足括号中的条件，值为1,否则值为0。

Hint: the SVM loss function is not strictly speaking differentiable