# Due on March 13

Support vector regression (SVR) extends SVM to regression problem with a similar formulation of minimizing $$\|w\|^2$$ subject to margin constraints. For soft-margin SVR, a typical formulation of fitting $$y=f({\bf x})$$ with training data $${\bf x}_1,\cdots,{\bf x}_l$$ and $$y_1,\cdots,y_l$$ is as follows (c.f. (3) in this tutorial)

$$\min \frac{1}{2} \|{\bf w}\|^2 + C \sum_{i=1}^l (\xi_i+\xi_i^*)$$

s.t. $$\begin{cases} y_i - \langle {\bf w}, {\bf x}_i \rangle -b \le \epsilon +\xi_i \\ \langle {\bf w}, {\bf x}_i \rangle +b -y_i \le \epsilon +\xi_i^* \\ \xi_i, \xi_i^* \ge 0 \end{cases}$$

1. (10 points) Follow the derivation in the tutorial, show that the dual optimization problem of above is indeed shown as (10) there by filling in any missing steps.

2. (10 points) Repeat Q3 in HW1 using support vector regression. You can use any package this time (e.g., scikit-learn's SVR implementation).