The Matrix format of Least Square Method (LMS)

I. Cost function

For the cost function

J = ∑ i = 1 n ∣ ∣ y i − x i T θ ∣ ∣ 2 (1) J = \sum_{i=1}^n || y_i - x_i^T \theta ||^2 \tag{1} J=i=1n∣∣yixiTθ2(1)

where θ ( m × 1 ) \theta (m\times 1) θ(m×1) is the unknow parameters, x i T ( 1 × m ) x_i^T (1\times m) xiT(1×m) and y i ( 1 × 1 ) y_i (1\times 1) yi(1×1) are the collected data.

II. The optimal solution

The optimal estimation of θ \theta θ is

θ ^ = ( X T X ) − 1 X T Y (2) \hat \theta = (X^T X)^{-1} X^T Y \tag{2} θ^=(XTX)1XTY(2)

where Y ( n × 1 ) Y(n\times 1) Y(n×1) and X ( n × m ) X(n \times m) X(n×m) as follows

Y = [ y 1 ⋮ y n ] ( n × 1 ) X = [ x 1 T ⋮ x n T ] ( n × m ) (3) Y = \left[\begin{array}{c} y_1 \\ \vdots \\ y_n \end{array}\right]_{(n\times 1)} \qquad X = \left[\begin{array}{c} x_1^T \\ \vdots \\ x_n^T \end{array}\right]_{(n\times m)} \tag{3} Y= y1yn (n×1)X= x1TxnT (n×m)(3)

III. Proof

Omiting.

— over —

你可能感兴趣的:(算法,最优化)