Weighted linear regression is one of those things that one needs from time to time, yet it is not a built-in function of many common packages, including spreadsheet programs. On the other hand, the problem is not sufficiently complicated to make it worth one's while to learn (or relearn!) more sophisticated statistical software packages, as with a modest effort, the formulae can be derived easily from first principles.

The problem can be stated as follows. Given a set of $N$ value pairs $(x_i,y_i)$ ($i=1,...,N$), and a set of weights $W_i$, we seek the values of $A$ and $B$ such that the following weighted sum $S$ is minimal:

\begin{align}
S=\sum_{i=1}^NW_i[(Ax_i+B)-y_i]^2=&A^2\sum W_ix_i^2+2AB\sum W_ix_i-2A\sum W_ix_iy_i+\nonumber\\
&B^2\sum W_i-2B\sum W_iy_i+\sum W_iy_i^2.
\end{align}

The requisite values of $A$ and $B$ can be computed by taking the partial derivative of $S$ with respect to $A$ and $B$ and demanding that they both be zero:

\begin{eqnarray}
\frac{\partial S}{\partial A}&=&2A\sum W_ix_i^2+2B\sum W_ix_i-2\sum W_ix_iy_i=0,\\
\frac{\partial S}{\partial B}&=&2A\sum W_ix_i+2B\sum W_i-2\sum W_iy_i=0,
\end{eqnarray}

or, in matrix form:

\begin{equation}
\begin{pmatrix}
\sum W_ix_i^2&\sum W_ix_i\\
\sum W_ix_i&\sum W_i
\end{pmatrix}
\begin{pmatrix}
A\\B
\end{pmatrix}=
\begin{pmatrix}
\sum W_ix_iy_i\\
\sum W_iy_i
\end{pmatrix}.
\end{equation}

This equation is solved as follows:

\begin{equation}
\begin{pmatrix}A\\B
\end{pmatrix}=\frac{1}{\sum W_i\sum W_ix_i^2-(\sum W_ix_i)^2}
\begin{pmatrix}
\sum W_i&-\sum W_ix_i\\
-\sum W_ix_i&\sum W_ix_i^2
\end{pmatrix}
\begin{pmatrix}
\sum W_ix_iy_i\\
\sum W_iy_i
\end{pmatrix},
\end{equation}

which gives

\begin{eqnarray}
A&=&\frac{\sum W_i\sum W_ix_iy_i-\sum W_ix_i\sum W_iy_i}{\sum W_i\sum W_ix_i^2-(\sum W_ix_i)^2},\\
B&=&\frac{\sum W_ix_i^2\sum W_iy_i-\sum W_ix_i\sum W_ix_iy_i}{\sum W_i\sum W_ix_i^2-(\sum W_ix_i)^2},
\end{eqnarray}

which can be readily computed if the values of $\sum W_i$, $\sum W_ix_i$, $\sum W_ix_i^2$, $\sum W_iy_i$ and $\sum W_ix_iy_i$ are available. These, in turn, can be calculated in a cumulative fashion, allowing a weighted least squares calculation to take place even on a handheld calculator that lacks sufficient memory to store all individual $x_i$, $y_i$, and $W_i$ values.

The method can be readily extended to polynomial regression of degree $n$. The function to be minimized in this case is

\begin{equation}
S=\sum_{i=1}^N W_i\left[\left(\sum_{j=1}^n A_jx_i^j+B\right)-y_i\right]^2.
\end{equation}

The partial derivatives with respect to $A_j$ and $B$ are as follows:

\begin{align}
\frac{\partial S}{\partial A_j}&=2\sum_k\left(A_k\sum_iW_ix_i^{j+k}\right)+2B\sum_iW_ix_i^j-2\sum_iW_ix_i^jy_i=0,\\
\frac{\partial S}{\partial B}&=2\sum_k\left(A_k\sum_iW_ix_i^k\right)+2B\sum_iW_i-2\sum_iW_iy_i=0.
\end{align}

In matrix form:

\begin{equation}
\begin{pmatrix}
\Sigma W_i&\Sigma W_ix_i&\Sigma W_ix_i^2&...&\Sigma W_ix_i^n\\
\Sigma W_ix_i&\Sigma W_ix_i^2&\Sigma W_ix_i^3&...&\Sigma W_ix_i^{n+1}\\
\Sigma W_ix_i^2&\Sigma W_ix_i^3&\Sigma W_ix_i^4&...&\Sigma W_ix_i^{n+2}\\
.&.&.&...&.\\
\Sigma W_ix_i^n&\Sigma W_ix_i^{n+1}&\Sigma W_ix_i^{n+2}&...&\Sigma W_ix_i^{2n}
\end{pmatrix}
\begin{pmatrix}
B^{~}_{~}\\
A^{~}_1\\
A^{~}_2\\
...^{~}_{~}\\
A^{~}_n
\end{pmatrix}=
\begin{pmatrix}
\Sigma W_iy_i\\
\Sigma W_ix_iy_i\\
\Sigma W_ix_i^2y_i\\
...\\
\Sigma W_ix_i^ny_i
\end{pmatrix}.
\end{equation}

We could also write $B=A_0$ and then, in compact form:

\begin{equation}
\sum_{j=0}^n\left(A_j\sum_{i=1}^NW_ix_i^{j+k}\right)=\sum_{i=1}^NW_ix_i^ky_i,~~~(k=0,...,n).
\end{equation}