LeastSquares - Maple Help

LinearAlgebra

 LeastSquares
 least-squares solution to equations

 Calling Sequence LeastSquares(A, B, opt, t, c, options, meth)

Parameters

 A - Matrix, list, or set B - Matrix, column Vector or set of variables opt - (optional) equation of the form optimize=true or false; specifies whether to optimize a parametrized solution t - (optional) equation of the form free = name; base name to use for free variables in parametrized solutions c - (optional) equation of the form conjugate=true or false; specifies whether to use the Hermitian transpose in the case of non-floating-point data options - (optional); constructor options for the result object meth - (optional) equation of the form method = name where name is one of QR or SVD; floating-point method to use

Description

 • For Matrix A and Vector B, the LeastSquares(A, B) function returns a Vector that best satisfies the equation A . x is approximately equal to B, in the least squares sense. The result that is returned is the Vector x which minimizes $\mathrm{Norm}\left(A·x-B,2\right)$.
 If B is a Matrix, then a Matrix is returned where its ith column is the least squares solution to $A·x={B}_{i}$, where ${B}_{i}$ is the ith column of the Matrix B.
 • Parameter A can also be a set of equations that describe the linear least-squares problem. In this case, B is the set of variables in which the equations in A occur.
 • The optimize option (opt) determines how the result is returned in the case when the coefficient Matrix is not full rank (so that there are an infinite number of solutions).  If given as optimize=true, the solution with the minimal $2$-norm is returned.
 If the optimize option is omitted (or is given as optimize=false) then in the case of rational data a parametrized solution is returned if the coefficient Matrix is not full rank. In this case, the parametrization uses names based on the symbol given by the option t. If the option t is not provided then an unassigned symbol _t is generated.
 The condition optimize=true can be abbreviated as optimize.
 • The conjugate option (c) specifies whether to use HermitianTranspose instead of Transpose for problems without a complex data type. The default is true. The condition conjugate=true can be abbreviated as conjugate.
 • For floating-point numeric Matrices a solution is computed by using a singular value decomposition (SVD) method by default.
 • For nonnumeric data the solution is computed by default using the normal equations. If the method option is supplied as QR then the default method is overridden.
 • If A is a list, then the elements of the list are taken as the Matrix factors of the Matrix A, due to some prefactorization. These factors are interpreted, uniquely, as follows:
 – A list of Matrix, Vector[column] items, [QR, tau], is interpreted as the result of calling QRDecomposition with the output='NAG' option (the factors must contain data for a row-dominant Matrix).
 – A list of Matrix, Matrix items, [Q, R], is interpreted as the result of calling QRDecomposition with the output=['Q', 'R'] option (the factors must contain data for a row-dominant Matrix).
 – A list of Matrix, Vector[column], Matrix items, [U, S, Vt], is interpreted as the result of calling SingularValues with the output=['U', 'S', 'Vt'] option.
 • The constructor options provide additional information (readonly, shape, storage, order, datatype, and attributes) to the Vector constructor that builds the result. These options may also be provided in the form outputoptions=[...], where [...] represents a Maple list.  If a constructor option is provided in both the calling sequence directly and in an outputoptions option, the latter takes precedence (regardless of the order).
 • This function is part of the LinearAlgebra package, and so it can be used in the form LeastSquares(..) only after executing the command with(LinearAlgebra). However, it can always be accessed through the long form of the command by using LinearAlgebra[LeastSquares](..).

Examples

 > $\mathrm{with}\left(\mathrm{LinearAlgebra}\right):$
 > $A≔⟨⟨3,0,4⟩|⟨-2,3,4⟩⟩:$
 > $b≔⟨1,2,4⟩:$
 > $X≔\mathrm{LeastSquares}\left(A,b\right)$
 ${X}{≔}\left[\begin{array}{c}\frac{{351}}{{625}}\\ \frac{{62}}{{125}}\end{array}\right]$ (1)
 > $\mathrm{VectorNorm}\left(A·X-b\right)$
 $\frac{{64}}{{125}}$ (2)
 > $\mathrm{Af}≔\mathrm{Matrix}\left(A,\mathrm{datatype}=\mathrm{float}\right):$
 > $\mathrm{bf}≔\mathrm{Vector}\left(b,\mathrm{datatype}=\mathrm{float}\right):$
 > $\mathrm{QR},\mathrm{\tau }≔\mathrm{QRDecomposition}\left(\mathrm{Af},\mathrm{output}='\mathrm{NAG}'\right):$
 > $\mathrm{Xf}≔\mathrm{LeastSquares}\left(\left[\mathrm{QR},\mathrm{\tau }\right],\mathrm{bf}\right)$
 ${\mathrm{Xf}}{≔}\left[\begin{array}{c}{0.561600000000000}\\ {0.496000000000000}\end{array}\right]$ (3)
 > $E≔\left\{2x-y+2=0,mx+ny-3=0\right\}:$
 > $V≔\left\{x,y\right\}:$
 > $\mathrm{LeastSquares}\left(E,V\right)$
 $\left\{{x}{=}{-}\frac{{2}{}{n}{-}{3}}{{m}{+}{2}{}{n}}{,}{y}{=}\frac{{2}{}\left({m}{+}{3}\right)}{{m}{+}{2}{}{n}}\right\}$ (4)
 > $F≔⟨⟨1+I,4-3I,-I⟩|⟨3,I,1-I⟩⟩:$
 > $G≔⟨⟨0,2I,5⟩|⟨3+4I,0,-1+I⟩⟩:$
 > $\mathrm{LeastSquares}\left(F,G\right)$
 $\left[\begin{array}{cc}{-}\frac{{69}}{{331}}{+}\frac{{137}{}{I}}{{331}}& \frac{{89}}{{331}}{-}\frac{{26}{}{I}}{{331}}\\ \frac{{176}}{{331}}{+}\frac{{115}{}{I}}{{331}}& \frac{{190}}{{331}}{+}\frac{{348}{}{I}}{{331}}\end{array}\right]$ (5)
 > $\mathrm{Ft}≔\mathrm{HermitianTranspose}\left(F\right):$
 > $\mathrm{Gt}≔\mathrm{HermitianTranspose}\left(G\right):$
 > $\mathrm{LeastSquares}\left(\mathrm{Ft},\mathrm{Gt},\mathrm{optimize}\right)$
 $\left[\begin{array}{ccc}\frac{{243}}{{331}}{-}\frac{{349}{}{I}}{{331}}& \frac{{36}}{{331}}{-}\frac{{18}{}{I}}{{331}}& {-}\frac{{43}}{{331}}{+}\frac{{8}{}{I}}{{331}}\\ \frac{{62}}{{331}}{+}\frac{{109}{}{I}}{{331}}& {-}\frac{{74}}{{331}}{-}\frac{{92}{}{I}}{{331}}& \frac{{263}}{{331}}{-}\frac{{198}{}{I}}{{331}}\\ {-}\frac{{30}}{{331}}{-}\frac{{185}{}{I}}{{331}}& {-}\frac{{18}}{{331}}{-}\frac{{2}{}{I}}{{331}}& {-}\frac{{48}}{{331}}{-}\frac{{44}{}{I}}{{331}}\end{array}\right]$ (6)
 > $N≔⟨⟨1,2,3⟩|⟨3,4,5⟩|⟨5,6,7⟩⟩:$
 > $z≔⟨1,2,3⟩:$
 > $M≔\mathrm{LeastSquares}\left(N,z,\mathrm{free}=c\right)$
 ${M}{≔}\left[\begin{array}{c}{1}{+}{{c}}_{{3}}\\ {-}{2}{}{{c}}_{{3}}\\ {{c}}_{{3}}\end{array}\right]$ (7)
 > $N·M$
 $\left[\begin{array}{c}{1}\\ {2}\\ {3}\end{array}\right]$ (8)
 > $\mathrm{LeastSquares}\left(N,z,\mathrm{optimize}\right)$
 $\left[\begin{array}{c}\frac{{5}}{{6}}\\ \frac{{1}}{{3}}\\ {-}\frac{{1}}{{6}}\end{array}\right]$ (9)
 > $\mathrm{Nf}≔\mathrm{Matrix}\left(N,\mathrm{datatype}=\mathrm{float}\right):$
 > $\mathrm{zf}≔\mathrm{Vector}\left(z,\mathrm{datatype}=\mathrm{float}\right):$
 > $\mathrm{LeastSquares}\left(\mathrm{Nf},\mathrm{zf},\mathrm{method}='\mathrm{SVD}'\right)$
 $\left[\begin{array}{c}{0.833333333333333}\\ {0.333333333333334}\\ {-0.166666666666667}\end{array}\right]$ (10)
 > $U,S,\mathrm{Vt}≔\mathrm{SingularValues}\left(\mathrm{Nf},\mathrm{output}=\left['U','S','\mathrm{Vt}'\right]\right)$
 ${U}{,}{S}{,}{\mathrm{Vt}}{≔}\left[\begin{array}{ccc}{-0.446172160406770}& {0.796406765799543}& {0.408248290463864}\\ {-0.568626904944529}& {0.0999838802334592}& {-0.816496580927726}\\ {-0.691081649482289}& {-0.596439005332626}& {0.408248290463863}\end{array}\right]{,}\left[\begin{array}{c}{13.1593479966931}\\ {0.911899282777199}\\ {1.50611489982663}{×}{{10}}^{{-16}}\end{array}\right]{,}\left[\begin{array}{ccc}{-0.277876299012809}& {-0.537141532406176}& {-0.796406765799544}\\ {-0.869550513644997}& {-0.211689176619114}& {0.446172160406770}\\ {-0.408248290463863}& {0.816496580927726}& {-0.408248290463863}\end{array}\right]$ (11)
 > $\mathrm{LeastSquares}\left(\left[U,S,\mathrm{Vt}\right],\mathrm{zf}\right)$
 $\left[\begin{array}{c}{0.833333333333333}\\ {0.333333333333334}\\ {-0.166666666666667}\end{array}\right]$ (12)