Safe Haskell | None |
---|---|
Language | Haskell98 |
Synopsis
- leastSquares :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => Full horiz Small height width a -> Full vert horiz height nrhs a -> Full vert horiz width nrhs a
- minimumNorm :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => Full Small vert height width a -> Full vert horiz height nrhs a -> Full vert horiz width nrhs a
- leastSquaresMinimumNormRCond :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => RealOf a -> Full horiz vert height width a -> Full vert horiz height nrhs a -> (Int, Full vert horiz width nrhs a)
- pseudoInverseRCond :: (C vert, C horiz, C height, C width, Floating a) => RealOf a -> Full vert horiz height width a -> (Int, Full horiz vert width height a)
- project :: (C height, Eq height, C width, Eq width, Floating a) => Wide height width a -> Vector height a -> Vector width a -> Vector width a
- leastSquaresConstraint :: (C height, Eq height, C width, Eq width, C constraints, Eq constraints, Floating a) => General height width a -> Vector height a -> Wide constraints width a -> Vector constraints a -> Vector width a
- gaussMarkovLinearModel :: (C height, Eq height, C width, Eq width, C opt, Eq opt, Floating a) => Tall height width a -> General height opt a -> Vector height a -> (Vector width a, Vector opt a)
- determinant :: (C sh, Floating a) => Square sh a -> a
- determinantAbsolute :: (C vert, C horiz, C height, C width, Floating a) => Full vert horiz height width a -> RealOf a
- complement :: (C height, C width, Floating a) => Tall height width a -> Tall height ShapeInt a
- affineSpanFromKernel :: (C width, Eq width, C height, Eq height, Floating a) => Wide height width a -> Vector height a -> (Tall width ShapeInt a, Vector width a)
- affineKernelFromSpan :: (C width, Eq width, C height, Eq height, Floating a) => Tall height width a -> Vector height a -> (Wide ShapeInt height a, Vector ShapeInt a)
- householder :: (C vert, C horiz, C height, C width, Floating a) => Full vert horiz height width a -> (Square height a, Full vert horiz height width a)
- householderTall :: (C vert, C height, C width, Floating a) => Full vert Small height width a -> (Full vert Small height width a, Upper width a)
Documentation
leastSquares :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => Full horiz Small height width a -> Full vert horiz height nrhs a -> Full vert horiz width nrhs a Source #
If x = leastSquares a b
then x
minimizes Vector.norm2 (multiply a x
.sub
b)
Precondition: a
must have full rank and height a >= width a
.
minimumNorm :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => Full Small vert height width a -> Full vert horiz height nrhs a -> Full vert horiz width nrhs a Source #
The vector x
with x = minimumNorm a b
is the vector with minimal Vector.norm2 x
that satisfies multiply a x == b
.
Precondition: a
must have full rank and height a <= width a
.
leastSquaresMinimumNormRCond :: (C vert, C horiz, C height, Eq height, C width, C nrhs, Floating a) => RealOf a -> Full horiz vert height width a -> Full vert horiz height nrhs a -> (Int, Full vert horiz width nrhs a) Source #
If (rank,x) = leastSquaresMinimumNormRCond rcond a b
then x
is the vector with minimum Vector.norm2 x
that minimizes Vector.norm2 (a #*| x
.sub
b)
Matrix a
can have any rank
but you must specify the reciprocal condition of the rank-truncated matrix.
pseudoInverseRCond :: (C vert, C horiz, C height, C width, Floating a) => RealOf a -> Full vert horiz height width a -> (Int, Full horiz vert width height a) Source #
project :: (C height, Eq height, C width, Eq width, Floating a) => Wide height width a -> Vector height a -> Vector width a -> Vector width a Source #
project b d x
projects x
on the plane described by B*x = d
.
b
must have full rank.
leastSquaresConstraint :: (C height, Eq height, C width, Eq width, C constraints, Eq constraints, Floating a) => General height width a -> Vector height a -> Wide constraints width a -> Vector constraints a -> Vector width a Source #
leastSquaresConstraint a c b d
computes x
with minimal || c - A*x ||_2
and constraint B*x = d
.
b
must be wide and a===b
must be tall
and both matrices must have full rank.
gaussMarkovLinearModel :: (C height, Eq height, C width, Eq width, C opt, Eq opt, Floating a) => Tall height width a -> General height opt a -> Vector height a -> (Vector width a, Vector opt a) Source #
gaussMarkovLinearModel a b d
computes (x,y)
with minimal || y ||_2
and constraint d = A*x + B*y
.
a
must be tall and a|||b
must be wide
and both matrices must have full rank.
determinantAbsolute :: (C vert, C horiz, C height, C width, Floating a) => Full vert horiz height width a -> RealOf a Source #
Gramian determinant - works also for non-square matrices, but is sensitive to transposition.
determinantAbsolute a = sqrt (Herm.determinant (Herm.gramian a))
complement :: (C height, C width, Floating a) => Tall height width a -> Tall height ShapeInt a Source #
For an m-by-n-matrix a
with m>=n
this function computes an m-by-(m-n)-matrix b
such that Matrix.multiply (adjoint b) a
is a zero matrix.
The function does not try to compensate a rank deficiency of a
.
That is, a|||b
has full rank if and only if a
has full rank.
For full-rank matrices you might also call this kernel
or nullspace
.
affineSpanFromKernel :: (C width, Eq width, C height, Eq height, Floating a) => Wide height width a -> Vector height a -> (Tall width ShapeInt a, Vector width a) Source #
affineSpanFromKernel a b == (c,d)
Means:
An affine subspace is given implicitly by {x : a#*|x == b}.
Convert this into an explicit representation {c#*|y|+|d : y}.
Matrix a
must have full rank,
otherwise the explicit representation will miss dimensions
and we cannot easily determine the origin d
as a minimum norm solution.
affineKernelFromSpan :: (C width, Eq width, C height, Eq height, Floating a) => Tall height width a -> Vector height a -> (Wide ShapeInt height a, Vector ShapeInt a) Source #
This conversion is somehow inverse to affineSpanFromKernel
.
However, it is not precisely inverse in either direction.
This is because both affineSpanFromKernel
and affineKernelFromSpan
accept non-orthogonal matrices but always return orthogonal ones.
In affineKernelFromSpan c d
,
matrix c
should have full rank,
otherwise the implicit representation will miss dimensions.