Multiple |
[Missing <summary> documentation for "T:Altaxo.Calc.LinearRegression.MultipleRegression"]
public static class MultipleRegression
The MultipleRegression type exposes the following members.
| Name | Description | |
|---|---|---|
| DirectMethodT(IEnumerableTupleT, T, Boolean, DirectRegressionMethod) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| DirectMethodT(MatrixT, MatrixT, DirectRegressionMethod) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. | |
| DirectMethodT(MatrixT, VectorT, DirectRegressionMethod) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. | |
| DirectMethodT(T, T, Boolean, DirectRegressionMethod) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. | |
| NormalEquationsT(IEnumerableTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| NormalEquationsT(IEnumerableValueTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| NormalEquationsT(MatrixT, MatrixT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| NormalEquationsT(MatrixT, VectorT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| NormalEquationsT(T, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses the cholesky decomposition of the normal equations. | |
| QRT(IEnumerableTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses an orthogonal decomposition and is therefore more numerically stable than the normal equations but also slower. | |
| QRT(IEnumerableValueTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses an orthogonal decomposition and is therefore more numerically stable than the normal equations but also slower. | |
| QRT(MatrixT, MatrixT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses an orthogonal decomposition and is therefore more numerically stable than the normal equations but also slower. | |
| QRT(MatrixT, VectorT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses an orthogonal decomposition and is therefore more numerically stable than the normal equations but also slower. | |
| QRT(T, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses an orthogonal decomposition and is therefore more numerically stable than the normal equations but also slower. | |
| SvdT(IEnumerableTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses a singular value decomposition and is therefore more numerically stable (especially if ill-conditioned) than the normal equations or QR but also slower. | |
| SvdT(IEnumerableValueTupleT, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses a singular value decomposition and is therefore more numerically stable (especially if ill-conditioned) than the normal equations or QR but also slower. | |
| SvdT(MatrixT, MatrixT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses a singular value decomposition and is therefore more numerically stable (especially if ill-conditioned) than the normal equations or QR but also slower. | |
| SvdT(MatrixT, VectorT) | Find the model parameters β such that X*β with predictor X becomes as close to response Y as possible, with least squares residuals. Uses a singular value decomposition and is therefore more numerically stable (especially if ill-conditioned) than the normal equations or QR but also slower. | |
| SvdT(T, T, Boolean) | Find the model parameters β such that their linear combination with all predictor-arrays in X become as close to their response in Y as possible, with least squares residuals. Uses a singular value decomposition and is therefore more numerically stable (especially if ill-conditioned) than the normal equations or QR but also slower. |