Increase tolerance to ensure tests pass

It's possible that some particularly bad inputs cause
severe loss of significance in the triangular solves.
This is exacerbated by the fact that the way we test
the (residual) error is also prone to loss of significance,
so that the error measure itself is problematic.

We could maybe improve this in the future by using arbitrary-
precision arithmetic to remove some sources of error and testing
against appropriate bounds.
This commit is contained in:
Andreas Longva 2021-01-21 16:40:05 +01:00
parent 31c911d4fb
commit 7a083d50f7
1 changed files with 6 additions and 2 deletions

View File

@ -1154,7 +1154,9 @@ proptest! {
spsolve_csc_lower_triangular(Op::NoOp(&a), &mut x).unwrap();
let a_lower = a.lower_triangle();
prop_assert_matrix_eq!(&a_lower * &x, &b, comp = abs, tol = 1e-6);
// We're using a high tolerance here because there are some "bad" inputs that can give
// severe loss of precision.
prop_assert_matrix_eq!(&a_lower * &x, &b, comp = abs, tol = 1e-4);
}
#[test]
@ -1171,7 +1173,9 @@ proptest! {
spsolve_csc_lower_triangular(Op::Transpose(&a), &mut x).unwrap();
let a_lower = a.lower_triangle();
prop_assert_matrix_eq!(&a_lower.transpose() * &x, &b, comp = abs, tol = 1e-6);
// We're using a high tolerance here because there are some "bad" inputs that can give
// severe loss of precision.
prop_assert_matrix_eq!(&a_lower.transpose() * &x, &b, comp = abs, tol = 1e-4);
}
}