Skip to content

Improve pytest.approx error messages readability #8335

@tarcisiofischer

Description

@tarcisiofischer

Hey all. I think that as important as getting PASSED on our tests, it is important to get useful messages when the tests FAIL. That being said...

What's the problem this feature will solve?

While comparing numerical data with pytest.approx, I find the messages a bit hard to read when we are comparing big enough numpy arrays. Example:

def test_bla():
    import numpy as np
    a = np.linspace(0, 100, 20)
    b = np.linspace(0, 100, 20)
    a[10] += 0.5
    assert a == pytest.approx(b)

The error message:

>       assert a == pytest.approx(b)
E       assert array([  0.        ,   5.26315789,  10.52631579,  15.78947368,\n        21.05263158,  26.31578947,  31.57894737,  36.84210526,\n        42.10526316,  47.36842105,  53.13157895,  57.89473684,\n        63.15789474,  68.42105263,  73.68421053,  78.94736842,\n        84.21052632,  89.47368421,  94.73684211, 100.        ]) == approx([0.0 ± 1.0e-12, 5.2631578947368425 ± 5.3e-06, 10.526315789473685 ± 1.1e-05, 15.789473684210527 ± 1.6e-05, 21.05263157894737 ± 2.1e-05, 26.315789473684212 ± 2.6e-05, 31.578947368421055 ± 3.2e-05, 36.8421052631579 ± 3.7e-05, 42.10526315789474 ± 4.2e-05, 47.36842105263158 ± 4.7e-05, 52.631578947368425 ± 5.3e-05, 57.89473684210527 ± 5.8e-05, 63.15789473684211 ± 6.3e-05, 68.42105263157896 ± 6.8e-05, 73.6842105263158 ± 7.4e-05, 78.94736842105263 ± 7.9e-05, 84.21052631578948 ± 8.4e-05, 89.47368421052633 ± 8.9e-05, 94.73684210526316 ± 9.5e-05, 100.0 ± 1.0e-04])

This is the main example I want to solve, although I've implemented a solution for other cases.

Describe the solution you'd like

Would be nice to have something like this:

>       assert a == pytest.approx(b)
E       assert comparison failed for 1 values:
E         Index | Obtained           | Expected                    
E         (10,) | 53.131578947368425 | 52.631578947368425 ± 5.3e-05

Alternative Solutions

I haven't tried any workaround. But I implemented a solution, using pytest_assertrepr_compare. It is in this gist: https://gist.github.com/tarcisiofischer/cab56a5eb65aaa03bdbb4d206f1d49f0

Examples:

>       assert [1, 2, 3, 4] == pytest.approx([1, 3, 3, 5])
E       assert comparison failed for 2 values:
E         Index | Obtained | Expected   
E         1     | 2        | 3 ± 3.0e-06
E         3     | 4        | 5 ± 5.0e-06
>       assert 1.0 == pytest.approx(2.0)
E       assert comparison failed
E         Obtained: 1.0
E         Expected: 2.0 ± 2.0e-06
>       assert {'a': 1, 'b': 3} == pytest.approx({'a': 1.7, 'b': 3})
E       assert comparison failed for 1 values:
E         Index | Obtained | Expected     
E         a     | 1        | 1.7 ± 1.7e-06

A big one (sorry for the dump)

>       assert np.array([
            [
                [1.1987311, 12412342.3],
                [3.214143244, 1423412423415.677]
            ], [
                [1, 2],
                [3, 219371297321973]
            ]
        ]) == pytest.approx(
            np.array([
                [
                    [1.12313, 12412342.3],
                    [3.214143244, 534523542345.677]
                ], [
                    [1, 2],
                    [3, 7]
                ]
            ])
        )
E       assert comparison failed for 3 values:
E         Index     | Obtained          | Expected                  
E         (0, 0, 0) | 1.1987311         | 1.12313 ± 1.1e-06         
E         (0, 1, 1) | 1423412423415.677 | 534523542345.677 ± 5.3e+05
E         (1, 1, 1) | 219371297321973.0 | 7.0 ± 7.0e-06

Additional context

Just to mention, I didn't find any similar issue, so I'm starting a new one. The following issue has similar title, but it is completely different: #6985

Please, let me know if you guys like my proposal, and feel free to suggest improvements - I'll be happy to hear ideas.
Also, feel free to disagree and close this proposal, so I will just move my code to my project's conftest.py ;)

Metadata

Metadata

Assignees

No one assigned

    Labels

    topic: approxrelated to pytest.approx function

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions