diff options
author | Simon Glass <sjg@chromium.org> | 2024-10-09 18:28:59 -0600 |
---|---|---|
committer | Tom Rini <trini@konsulko.com> | 2024-10-15 10:24:27 -0600 |
commit | 4018a08e423cf0e8eda78d986ff915112485147b (patch) | |
tree | 131f59c77d87d5b7d3a016537b3be8062d6fb0a1 /test/py/u_boot_spawn.py | |
parent | 189c4d9f5ce48a79bcf938f16061c7de11b75d34 (diff) |
test: Avoid failing skipped tests
When a test returns -EAGAIN this should not be considered a failure.
Fix what seems to be a problem case, where the pytests see a failure
when a test has merely been skipped.
We cannot squash the -EAGAIN error in ut_run_test() since the failure
count is incremented by its caller, ut_run_test_live_flat()
The specific example here is on snow, where a test is compiled into the
image but cannot run, so returns -EAGAIN to skip:
test/py/tests/test_ut.py sssnow # ut bdinfo bdinfo_test_eth
Test: bdinfo_test_eth: bdinfo.c
Skipping: Console recording disabled
test/test-main.c:486, ut_run_test_live_flat(): 0 == ut_run_test(uts,
test, test->name): Expected 0x0 (0), got 0xfffffff5 (-11)
Test bdinfo_test_eth failed 1 times
Skipped: 1, Failures: 1
snow # F+u-boot-test-reset snow snow
The fix is simply to respect the return code from ut_run_test(), so do
that.
Signed-off-by: Simon Glass <sjg@chromium.org>
Diffstat (limited to 'test/py/u_boot_spawn.py')
0 files changed, 0 insertions, 0 deletions