Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test suite fails hypothesis is in the environment (a test dependency of pytest) #54

Open
Apteryks opened this issue Aug 26, 2021 · 8 comments
Labels

Comments

@Apteryks
Copy link

Hi,

I'm trying to update this package to 1.3.0 on GNU Guix, but I'm encountering the following test failures:

============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /gnu/store/f8s95qc6dfhl0r45m70hczw5zip0xjxq-python-wrapper-3.8.2/bin/python
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/.hypothesis/examples')
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source, configfile: tox.ini
plugins: hypothesis-5.4.1, forked-1.1.3
collecting ... collected 10 items

testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (c...) [ 40%]
testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (ca...) [ 50%]
testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
testing/test_xfail_behavior.py::test_xfail[strict xfail] FAILED          [ 70%]
testing/test_xfail_behavior.py::test_xfail[strict xpass] FAILED          [ 80%]
testing/test_xfail_behavior.py::test_xfail[non-strict xfail] FAILED      [ 90%]
testing/test_xfail_behavior.py::test_xfail[non-strict xpass] FAILED      [100%]

=================================== FAILURES ===================================
___________________________ test_xfail[strict xfail] ___________________________

is_crashing = True, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
___________________________ test_xfail[strict xpass] ___________________________

is_crashing = False, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py F                                                          [100%]'
E           and: ''
E           and: '=================================== FAILURES ==================================='
E           and: '________________________________ test_function _________________________________'
E           and: '[XPASS(strict)] The process gets terminated'
E           and: '=========================== short test summary info ============================'
E           and: 'FAILED test_xfail.py::test_function'
E           and: '============================== 1 failed in 0.03s ==============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py F                                                          [100%]

=================================== FAILURES ===================================
________________________________ test_function _________________________________
[XPASS(strict)] The process gets terminated
=========================== short test summary info ============================
FAILED test_xfail.py::test_function
============================== 1 failed in 0.03s ===============================
_________________________ test_xfail[non-strict xfail] _________________________

is_crashing = True, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
_________________________ test_xfail[non-strict xpass] _________________________

is_crashing = False, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py X                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XPASS test_xfail.py::test_function The process gets terminated'
E           and: '============================== 1 xpassed in 0.04s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py X                                                          [100%]

=========================== short test summary info ============================
XPASS test_xfail.py::test_function The process gets terminated
============================== 1 xpassed in 0.04s ==============================
=========================== short test summary info ============================
FAILED testing/test_xfail_behavior.py::test_xfail[strict xfail] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xfail] - Failed:...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed:...
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
==================== 4 failed, 4 passed, 2 xfailed in 1.62s ====================

The only direct dependency is pytest 6.2.4, but if you'd like to see the whole list of transitive dependencies used, here it is:

$ ./pre-inst-env guix refresh --list-transitive python-pytest-forked
[email protected] depends on the following 91 packages: [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] tzdata@2019c [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] ld-wrapper@0 [email protected] [email protected] [email protected] [email protected]
@Apteryks
Copy link
Author

Note: the same test failures are observed when using Pytest 5.3.5.

@webknjaz
Copy link
Member

Try checking if disabling the hypothesis plugin would fix this. We don't test with other plugins.
Here's the list of deps it didn't fail with: https://travis-ci.org/github/pytest-dev/pytest-forked/jobs/712323039#L244.

@webknjaz
Copy link
Member

Also, verify you can reproduce this with the latest master. Note that currently there's no CI integrated (there was Travis but it got deprecated, disconnected, and never migrated).

@webknjaz webknjaz added the bug label Aug 26, 2021
@Apteryks
Copy link
Author

Apteryks commented Aug 27, 2021

Thanks for the answer!

Unfortunately, I don't much control over the presence of hypothesis; it gets added to the environment as a result of the pytest package in Guix which depends on it for tests, and unfortunately because of a Guix bug, this test dependency gets embedded in the PYTHONPATH set the generated wrapper script for the pytest command script

However, I was able to test the build with a variant of the pytest package which comes with no dependency, and it worked!

./pre-inst-env guix build python-pytest-forked --with-input=python-pytest@6=python-pytest-bootstrap@6
[...]
starting phase `check'
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /gnu/store/f8s95qc6dfhl0r45m70hczw5zip0xjxq-python-wrapper-3.8.2/bin/python
cachedir: .pytest_cache
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source, configfile: tox.ini
plugins: forked-1.3.0
collecting ... collected 10 items

testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (c...) [ 40%]
testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (ca...) [ 50%]
testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
testing/test_xfail_behavior.py::test_xfail[strict xfail] PASSED          [ 70%]
testing/test_xfail_behavior.py::test_xfail[strict xpass] PASSED          [ 80%]
testing/test_xfail_behavior.py::test_xfail[non-strict xfail] PASSED      [ 90%]
testing/test_xfail_behavior.py::test_xfail[non-strict xpass] PASSED      [100%]

=========================== short test summary info ============================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
========================= 8 passed, 2 xfailed in 0.36s =========================
[...]

So the problem is indeed that the test suite fails in the presence of the hypothesis plugin.

@Apteryks Apteryks changed the title 4 test failures when building 1.3.0. test suite fails when the hypothesis dependency is present (an optional dependency of pytest) Aug 27, 2021
@Apteryks Apteryks changed the title test suite fails when the hypothesis dependency is present (an optional dependency of pytest) test suite fails hypothesis is in the environment (an optional dependency of pytest) Aug 27, 2021
@Apteryks Apteryks changed the title test suite fails hypothesis is in the environment (an optional dependency of pytest) test suite fails hypothesis is in the environment (an test dependency of pytest) Aug 27, 2021
@Apteryks Apteryks changed the title test suite fails hypothesis is in the environment (an test dependency of pytest) test suite fails hypothesis is in the environment (a test dependency of pytest) Aug 27, 2021
@webknjaz
Copy link
Member

Try adding -p no:hypothesis, then.

@webknjaz
Copy link
Member

So from your log it seems that plugins: forked* does not match plugins: hypothesis-5.4.1, forked-1.1.3. I think that disabling it or improving the match pattern should fix the tests.

@Apteryks
Copy link
Author

Apteryks commented Aug 27, 2021

Hmm, -p no:hypothesis didn't work for me, perhaps because in Guix it's made available via PYTHONPATH, not sure:

test_xfail.py X                                                          [100%]

=========================== short test summary info ============================
XPASS test_xfail.py::test_function The process gets terminated
============================== 1 xpassed in 0.01s ==============================
=========================== short test summary info ============================
FAILED testing/test_xfail_behavior.py::test_xfail[strict xfail] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xfail] - Failed:...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed:...
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
==================== 4 failed, 4 passed, 2 xfailed in 0.43s ====================
command "pytest" "-vv" "-p" "no:hypothesis" failed with status 1

@webknjaz
Copy link
Member

I think it's because it should be applied to the underlying invocation in tests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants