Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI Test Flake #190

Open
asmacdo opened this issue Sep 29, 2024 · 1 comment
Open

CI Test Flake #190

asmacdo opened this issue Sep 29, 2024 · 1 comment

Comments

@asmacdo
Copy link
Member

asmacdo commented Sep 29, 2024

Problem appears as a test flake in CI (uncommon, spotted on 3.9 pypy test job)

Initial impression: there could be a race condition for the Ctrl+C signal exit. Then once it blows up, something doesn't clean up correctly and causes:

FAILED test/test_formatter.py::test_execution_summary_formatted_wall_clock_time_rounded - pytest.PytestUnhandledThreadExceptionWarning: Exception in thread Thread-70
E                       stdout, stderr = process.communicate(input, timeout=timeout)
E                   ValueError: not enough values to unpack (expected 2, got 0)

CI Output

1s
1s
0s
11s
1m 31s
Run tox -e py -- -vv --cov-report=xml
py: install_deps> python -I -m pip install pytest pytest-cov
.pkg: install_requires> python -I -m pip install 'setuptools>=46.4.0'
.pkg: _optional_hooks> python /opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_sdist> python /opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_wheel> python /opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: freeze> python -m pip freeze --all
.pkg: cffi==1.17.0.dev0,greenlet==0.4.13,hpy==0.9.0,pip==24.2,readline==6.2.4.1,setuptools==75.1.0,wheel==0.44.0
.pkg: prepare_metadata_for_build_wheel> python /opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: build_sdist> python /opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
py: install_package> python -I -m pip install --force-reinstall --no-deps /home/runner/work/duct/duct/.tox/.tmp/package/1/con_duct-0.3.1.tar.gz
py: freeze> python -m pip freeze --all
py: cffi==1.17.0.dev0,con-duct @ file:///home/runner/work/duct/duct/.tox/.tmp/package/1/con_duct-0.3.1.tar.gz#sha256=2e7da39c15c4d0befe4ae6fe93abe7e6d241bc9bfe29987cd91dd983b4e5d7da,coverage==7.6.1,exceptiongroup==1.2.2,greenlet==0.4.13,hpy==0.9.0,iniconfig==2.0.0,packaging==24.1,pip==24.2,pluggy==1.5.0,pytest==8.3.3,pytest-cov==5.0.0,readline==6.2.4.1,setuptools==75.1.0,tomli==2.0.1,wheel==0.44.0
py: commands[0]> pytest -vv --cov-report=xml test
============================= test session starts ==============================
platform linux -- Python 3.9.19[pypy-7.3.16-final], pytest-8.3.3, pluggy-1.5.0 -- /home/runner/work/duct/duct/.tox/py/bin/python
cachedir: .tox/py/.pytest_cache
rootdir: /home/runner/work/duct/duct
configfile: tox.ini
plugins: cov-5.0.0
collecting ... collected 161 items

test/test_aggregation.py::test_aggregation_num_samples_increment PASSED  [  0%]
test/test_aggregation.py::test_aggregation_single_sample_sanity PASSED   [  1%]
test/test_aggregation.py::test_aggregation_single_stat_multiple_samples_sanity[stat0] PASSED [  1%]
test/test_aggregation.py::test_aggregation_single_stat_multiple_samples_sanity[stat1] PASSED [  2%]
test/test_aggregation.py::test_aggregation_single_stat_multiple_samples_sanity[stat2] PASSED [  3%]
test/test_aggregation.py::test_aggregation_single_stat_multiple_samples_sanity[stat3] PASSED [  3%]
test/test_aggregation.py::test_aggregation_averages PASSED               [  4%]
test/test_aggregation.py::test_aggregation_current_ave_diverges_from_total_ave PASSED [  4%]
test/test_aggregation.py::test_aggregation_many_samples[stat0] PASSED    [  5%]
test/test_aggregation.py::test_aggregation_many_samples[stat1] PASSED    [  6%]
test/test_aggregation.py::test_aggregation_many_samples[stat2] PASSED    [  6%]
test/test_aggregation.py::test_aggregation_many_samples[stat3] PASSED    [  7%]
test/test_aggregation.py::test_aggregation_sample_no_pids PASSED         [  8%]
test/test_aggregation.py::test_aggregation_no_false_peak PASSED          [  8%]
test/test_arg_parsing.py::test_duct_help PASSED                          [  9%]
test/test_arg_parsing.py::test_cmd_help PASSED                           [  9%]
test/test_arg_parsing.py::test_duct_unrecognized_arg[args0] PASSED       [ 10%]
test/test_arg_parsing.py::test_duct_unrecognized_arg[args1] PASSED       [ 11%]
test/test_arg_parsing.py::test_duct_missing_cmd PASSED                   [ 11%]
test/test_arg_parsing.py::test_abreviation_disabled PASSED               [ 12%]
test/test_execution.py::test_sanity_green PASSED                         [ 13%]
test/test_execution.py::test_execution_summary PASSED                    [ 13%]
test/test_execution.py::test_sanity_red[1] PASSED                        [ 14%]
test/test_execution.py::test_sanity_red[2] PASSED                        [ 14%]
test/test_execution.py::test_sanity_red[128] PASSED                      [ 15%]
test/test_execution.py::test_outputs_full PASSED                         [ 16%]
test/test_execution.py::test_outputs_passthrough PASSED                  [ 16%]
test/test_execution.py::test_outputs_capture PASSED                      [ 17%]
test/test_execution.py::test_outputs_none PASSED                         [ 18%]
test/test_execution.py::test_outputs_none_quiet PASSED                   [ 18%]
test/test_execution.py::test_exit_before_first_sample PASSED             [ 19%]
test/test_execution.py::test_run_less_than_report_interval PASSED        [ 19%]
test/test_execution.py::test_execute_unknown_command PASSED              [ 20%]
test/test_execution.py::test_signal_exit FAILED                          [ 21%]
test/test_formatter.py::test_execution_summary_formatted_wall_clock_time_nan PASSED [ 21%]
test/test_formatter.py::test_execution_summary_formatted_wall_clock_time_rounded FAILED [ 22%]
test/test_formatter.py::test_summary_formatter_no_vars PASSED            [ 22%]
test/test_formatter.py::test_summary_formatter_vars_provided_no_vars_in_format_string PASSED [ 23%]
test/test_formatter.py::test_summary_formatter_one_var PASSED            [ 24%]
test/test_formatter.py::test_summary_formatter_many_vars PASSED          [ 24%]
test/test_formatter.py::test_summary_formatter_missing_vars PASSED       [ 25%]
test/test_formatter.py::test_summary_formatter_none_replacement PASSED   [ 26%]
test/test_formatter.py::test_summary_formatter_S_e2e PASSED              [ 26%]
test/test_formatter.py::test_summary_formatter_S_sizes[1-1 Byte] PASSED  [ 27%]
test/test_formatter.py::test_summary_formatter_S_sizes[10-10 Bytes] PASSED [ 27%]
test/test_formatter.py::test_summary_formatter_S_sizes[100-100 Bytes] PASSED [ 28%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000-1.0 kB] PASSED [ 29%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000-10.0 kB] PASSED [ 29%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000-100.0 kB] PASSED [ 30%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000-1.0 MB] PASSED [ 31%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000-10.0 MB] PASSED [ 31%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000-100.0 MB] PASSED [ 32%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000-1.0 GB] PASSED [ 32%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000-10.0 GB] PASSED [ 33%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000-100.0 GB] PASSED [ 34%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000-1.0 TB] PASSED [ 34%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000000-10.0 TB] PASSED [ 35%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000000-100.0 TB] PASSED [ 36%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000000-1.0 PB] PASSED [ 36%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000000000-10.0 PB] PASSED [ 37%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000000000-100.0 PB] PASSED [ 37%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000000000-1.0 EB] PASSED [ 38%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000000000000-10.0 EB] PASSED [ 39%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000000000000-100.0 EB] PASSED [ 39%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000000000000-1.0 ZB] PASSED [ 40%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000000000000000-10.0 ZB] PASSED [ 40%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000000000000000-100.0 ZB] PASSED [ 41%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000900000000000-1.0 YB] PASSED [ 42%]
test/test_formatter.py::test_summary_formatter_S_sizes[10000000000000000000000000-10.0 YB] PASSED [ 42%]
test/test_formatter.py::test_summary_formatter_S_sizes[100000000000000000000000000-100.0 YB] PASSED [ 43%]
test/test_formatter.py::test_summary_formatter_S_sizes[1000000000000000000000000000-1000.0 YB] PASSED [ 44%]
test/test_formatter.py::test_summary_formatter_S_e2e_colors PASSED       [ 44%]
test/test_formatter.py::test_summary_formatter_E_e2e PASSED              [ 45%]
test/test_formatter.py::test_summary_formatter_E_e2e_colors PASSED       [ 45%]
test/test_formatter.py::test_summary_formatter_X_e2e PASSED              [ 46%]
test/test_formatter.py::test_summary_formatter_X_e2e_colors PASSED       [ 47%]
test/test_formatter.py::test_summary_formatter_N_e2e PASSED              [ 47%]
test/test_formatter.py::test_summary_formatter_N_e2e_colors PASSED       [ 48%]
test/test_formatter.py::test_execution_summary_formatted_wall_clock_time_invalid PASSED [ 49%]
test/test_log_paths.py::test_log_paths_filesafe_datetime_prefix PASSED   [ 49%]
test/test_log_paths.py::test_log_paths_pid_prefix PASSED                 [ 50%]
test/test_log_paths.py::test_prepare_dir_paths_available[directory/] PASSED [ 50%]
test/test_log_paths.py::test_prepare_dir_paths_available[nested/directory/] PASSED [ 51%]
test/test_log_paths.py::test_prepare_dir_paths_available[/abs/path/] PASSED [ 52%]
test/test_log_paths.py::test_prefix_with_filepart_and_directory_part[directory/pre_] PASSED [ 52%]
test/test_log_paths.py::test_prefix_with_filepart_and_directory_part[nested/directory/pre_] PASSED [ 53%]
test/test_log_paths.py::test_prefix_with_filepart_and_directory_part[/abs/path/pre_] PASSED [ 54%]
test/test_log_paths.py::test_prefix_with_filepart_only PASSED            [ 54%]
test/test_log_paths.py::test_prepare_file_paths_available_all PASSED     [ 55%]
test/test_log_paths.py::test_prepare_file_paths_available_stdout PASSED  [ 55%]
test/test_log_paths.py::test_prepare_file_paths_available_stderr PASSED  [ 56%]
test/test_log_paths.py::test_prepare_file_paths_available_no_streams PASSED [ 57%]
test/test_log_paths.py::test_prepare_paths_not_available_no_clobber PASSED [ 57%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_none_output_none PASSED [ 58%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_none_output_stdout PASSED [ 59%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_none_output_stderr PASSED [ 59%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_none_output_all PASSED [ 60%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stdout_output_none PASSED [ 60%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stdout_output_stdout PASSED [ 61%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stdout_output_stderr PASSED [ 62%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stdout_output_all PASSED [ 62%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stderr_output_none PASSED [ 63%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stderr_output_stdout PASSED [ 63%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stderr_output_stderr PASSED [ 64%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_stderr_output_all PASSED [ 65%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_all_output_none PASSED [ 65%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_all_output_stdout PASSED [ 66%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_all_output_stderr PASSED [ 67%]
test/test_prepare_outputs.py::test_prepare_outputs_capture_all_output_all PASSED [ 67%]
test/test_report.py::test_sample_max_initial_values_one_pid PASSED       [ 68%]
test/test_report.py::test_sample_max_one_pid PASSED                      [ 68%]
test/test_report.py::test_sample_max_initial_values_two_pids PASSED      [ 69%]
test/test_report.py::test_sample_maxtwo_pids PASSED                      [ 70%]
test/test_report.py::test_average_no_samples PASSED                      [ 70%]
test/test_report.py::test_averages_one_sample PASSED                     [ 71%]
test/test_report.py::test_averages_two_samples PASSED                    [ 72%]
test/test_report.py::test_averages_three_samples PASSED                  [ 72%]
test/test_report.py::test_sample_totals PASSED                           [ 73%]
test/test_report.py::test_process_stats_green[1.0-1.1-1024-1025-00:00-cmd] PASSED [ 73%]
test/test_report.py::test_process_stats_green[0.5-0.7-20.48-40.96-00:01-any] PASSED [ 74%]
test/test_report.py::test_process_stats_green[1-2-3-4-100:1000-string] PASSED [ 75%]
test/test_report.py::test_process_stats_green[0-0.0-0-0.0-999:999:999-can have spaces] PASSED [ 75%]
test/test_report.py::test_process_stats_green[2.5-3.5-8192-16384-any-for --this --kind of thing] PASSED [ 76%]
test/test_report.py::test_process_stats_green[100.0-99.9-65536-131072-string-cmd] PASSED [ 77%]
test/test_report.py::test_process_stats_red[only-1.1-1024-1025-etime-cmd] PASSED [ 77%]
test/test_report.py::test_process_stats_red[0.5-takes-20.48-40.96-some-str] PASSED [ 78%]
test/test_report.py::test_process_stats_red[1-2-one-4-anything-accepted] PASSED [ 78%]
test/test_report.py::test_process_stats_red[1-2-3-value-etime-cmd] PASSED [ 79%]
test/test_report.py::test_process_stats_red[2-fail-or-more-etime-cmd] PASSED [ 80%]
test/test_report.py::test_system_info_sanity PASSED                      [ 80%]
test/test_report.py::test_gpu_parsing_green PASSED                       [ 81%]
test/test_report.py::test_gpu_call_error PASSED                          [ 81%]
test/test_report.py::test_gpu_parse_error PASSED                         [ 82%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_1] PASSED         [ 83%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_1] PASSED         [ 83%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_2] PASSED         [ 84%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_2] PASSED         [ 85%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_3] PASSED         [ 85%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_3] PASSED         [ 86%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_4] PASSED         [ 86%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_4] PASSED         [ 87%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_5] PASSED         [ 88%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_5] PASSED         [ 88%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_6] PASSED         [ 89%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_6] PASSED         [ 90%]
test/test_tailpipe.py::test_high_throughput_stdout[ten_7] PASSED         [ 90%]
test/test_tailpipe.py::test_high_throughput_stderr[ten_7] PASSED         [ 91%]
test/test_tailpipe.py::test_close PASSED                                 [ 91%]
test/test_validation.py::test_sample_less_than_report_interval PASSED    [ 92%]
test/test_validation.py::test_sample_equal_to_report_interval PASSED     [ 93%]
test/test_validation.py::test_sample_equal_greater_than_report_interval PASSED [ 93%]
test/test_validation.py::test_assert_num_green[0] PASSED                 [ 94%]
test/test_validation.py::test_assert_num_green[1] PASSED                 [ 95%]
test/test_validation.py::test_assert_num_green[2] PASSED                 [ 95%]
test/test_validation.py::test_assert_num_green[-1] PASSED                [ 96%]
test/test_validation.py::test_assert_num_green[100] PASSED               [ 96%]
test/test_validation.py::test_assert_num_green[0.001] PASSED             [ 97%]
test/test_validation.py::test_assert_num_green[-1.68] PASSED             [ 98%]
test/test_validation.py::test_assert_num_red[hi] PASSED                  [ 98%]
test/test_validation.py::test_assert_num_red[0] PASSED                   [ 99%]
test/test_validation.py::test_assert_num_red[one] PASSED                 [100%]

=================================== FAILURES ===================================
_______________________________ test_signal_exit _______________________________

temp_output_dir = '/tmp/pytest-of-runner/pytest-0/test_signal_exit0/'

    def test_signal_exit(temp_output_dir: str) -> None:
    
        def runner() -> int:
            args = Arguments.from_argv(
                ["sleep", "60.74016230000801"],
                output_prefix=temp_output_dir,
            )
            return execute(args)
    
        thread = threading.Thread(target=runner)
        thread.start()
        sleep(0.03)  # make sure the process is started
        ps_command = "ps aux | grep '[s]leep 60.74016230000801'"  # brackets to not match grep process
>       ps_output = subprocess.check_output(ps_command, shell=True).decode()

test/test_execution.py:216: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = ("ps aux | grep '[s]leep 60.74016230000801'",)
kwargs = {'shell': True, 'stdout': -1}
process = <Popen: returncode: 1 args: "ps aux | grep '[s]leep 60.74016230000801'">
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command 'ps aux | grep '[s]leep 60.74016230000801'' returned non-zero exit status 1.

/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py:528: CalledProcessError
___________ test_execution_summary_formatted_wall_clock_time_rounded ___________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_and_report.<locals>.<lambda> at 0x00007f717ea6afc0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: Callable[[], TResult],
        when: Literal["collect", "setup", "call", "teardown"],
        reraise: type[BaseException] | tuple[type[BaseException], ...] | None = None,
    ) -> CallInfo[TResult]:
        """Call func, wrapping the result in a CallInfo.
    
        :param func:
            The function to call. Called without arguments.
        :type func: Callable[[], _pytest.runner.TResult]
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: TResult | None = func()

.tox/py/lib/pypy3.9/site-packages/_pytest/runner.py:341: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py/lib/pypy3.9/site-packages/_pytest/runner.py:242: in <lambda>
    lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise
.tox/py/lib/pypy3.9/site-packages/pluggy/_hooks.py:513: in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
.tox/py/lib/pypy3.9/site-packages/pluggy/_manager.py:120: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
.tox/py/lib/pypy3.9/site-packages/_pytest/threadexception.py:92: in pytest_runtest_call
    yield from thread_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

    def thread_exception_runtest_hook() -> Generator[None]:
        with catch_threading_exception() as cm:
            try:
                yield
            finally:
                if cm.args:
                    thread_name = (
                        "<unknown>" if cm.args.thread is None else cm.args.thread.name
                    )
                    msg = f"Exception in thread {thread_name}\n\n"
                    msg += "".join(
                        traceback.format_exception(
                            cm.args.exc_type,
                            cm.args.exc_value,
                            cm.args.exc_traceback,
                        )
                    )
>                   warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg))
E                   pytest.PytestUnhandledThreadExceptionWarning: Exception in thread Thread-70
E                   
E                   Traceback (most recent call last):
E                     File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 980, in _bootstrap_inner
E                       self.run()
E                     File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 917, in run
E                       self._target(*self._args, **self._kwargs)
E                     File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 808, in monitor_process
E                       sample = report.collect_sample()
E                     File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 427, in collect_sample
E                       output = subprocess.check_output(
E                     File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py", line 424, in check_output
E                       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E                     File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py", line 507, in run
E                       stdout, stderr = process.communicate(input, timeout=timeout)
E                   ValueError: not enough values to unpack (expected 2, got 0)

.tox/py/lib/pypy3.9/site-packages/_pytest/threadexception.py:82: PytestUnhandledThreadExceptionWarning
=========================== short test summary info ============================
FAILED test/test_execution.py::test_signal_exit - subprocess.CalledProcessError: Command 'ps aux | grep '[s]leep 60.74016230000801'' returned non-zero exit status 1.
FAILED test/test_formatter.py::test_execution_summary_formatted_wall_clock_time_rounded - pytest.PytestUnhandledThreadExceptionWarning: Exception in thread Thread-70

Traceback (most recent call last):
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 980, in _bootstrap_inner
    self.run()
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 917, in run
    self._target(*self._args, **self._kwargs)
  File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 808, in monitor_process
    sample = report.collect_sample()
  File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 427, in collect_sample
    output = subprocess.check_output(
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/subprocess.py", line 507, in run
    stdout, stderr = process.communicate(input, timeout=timeout)
ValueError: not enough values to unpack (expected 2, got 0)
======================== 2 failed, 159 passed in 23.59s ========================
Exception in thread Thread-68:
Traceback (most recent call last):
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 980, in _bootstrap_inner
    self.run()
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 917, in run
    self._target(*self._args, **self._kwargs)
  File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 874, in _tail
    self.buffer.flush()
ValueError: I/O operation on closed file
Exception in thread Thread-69:
Traceback (most recent call last):
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 980, in _bootstrap_inner
    self.run()
  File "/opt/hostedtoolcache/PyPy/3.9.19/x64/lib/pypy3.9/threading.py", line 917, in run
    self._target(*self._args, **self._kwargs)
  File "/home/runner/work/duct/duct/.tox/py/lib/pypy3.9/site-packages/con_duct/__main__.py", line 874, in _tail
    self.buffer.flush()
ValueError: I/O operation on closed file
py: exit 1 (78.16 seconds) /home/runner/work/duct/duct> pytest -vv --cov-report=xml test pid=1885
  py: FAIL code 1 (91.20=setup[13.04]+cmd[78.16] seconds)
  evaluation failed :( (91.42 seconds)
Error: Process completed with exit code 1.
@yarikoptic
Copy link
Member

interestingly, that function is not yet type annotated :-/

❯ git grep 'def communicate'
Lib/asyncio/subprocess.py:    async def communicate(self, input=None):
Lib/subprocess.py:    def communicate(self, input=None, timeout=None):

but looking at v3.9.19 and current version of it as of v3.13.0a6-2252-g7e7223e18f5 I do not see how it could potentially return an empty tuple. Mystery...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants