Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: [SWE-bench] Faild to cd. ModuleNotFoundError #6431

Open
1 task done
BIJOY-SUST opened this issue Jan 23, 2025 · 1 comment
Open
1 task done

[Bug]: [SWE-bench] Faild to cd. ModuleNotFoundError #6431

BIJOY-SUST opened this issue Jan 23, 2025 · 1 comment
Labels
bug Something isn't working evaluation Related to running evaluations with OpenHands

Comments

@BIJOY-SUST
Copy link

BIJOY-SUST commented Jan 23, 2025

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

I've encountered these following issues—sometimes the process works as expected, sometimes it doesn't. A specific example of this issue occurred when running the following command:

./evaluation/benchmarks/swe_bench/scripts/run_infer.sh llm.qwen_model HEAD CodeActAgent 1 30 1 princeton-nlp/SWE-bench_Lite test

The issue is associated with django__django-12915.

evaluation.utils.shared.EvalException: Failed to cd to /workspace/django__django__3.2: **CmdOutputObservation (source=None, exit code=2)**
---------------------------------
---------------------------------
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named '/workspace/django__django__3'
---------------------------------
---------------------------------
ERROR:root:<class 'RuntimeError'>: Maximum error retries reached for instance django__django-12915

OpenHands Installation

Docker command in README

OpenHands Version

0.18.0

Operating System

Linux

Logs, Errors, Screenshots, and Additional Context

No response

@BIJOY-SUST BIJOY-SUST added the bug Something isn't working label Jan 23, 2025
@enyst enyst changed the title [Bug]: Faild to cd. ModuleNotFoundError [Bug]: [SWE-bench] Faild to cd. ModuleNotFoundError Jan 23, 2025
@BIJOY-SUST
Copy link
Author

Facing this issue for other instances also-

evaluation.utils.shared.EvalException: Failed to cd to /workspace/matplotlib__matplotlib__3.5: **CmdOutputObservation (source=None, exit code=2)**
0
Instance matplotlib__matplotlib-22711 - 2025-01-23 15:53:13,587 - ERROR - Failed to cd to /workspace/matplotlib__matplotlib__3.5: **CmdOutputObservation (source=None, exit code=2)**
0
Traceback (most recent call last):
  File "project_name/evaluation/utils/shared.py", line 330, in _process_instance_wrapper
    result = process_instance_func(instance, metadata, use_mp, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "project_name/evaluation/benchmarks/swe_bench/run_infer.py", line 419, in process_instance
    return_val = complete_runtime(runtime, instance)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "project_name/evaluation/benchmarks/swe_bench/run_infer.py", line 311, in complete_runtime
    assert_and_raise(
  File "project_name/evaluation/utils/shared.py", line 307, in assert_and_raise
    raise EvalException(msg)
evaluation.utils.shared.EvalException: Failed to cd to /workspace/matplotlib__matplotlib__3.5: **CmdOutputObservation (source=None, exit code=2)**
0
Instance matplotlib__matplotlib-23987 - 2025-01-23 15:53:13,589 - INFO - Starting evaluation for instance matplotlib__matplotlib-23987.
Hint: run "tail -f evaluation/evaluation_outputs/outputs/princeton-nlp__SWE-bench_Lite-test/CodeActAgent/_maxiter_30_N_v0.18.0-no-hint-swe_gym_7b_sft_d_1234_e_3_v4_merge-run_1/infer_logs/instance_matplotlib__matplotlib-23987.log" to see live logs in a separate shell
ERROR:root:  File "project_name/evaluation/benchmarks/swe_bench/run_infer.py", line 531, in <module>
    run_evaluation(
  File "project_name/evaluation/utils/shared.py", line 440, in run_evaluation
    for result in results:
                  ^^^^^^^
  File "/home/user_name/.conda/envs/openhands/lib/python3.12/multiprocessing/pool.py", line 873, in next
    raise value

ERROR:root:<class 'RuntimeError'>: Maximum error retries reached for instance matplotlib__matplotlib-22711

@mamoodi mamoodi added the evaluation Related to running evaluations with OpenHands label Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working evaluation Related to running evaluations with OpenHands
Projects
None yet
Development

No branches or pull requests

2 participants