Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecated option optimize_cuda_cache warning on import of trl #1044

Closed
ChanderG opened this issue Nov 30, 2023 · 1 comment · Fixed by #1045
Closed

Deprecated option optimize_cuda_cache warning on import of trl #1044

ChanderG opened this issue Nov 30, 2023 · 1 comment · Fixed by #1045

Comments

@ChanderG
Copy link
Contributor

ChanderG commented Nov 30, 2023

When you just import trl, we get the following warning:

$ python3
Python 3.10.8 (main, Nov 24 2022, 14:13:03) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import trl
/opt/conda/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead.
  warnings.warn(

Unless this is left on purpose, I believe that this line https://github.com/huggingface/trl/blob/main/trl/trainer/ppo_config.py#L107, should be changed from:

optimize_cuda_cache: bool = False

to:

optimize_cuda_cache: Optional[bool] = None

to get rid of the warning.

I can open a PR if this warning is not needed on default config creation.

@lvwerra
Copy link
Member

lvwerra commented Nov 30, 2023

Indeed, that's correct! A PR would be great, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants