Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pydantic problem #23

Open
bacoco opened this issue Aug 7, 2023 · 6 comments
Open

Pydantic problem #23

bacoco opened this issue Aug 7, 2023 · 6 comments

Comments

@bacoco
Copy link

bacoco commented Aug 7, 2023

Hello,

If we update pydantic lib to last version V2 we have a model_dump problem.

Thanks

@philschmid
Copy link
Owner

What is the error you are getting?

@van51
Copy link

van51 commented Aug 12, 2023

Hi! I also ran into this issue, hope that the following traceback helps:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[60], line 9
      6 huggingface.prompt_builder = "llama2"
----> 9 response = huggingface.ChatCompletion.create(
     10     messages=[
     11         {"role": "system", "content": "\nYou are a helpful assistant speaking like a pirate. argh!"},
     12         {"role": "user", "content": "What is the sun?"},
     13     ],
     14     temperature=0.9,
     15     top_p=0.6,
     16     max_tokens=256,
     17 )
     19 print(response)

File python3.11/site-packages/easyllm/clients/huggingface.py:225, in ChatCompletion.create(messages, model, temperature, top_p, top_k, n, max_tokens, stop, stream, frequency_penalty, debug)
    222 prompt_tokens = int(len(prompt) / 4)
    223 total_tokens = prompt_tokens + generated_tokens
--> 225 return dump_object(
    226     ChatCompletionResponse(
    227         model=request.model,
    228         choices=choices,
    229         usage=Usage(
    230             prompt_tokens=prompt_tokens, completion_tokens=generated_tokens, total_tokens=total_tokens
    231         ),
    232     )
    233 )

File python3.11/site-packages/easyllm/schema/base.py:12, in dump_object(object)
     10     return object.dict()
     11 else:
---> 12     return object.model_dump(exclude_none=True)

AttributeError: 'ChatCompletionResponse' object has no attribute 'model_dump'

@philschmid
Copy link
Owner

Can you please share the versions you have installed?

@murdadesmaeeli
Copy link

Hi @van51, are you still facing the issue?

@van51
Copy link

van51 commented Nov 21, 2023

Oops sorry, I had missed the above message. Unfortunately, I am not really sure of the versions that were installed at the time.
However I did a fresh install now and everything works fine.

@murdadesmaeeli
Copy link

murdadesmaeeli commented Nov 26, 2023

@philschmid this issue can be closed 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants