Can't utilize 4 GPU for lmdeploy #3075
Unanswered
ismailyenigul
asked this question in
Q&A
Replies: 2 comments 1 reply
-
Please add |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @lvhan028
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have g5.12xlarge aws instance with 4 GPU (each has 24 GB Video memory) running on AWS EKS
I enabled 4 gpu for this pod and I verify that it is visible to the pod
running
get the following error always
and tried
deepseek-ai/DeepSeek-V3.0
gotquestion 1: Is is possible to run these models with single GPU (24 GB memory)
question 2: If I can't run with a single GPU, how can I enable to utilize 4 GPU
Thanks
Beta Was this translation helpful? Give feedback.
All reactions