-
Notifications
You must be signed in to change notification settings - Fork 499
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cannot work with lambdalabs gpu #1612
Comments
i am getting the same issue |
any feedback on this ? |
Hi, I am an SWE working for Lambda, and I decided to look into this problem. I know next to nothing about cog, and following the directions linked in the original report I can confirm that I can reproduce the problem. I did find that the following steps on a freshly launched instance successfully generated a file output.0.png though:
Is the version of CUDA provided by Lambda Stack not supported? I ask because the first line of output from that last command is the following: Note that I don't know where the "CUDA 11.8" is coming from:
If there is anything that I can do to help troubleshoot this, or if there's a change to our on-demand VM base image that might prevent this in the future, please let me know. |
no news on this from replicate team? |
hey I have the same issue here! any news? |
I am following this tutorial
https://replicate.com/docs/guides/get-a-gpu-machine
I run
sudo cog predict r8.im/stability-ai/stable-diffusion@sha256:ac732df83cea7fff18b8472768c88ad041fa750ff7682a21affe81863cbe77e4 -i prompt="a pot of gold"
And getting the following error :
any feedback?
The text was updated successfully, but these errors were encountered: