Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This software in its current form is useless #133

Open
gosforth opened this issue Feb 21, 2024 · 5 comments
Open

This software in its current form is useless #133

gosforth opened this issue Feb 21, 2024 · 5 comments

Comments

@gosforth
Copy link

docker compose up -d --build
Sometimes ends with success, sometimes not. If you have luck, then it ends with success after several tries (just execute several times same command... without any changes)

Starting docker images in most cases will end with failure.
Deleting containers/images from GUI level is not possible.

Sorry guys but quality is beyond any standard.

@slimslenderslacks
Copy link
Collaborator

Just looking at a few of the other issues you posted. You're running on WSL2, correct? Could you post the version of Docker Desktop that you're currently using. docker compose down should be removing containers but not images.

It'd be great if you could send which services are failing to start or only starting intermittently. Are you seeing this for database, pull-model, loader, bot, pdf_bot, api, front-end?

@i-Am-GhOsT
Copy link

i-Am-GhOsT commented Feb 22, 2024

I though I was only one who is not able to bring up the genai-stack from wsl. But glad it is reported.

Here are the binary versions I am using. It break now and often.

I have two WSL envs one ubuntu22.04 and Kali. I am trying to run the stack from WSL2 ubuntu 22.04

> uname -a
Linux TuxTechLab-ControlPlane 5.15.133.1-microsoft-standard-WSL2 #1 SMP Thu Oct 5 21:02:42 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

> docker --version
Docker version 25.0.2, build 29cf629

> docker-compose --version
Docker Compose version v2.24.3-desktop.1

> docker images ls 
REPOSITORY               TAG       IMAGE ID       CREATED          SIZE
genai-stack/pull-model   latest    b89180db2b0b   10 minutes ago   455MB
ollama/ollama            latest    7e0b7758e65f   18 hours ago     419MB
neo4j                    5.11      e03a2a7d1b29   5 months ago     495MB

> ollama --version
ollama version is 0.1.26

>ollama list     
NAME            ID              SIZE    MODIFIED    
llama2:7b       78e26419b446    3.8 GB  6 hours ago

I have just enabled the LLM, EMBEDDING_MODEL, NEO4J_*. And doing the below command

docker-compose up -d --build --remove-orphans --force-recreate

Am I doing anything wrong here? I would be very happy if contributions are accepted, which solves the WSL support for this project.


Compose Build Failed Log

[+] Building 338.8s (32/52)                                                                                           docker:default 
 => => transferring dockerfile: 573B                                                                                            0.1s 
 => [pdf_bot internal] load metadata for docker.io/langchain/langchain:latest                                                   2.0s 
 => [pdf_bot internal] load build definition from pdf_bot.Dockerfile                                                            0.1s 
 => => transferring dockerfile: 554B                                                                                            0.0s 
 => [bot auth] langchain/langchain:pull token for registry-1.docker.io                                                          0.0s 
 => [bot internal] load .dockerignore                                                                                           0.1s 
 => => transferring context: 120B                                                                                               0.1s 
 => [api internal] load .dockerignore                                                                                           0.1s 
 => => transferring context: 120B                                                                                               0.0s 
 => [pdf_bot internal] load .dockerignore                                                                                       0.1s 
 => => transferring context: 120B                                                                                               0.0s 
 => => transferring context: 120B                                                                                               0.0s 
 => [api 1/8] FROM docker.io/langchain/langchain:latest@sha256:72f8f54e130c711b17dd025d19c96204a2faf044d85caa0a31d7f1f5bff3b58  0.0s 
 => [pdf_bot internal] load build context                                                                                       0.1s 
 => => transferring context: 2.86kB                                                                                             0.1s 
 => [bot internal] load build context                                                                                           0.1s 
 => => transferring context: 121B                                                                                               0.1s 
 => [api internal] load build context                                                                                           0.1s 
 => => transferring context: 4.29kB                                                                                             0.1s 
 => [loader internal] load build context                                                                                        0.1s 
 => => transferring context: 191B                                                                                               0.1s 
 => => exporting layers                                                                                                        35.1s 
 => => writing image sha256:35c906ef762abc9de65f43c91b3bdadd551c416682e838dc04aee50d12db22ab                                    0.0s 
 => => naming to docker.io/library/genai-stack-pdf_bot                                                                          0.0s 
 => [bot] exporting to image                                                                                                   35.2s 
 => => exporting layers                                                                                                        35.1s 
 => => writing image sha256:e8648a1e7c6a680a4028b7cd8ed5e3412ef2ae3f587b85179e12db53fc843a29                                    0.0s 
 => => naming to docker.io/library/genai-stack-bot                                                                              0.0s 
 => [api] exporting to image                                                                                                   35.2s 
 => => exporting layers                                                                                                        35.1s 
 => => writing image sha256:732316c55ea41e2e837177d375fa533fc41f56b7c27b15a2008907ba560bcc34                                    0.0s 
 => => naming to docker.io/library/genai-stack-api                                                                              0.0s 
 => [loader] exporting to image                                                                                                35.1s 
 => => exporting layers                                                                                                        35.1s 
 => => writing image sha256:9a88b4fba4dae083b04eb9d8bb386e35fa318bbd4aa5f2ed79ebabc799596280                                    0.0s 
 => => naming to docker.io/library/genai-stack-loader                                                                           0.0s 
 => [front-end internal] load build definition from front-end.Dockerfile                                                        0.0s 
 => => transferring dockerfile: 174B                                                                                            0.0s 
 => ERROR [front-end internal] load metadata for docker.io/library/node:alpine                                                 10.9s 
------
 > [front-end internal] load metadata for docker.io/library/node:alpine:
------
failed to solve: node:alpine: error getting credentials - err: exit status 1, out: ``

@zooninja
Copy link
Contributor

Hello,

I can confirm that the stack works for me both on linux and on windows with WSL. Can't comment only for Mac cause I don't have one at the moment.
It will be useful if you post you .env configuration so we can check whether something needs to be corrected.
Also the statement that "This software in its current form is useless" is not valid. This is a sample stack that shows how you could apply the given combo and is not intended to be an out of the box product.
You will need to put some effort and adapt the stack to your use case.
Let's start with a review of the .env and we can then try and troubleshoot your issue.

@i-Am-GhOsT
Copy link

Hi @zooninja,
Thanks for addressing this. I also agree that we need to troubleshoot our issues.

  • For Mac, I can confirm it works. I tried the stack on my Mac M1. It works fine.

  • I have used the below .env file to bring up the stack. Tried to troubleshoot the local issues from morning and they are not everytime same. So looking for some help here. If anyone else is also using WSL2 for this.

    #*****************************************************************
    # LLM and Embedding Model
    #*****************************************************************
    LLM=llama2:13b #or any Ollama model tag, gpt-4, gpt-3.5, or claudev2
    EMBEDDING_MODEL=sentence_transformer #or google-genai-embedding-001 openai, ollama, or aws
    
    #*****************************************************************
    # Neo4j
    #*****************************************************************
    NEO4J_URI=neo4j://database:7687
    NEO4J_USERNAME=neo4j
    NEO4J_PASSWORD=fossisawsome
    
    #*****************************************************************
    # Langchain
    #*****************************************************************
    # Optional for enabling Langchain Smith API
    
    #LANGCHAIN_TRACING_V2=true # false
    #LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
    #LANGCHAIN_PROJECT=#your-project-name
    #LANGCHAIN_API_KEY=#your-api-key ls_...
    
    #*****************************************************************
    # Ollama
    #*****************************************************************
    OLLAMA_BASE_URL=http://host.docker.internal:11434
    
    #*****************************************************************
    # OpenAI
    #*****************************************************************
    # Only required when using OpenAI LLM or embedding model
    
    #OPENAI_API_KEY=sk-...
    
    #*****************************************************************
    # AWS
    #*****************************************************************
    # Only required when using AWS Bedrock LLM or embedding model
    
    #AWS_ACCESS_KEY_ID=
    #AWS_SECRET_ACCESS_KEY=
    #AWS_DEFAULT_REGION=us-east-1
    
    #*****************************************************************
    # GOOGLE
    #*****************************************************************
    # Only required when using GoogleGenai LLM or embedding model
    # GOOGLE_API_KEY=

@gosforth
Copy link
Author

gosforth commented Mar 1, 2024

I deleted this from my PC and not going to turn back - I lost my time.
As I wrote; sometimes compilation was working, software was up and running. But after restarting PC it was not.
When I tried to stop images - GUI did not respond. It was not even possible to delete images and containers from GUI level - I had to use command prompt (Windows).
So Docker itself is far away to be solid software.
I decided to build RAG with pure Python (genai-stack is just way to heavy)... and not working.

Anyway user: neo4j:neo4j line have to be deleted to compile it (other changes as well... I do not remember what)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants