Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Unknown model for all Groq models in ChatGroq #5364

Open
5 tasks done
abishekdevendran opened this issue May 14, 2024 · 4 comments
Open
5 tasks done

Error: Unknown model for all Groq models in ChatGroq #5364

abishekdevendran opened this issue May 14, 2024 · 4 comments
Assignees
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@abishekdevendran
Copy link

abishekdevendran commented May 14, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatGroq } from "@langchain/groq";
const large_model = new ChatGroq({
    temperature: 0.0
});

import { ConversationSummaryBufferMemory } from "langchain/memory";
import db from "./db/sqlite";
import { small_model } from "./llms";
import {
    BaseMessage,
    StoredMessage,
    mapChatMessagesToStoredMessages,
    mapStoredMessagesToChatMessages,
} from "@langchain/core/messages";
import { BaseListChatMessageHistory } from "@langchain/core/chat_history";

// Not required, but usually chat message histories will handle multiple sessions
// for different users, and should take some kind of sessionId as input.
export interface CustomChatMessageHistoryInput {
    sessionId: string;
}

export class CustomChatMessageHistory extends BaseListChatMessageHistory {
    lc_namespace = ["langchain", "stores", "message"];

    sessionId: string;

    constructor(fields: CustomChatMessageHistoryInput) {
        super(fields);
        this.sessionId = fields.sessionId;
    }

    async getMessages(): Promise<BaseMessage[]> {
        const messagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let messages: StoredMessage[] = JSON.parse(messagesStr);
        return mapStoredMessagesToChatMessages(messages);
    }

    async addMessage(message: BaseMessage): Promise<void> {
        const retrievedMessagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let retrievedMessages: StoredMessage[] = JSON.parse(retrievedMessagesStr);
        const serializedMessages = mapChatMessagesToStoredMessages([message]);
        retrievedMessages.push(serializedMessages[0]);
        // console.log("retrievedMessage: ", retrievedMessages);
        db.prepare("INSERT OR REPLACE INTO chat (sessionId, data) VALUES (?, ?)").run(this.sessionId, JSON.stringify(retrievedMessages));
    }

    async addMessages(messages: BaseMessage[]): Promise<void> {
        const retrievedMessagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let retrievedMessages: StoredMessage[] = JSON.parse(retrievedMessagesStr);
        const serializedMessages = mapChatMessagesToStoredMessages(messages);
        retrievedMessages.push(...serializedMessages);
        // console.log("retrievedMessagessss: ", retrievedMessages);
        db.prepare("INSERT OR REPLACE INTO chat (sessionId, data) VALUES (?, ?)").run(this.sessionId, JSON.stringify(retrievedMessages));
    }

    async clear(): Promise<void> {
        db.prepare("DELETE FROM chat WHERE sessionId = ?").run(this.sessionId);
    }
}

export function getMemory(sessionId: string): ConversationSummaryBufferMemory {
    const history = new CustomChatMessageHistory({ sessionId: sessionId });
    return new ConversationSummaryBufferMemory({
        chatHistory: history,
        memoryKey: "chat_history",
        llm: small_model,
    });
}

memory.saveContext({ input: query }, { output: resp.response })

Error Message and Stack Trace (if applicable)

Failed to calculate number of tokens, falling back to approximate count Error: Unknown model
    at getEncodingNameForModel (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/js-tiktoken@1.0.12/node_modules/js-tiktoken/dist/chunk-PEBACC3C.js:230:13)
    at encodingForModel (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/@langchain+core@0.1.63_openai@4.46.1/node_modules/@langchain/core/dist/utils/tiktoken.js:19:24)
    at ChatGroq.getNumTokens (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/@langchain+core@0.1.63_openai@4.46.1/node_modules/@langchain/core/dist/language_models/base.js:177:40)
    at ConversationSummaryBufferMemory.prune (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/langchain@0.1.37_@xenova+transformers@2.17.1_@zilliz+milvus2-sdk-node@2.4.2_better-sqlite3@9._byv6wm32l75bjvaef4ttqhmd3e/node_modules/langchain/dist/memory/summary_buffer.js:114:47)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async ConversationSummaryBufferMemory.saveContext (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/langchain@0.1.37_@xenova+transformers@2.17.1_@zilliz+milvus2-sdk-node@2.4.2_better-sqlite3@9._byv6wm32l75bjvaef4ttqhmd3e/node_modules/langchain/dist/memory/summary_buffer.js:96:9)

Description

I'm trying to summarize chat history and save using ConversationalSummaryBufferMemory, and when I switch out LLMs from ChatOpenAI or ChatOllama to ChatGroq, I get the error.

System Info

langchain@0.1.37 | MIT | deps: 18 | versions: 270
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.1.37.tgz
.shasum: 15db8ca5c24afc39e61773cab69e216dfb38e1bb
.integrity: sha512-rpaLEJtRrLYhAViEp7/aHfSkxbgSqHJ5n10tXv3o4kHP/wOin85RpTgewwvGjEaKc3797jOg+sLSk6a7e0UlMg==
.unpackedSize: 4.0 MB

dependencies:
@anthropic-ai/sdk: ^0.9.1 js-tiktoken: ^1.0.7 openapi-types: ^12.1.3
@langchain/community: ~0.0.47 js-yaml: ^4.1.0 p-retry: 4
@langchain/core: ~0.1.60 jsonpointer: ^5.0.1 uuid: ^9.0.0
@langchain/openai: ~0.0.28 langchainhub: ~0.0.8 yaml: ^2.2.1
@langchain/textsplitters: ~0.0.0 langsmith: ~0.1.7 zod-to-json-schema: ^3.22.3
binary-extensions: ^2.2.0 ml-distance: ^4.0.0 zod: ^3.22.4

maintainers:

dist-tags:
latest: 0.1.37 next: 0.2.0-rc.1

published 3 days ago by basproul braceasproul@gmail.com

@langchain/groq@0.0.9 | MIT | deps: 5 | versions: 9
Groq integration for LangChain.js
https://github.com/langchain-ai/langchainjs/tree/main/libs/langchain-groq/

dist
.tarball: https://registry.npmjs.org/@langchain/groq/-/groq-0.0.9.tgz
.shasum: a44b19af3b784f324057bfb0217ff0613d148c2f
.integrity: sha512-/QGGgazYdxlN8FCmPfEVDO9Hg55POvQdnoou+b3lsugmwP1TYPRtqLW6JY7Atb36X4vjEJwiMCnntDXdT7vgaw==
.unpackedSize: 48.8 kB

dependencies:
@langchain/core: ~0.1.56 groq-sdk: ^0.3.2 zod: ^3.22.4
@langchain/openai: ~0.0.28 zod-to-json-schema: ^3.22.5

maintainers:

dist-tags:
latest: 0.0.9

published 2 weeks ago by basproul braceasproul@gmail.com

node --version: v20.11.0
Platform: Windows 11 Version 10.0.22631 Build 22631 x64

Copy link

dosubot bot commented May 14, 2024

The error you're encountering with ChatGroq seems to be due to the getEncodingNameForModel function not recognizing the model. This typically happens when there's an issue with model identification, which could be due to several reasons such as an incorrect model name, a missing API key, or an improper configuration specific to ChatGroq. Here's how you can address this issue:

  1. Ensure Correct Import and Instantiation: Verify that you're importing ChatGroq from @langchain/groq and instantiating it with the necessary parameters, including the apiKey. The API key is essential for the model to authenticate and function properly.

  2. API Key Configuration: Double-check that the GROQ_API_KEY environment variable is correctly set. An incorrect or missing API key could lead to authentication issues, resulting in the model not being recognized.

  3. Model Name and Configuration: The instantiation code you provided doesn't specify a modelName, which suggests that either the default model is being used or specifying a model name isn't required. However, ensure that your specific use case doesn't necessitate a different model name. If unsure, consulting the official ChatGroq documentation or support might be necessary.

  4. Integration with ConversationSummaryBufferMemory: After ensuring the ChatGroq model is correctly instantiated, integrate it with ConversationSummaryBufferMemory as intended. Pay attention to any ChatGroq-specific configurations that might be required.

  5. Debugging: If the problem persists after checking the API key and instantiation, consider debugging by verifying all parameters and configurations again. Also, keep an eye out for any updates from the Groq API that might affect model availability or parameter requirements.

By following these steps, you should be able to resolve the "Unknown model" error. If you continue to face issues, providing more details about your ChatGroq configuration and its usage context could help in further diagnosing the problem.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label May 14, 2024
@jacoblee93
Copy link
Collaborator

Hey, yes will have a look at this!

@jacoblee93 jacoblee93 self-assigned this May 14, 2024
@jacoblee93
Copy link
Collaborator

Hey, apologies for the delay!

The reason this occurs is because that type of memory is dependent on token counting:

https://github.com/langchain-ai/langchainjs/blob/main/langchain/src/memory/buffer_token_memory.ts#L115

We don't have a built-in accurate tokenizer at the moment - best we could do is add an option to suppress the warning?

@abishekdevendran
Copy link
Author

If we just suppress the warning, will the buffer memory still continue to work to some extent? By approximating the token count somehow?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants