-
Notifications
You must be signed in to change notification settings - Fork 190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Curly brace ({}) in prompt #300
Comments
Hi @wernerulbts. What you're trying to do can be described as "constrained generation" or "function calling". We don't currently support these features with our official deployments of llama 3, but this model by @hamelsmu demonstrates how you can get with in-context prompting. As for your specific error, could you please share a URL of a failed prediction and the exact code that you're running? |
Hi @mattt , thank you very much I will take a look at it. Regarding the urls here are two examples, how I tried to add my prompt into the system prompt. https://replicate.com/p/ea92cnckndrgj0cf73892nqnkr When I take a look at Llama 3 In your example: You are a helpful assistant<|eot_id|><|start_header_id|>user<|end_header_id|> {prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|> There is only prompt_template and prompt My goal is basically have the system_prompt ("You are a helpful assistant") given in a correct way, when my sytem prompt contains "{". Basically, I don't know how to parse the system_prompt in the right way for example using characters like "{" in the system_prompt / prompt_template |
@wernerulbts Thanks for sharing that context. Something you might try to solve the immediate Another option, if you're having trouble with Finally, take a look at this blog post I wrote a while back about llama 2 with grammar support, which looks to be similar to what you're trying to do. |
@mattt Thank you very much, I also thought about XML format. Your link to the blog post, looks great, I will defiantly test it. A quick work arround, which I found is remove the special characters from the system prompt and add it to the prompt: |
@wernerulbts So glad to see you got that working! I think what you have there is better than what's described by that blog post, so I'd recommend rolling with that. Anything else to be done in this issue? Or do you think we're good to mark this as resolved? |
@mattt Thank you very much, we can mark it as resolved. |
How can I execute a prompt, where I tell the model give me back in a specific JSON format?
I know that the { are used for instructions
prompt_template={prompt}
But how I can submit a promot where I give instructions which contains the {.
"John Doe's company, Acme Ltd., is located at 1234 Main Street, Springfield, IL 62704."
Extract all infos in the following format:
{
"Name": "",
"Company": "",
"Street": "",
"City": "",
"State": "",
"PostalCode": ""
}
Only return your answer in json
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Extract all infos in the following format:\n{\n"Name": "",\n"Company": "",\n"Street": "",\n"City": "",\n"State": "",\n"PostalCode": ""\n}\n\nOnly return your answer in json<|eot_id|><|start_header_id|>user<|end_header_id|>
{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
I get an error with:
Prediction failed.
'\n\"Name\"'
The text was updated successfully, but these errors were encountered: