You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello , it seems that the model library doesn't support the Arabic language .. not sure if this due to the model-qa script or because the inference - runtime doesn't support it ?
it does actually output Arabic prediction but it doesn't respond if there's question mark "؟" at the end for a Q&A test.
The text was updated successfully, but these errors were encountered:
We have a streaming detokenization issue for which a fix is currently in progress.
Can you please try running your input with the model-generate.py script also in the examples/python folder? This will output the generated text in one operation rather than streaming one token at a time.
python model-generate..py -m <path to your model> -pr <your prompt>
Hello , it seems that the model library doesn't support the Arabic language .. not sure if this due to the model-qa script or because the inference - runtime doesn't support it ?
it does actually output Arabic prediction but it doesn't respond if there's question mark "؟" at the end for a Q&A test.
The text was updated successfully, but these errors were encountered: