-
Notifications
You must be signed in to change notification settings - Fork 325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to include confidence information in the annotation results. #399
Comments
Hey there! Right now, there isn't a built-in way to add output fields to label files, so you'll need to tweak the code a bit to make that happen. I suggest taking a closer look at the source code to figure out where you need to make those adjustments. Just make sure to give it a good test run after you've made your changes. If you run into any snags, feel free to reach out. Take care! |
您好,修改源码需要涉及几处改动:
更具体地,我之前添加过一个difficult参数,你可以搜下对应的commit记录,看下增加字段涉及哪些地方。 |
Hi @dejun219: Just wanted to let you know that the feature request for displaying the confidence score has been implemented and pushed to the remote repository. Now, you can now update your local source code to experience this new addition. Happy coding! Best regards, |
谢谢您,我这边更新下。 |
Dear author,
In the annotation results of this model, each bounding box does not include the corresponding predicted confidence information. Could you guide me on how to modify the relevant script so that the output results in the JSON file include confidence information?
Best regards,
The text was updated successfully, but these errors were encountered: