Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mute warnings when computing metrics for non-NER labels? #74

Open
kelseyball opened this issue Apr 7, 2021 · 2 comments
Open

Mute warnings when computing metrics for non-NER labels? #74

kelseyball opened this issue Apr 7, 2021 · 2 comments

Comments

@kelseyball
Copy link

Is it possible to mute warnings when computing metrics for e.g. non-NER tasks, e.g. POS tagging? (Or optionally validate labels for NER?) I believe the implementations of precision_score, recall_score, etc. should still work fine for other sequence labeling tasks.

If not, it might be a nice feature to add. Since a warning is raised for each non-NER label, they can add up for larger label sets.

@mirfan899
Copy link

Exactly, it's misleading when used for NON-NER tags. I looked into the code and it seems mostly hardcoded and very difficult to contribute to use existing code for this purpose.

@BramVanroy
Copy link

+1 for this. It seems quite odd that it is advertised as being customary to use this script for other tasks such as POS yet when you do so it gives you warnings. Unless it is intended to do IOB POS tagging? I did not know that that was a thing. In any case, it would be useful if these warnings could be disabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants