Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Log spam: indexprovider/wrapper.go:159 -> "seal status Removed" #1738

Open
7 of 12 tasks
RobQuistNL opened this issue Oct 9, 2023 · 3 comments
Open
7 of 12 tasks

Log spam: indexprovider/wrapper.go:159 -> "seal status Removed" #1738

RobQuistNL opened this issue Oct 9, 2023 · 3 comments
Labels

Comments

@RobQuistNL
Copy link
Contributor

Checklist

  • This is not a question or a support request. If you have any boost related questions, please ask in the discussion forum.
  • This is not a new feature request. If it is, please file a feature request instead.
  • This is not an enhancement request. If it is, please file a improvement suggestion instead.
  • I have searched on the issue tracker and the discussion forum, and there is no existing related issue or discussion.
  • I am running the Latest release, or the most recent RC(release canadiate) for the upcoming release or the dev branch(master), or have an issue updating to any of these.
  • I did not make any code changes to boost.

Boost component

  • boost daemon - storage providers
  • boost client
  • boost UI
  • boost data-transfer
  • boost index-provider
  • Other

Boost Version

2.0.0

Describe the Bug

While running the daemon, I keep getting a lot of logs from deals / sectors that have been removed.

This also happened in 1.7.3 and I think it also happened before that.

Logging Information

{"level":"debug","ts":"2023-10-09T12:01:25.117Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 10262} has 0 deals, seal status Removed"}
{"level":"debug","ts":"2023-10-09T12:01:25.658Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 24522} has 0 deals, seal status Removed"}
{"level":"debug","ts":"2023-10-09T12:01:26.182Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 6824} has 0 deals, seal status Removed"}
{"level":"debug","ts":"2023-10-09T12:01:26.723Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 4378} has 0 deals, seal status Removed"}
{"level":"debug","ts":"2023-10-09T12:01:27.236Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 10564} has 0 deals, seal status Removed"}
{"level":"debug","ts":"2023-10-09T12:01:27.749Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:159","msg":"sector {<SPID> 4523} has 0 deals, seal status Removed"}

Repo Steps

  1. Run boost
  2. Check logs
@LexLuthr
Copy link
Collaborator

LexLuthr commented Oct 9, 2023

Those are debug logs. You can just turn the logging to INFO. We get a list of deals from legacy provider and match that sectors has expired or not. If expired, we log a line saying that. This happens when index-provider starts.
This is not really an issue.

@RobQuistNL
Copy link
Contributor Author

RobQuistNL commented Oct 9, 2023

I think they might be a bit verbose (even for debug) - but other than that - I wonder why its being called that often.

Looking at the code, its walking through all sectors its receiving, but I don't see why it would receive "Removed" sectors at all when considering updates. Maybe we can bail out of the loop if the sector is removed?

right here: https://github.com/filecoin-project/boost/blob/main/indexprovider/wrapper.go#L155-L161

EDIT:
To answer this part:

This happens when index-provider starts.

Its happening 24/7 on my node - and I only seem to see the "Removed" sectors with 0 deals. Maybe it shouldn't be selecting these in the first place? Could they be stuck in an infinite loop? (e.g. selected because not indexed, can't be indexed because removed, repeat?)

@RobQuistNL
Copy link
Contributor Author

RobQuistNL commented Oct 9, 2023

Can't find any of the older ones, but it looks like its always getting stuck at 12527 updates (which might affirm my suspicion);

{"level":"debug","ts":"2023-10-09T10:53:53.798Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12527,"len(u.active)":226899,"u.updatedAt":"2023-10-09T10:53:42.344Z"}
{"level":"debug","ts":"2023-10-09T10:53:53.798Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:133","msg":"got state updates from SectorStateMgr","u":12527}
{"level":"debug","ts":"2023-10-09T11:00:48.272Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:133","msg":"got state updates from SectorStateMgr","u":12527}
{"level":"debug","ts":"2023-10-09T11:00:48.275Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12527,"len(u.active)":226899,"u.updatedAt":"2023-10-09T11:00:37.523Z"}
{"level":"debug","ts":"2023-10-09T11:36:44.296Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12527,"len(u.active)":226899,"u.updatedAt":"2023-10-09T11:36:35.069Z"}
{"level":"debug","ts":"2023-10-09T11:36:44.296Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:133","msg":"got state updates from SectorStateMgr","u":12527}
{"level":"debug","ts":"2023-10-09T11:43:28.177Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12527,"len(u.active)":226899,"u.updatedAt":"2023-10-09T11:43:24.319Z"}
{"level":"debug","ts":"2023-10-09T11:43:28.177Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:133","msg":"got state updates from SectorStateMgr","u":12527}
{"level":"debug","ts":"2023-10-09T14:18:26.338Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12538,"len(u.active)":226899,"u.updatedAt":"2023-10-09T12:45:11.039Z"}
{"level":"debug","ts":"2023-10-09T14:18:26.338Z","logger":"index-provider-wrapper","caller":"indexprovider/wrapper.go:133","msg":"got state updates from SectorStateMgr","u":12538}
{"level":"debug","ts":"2023-10-09T15:04:16.925Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12577,"len(u.active)":226899,"u.updatedAt":"2023-10-09T14:19:53.457Z"}
{"level":"debug","ts":"2023-10-09T15:37:50.136Z","logger":"sectorstatemgr","caller":"sectorstatemgr/sectorstatemgr.go:96","msg":"got state updates from SectorStateMgr","len(u.updates)":12527,"len(u.active)":226899,"u.updatedAt":"2023-10-09T15:05:42.244Z"}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: No status
Development

No branches or pull requests

2 participants