-
Notifications
You must be signed in to change notification settings - Fork 24.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Add option to disable inference process cache by default #108784
Conversation
Pinging @elastic/ml-core (Team:ML) |
failure due to esql IT |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, how about we add a test just to confirm. Maybe something like this?
@elasticmachine merge upstream |
} | ||
|
||
private static CreateTrainedModelAssignmentAction.Response createResponse() { | ||
return new CreateTrainedModelAssignmentAction.Response(mock(TrainedModelAssignment.class)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of mocking it and needing remove final
can you use the tests helper?
Line 20 in ad63465
protected Response createTestInstance() { |
Or is it not accessible?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, it's protected...so instead you could create a public method in that class that exposes it or you can just do what it's doing:
return new Response(TrainedModelAssignmentTests.randomInstance());
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I hadn't even considered that the object's tests would have their own instantiation function. Thanks!
@elasticmachine merge upstream |
Due to recent issues with the inference process cache in severless, we are adding the option to disable the inference process cache by default. On its own, this PR should have no functional effect. Once the corresponding elasticsearch-serverless PR is merged, the cache will be disabled on serverless.
This change will always allow the user to manually set the inference process cache.