Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overriding Layer.forward unexpectedly changes the signature of Layer.__call__ under torch backend #19730

Open
LarsKue opened this issue May 19, 2024 · 5 comments
Assignees
Labels
To investigate Looks like a bug. It needs someone to investigate.

Comments

@LarsKue
Copy link
Contributor

LarsKue commented May 19, 2024

In torch, one typically writes layer __call__ methods by overriding the forward method. Under keras, we instead use the call method.

I would not expect overriding forward to have any effect on how a keras.Layer is called, even under the torch backend, since this is the purpose of call. However, it seems when the forward method is overridden, this takes priority over overriding Layer.call.

Minimal Example:

import os
os.environ["KERAS_BACKEND"] = "torch"

import keras


class MyLayer(keras.Layer):
    def call(self, xz, inverse: bool = False):
        if inverse:
            return self.inverse(xz)
        return self.forward(xz)

    def forward(self, x):
        pass

    def inverse(self, z):
        pass


layer = MyLayer()

x = keras.ops.zeros((128, 2))

# TypeError: MyLayer.forward() got an unexpected keyword argument 'inverse'
layer(x, inverse=True)

My keras version: 3.3.3

@SuryanarayanaY
Copy link
Collaborator

SuryanarayanaY commented May 20, 2024

Hi @LarsKue ,

Thanks for reporting the issue. I have replicated the issue and observed that with Pytorch as backend its Layer.forward method has preference than base keras.Layer.call method. Attaching gist for reference.

Need to check with Keras Dev team whether this is intended or overlook.

Thanks!

@SuryanarayanaY SuryanarayanaY added To investigate Looks like a bug. It needs someone to investigate. keras-team-review-pending Pending review by a Keras team member. labels May 20, 2024
@mattdangerw
Copy link
Member

@haifeng-jin any thoughts on this? I'm not familiar enough with the torch to know what our intended behavior is.

@haifeng-jin
Copy link
Contributor

To have a Keras layer works with torch Module and training loops, which would call the forward() function for the forward pass, we let the forward() function call the call() function in the base layer class.

@mattdangerw mattdangerw removed the keras-team-review-pending Pending review by a Keras team member. label May 23, 2024
@haifeng-jin
Copy link
Contributor

@LarsKue ,

Please let us know your specific use case and whether it can be solved by add a super().forward() to your forward() implementation.

Thanks

@LarsKue
Copy link
Contributor Author

LarsKue commented May 28, 2024

@haifeng-jin The use case is invertible networks. My current work-around is not to expose forward and inverse directly, but to use protected methods:

import os
os.environ["KERAS_BACKEND"] = "torch"

import keras


class MyLayer(keras.Layer):
    def call(self, xz, inverse: bool = False):
        if inverse:
            return self._inverse(xz)
        return self._forward(xz)

    def _forward(self, x):
        pass

    def _inverse(self, z):
        pass


layer = MyLayer()

x = keras.ops.zeros((128, 2))

# works now
layer(x, inverse=True)

Adding a super().forward() might work, but I do not think this is a sound solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
To investigate Looks like a bug. It needs someone to investigate.
Projects
None yet
Development

No branches or pull requests

4 participants