-
Notifications
You must be signed in to change notification settings - Fork 3.7k
[Relax][PyTorch] Add PReLU Op Support for Exported Program and FX graph #17816
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Please address the conflict as the previous PR just has been merged :) |
5815b66
to
963e338
Compare
@tvm-bot rerun |
Failed to re-run CI in https://github.com/apache/tvm/actions/runs/14373842802
with response
|
@Deivanayaki-S Please resolve the conflict so we can prioritize merging it. Seems the recent merged pr introduced the conflict |
fd24387
to
cd8f9cd
Compare
@yongwww thanks for pointing it out, the conflict has been resolved. |
…ph (apache#17816) * prelu op support and test script added * end-of-file issue fixed * trailing whitespace issue fixed * fixing lint issues * fix assertion error in test_op_nn.py file * add test script in test_frontend_nn_op.py * include wrapper function for prelu in op.py * fixing unity check issue by modifying test func * conflicts resolved * add doc for prelu op axis arg * fixed failing checks issue --------- Co-authored-by: deivanayakisankaralingam <deiva@Deivanayaki>
This PR adds support for the PReLU operation in both the Exported Program and FX graph Relax frontend pipeline.