Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.unique not support for torch.jit.script #37986

Closed
lgray opened this issue May 7, 2020 · 2 comments
Closed

torch.unique not support for torch.jit.script #37986

lgray opened this issue May 7, 2020 · 2 comments
Assignees
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue small We think this is a small issue to fix. Consider knocking off high priority small issues triage review triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@lgray
Copy link

lgray commented May 7, 2020

🚀 Feature

Add torch.unique support to torch.jit.script so that it is possible to jit-script graph pooling in using pytorch geometric.

This was mentioned back in #12206.

@suo

Motivation

I am writing models using pytorch_geometric that I would like to jit-script. I have started by making automatic GNN model synthesis that makes jittable convolution operations, but we also need to do graph pooling to have the full range of functionality. This latter bit requires torch.unique in python to be jit-scriptable.

Pitch

I would like:

@torch.jit.script
def f(x):
    return torch.unique(x)

to not result in:

Traceback (most recent call last):
  File "quicktest.py", line 3, in <module>
    @torch.jit.script
  File "/data/gaoxiang/anaconda3/lib/python3.6/site-packages/torch/jit/__init__.py", line 639, in script
    graph = _jit_script_compile(ast, rcb)
RuntimeError: 
unknown builtin op:
@torch.jit.script
def f(x):
    return torch.unique(x)
           ~~~~~~~~~~~~ <--- HERE

but rather a working jit function!

Alternatives

Writing everything in C++ for torch_geometric, which is a significant time burden for developers, where this could be fixed easily and centrally.

Additional context

This is supporting an effort here:
pyg-team/pytorch_geometric#1191

to make pytorch_geometric easily jittable without major rewrites or changes of user code.

cc @suo

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label May 7, 2020
@zdevito zdevito added triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module triage review labels May 7, 2020
@zdevito
Copy link
Contributor

zdevito commented May 7, 2020

torch.unique is similar to torch.norm in that it includes a python function that figures out dispatch to the actual builtin op. Trying to script that function directly doesn"t work at the moment. @eellison I think you looked at norm, do you know what it would take to make unique work too?

@eellison
Copy link
Contributor

eellison commented May 7, 2020

it would be a similar to #33737

torch.unique is a little trickier because you"d have to add a boolean_dispatch for both the return_inverse arg and the return_counts arg.

@eellison eellison added the small We think this is a small issue to fix. Consider knocking off high priority small issues label May 7, 2020
facebook-github-bot pushed a commit that referenced this issue May 13, 2020
Summary:
Fix for #37986

Follows the stack in #33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism.
Pull Request resolved: #38156

Differential Revision: D21504449

Pulled By: eellison

fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
gchanan pushed a commit to gchanan/pytorch that referenced this issue Jun 1, 2020
Summary:
Fix for pytorch#37986

Follows the stack in pytorch#33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism.
Pull Request resolved: pytorch#38156

Differential Revision: D21504449

Pulled By: eellison

fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
gchanan pushed a commit that referenced this issue Jun 2, 2020
Summary:
Fix for #37986

Follows the stack in #33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism.
Pull Request resolved: #38156

Differential Revision: D21504449

Pulled By: eellison

fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue small We think this is a small issue to fix. Consider knocking off high priority small issues triage review triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

4 participants