Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to upstream StablehloToLinalg code. #19792

Merged
merged 13 commits into from
Jan 27, 2025

Conversation

ScottTodd
Copy link
Member

@ScottTodd ScottTodd commented Jan 23, 2025

While looking at compiler warnings in build logs, I noticed paths in StableHLO that looked out of place. As it turns out, much of IREE's StableHLO to Linalg conversion code was forked into upstream StableHLO in openxla/stablehlo#1817, though there have been some local changes to the code here since it was forked: https://github.com/iree-org/iree/commits/main/compiler/plugins/input/StableHLO/Conversion.

Switching to use the upstream code will allow us to decrease the surface area we directly support and limit the number of files we need to build from source, but it will also make maintenance require coordinating more with upstream (such as during LLVM integrates). We still point to a fork at https://github.com/iree-org/stablehlo , so if things get tricky we can choose to set up a branch with patches as needed.

Some notes:

  • More code, particularly includes and build dependencies, could be pruned.
  • We can probably delete more code by reviving Remove all usage of the CHLO dialect (from StableHLO). #18681 too
  • I deleted lit tests for the patterns that were moved upstream. The tests still exist at https://github.com/openxla/stablehlo/tree/main/stablehlo/conversions/linalg/tests, but I don't see much value in having our own versions of the lit tests. We do still have e2e tests that compile and run.
  • I did not plumb through the enablePrimitiveOps or enableSparseOps options, which may be useful for some programs
  • I'm keeping our custom stablehlo.concatenate lowering since the alternate lowering (from IREE or now StableHLO) has correctness issues. Also keeping the FFT lowering since that does not exist upstream and it handles cases that our LinalgExt lowering does not.

@ScottTodd ScottTodd added cleanup 🧹 integrations/stablehlo StableHLO (JAX/TensorFlow/etc) import and conversion labels Jan 23, 2025
@benvanik
Copy link
Collaborator

-12k O_O nice!

abhigunj pushed a commit to openxla/stablehlo that referenced this pull request Jan 23, 2025
This function declaration has existed since the code was forked from
IREE in #1817, but the
implementation was kept private (static function within an anonymous
namespace).

I'm now trying to switch IREE from having its own implementation to
using the upstream implementation from this project in
iree-org/iree#19792, and I would like to access
these patterns directly, instead of through the
`StablehloLegalizeToLinalgPass`. With the patterns I can run conversion
including my own sets of additional patterns, while a pass runs in
isolation.

I'm also deleting the `populateLegalizeChloPatterns`,
`populateLegalizeControlFlowPatterns`, and
`populateLegalizeShapeComputationPatterns` declarations, which were not
migrated from IREE and are also dangling without implementations.
@ScottTodd ScottTodd changed the title Switch to upstream createStablehloLegalizeToLinalgPass. Switch to upstream StablehloToLinalg code. Jan 23, 2025
@ScottTodd ScottTodd marked this pull request as ready for review January 24, 2025 20:13
@ScottTodd
Copy link
Member Author

Got all in-tree presubmit tests passing. Ready for review.

@ScottTodd ScottTodd requested a review from kuhar January 24, 2025 20:53
Copy link
Member

@kuhar kuhar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've come a full circle on this :D LGTM as long as the tests pass.

@ScottTodd ScottTodd merged commit 7e6c5ec into iree-org:main Jan 27, 2025
41 checks passed
@ScottTodd ScottTodd deleted the stablehlo-upstream-passes branch January 27, 2025 22:06
ita9naiwa pushed a commit to ita9naiwa/iree that referenced this pull request Feb 4, 2025
While looking at compiler warnings in build logs, I noticed paths in
StableHLO that looked out of place. As it turns out, much of IREE's
StableHLO to Linalg conversion code was forked into upstream StableHLO
in openxla/stablehlo#1817, though there have
been some local changes to the code here since it was forked:
https://github.com/iree-org/iree/commits/main/compiler/plugins/input/StableHLO/Conversion.

Switching to use the upstream code will allow us to decrease the surface
area we directly support and limit the number of files we need to build
from source, but it will also make maintenance require coordinating more
with upstream (such as during LLVM integrates). We still point to a fork
at https://github.com/iree-org/stablehlo , so if things get tricky we
can choose to set up a branch with patches as needed.

Some notes:

* More code, particularly includes and build dependencies, could be
pruned.
* We can probably delete more code by reviving
iree-org#18681 too
* I deleted lit tests for the patterns that were moved upstream. The
tests still exist at
https://github.com/openxla/stablehlo/tree/main/stablehlo/conversions/linalg/tests,
but I don't see much value in having our own versions of the lit tests.
We do still have e2e tests that compile and run.
* I did _not_ plumb through the `enablePrimitiveOps` or
`enableSparseOps` options, which may be useful for some programs
* I'm keeping our custom `stablehlo.concatenate` lowering since the
alternate lowering (from IREE or now StableHLO) has correctness issues.
Also keeping the FFT lowering since that does not exist upstream and it
handles cases that our LinalgExt lowering does not.

Signed-off-by: Hyunsung Lee <ita9naiwa@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cleanup 🧹 integrations/stablehlo StableHLO (JAX/TensorFlow/etc) import and conversion
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants