Skip to content

feat: add a macro to directly visualize the generated mlir #1246

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

avik-pal
Copy link
Collaborator

@avik-pal avik-pal commented May 4, 2025

julia> @mlir_visualize fn(x_ra)
Loading extensions...
2025-05-04 18:33:43.039914: I external/org_tensorflow/tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
Loaded 8 extensions:
 - TFLite adapter (Flatbuffer)
 - TFLite adapter (MLIR)
 - TF adapter (MLIR)
 - TF adapter (direct)
 - GraphDef adapter
 - Pytorch adapter (exported program)
 - MLIR adapter
 - JSON adapter

Starting Model Explorer server at:
http://localhost:8080/?data=%7B%22models%22%3A%20%5B%7B%22url%22%3A%20%22/tmp/jl_jSFMXkryRI.mlir%22%7D%5D%7D

Press Ctrl+C to stop.
Opening in existing browser session.

Same as code_hlo but opens a browser session

@avik-pal avik-pal requested a review from wsmoses May 4, 2025 22:36
@avik-pal avik-pal force-pushed the ap/model-explorer branch from d47cca4 to 4279851 Compare May 5, 2025 04:35
@mlir_visualize [optimize = ...] [no_nan = <true/false>] f(args...)

Runs `@code_hlo` and visualizes the MLIR module using `model-explorer`. This expects the
`model-explorer` executable to be in your `PATH`. Installation instructions can be found
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this feels worthwhile just making into a (separate) jll of our own imo, rather than relying on path

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, though it seems to be a JS framework, not sure how to ship the binary via yggy. @giordano might know?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something like https://github.com/JuliaPackaging/Yggdrasil/blob/179e92aaf902a50c7876b0d088b3de88091be228/D/D3/build_tarballs.jl? What's the executable? A script which requires an interpreter?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I was mistaken, only the UI components are present in the npm package. https://github.com/google-ai-edge/model-explorer/blob/main/src/builtin-adapter/python/pip_package/build_pip_package.sh has the bazel scripts for the python wheel

default_options = Dict{Symbol,Any}(
:optimize => true,
:no_nan => false,
:client => nothing,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tangentially from this PR, we should consider having these options defaults in one place (rather than copied many places)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants