-
Notifications
You must be signed in to change notification settings - Fork 149
Add gallery notebook on vector jacobian products #1544
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add gallery notebook on vector jacobian products #1544
Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds vectorization support for the cumulative sum operation in PyTensor and includes a new tutorial notebook demonstrating optimized vector-Jacobian products.
- Implement
_vectorize_noderule forCumOpto enable batched inputs. - Add a comprehensive
vector_jacobian_product.ipynbshowing VJP optimizations across several tensor operations.
Reviewed Changes
Copilot reviewed 2 out of 3 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| pytensor/tensor/extra_ops.py | Register and implement vectorize_cum_op for CumOp |
| doc/gallery/gradient/vector_jacobian_product.ipynb | New tutorial notebook illustrating vector-Jacobian products |
Comments suppressed due to low confidence (4)
pytensor/tensor/extra_ops.py:477
- [nitpick] There aren’t any unit tests covering the new
vectorize_cum_oprule. Consider adding tests for both theaxis=Noneand explicitaxiscases to ensure correct batched behavior.
@_vectorize_node.register(CumOp)
pytensor/tensor/extra_ops.py:486
- The function
normalize_axis_indexis used here but not imported, which will cause aNameError. Please add an import, for example:
from pytensor.tensor.utils import normalize_axis_index axis = normalize_axis_index(op.axis, original_x.ndim)
pytensor/tensor/extra_ops.py:490
- Inside the list comprehension you iterate over
batch_xbut still callbatch_x.flatten(...). You likely intended to callx.flatten(...)for each element or otherwise flatten the entire batch tensor directly. Consider:
batch_x_raveled = [x.flatten(ndim=batch_ndim + 1) for x in batch_x] batch_x_raveled = [batch_x.flatten(ndim=batch_ndim + 1) for x in batch_x]
doc/gallery/gradient/vector_jacobian_product.ipynb:62
- Spelling mistake: 'Elemtwise' should be 'Elementwise'.
"## Elemtwise operations"
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1544 +/- ##
==========================================
+ Coverage 81.49% 81.53% +0.04%
==========================================
Files 232 230 -2
Lines 53122 53066 -56
Branches 9444 9423 -21
==========================================
- Hits 43292 43269 -23
+ Misses 7382 7364 -18
+ Partials 2448 2433 -15
🚀 New features to boost your workflow:
|
eda6642 to
0ce5359
Compare
0ce5359 to
7da29a8
Compare
7da29a8 to
680e47d
Compare
|
Link to the docs preview: https://pytensor--1544.org.readthedocs.build/en/1544/gallery/autodiff/vector_jacobian_product.html And yes the auto snapshoot seems to be erasing the custom pics |
| @@ -0,0 +1,1071 @@ | |||
| { | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
jessegrabowski
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks great!
Did you want to adjust the generate_gallery script to use custom images? Or open an issue for later?
680e47d to
d049dad
Compare
|
Opened issue #1554 |
When we have more powerful pytensor rewrites I would like to make a companion piece showing pytensor deriving the optimized Lop from the naive formulation (maybe with the exception of the transpose)
Incidentally, this PR implements vectorize for CumSum, so it shows up nicely in the jacobian :)