-
Notifications
You must be signed in to change notification settings - Fork 420
Issues: pytorch/xla
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[RFC] PyTorch/XLA eager mode as default
usability
Bugs/features related to improving the usability of PyTorch/XLA
#7253
opened Jun 12, 2024 by
JackCaoG
[torchbench]
drq
training fails to run on non-dynamo.
xla:gpu
#7247
opened Jun 11, 2024 by
ysiraichi
The combination of inplace ops and custom op resulted in incorrect results
#7240
opened Jun 11, 2024 by
yitongh
Incomplete Checkpoints for Non-Sharded Parameters During SPMD Training in PyTorch XLA
#7215
opened Jun 7, 2024 by
huzama
In-place operations on an DLPack aliased XLA tensor does not propagate.
xla:gpu
#7198
opened Jun 5, 2024 by
ysiraichi
How do I know which pytorch parameter corresponds to which parameter in hlo ir
#7191
opened Jun 4, 2024 by
yao-jz
Select a model to train and run on TPUs
advanced
docathon-h1-2024
#7190
opened Jun 4, 2024 by
duncantech
Try running inference on an ARM CPU
advanced
docathon-h1-2024
#7185
opened Jun 4, 2024 by
duncantech
Create a distributed and single device example
advanced
docathon-h1-2024
#7183
opened Jun 4, 2024 by
duncantech
Run and suggest improvements for GPU setup
docathon-h1-2024
medium
#7178
opened Jun 4, 2024 by
duncantech
Why not register low precision autocast for scaled dot product attention?
#7177
opened Jun 4, 2024 by
lingzhi98
Persistent Cache will not recompile when
XLA_IR_DEBUG
and XLA_HLO_DEBUG
changed
#7169
opened Jun 3, 2024 by
JackCaoG
A large number of Tensors (>8000) in the graph will trigger an spmd sharding error
#7161
opened May 31, 2024 by
mars1248
Previous Next
ProTip!
no:milestone will show everything without a milestone.