You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue comes from the first ttnn.from_torch which changes the shape of the arange tensor, (2048), to (1[32], 2048) due to tilization. Changing the tensor back to ROW_MAJOR will not remove the extra dimension. The most straightforward solution will be to set layout to ROW_MAJOR in the first ttnn.from_torch call and removing the extra device/layout changes. A workaround would be to add a call to reshape or squeeze if arg[0] of ttnn.embedding has only a rank of 1. We can remove this workaround once we have the better fix.
The text was updated successfully, but these errors were encountered:
This example is from the perceiver_io model.
In this model, the aten graph has this code:
The
arange
tensor is 1D with shape of (2048). The size of the output should be:This is lowered to the equivalent code below:
However, the shape becomes:
The issue comes from the first
ttnn.from_torch
which changes the shape of the arange tensor,(2048)
, to(1[32], 2048)
due to tilization. Changing the tensor back to ROW_MAJOR will not remove the extra dimension. The most straightforward solution will be to set layout toROW_MAJOR
in the firstttnn.from_torch
call and removing the extra device/layout changes. A workaround would be to add a call to reshape or squeeze ifarg[0]
ofttnn.embedding
has only a rank of 1. We can remove this workaround once we have the better fix.The text was updated successfully, but these errors were encountered: