You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Loading an ONNX model, if there is a constant tensor it doesn't seem to get populated with values. Not 100% this is a bug or I'm doing something wrong. I've been trying with models generated for testing.
To Reproduce
I have a ONNX test in a branch on my fork - constant_tensor_f32. It's using an ONNX model generated from this PyTorch script - constant_tensor_f32.onnx, and trying to add a constant tensor [[2,2],[2,2]] to input [[0,0],[0,0]].
Running the ONNX tests (cargo nextest run --manifest-path crates/burn-import/onnx-tests/Cargo.toml) gives this error:
Describe the bug
Loading an ONNX model, if there is a constant tensor it doesn't seem to get populated with values. Not 100% this is a bug or I'm doing something wrong. I've been trying with models generated for testing.
To Reproduce
I have a ONNX test in a branch on my fork - constant_tensor_f32. It's using an ONNX model generated from this PyTorch script - constant_tensor_f32.onnx, and trying to add a constant tensor [[2,2],[2,2]] to input [[0,0],[0,0]].
Running the ONNX tests (
cargo nextest run --manifest-path crates/burn-import/onnx-tests/Cargo.toml
) gives this error:Expected behavior
The test should pass with expected output [[2,2],[2,2]]
Screenshots
In Netron the model looks like it contains the [[2,2],[2,2]] constant tensor.
Desktop (please complete the following information):
ConstantNode
functionality in the linked test branch shouldn't be different from that commitAdditional context
Running
cargo run -p burn-import
over that model, in the generated Rust code it looks like it is initializing the constant tensor to all zerosThe text was updated successfully, but these errors were encountered: