Skip to content

Long Inference Time on First Run After Changing Input Shape in Dynamic Shape TensorRT Engine #6607

Long Inference Time on First Run After Changing Input Shape in Dynamic Shape TensorRT Engine

Long Inference Time on First Run After Changing Input Shape in Dynamic Shape TensorRT Engine #6607

Triggered via issue December 19, 2024 22:57
Status Skipped
Total duration 5s
Artifacts

blossom-ci.yml

on: issue_comment
Authorization
0s
Authorization
Upload log
0s
Upload log
Vulnerability scan
0s
Vulnerability scan
Start ci job
0s
Start ci job
Fit to window
Zoom out
Zoom in