ERROR: root:Exporting to ONNX failed #338
Unanswered
dezorianguy
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to use Nvidia TensorRT within my Stable Diffusion Forge environment.
I use Stability Matrix for my Stable Diffusion programs and installation of models. In Forge, I installed the TensorRT extension, enabled sd unet in the interface, and when I try to export an engine for a model, I get the following errors in the command screen:
ERROR:root:Exporting to ONNX failed. module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'
Building TensorRT engine... This can take a while, please check the progress in the terminal.
Building TensorRT engine for C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx: C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-trt\autismmixSDXL_autismmixConfetti_10047b0e_cc86_sample=2x4x128x128+2x4x128x128+2x4x128x128-timesteps=2+2+2-encoder_hidden_states=2x77x2048+2x77x2048+2x77x2048-y=2x2816+2x2816+2x2816.trt
Could not open file C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx
Could not open file C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx
[W] 'colored' module is not installed, will not use colors when logging. To enable colors, please install the 'colored' module: python3 -m pip install colored
[E] ModelImporter.cpp:773: Failed to parse ONNX model from file: C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx
[!] Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?
Environment
NVIDIA GPU: GeForce RTX 3060 12GB
NVIDIA Driver Version: latest
CUDA Version: 12.1
Operating System: Windows 11
Python Version: 3.10
PyTorch Version: torch-2.3.1+cu121, torchaudio-2.3.1+cu121, torchvision-0.18.1+cu121, xformers-0.0.26.post1
Baremetal or Container: Baremetal
Relevant Files
Model link: AutismMixXL Conffetti
Steps To Reproduce
Commands or scripts:
Install TensorRT extension in Stable Diffusion Forge.
Enable sd unet in the interface.
Try to export an engine for the model.
Have you tried the latest release?: Yes
Can this model run on other frameworks?: Have not tried running the ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt).
I am looking for assistance in resolving these errors to successfully export and run the TensorRT engine.
Beta Was this translation helpful? Give feedback.
All reactions