I haven't managed to successfully export my custom ViT model yet, but I've not had an issue accessing the export methods in torch 2.3 within the nvcr.io/nvidia/pytorch:24.02-py3 container.
I may have some more time to debug my trace tonight (i.e. remove conditionals from model + make sure everything is on CPU) and will update if I have any new insights.
I may have some more time to debug my trace tonight (i.e. remove conditionals from model + make sure everything is on CPU) and will update if I have any new insights.
``` from torch.export import export ... example_args = (dummy_input) exported_program = export(model, args=example_args) ```
Links: - torch.export docs: https://pytorch.org/docs/stable/export.html#serialization - Using 24.02 container: https://docs.nvidia.com/deeplearning/frameworks/pytorch-rele...