February 14th, 2025
GPU Transcription Inference Container
GPU Inference Containers are released in sync with the Real-time/Batch containers they support. You should only rely on an Inference Container working with a Real-time/Batch container if it has the same version number.
For full details and a guide to implementation, see GPU Transcription Inference Container.
Compatible with version 12.0.0 of the Batch and Real-time Containers
Security fixes. A Software Bill of Materials (SBOM) is available for download from the corresponding release page in our Support Portal.