October 11th, 2024

GPU Transcription Inference Container

11.0.1 - GPU Transcription Inference Container

GPU Inference Containers are released in sync with the Real-time/Batch containers they support. You should only rely on an Inference Container working with a Real-time/Batch container if it has the same version number.

For full details and a guide to implementation see GPU Transcription Inference Container.

  • Compatible with version 11.0.1 of the Batch and Real-time Containers

  • Security fixes. A Software Bill of Materials (SBOM) is available for download from the knowledgebase section of the Support Portal.

  • Fix for previously reported known limitation [DEL-18942] where occasionally the inference server could receive a signal 11 followed by a series of error logs and begin to shutdown.