get whisper-x running on the cluster

  • Build container containing the whisper-x package, use whisper turbo, include models in mount points
  • Run container in cluster
  • Run transcription inference as job
  • Run container with API interface, e.g. via cog
  • push container image to GitLab registry

References:

Edited by Michael Graber