• Y
  • List All
  • Feedback
    • This Project
    • All Projects
Profile Account settings Log out
  • Favorite
  • Project
  • All
Loading...
  • Log in
  • Sign up
yjyoon / whisper_server_speaches star
  • Project homeH
  • CodeC
  • IssueI
  • Pull requestP
  • Review R
  • MilestoneM
  • BoardB
  • Files
  • Commit
  • Branches
whisper_server_speachescompose.yaml
Download as .zip file
File name
Commit message
Commit date
.github/workflows
deps: move ui packages to a separate group
2024-10-01
docs
docs: initialize mkdocs
2024-10-03
examples
Update script.sh
2024-09-03
scripts
chore: misc changes
2024-10-03
src/speaches
chore: misc changes
2024-10-03
tests
feat: model unloading
2024-10-01
.dockerignore
chore: ignore .env
2024-05-27
.envrc
init
2024-05-20
.gitattributes
docs: add live-transcription demo
2024-05-28
.gitignore
chore: update .gitignore
2024-07-03
.pre-commit-config.yaml
fix: pre-commit basedpyright not checking all files
2024-10-01
Dockerfile.cpu
deps: move ui packages to a separate group
2024-10-01
Dockerfile.cuda
deps: move ui packages to a separate group
2024-10-01
LICENSE
init
2024-05-20
README.md
Update README.md
2024-10-04
Taskfile.yaml
feat: dependency injection
2024-09-22
audio.wav
docs: update README.md
2024-05-27
compose.yaml
chore: format compose
2024-09-11
flake.lock
deps: update flake
2024-10-01
flake.nix
deps: update flake
2024-09-08
mkdocs.yml
docs: initialize mkdocs
2024-10-03
pyproject.toml
docs: initialize mkdocs
2024-10-03
uv.lock
docs: initialize mkdocs
2024-10-03
Fedir Zadniprovskyi 2024-09-11 ae64a1a chore: format compose UNIX
Raw Open in browser Change history
# TODO: https://docs.astral.sh/uv/guides/integration/docker/#configuring-watch-with-docker-compose services: faster-whisper-server-cuda: image: fedirz/faster-whisper-server:latest-cuda build: dockerfile: Dockerfile.cuda context: . platforms: - linux/amd64 - linux/arm64 restart: unless-stopped ports: - 8000:8000 volumes: - hugging_face_cache:/root/.cache/huggingface develop: watch: - path: faster_whisper_server action: rebuild deploy: resources: reservations: devices: - capabilities: ["gpu"] # If you have CDI feature enabled use the following instead # https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/cdi-support.html # https://docs.docker.com/reference/cli/dockerd/#enable-cdi-devices # - driver: cdi # device_ids: # - nvidia.com/gpu=all faster-whisper-server-cpu: image: fedirz/faster-whisper-server:latest-cpu build: dockerfile: Dockerfile.cpu context: . platforms: - linux/amd64 - linux/arm64 restart: unless-stopped ports: - 8000:8000 volumes: - hugging_face_cache:/root/.cache/huggingface develop: watch: - path: faster_whisper_server action: rebuild volumes: hugging_face_cache:

          
        
    
    
Copyright Yona authors & © NAVER Corp. & NAVER LABS Supported by NAVER CLOUD PLATFORM

or
Sign in with github login with Google Sign in with Google
Reset password | Sign up