• Y
  • List All
  • Feedback
    • This Project
    • All Projects
Profile Account settings Log out
  • Favorite
  • Project
  • All
Loading...
  • Log in
  • Sign up
yjyoon / whisper_server_speaches star
  • Project homeH
  • CodeC
  • IssueI
  • Pull requestP
  • Review R
  • MilestoneM
  • BoardB
  • Files
  • Commit
  • Branches
whisper_server_speachessrcfaster_whisper_serverroutersmisc.py
Download as .zip file
File name
Commit message
Commit date
.github/workflows
deps: move ui packages to a separate group
2024-10-01
examples
Update script.sh
2024-09-03
scripts
chore: fix some ruff errors
2024-10-01
src/speaches
feat: model unloading
2024-10-01
tests
feat: model unloading
2024-10-01
.dockerignore
chore: ignore .env
2024-05-27
.envrc
init
2024-05-20
.gitattributes
docs: add live-transcription demo
2024-05-28
.gitignore
chore: update .gitignore
2024-07-03
.pre-commit-config.yaml
fix: pre-commit basedpyright not checking all files
2024-10-01
Dockerfile.cpu
deps: move ui packages to a separate group
2024-10-01
Dockerfile.cuda
deps: move ui packages to a separate group
2024-10-01
LICENSE
init
2024-05-20
README.md
Update README.md
2024-10-01
Taskfile.yaml
feat: dependency injection
2024-09-22
audio.wav
docs: update README.md
2024-05-27
compose.yaml
chore: format compose
2024-09-11
flake.lock
deps: update flake
2024-10-01
flake.nix
deps: update flake
2024-09-08
pyproject.toml
feat: model unloading
2024-10-01
uv.lock
feat: model unloading
2024-10-01
File name
Commit message
Commit date
routers
feat: model unloading
2024-10-01
__init__.py
feat: use `uv` package manager, pin dependencies
2024-09-08
api_models.py
refactor: update response model names and module name
2024-10-01
asr.py
refactor: update response model names and module name
2024-10-01
audio.py
feat: dependency injection
2024-09-22
config.py
feat: model unloading
2024-10-01
dependencies.py
feat: model unloading
2024-10-01
gradio_app.py
chore: fix some ruff errors
2024-10-01
hf_utils.py
feat: dependency injection
2024-09-22
logger.py
feat: dependency injection
2024-09-22
main.py
feat: dependency injection
2024-09-22
model_manager.py
feat: model unloading
2024-10-01
text_utils.py
refactor: update response model names and module name
2024-10-01
text_utils_test.py
refactor: update response model names and module name
2024-10-01
transcriber.py
refactor: update response model names and module name
2024-10-01
File name
Commit message
Commit date
__init__.py
refactor: split out app into multiple router modules
2024-09-22
list_models.py
chore: fix some ruff errors
2024-10-01
misc.py
feat: model unloading
2024-10-01
stt.py
feat: model unloading
2024-10-01
Fedir Zadniprovskyi 2024-10-01 23e0347 feat: model unloading UNIX
Raw Open in browser Change history
from __future__ import annotations from fastapi import ( APIRouter, Response, ) import huggingface_hub from huggingface_hub.hf_api import RepositoryNotFoundError from faster_whisper_server import hf_utils from faster_whisper_server.dependencies import ModelManagerDependency # noqa: TCH001 router = APIRouter() @router.get("/health") def health() -> Response: return Response(status_code=200, content="OK") @router.post("/api/pull/{model_name:path}", tags=["experimental"], summary="Download a model from Hugging Face.") def pull_model(model_name: str) -> Response: if hf_utils.does_local_model_exist(model_name): return Response(status_code=200, content="Model already exists") try: huggingface_hub.snapshot_download(model_name, repo_type="model") except RepositoryNotFoundError as e: return Response(status_code=404, content=str(e)) return Response(status_code=201, content="Model downloaded") @router.get("/api/ps", tags=["experimental"], summary="Get a list of loaded models.") def get_running_models( model_manager: ModelManagerDependency, ) -> dict[str, list[str]]: return {"models": list(model_manager.loaded_models.keys())} @router.post("/api/ps/{model_name:path}", tags=["experimental"], summary="Load a model into memory.") def load_model_route(model_manager: ModelManagerDependency, model_name: str) -> Response: if model_name in model_manager.loaded_models: return Response(status_code=409, content="Model already loaded") with model_manager.load_model(model_name): pass return Response(status_code=201) @router.delete("/api/ps/{model_name:path}", tags=["experimental"], summary="Unload a model from memory.") def stop_running_model(model_manager: ModelManagerDependency, model_name: str) -> Response: try: model_manager.unload_model(model_name) return Response(status_code=204) except (KeyError, ValueError) as e: match e: case KeyError(): return Response(status_code=404, content="Model not found") case ValueError(): return Response(status_code=409, content=str(e))

          
        
    
    
Copyright Yona authors & © NAVER Corp. & NAVER LABS Supported by NAVER CLOUD PLATFORM

or
Sign in with github login with Google Sign in with Google
Reset password | Sign up