YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Parakeet‑TDT 0.6B v3 (MLX) — Model Files

This repository hosts the MLX model files (config + weights + tokenizer) for Parakeet‑TDT v3. It is intentionally files‑only (no widget, no runnable code). Use these files with your own codebase or with the parakeetv3_mlx package in your project.

Contents

  • parakeet-tdt-v3-mlx/config.json
  • parakeet-tdt-v3-mlx/model.safetensors
  • parakeet-tdt-v3-mlx/tokenizer.model
  • parakeet-tdt-v3-mlx/tokenizer.vocab
  • parakeet-tdt-v3-mlx/vocab.txt

How to use

Option A: Download programmatically

from huggingface_hub import snapshot_download
from parakeetv3_mlx.utils import from_pretrained
from pathlib import Path

local_dir = snapshot_download("Jimmi42/parakeetv3-mlx-files", allow_patterns=["parakeet-tdt-v3-mlx/*"])
model_dir = Path(local_dir)/"parakeet-tdt-v3-mlx"
model = from_pretrained(str(model_dir))
res = model.transcribe("audio.wav")
print("".join(t.text for t in res.tokens))

Option B: Download manually

  1. In “Files and versions”, download the files under parakeet-tdt-v3-mlx/ into a local folder.
  2. Load with:
from parakeetv3_mlx.utils import from_pretrained
model = from_pretrained("/path/to/parakeet-tdt-v3-mlx")

Notes

  • Requires the parakeetv3_mlx Python package (your app or local project) and Apple’s MLX.
  • Audio: mono 16 kHz WAV recommended (librosa can resample automatically).
  • Long audio: enable local attention + chunking in your code for best memory/perf trade‑off.

Benchmarks (Apple Silicon)

  • Settings: chunk=120s, overlap=15s, local attention (256,256), bf16
  • Device: M4 Pro
Audio (1h) Wall time RTF
English 43.6 s 82.6×
German 59.6 s 60.4×

On M4 Max, throughput is typically ~2× higher under the same settings.

About the author

Profile: https://huggingface.co/Jimmi42

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support