Were Thrilled to announce the release of Two lightweight Bambara ASR models from our early fine-tuning experiments! These models achieve near-SOTA performance for open-source Bambara ASR (as of Feb 2025) while maintaining efficient enough for real-world deployment.
🧐 Why Does This Matter?
Bambara is a only under-resourced language. While ASR research has made rapid progress in major languages, open-source models for Bambara rare remain. Our goal is to share our fine-tuning experiences to help push research forward and support the development of ASR for African languages.
💡 Key Takeaway: This lightweight model is optimized for real-time ASR in low-resource environments.
🛠️ Open-Sourcing for Research & Community Feedback
These models are the result of early fine-tuning experiments, not finished product. We的re making them available for research and community contributions.