Skip to content

Version 0.3.1

Pre-release
Pre-release
Compare
Choose a tag to compare
@WilliamKarolDiCioccio WilliamKarolDiCioccio released this 06 Jul 15:07
· 2 commits to main since this release

✨ Customization, Customization, Customization! ✨

Note: With this release automatic updates on old previous versions are now broken due to changes in app files distribution.

🔧 Model-Specific Settings

With this release of OpenLocalUI, we're taking customization to a whole new level. Now, you can fine-tune every available Ollama client setting on a per-model basis, giving you unparalleled control over your models.

🐧 Landing on Linux

As promised in our last release, we are excited to announce that OpenLocalUI is now available on Linux! This update brings joy to our Linux users, and we are thrilled to support the entire Tux community.

😥 Note: The TTS (text to speech) feature is currently unavailable on Linux due to the application relying on the Windows-specific pyttsx3 Python package.

📦 Note: Automatic updates will also not be available because we will distribute those through apt and dnf starting from the next release.

🚨 Note: The RPM release might present issues during installation. We don't know why, and we're investigating the issue. In case we find a solution, we'll update the release asset.

🙏 A Big Thank You!

I want to express our heartfelt gratitude to:

  • My Collaborators: Your support and contributions have been invaluable.
  • Developers of Dependencies: Your incredible work has made this release possible.
  • Our Users: Thank you for your feedback and for being part of our journey. Your insights help us improve with every update.

Thank you all for your continued support!