Allow for local LLMs

I’d love to see t3.chat being able to connect to local ollama endpoint and allow for local LLMs to run and display on the website. Why? Surely we’ve got open-webui for such efforts! No, I don’t. Somehow it’s broken on my machine with no way to install and run locally seamlessly. Also, t3.chat’s UI feels faster and superior to what I’ve seen around the web, zero complication, fast af and we get the perk of having other web LLMs integrating seamlessly with one another.

Please authenticate to join the conversation.

Upvoters
Status

Closed

Board
💡

Feature Request

Subscribe to post

Get notified by email when there are changes.