Chat UI as a Service

As a chat service provider, I’d like minimize my UI work and focus on the model itself.

In other words, I’d like to be able to provide my LLM through t3.chat, like a Custom GPT but monetized. t3.chat takes a cut and sends me the remainder.

Either approach looks good to me:

  1. I provide a service API conforming to a standard expected by t3.chat, or

  2. I build the model inside t3.chat with a creator view where I can prompt engineer, upload docs for RAG, etc, similar to a Custom GPT experience from Open AI

Please authenticate to join the conversation.

Upvoters
Status

Closed

Board
💡

Feature Request

Subscribe to post

Get notified by email when there are changes.