Show model selection with prompt

I love the ability to ask different models the same question (makes it easy to directly compare models), as well as swap models mid-chat. However currently there is currently no way to see which model I asked which prompt, or where in the chat I switched models (as the chat prompt text window defaults to whatever the latest model selected was regardless of chat).

It would be great to have some way of knowing who(/what? Not sure on how to best describe chatting with an LLM) I asked what prompt to, and whether I switched models from a reasoning model to a regular LLM.

An example workflow is to ask DeepSeek R1 or o3-mini to plan out a feature for me, but then get Claude to actually generate some of the code for that feature.

Please authenticate to join the conversation.

Upvoters
Status

Completed

Board
💡

Feature Request

Subscribe to post

Get notified by email when there are changes.