Queue messages

I would like to be able to add followups to my message before the model finishes writing out the code. This may be fixed just by the models being faster. They have been slow for me today, running this on a raspberry pi, but I would like to be able to press enter send the next thing for it to respond to instead of waiting for it to finish to be able to send the next message.

This would kind of be like premoving in chess . com. I can see part of the answer and I have a follow up, I don’t want to have to wait for the model to finish typing for me to be able to ask a followup.

An even better version of this, this is one of my biggest problems with ai chat bots.

When I type I often type several short messages instead of one long message.

I haven’t seen a chat app that supports that yet, especially if it is a thinking model like o3 mini high, or claude 3.7 reasoning it would be nice if it took in my next message as well if it has not started showing tokens yet.

Sometimes it is only after I hit send I realise I need to add something else and I would prefer to be able to quickly type something and push send instead of having to stop the generation and then go up manually copy the previous prompt and then add my additional context or question to the previous prompt.

Please authenticate to join the conversation.

Upvoters
Status

Gathering Interest

Board
💡

Feature Request

Subscribe to post

Get notified by email when there are changes.