I usually only need to ask an AI some basic questions, so I default to the smallest, fastest models.
I know that with my usage, the cost of the messages are extremely, extremely low. Some services give a certain amount of credits, and heavier models cost more credits, allowing me to use lighter models a lot while remaining very cheap.
It would be nice to see super light models costing less credits than heavier “standard” models under the standard credits. For example, it would be nice for GPT-5 Nano or Gemini 2.5 Flash Lite to be 0.25 credits instead of an entire credit. Otherwise, I’d rather use a larger model to “get my money’s worth”.
Please authenticate to join the conversation.
Closed
Feature Request
Get notified by email when there are changes.
Closed
Feature Request
Get notified by email when there are changes.