OpenAI just announced their latest Codex models geared for coding. Although it was designed for codex, it would probably do great things within Zed's Agentic Editing:
Today, we’re also releasing a smaller version of codex-1, a version of o4-mini designed specifically for use in Codex CLI. This new model supports faster workflows in the CLI and is optimized for low-latency code Q&A and editing, while retaining the same strengths in instruction following and style. It’s available now as the default model in Codex CLI and in the API as codex-mini-latest. The underlying snapshot will be regularly updated as we continue to improve the Codex-mini model.
codex-mini-latest is available in the API as of today and usable within the Codex CLI, although it looks like it is only supported by the Responses API (rather than the typical Chat API):
For developers building with codex-mini-latest, the model is available on the Responses API and priced at $1.50 per 1M input tokens and $6 per 1M output tokens, with a 75% prompt caching discount.
See more in the official Codex docs on typical usage
Please authenticate to join the conversation.
Gathering Interest
Feature Request
Get notified by email when there are changes.
Gathering Interest
Feature Request
Get notified by email when there are changes.