Back to feed

0.89.0

Jan 22, 2026
OpenAI/Codex CLICLIvrust-v0.89.0

New Features

  • Added a /permissions command with a shorter approval set while keeping /approvals for compatibility. (#9561)
  • Added a /skill UI to enable or disable individual skills. (#9627)
  • Improved slash-command selection by prioritizing exact and prefix matches over fuzzy matches. (#9629)
  • App server now supports thread/read and can filter archived threads in thread/list. (#9569, #9571)
  • App server clients now support layered config.toml resolution and config/read can compute effective config from a given cwd. (#9510)
  • Release artifacts now include a stable URL for the published config schema. (#9572)

Bug Fixes

  • Prevented tilde expansion from escaping HOME on paths like ~//.... (#9621)
  • TUI turn timing now resets between assistant messages so elapsed time reflects the latest response. (#9599)

Documentation

  • Updated MCP subcommand docs to match current CLI behavior. (#9622)
  • Refreshed the skills/list protocol README example to match the latest response shape. (#9623)

Chores

  • Removed the TUI2 experiment and its related config/docs, keeping Codex on the terminal-native UI. (#9640)

Changelog

Full Changelog: https://github.com/openai/codex/compare/rust-v0.88.0...rust-v0.89.0

  • #9576 [bazel] Upgrade to bazel9 @zbarsky-openai
  • #9606 nit: ui on interruption @jif-oai
  • #9609 chore: defensive shell snapshot @jif-oai
  • #9621 fix: Fix tilde expansion to avoid absolute-path escape @tiffanycitra
  • #9573 define/emit some metrics for windows sandbox setup @iceweasel-oai
  • #9622 docs: fix outdated MCP subcommands documentation @htiennv
  • #9623 Update skills/list protocol readme @gverma-openai
  • #9616 [bazel] Upgrade llvm toolchain and enable remote repo cache @zbarsky-openai
  • #9624 forgot to add some windows sandbox nux events. @iceweasel-oai
  • #9633 Add websockets logging @pakrym-oai
  • #9592 Chore: update plan mode output in prompt @shijie-oai
  • #9583 Add collaboration_mode to TurnContextItem @charley-oai
  • #9510 Add layered config.toml support to app server @etraut-openai
  • #9629 feat: better sorting of shell commands @jif-oai
  • #9599 fix(tui) turn timing incremental @dylan-hurd-oai
  • #9572 feat: publish config schema on release @sayan-oai
  • #9549 Reduce burst testing flake @charley-oai
  • #9640 feat(tui): retire the tui2 experiment @joshka-oai
  • #9597 feat(core) ModelInfo.model_instructions_template @dylan-hurd-oai
  • #9627 Add UI for skill enable/disable. @xl-openai
  • #9650 chore: tweak AGENTS.md @dylan-hurd-oai
  • #9656 Add tui.experimental_mode setting @pakrym-oai
  • #9561 feat(tui) /permissions flow @dylan-hurd-oai
  • #9653 Fix: Lower log level for closed-channel send @Kbediako
  • #9659 Chore: add cmd related info to exec approval request @shijie-oai
  • #9693 Revert "feat: support proxy for ws connection" @pakrym-oai
  • #9698 Support end_turn flag @pakrym-oai
  • #9645 Modes label below textarea @charley-oai
  • #9644 feat(core) update Personality on turn @dylan-hurd-oai
  • #9569 feat(app-server): thread/read API @owenlin0
  • #9571 feat(app-server): support archived threads in thread/list @owenlin0