Skip to content

Run OpenAI Codex CLI on Your Phone (iPhone + Android)

By Cosyra Editorial Team

Published Last updated 9 min read

You can run OpenAI Codex CLI on your phone today, iPhone or Android. The fastest path: install Cosyra for iOS or Cosyra for Android, sign in, and type codex in the terminal. We pre-install Codex CLI in an Ubuntu 24.04 container in the cloud, so there is no Node install, no Termux ARM headaches, no SSH tunnel to babysit. Add your OpenAI API key, describe what you want, review the diff. Free trial, 10 hours or 7 days, no credit card.

This guide was written by the Cosyra team. We tested Codex CLI on iPhone and Android via Cosyra, plus the two main alternatives, in April 2026. Where we make competitor claims, they are dated 2026-04-16.

What is OpenAI Codex CLI?

OpenAI Codex CLI is OpenAI's official command-line coding agent, a Node.js binary you install on Linux or macOS that reads your codebase, proposes patches, runs commands in a sandboxed working directory, and asks for approval on outside-world actions. It is not the ChatGPT Codex feature and it is not a web product. It is a real shell tool, which is why getting it onto a phone takes a bit of work.

Codex CLI is OpenAI's official command-line coding agent. Source at github.com/openai/codex. You install it with npm install -g @openai/codex on Linux or macOS, authenticate with an OpenAI API key or your ChatGPT plan, and run codex inside a project. It reads your files, proposes patches, runs commands in a sandboxed working directory, and asks for approval on actions that touch the outside world. It is not a web product and it is not the ChatGPT Codex feature. It is a binary that expects a real shell.

How can you run Codex CLI on a phone?

You can run Codex CLI on a phone three ways: a cloud Ubuntu container with Codex pre-installed reached from a native mobile app (Cosyra), Termux on Android plus Tailscale plus SSH into a laptop or VPS, or Blink Shell on iOS into a VPS you own. Codex CLI needs a real Linux or macOS shell, so every approach puts the actual Codex process somewhere else and gives your phone a good way to talk to it. The three options below all work as of 2026-04-16.

1. Cosyra: cloud Ubuntu container, Codex pre-installed

This is what we built. A native iOS and Android terminal that connects to a persistent Ubuntu 24.04 container in the cloud. Codex CLI, Claude Code, Gemini CLI, and OpenCode are already installed. You add your OpenAI API key, type codex, and go.

2. Termux + Tailscale + SSH to a laptop or VPS (Android)

Install Termux from F-Droid, install Tailscale on both your phone and a machine that runs Codex, SSH into that machine inside a tmux session, and run codex there. The phone is a dumb terminal, Codex actually runs on the laptop or VPS.

3. SSH from Blink Shell into your own VPS (iOS only)

On iPhone or iPad, Blink Shell is the gold standard SSH client. Spin up a VPS (Hetzner, Scaleway, DigitalOcean), npm install -g @openai/codex on it, SSH in from Blink inside a tmux session, run codex.

How do you set up Codex CLI on iPhone or Android?

You set up Codex CLI on iPhone or Android in about three minutes with Cosyra: install the app, sign in, confirm codex is on the PATH, paste your OpenAI API key into .bashrc, clone a repo, and run codex. No Node install, no SSH, no Termux ARM build issues. We measured 2 minutes 47 seconds on average across five cold starts on an iPhone 15 Pro and a Pixel 8 on 2026-04-16.

We measured this on 2026-04-16 on an iPhone 15 Pro and a Pixel 8: install to first codex prompt took 2 minutes 47 seconds on average across five cold starts.

Step 1: Install Cosyra and sign in

Download from the App Store or Google Play. Sign in with Apple, Google, or email. On first launch, we provision a fresh Ubuntu 24.04 container.

cosyra, fresh container first-launch banner

Welcome to Cosyra.

Ubuntu 24.04.1 LTS (x86_64)

Pre-installed: codex, claude, gemini, opencode

Storage: 30 GB persistent (survives hibernation)

 

$ whoami

cosyra

Step 2: Confirm Codex CLI is there

No install step, Codex CLI is baked into the image. Verify it in one command.

cosyra, verifying codex is installed

$ codex --version

codex 0.45.0

$ which codex

/usr/local/bin/codex

Step 3: Add your OpenAI API key

Get a key from platform.openai.com/api-keys. In the Cosyra terminal, set it as an environment variable and persist it in .bashrc so future sessions pick it up.

cosyra, adding OpenAI API key

$ # Persist the key for future sessions

$ echo 'export OPENAI_API_KEY="sk-proj-your-key-here"' >> ~/.bashrc

$ source ~/.bashrc

 

$ # Confirm it is readable

$ echo ${OPENAI_API_KEY:0:7}...

sk-proj...

You pay OpenAI directly for tokens at their published rates. We do not proxy, meter, or see the responses.

Step 4: Clone a repo

cosyra, cloning a project

$ git clone https://github.com/your-org/your-project.git

Cloning into 'your-project'...

Receiving objects: 100% (1847/1847), done.

$ cd your-project

Step 5: Run your first Codex prompt

cosyra, running codex on a project

$ codex

codex 0.45.0

workspace: /home/cosyra/your-project

model: gpt-5-codex

 

> The /api/users endpoint returns 500 on empty body.

Find the cause, fix it, add a regression test.

Codex reads the relevant files, proposes a patch, and waits for you to approve. You see the diff on your phone screen before anything is written to disk.

Try it free. 10 hours or 7 days of full access. No credit card. App Store / Google Play / Pricing details

What can you actually do with Codex on your phone?

You can refactor a module on the commute, review a teammate's PR from the couch, ship a 2 AM hotfix when you are on call, upgrade a dependency in a waiting room, and pick up the same session on an iPad when you switch devices. The container persists, so each of these slots into a 10 to 25-minute gap without any "get my laptop out" friction. Five real sessions we run below, with the prompts and diffs we typed.

Abstract claims about "mobile coding" are useless without concrete scenarios. Five micro-workflows we actually run from a phone.

Refactor a module on the commute

Subway, 14 minutes to your stop, one seat. Open Cosyra, cd into the service, run codex, type "rename the UserService.fetch method to getUser and update all callers, including tests." Codex lists the 15 files it plans to touch, you approve, it applies the patch, runs the tests, shows the green output. Commit and push before you surface.

Review a teammate's PR from the couch

Saturday morning, coffee, couch. A PR notification lands. You do not want to open a laptop. Fetch the branch in Cosyra, point Codex at it: "compare this branch against main, focus on security and missing error handling, write a short review." Paste the output into the GitHub PR comment from the mobile browser.

Ship a hotfix when you are on call

Pager goes off at 2 AM. You are in bed. Open Cosyra, pull, run codex with the stack trace, tell it to reproduce and fix. Review the diff on your phone, approve, push, watch the deploy pipeline go green. Go back to sleep. We did this twice in March 2026 and logged under 9 minutes from pager to green.

Upgrade a dependency in a waiting room

Dentist is running 25 minutes late. You have been putting off bumping an annoying dependency. Open Cosyra: "upgrade axios from 0.27 to 1.x, update any broken call sites, make sure tests still pass." Codex handles the migration one file at a time, shows you each diff, and you approve or reject as you go.

Test from an iPad without installing anything

iPad with a keyboard case is a surprisingly good Codex machine via Cosyra. Same container state as your phone. You pick up mid-session when you switch devices. The state persists because the container persists, not because we sync some client cache.

What are the real limits of running Codex CLI on a phone?

The real limits are no offline mode, OpenAI token spend adds up fast, phone keyboards slow long prompts, Codex defaults to looser permissions than Claude Code, and Docker-in-Docker is not guaranteed inside the container. We do not think Codex on a phone replaces a laptop for everything, but knowing where it stops helps you match the tool to the task instead of fighting the tool.

We do not think Codex on a phone replaces a laptop for everything. Here are the real constraints.

How does Cosyra compare to SSH and Termux for Codex CLI?

Cosyra wins for zero-setup and dual-platform (iOS and Android) phone use, Termux plus SSH wins if you already run a home server and are an Android user, and Blink Shell plus a VPS wins if you are an iOS user who wants a box you own. None of them are strictly best; each maps to a specific trade-off. The table below lines them up on seven attributes as of 2026-04-16.

Comparison is as of 2026-04-16.

Feature Cosyra Termux + SSH Blink + VPS
Codex pre-installed Yes You install on remote host You install on VPS
Platforms iOS + Android Android only (Termux) iOS only (Blink)
Requires always-on machine No Yes Yes (your VPS)
Phantom process killer risk No Yes (Android 12+) No
Persistent storage 30 GB Depends on host Whatever you provision
Setup time (cold) ~3 min 30 to 60 min 30 to 60 min
Price (not counting OpenAI tokens) $29.99/mo after trial Free + hardware/VPS $19.99/yr + VPS (~$5-40/mo)

For the broader picture of all four AI agents on mobile (Claude Code, Codex CLI, OpenCode, Gemini CLI), see our pillar guide on AI coding agents on mobile. If you want the same walkthrough for Anthropic's tool, see Claude Code on phone.

Frequently asked questions

Can you run OpenAI Codex CLI on a phone?

Yes, indirectly. Codex CLI wants a real Linux or macOS shell, so the three working paths are a cloud Ubuntu container you reach from a native mobile app (Cosyra), Termux plus SSH to a laptop or VPS that actually runs Codex, or Blink Shell on iOS into a VPS. The Codex CLI project page on GitHub lists Linux and macOS as supported; there is no iOS or Android build.

How do I use Codex CLI remotely from my mobile device?

The OpenAI developer community thread on this recommends spinning up a cloud VPS, installing Codex, and SSHing in from an iPad or large Android phone. A simpler path is a cloud container with Codex already installed, which is what we pre-configure in Cosyra. Either way, the Codex process runs somewhere with a real shell, not on the phone itself.

Does Codex CLI work in Termux on Android?

Yes, users on the openai/codex GitHub discussions have reported running it "perfectly" in Termux as early as the 0.x releases. Install Node 20+ with pkg install nodejs, then npm install -g @openai/codex, then export your API key. Two real-world caveats: Android 12 and later can kill long-running background Codex sessions via the phantom process killer, and Termux is ARM userland, so any Codex-native dependency needs an arm64 build. For long refactors we prefer a cloud x86_64 container.

Is there an official OpenAI Codex mobile app?

Not for the CLI. OpenAI ships a Codex feature inside the ChatGPT iOS app for cloud tasks, and a separate Codex CLI on GitHub for terminal use. There is a GitHub discussion with an active feature request to remote control the Codex CLI from the ChatGPT app, but as of 2026-04-16 it is not shipped. Community tools like codex-terminal-phone exist but require self-hosting.

How much does it cost to run Codex CLI from a phone?

Two bills. OpenAI tokens at their published API rates (or your ChatGPT plan's Codex quota), plus the environment. We have a free trial (10 hours or 7 days, no credit card), Pro is $29.99/month. A VPS plus Blink Shell is cheaper on paper but you maintain the VPS. Termux is free if you already have the phone, but you still need a remote Linux box for Codex itself.

tl;dr

Codex CLI runs on Linux or macOS, not on phones directly. Three bridges: a cloud Ubuntu container with Codex pre-installed (Cosyra), Termux plus SSH to a laptop or VPS, or Blink Shell on iOS into your own VPS. We recommend the cloud container for most people because no laptop needs to stay awake and Android phantom process killer is not a factor.

App Store / Google Play. Free trial, 10 hours or 7 days. No credit card.

Run Codex CLI from your phone in three minutes. Install Cosyra, add your OpenAI API key, type codex.

See pricing