Skip to content

Conversation

mvccn
Copy link

@mvccn mvccn commented Oct 14, 2025

Why

After long desktop sessions we’d like to walk away, open an iPhone, and keep editing—dictating or finishing tasks without losing momentum. As the models mature we rely less and less on full feature code editors on desktop, I believe that working entirely from a mobile device is finally realistic.

What

This PR delivers part one of that effort: a WebSocket bridge that lets any remote client speak directly to the Codex JSON-RPC API. The AppSeerver
Screenshot 2025-10-15 at 00 44 23

I already developed an iOS app that mirrors most TUI functionality:
Simulator Screenshot - iPhone 16 Pro - 2025-10-15 at 01 04 14
Simulator Screenshot - iPhone 16 Pro - 2025-10-15 at 01 13 44

Part  three is to connect the TUI through the same bridge so all clients share the exact same live session. it would be as simple as add -r(--remote-access) to cli command.

How

Adds the codex-app-server-ws binary, which exposes the existing JSON-RPC server over WebSocket, along with the supporting architecture and documentation (REMOTE_ACCESS.md). Remote clients reuse the in-process AppServerEngine, so sessions and auth state stay unified across desktop and mobile.

Test

  • cargo test -p codex-app-server-ws
  • cargo test -p codex-app-server
  • cargo test --all-features

mvccn added 2 commits October 14, 2025 17:57
- New crate  exposing a WebSocket JSON-RPC bridge for the in-process App Server engine
- Add  in  with  for embedding
- Support shared Auth/Conversation managers and emit  as a top-level server notification
- Wire workspace + lockfile; add Just targets for focused tests

Tests:  and  pass locally

App-server: tag WS sessions and bound outgoing queues
update document
Copy link

github-actions bot commented Oct 14, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@mvccn mvccn force-pushed the feat/app-server-ws branch from c23d73f to 05dbec2 Compare October 14, 2025 17:22
Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

async fn add_conversation_listener(
&mut self,
request_id: RequestId,
params: AddConversationListenerParams,
) {
let AddConversationListenerParams { conversation_id } = params;
let Ok(conversation) = self
.conversation_manager
.get_conversation(conversation_id)
.await
else {
let error = JSONRPCErrorError {
code: INVALID_REQUEST_ERROR_CODE,
message: format!("conversation not found: {conversation_id}"),
data: None,
};
self.outgoing.send_error(request_id, error).await;
return;
};
let subscription_id = Uuid::new_v4();
let (cancel_tx, mut cancel_rx) = oneshot::channel();
self.conversation_listeners
.insert(subscription_id, cancel_tx);
let outgoing_for_task = self.outgoing.clone();
let pending_interrupts = self.pending_interrupts.clone();
tokio::spawn(async move {
loop {
tokio::select! {
_ = &mut cancel_rx => {
// User has unsubscribed, so exit this task.
break;
}
event = conversation.next_event() => {
let event = match event {
Ok(event) => event,
Err(err) => {
tracing::warn!("conversation.next_event() failed with: {err}");
break;

P1 Badge Fan-out to multiple conversation listeners drops events

The new WebSocket bridge intends to let several clients observe the same session, but each addConversationListener spawns a task that reads directly from conversation.next_event(). CodexConversation::next_event is backed by a single mpsc::Receiver, so every spawned listener competes for the same stream rather than receiving a copy. When two clients subscribe to the same conversation, whichever task wins the next_event race consumes the event and the other listeners block forever, leaving those clients with missing or stalled updates. The server therefore cannot actually mirror a live session to multiple WS connections. To support concurrent viewers, events need to be fanned out from a single reader (e.g., via broadcast or multiplexing) instead of each listener calling next_event independently.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

@mvccn
Copy link
Author

mvccn commented Oct 14, 2025

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Oct 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant