Skip to content

add batch processing to MessageProcessor and DataModel#789

Open
Ezreal09 wants to merge 4 commits intogoogle:mainfrom
Ezreal09:feat/batch-processing-optimization
Open

add batch processing to MessageProcessor and DataModel#789
Ezreal09 wants to merge 4 commits intogoogle:mainfrom
Ezreal09:feat/batch-processing-optimization

Conversation

@Ezreal09
Copy link
Copy Markdown

@Ezreal09 Ezreal09 commented Mar 6, 2026

Problem

When processMessages() receives multiple updateDataModel messages,
each message triggers notifications immediately, causing redundant
re-renders (N messages = N render cycles).

Solution

Batch notifications within a single processMessages() call:

  • DataModel: new beginBatch/endBatch/clearPending APIs
  • MessageProcessor: enters batch mode for multi-message processing
  • Notifications are deferred and deduplicated, then flushed once

Changes

  • DataModel.beginBatch() / endBatch() / clearPending()
  • MessageProcessor.processMessages() uses batch mode for 2+ messages
  • Supports nested batch calls via depth counter
  • Single-message calls use fast path (no batch overhead)

Testing

  • Added 6 unit tests for DataModel batch processing
  • Added 4 integration tests for MessageProcessor batch behavior
  • All 221 existing tests pass

@google-cla
Copy link
Copy Markdown

google-cla bot commented Mar 6, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces batch processing for DataModel and MessageProcessor to optimize performance by reducing redundant notifications and re-renders. However, a high-severity prototype pollution vulnerability was identified in the DataModel.set method, allowing an attacker to manipulate the prototype of the internal data object or Object.prototype via specially crafted paths like __proto__, which could lead to Cross-Site Scripting (XSS) or denial of service. Furthermore, a critical bug exists in MessageProcessor where batching fails for surfaces created within the same message batch, and there's a potential performance issue in the DataModel.endBatch method's notification logic. Strengthening a test case to fully verify error handling behavior is also recommended. Addressing these points will make the batching mechanism more robust, reliable, and secure.

@Ezreal09 Ezreal09 force-pushed the feat/batch-processing-optimization branch from 76b3aa8 to 2abeeb7 Compare March 6, 2026 09:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant