add batch processing to MessageProcessor and DataModel#789
add batch processing to MessageProcessor and DataModel#789Ezreal09 wants to merge 4 commits intogoogle:mainfrom
Conversation
|
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
There was a problem hiding this comment.
Code Review
This pull request introduces batch processing for DataModel and MessageProcessor to optimize performance by reducing redundant notifications and re-renders. However, a high-severity prototype pollution vulnerability was identified in the DataModel.set method, allowing an attacker to manipulate the prototype of the internal data object or Object.prototype via specially crafted paths like __proto__, which could lead to Cross-Site Scripting (XSS) or denial of service. Furthermore, a critical bug exists in MessageProcessor where batching fails for surfaces created within the same message batch, and there's a potential performance issue in the DataModel.endBatch method's notification logic. Strengthening a test case to fully verify error handling behavior is also recommended. Addressing these points will make the batching mechanism more robust, reliable, and secure.
76b3aa8 to
2abeeb7
Compare
Problem
When processMessages() receives multiple updateDataModel messages,
each message triggers notifications immediately, causing redundant
re-renders (N messages = N render cycles).
Solution
Batch notifications within a single processMessages() call:
Changes
DataModel.beginBatch()/endBatch()/clearPending()MessageProcessor.processMessages()uses batch mode for 2+ messagesTesting