You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As an engineering running a simulator as a producer
I want the simulator to run a recording by reading data from a cloud bucket
So that I do not have to store large recordings on my local filesystem
While talking with the MN team, they asked how they might use the simulator in their k8s cluster to send data through the BN and to the MN (listening as a consumer). The MN team will soon be reading block streams uploaded by the CN to cloud buckets. Adding the ability for the simulator as a producer to read the same data, from the same bucket would help them with their testing.
Another helpful scenario: imagine we're troubleshooting an issue in mainnet or testnet where CN is sending problematic data. We could use the simulator to read the same, problematic block stream, from a bucket and replay it locally to test where the items break with the Block Node.
Tech Notes
Since we will be persisting/writing to cloud buckets, we might wait until that work is done and then reuse some of it in the simulator to read from a cloud bucket. EPIC: Upload Blocks to Cloud Buckets (New Ph-2) #357
The text was updated successfully, but these errors were encountered:
As an engineering running a simulator as a producer
I want the simulator to run a recording by reading data from a cloud bucket
So that I do not have to store large recordings on my local filesystem
mainnet
ortestnet
where CN is sending problematic data. We could use the simulator to read the same, problematic block stream, from a bucket and replay it locally to test where the items break with the Block Node.Tech Notes
The text was updated successfully, but these errors were encountered: