Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Support 'serverless' scenarios similar to ACI #32

Open
damienpontifex opened this issue Mar 14, 2018 · 1 comment
Open

Support 'serverless' scenarios similar to ACI #32

damienpontifex opened this issue Mar 14, 2018 · 1 comment

Comments

@damienpontifex
Copy link

Azure Container Instances allow you to spin up a container workload and just define memory and CPU requirements. It would be great if this was possible with BatchAI to remove the idea of having a cluster.

To be able to deploy a job and in there define memory, CPU and GPU or more generally machine requirements and they be managed for you. Allow the data scientist/developer to just focus on the job itself.

Looking into it a bit, this seems similar to how Google run their ML engine jobs defining a scale tier, although I much prefer Batch AIs method of using custom containers vs ML Engines runtime versions to actually run the jobs 😄

@SenthuranSivananthan
Copy link

There are some scenarios that AWS brought up at re:Invent that we can unlock through ACI.

Reference: https://fr.slideshare.net/AmazonWebServices/srv317unlocking-high-performance-computing-for-financial-services-with-serverless-compute

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants