Skip to content

Commit

Permalink
Fix gpuCount status update logic
Browse files Browse the repository at this point in the history
  • Loading branch information
Scott Davidson committed Oct 2, 2023
1 parent 932180b commit 308718e
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions python/perftest/models/v1alpha1/pytorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,9 +221,11 @@ async def pod_modified(
fetch_pod_log: t.Callable[[], t.Awaitable[str]]
):
# Set default GPU count if none given in spec
gpu_count = pod.get("status", {}).get("gpuCount")
if gpu_count is None:
# (have to do this in status since spec is immutable)
if self.spec.gpu_count is None:
self.status.gpu_count = (0 if self.spec.device == "cpu" else 1)
else:
self.status.gpu_count = self.spec.gpu_count

pod_phase = pod.get("status", {}).get("phase", "Unknown")
if pod_phase == "Running":
Expand Down

0 comments on commit 308718e

Please sign in to comment.