Skip to content

running inference on single image overloads gpu memory #5

@shahyaan

Description

@shahyaan

Hi,

When trying to run inference on a test image using your script I get a "CUDA out of memory error". My image size is 640x480, and I have a GPU with 24GB memory. I'd appreciate it if you could help me resolve this. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions