Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix signature of torch allocator callbacks #1408

Closed
wants to merge 1 commit into from

Commits on Dec 12, 2023

  1. Fix signature of torch allocator callbacks

    The deallocation function now also takes the device id.
    
    Since both halves of the pair now receive the device on which to
    perform the (de)allocation, we switch from using
    get_current_device_resource to using the (more correct)
    get_per_device_resource. This necessitates a workaround in Cython:
    rmm::cuda_device_id has no nullary constructor, and so cannot be
    stack-allocated the way Cython transpiles code. Instead perform a heap
    allocation and then delete it.
    
    - Closes rapidsai#1405
    wence- committed Dec 12, 2023
    Configuration menu
    Copy the full SHA
    7fd1dd1 View commit details
    Browse the repository at this point in the history