Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[IR] Support safetensors in tensor adapters #1933

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented Nov 5, 2024

Load safetensors as numpy arrays. It is actually incredibly easy and we are able to reuse the ir.Tensor implementation because the numpy array loaded by safetensors is memory mapped. We can use it as a normal numpy array without memory constraints.

TODO

  • Add tests
  • Verify bfloat16/fp8 support

@justinchuby justinchuby added the topic: IR Intermediate representation label Nov 5, 2024
Copy link

codecov bot commented Nov 5, 2024

❌ 93 Tests Failed:

Tests completed Failed Passed Skipped
12126 93 12033 2447
View the top 3 failed tests by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0577_test_logsoftmax_example_1
Stack Traces | 0.003s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_logsoftmax_example_1'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_logsoftmax_example_1' (e=No module named 'tests.onnx_backend_test_code.test_logsoftmax_example_1') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_logsoftmax_example_1.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_logsoftmax_example_1.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_logsoftmax_example_1(x: FLOAT[1,3]) -> (FLOAT[1,3]):
E       y = opset13.LogSoftmax(x)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0693_test_neg_example
Stack Traces | 0.003s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_neg_example'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_neg_example' (e=No module named 'tests.onnx_backend_test_code.test_neg_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_neg_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_neg_example.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_neg_example(x: FLOAT[2]) -> (FLOAT[2]):
E       y = opset13.Neg(x)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1109_test_shrink_soft
Stack Traces | 0.003s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_shrink_soft'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_shrink_soft' (e=No module named 'tests.onnx_backend_test_code.test_shrink_soft') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_shrink_soft.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_shrink_soft.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset9
E   
E   @script()
E   def bck_test_shrink_soft(x: FLOAT[5]) -> (FLOAT[5]):
E       y = opset9.Shrink(x, bias=1.5, lambd=1.5)
E       return y

To view more test analytics, go to the Test Analytics Dashboard
📢 Thoughts on this report? Let us know!

self._path = path
self._tensor_name = tensor_name

with safetensors.safe_open(path, framework="numpy") as f:

Check failure

Code scanning / lintrunner

MYPY/attr-defined Error

"safe_open" has no attribute "__enter__" To disable, use # type: ignore[attr-defined]
self._path = path
self._tensor_name = tensor_name

with safetensors.safe_open(path, framework="numpy") as f:

Check failure

Code scanning / lintrunner

MYPY/attr-defined Error

"safe_open" has no attribute "__exit__" To disable, use # type: ignore[attr-defined]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: IR Intermediate representation
Projects
Development

Successfully merging this pull request may close these issues.

1 participant