You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+6-6
Original file line number
Diff line number
Diff line change
@@ -53,9 +53,9 @@ limitations under the License.
53
53
## Overview
54
54
55
55
SparseZoo is a constantly-growing repository of highly sparse and sparse-quantized models with matching sparsification recipes for neural networks.
56
-
It simplifies and accelerates your time-to-value in building performant deep learning models with a collection of inference-optimized models and recipes to prototype from.
56
+
It simplifies and accelerates your time-to-value in building performant deep learning models with a collection of inference-sparsified models and recipes to prototype from.
57
57
58
-
Available via API and hosted in the cloud, the SparseZoo contains both baseline and models optimized to different degrees of inference performance vs. baseline loss recovery.
58
+
Available via API and hosted in the cloud, the SparseZoo contains both baseline and models sparsified to different degrees of inference performance vs. baseline loss recovery.
59
59
Recipe-driven approaches built around sparsification algorithms allow you to take the models as given, transfer-learn from the models onto private datasets, or transfer the recipes to your architectures.
60
60
61
61
This repository contains the Python API code to handle the connection and authentication to the cloud.
@@ -148,9 +148,9 @@ from sparsezoo import Zoo
148
148
from sparsezoo.models.classification import resnet_50
Copy file name to clipboardexpand all lines: notebooks/model_download.ipynb
+1-1
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@
23
23
"- 15 minutes\n",
24
24
"\n",
25
25
"# Background\n",
26
-
"Neural networks can take a long time to train. Model optimization techniques such as model pruning may be necessary to achieve both performance and optimization goals. However, these model optimizations can involve many trials and errors due to a large number of hyperparameters. Fortunately, in the computer vision and natural language space, pruned (sparsified) neural networks transfer learn.\n",
26
+
"Neural networks can take a long time to train. Model sparsification techniques such as model pruning may be necessary to achieve both performance and sparsification goals. However, the sparsification of models can involve many trials and errors due to a large number of hyperparameters. Fortunately, in the computer vision and natural language space, pruned (sparsified) neural networks transfer learn.\n",
27
27
"\n",
28
28
"To make it easier to use pruned models, Neural Magic is actively:\n",
29
29
"- Creating pruned versions of popular models and datasets\n",
0 commit comments