Found 16 repositories(showing 16)
neuralmagic
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
By the end of this post, you will learn how to: Train a SOTA YOLOv5 model on your own data. Sparsify the model using SparseML quantization aware training, sparse transfer learning, and one-shot quantization. Export the sparsified model and run it using the DeepSparse engine at insane speeds. P/S: The end result - YOLOv5 on CPU at 180+ FPS using on
intel-spark
Spark MLlib code optimized to efficiently support sparse data
tchaton
No description available
aagmata
Training sparsified and pruned version of yolov5 through neuralmagic
zewemli
Machine Learning algorithms for sparse data in Julia
100latent
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
asu-gkg
No description available
ajkdrag
No description available
Godofnothing
Integration of SparseML + Pytorch for model pruning
guqiqi
No description available
VikasOjha666
No description available
vishnushukl
PyTorch + SparseML experiments exploring magnitude pruning and sparsity/accuracy trade-offs across MNIST, CIFAR-100, and text datasets, with recipes, logs, plots, and ONNX export artifacts.
jazib-sudo
No description available
robertgshaw
Examples using deepsparse engine and sparseml libraries
clementpoiret
No description available
All 16 repositories loaded