category
神经网络修剪和相关资源的精选列表。 灵感来自 awesome-deep-vision、awesome-adversarial-machine-learning、awesome-deep-learning-papers 和 Awesome-NAS。
Table of Contents
Type of Pruning
TypeF
W
Other
ExplanationFilter pruningWeight pruningother types
2021
TitleVenueTypeCode
A Probabilistic Approach to Neural Network PruningICMLF
-
Accelerate CNNs from Three Dimensions: A Comprehensive Pruning FrameworkICMLF
-
Group Fisher Pruning for Practical Network CompressionICMLF
PyTorch(Author)
On the Predictability of Pruning Across ScalesICMLW
-
Towards Compact CNNs via Collaborative CompressionCVPRF
PyTorch(Author)
Content-Aware GAN CompressionCVPRF
PyTorch(Author)
Permute, Quantize, and Fine-tune: Efficient Compression of Neural NetworksCVPRF
PyTorch(Author)
Network Pruning via Performance MaximizationCVPRF
-
Convolutional Neural Network Pruning with Structural Redundancy ReductionCVPRF
-
Manifold Regularized Dynamic Network PruningCVPRF
-
Joint-DetNAS: Upgrade Your Detector with NAS, Pruning and Dynamic DistillationCVPRFO
-
A Gradient Flow Framework For Analyzing Network PruningICLRF
PyTorch(Author)
Neural Pruning via Growing RegularizationICLRF
PyTorch(Author)
ChipNet: Budget-Aware Pruning with Heaviside Continuous ApproximationsICLRF
PyTorch(Author)
Network Pruning That Matters: A Case Study on Retraining VariantsICLRF
PyTorch(Author)
Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning A Randomly Weighted NetworkICLRW
PyTorch(Author)
Layer-adaptive Sparsity for the Magnitude-based PruningICLRW
PyTorch(Author)
Pruning Neural Networks at Initialization: Why Are We Missing the Mark?ICLRW
-
Robust Pruning at InitializationICLRW
-
2020
TitleVenueTypeCode
HYDRA: Pruning Adversarially Robust Neural NetworksNeurIPSW
PyTorch(Author)
Logarithmic Pruning is All You NeedNeurIPSW
-
Directional Pruning of Deep Neural NetworksNeurIPSW
-
Movement Pruning: Adaptive Sparsity by Fine-TuningNeurIPSW
PyTorch(Author)
Sanity-Checking Pruning Methods: Random Tickets can Win the JackpotNeurIPSW
PyTorch(Author)
Neuron Merging: Compensating for Pruned NeuronsNeurIPSF
PyTorch(Author)
Neuron-level Structured Pruning using Polarization RegularizerNeurIPSF
PyTorch(Author)
SCOP: Scientific Control for Reliable Neural Network PruningNeurIPSF
PyTorch(Author)
Storage Efficient and Dynamic Flexible Runtime Channel Pruning via Deep Reinforcement LearningNeurIPSF
-
The Generalization-Stability Tradeoff In Neural Network PruningNeurIPSF
PyTorch(Author)
Pruning Filter in FilterNeurIPSOther
PyTorch(Author)
Position-based Scaled Gradient for Model Quantization and PruningNeurIPSOther
PyTorch(Author)
Bayesian Bits: Unifying Quantization and PruningNeurIPSOther
-
Pruning neural networks without any data by iteratively conserving synaptic flowNeurIPSOther
PyTorch(Author)
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network PruningECCV (Oral)F
PyTorch(Author)
DSA: More Efficient Budgeted Pruning via Differentiable Sparsity AllocationECCVF
-
DHP: Differentiable Meta Pruning via HyperNetworksECCVF
PyTorch(Author)
Meta-Learning with Network PruningECCVW
-
Accelerating CNN Training by Pruning Activation GradientsECCVW
-
DA-NAS: Data Adapted Pruning for Efficient Neural Architecture SearchECCVOther
-
Differentiable Joint Pruning and Quantization for Hardware EfficiencyECCVOther
-
Channel Pruning via Automatic Structure SearchIJCAIF
PyTorch(Author)
Adversarial Neural Pruning with Latent Vulnerability SuppressionICMLW
-
Proving the Lottery Ticket Hypothesis: Pruning is All You NeedICMLW
-
Soft Threshold Weight Reparameterization for Learnable SparsityICMLWF
Pytorch(Author)
Network Pruning by Greedy Subnetwork SelectionICMLF
-
Operation-Aware Soft Channel Pruning using Differentiable MasksICMLF
-
DropNet: Reducing Neural Network Complexity via Iterative PruningICMLF
-
Towards Efficient Model Compression via Learned Global RankingCVPR (Oral)F
Pytorch(Author)
HRank: Filter Pruning using High-Rank Feature MapCVPR (Oral)F
Pytorch(Author)
Neural Network Pruning with Residual-Connections and Limited-DataCVPR (Oral)F
-
Multi-Dimensional Pruning: A Unified Framework for Model CompressionCVPR (Oral)WF
-
DMCP: Differentiable Markov Channel Pruning for Neural NetworksCVPR (Oral)F
TensorFlow(Author)
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network CompressionCVPRF
PyTorch(Author)
Few Sample Knowledge Distillation for Efficient Network CompressionCVPRF
-
Discrete Model Compression With Resource Constraint for Deep Neural NetworksCVPRF
-
Structured Compression by Weight Encryption for Unstructured Pruning and QuantizationCVPRW
-
Learning Filter Pruning Criteria for Deep Convolutional Neural Networks AccelerationCVPRF
-
APQ: Joint Search for Network Architecture, Pruning and Quantization PolicyCVPRF
-
Comparing Rewinding and Fine-tuning in Neural Network PruningICLR (Oral)WF
TensorFlow(Author)
A Signal Propagation Perspective for Pruning Neural Networks at InitializationICLR (Spotlight)W
-
ProxSGD: Training Structured Neural Networks under Regularization and ConstraintsICLRW
TF+PT(Author)
One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum EvaluationICLRW
-
Lookahead: A Far-sighted Alternative of Magnitude-based PruningICLRW
PyTorch(Author)
Dynamic Model Pruning with FeedbackICLRWF
-
Provable Filter Pruning for Efficient Neural NetworksICLRF
-
Data-Independent Neural Pruning via CoresetsICLRW
-
AutoCompress: An Automatic DNN Structured Pruning Framework for Ultra-High Compression RatesAAAIF
-
DARB: A Density-Aware Regular-Block Pruning for Deep Neural NetworksAAAIOther
-
Pruning from ScratchAAAIOther
-
Reborn filters: Pruning convolutional neural networks with limited dataAAAIF
-
2019
TitleVenueTypeCode
Network Pruning via Transformable Architecture SearchNeurIPSF
PyTorch(Author)
Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural NetworksNeurIPSF
PyTorch(Author)
Deconstructing Lottery Tickets: Zeros, Signs, and the SupermaskNeurIPSW
TensorFlow(Author)
Global Sparse Momentum SGD for Pruning Very Deep Neural NetworksNeurIPSW
PyTorch(Author)
AutoPrune: Automatic Network Pruning by Regularizing Auxiliary ParametersNeurIPSW
-
Model Compression with Adversarial Robustness: A Unified Optimization FrameworkNeurIPSOther
PyTorch(Author)
MetaPruning: Meta Learning for Automatic Neural Network Channel PruningICCVF
PyTorch(Author)
Accelerate CNN via Recursive Bayesian PruningICCVF
-
Adversarial Robustness vs Model Compression, or Both?ICCVW
PyTorch(Author)
Learning Filter Basis for Convolutional Neural Network CompressionICCVOther
-
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks AccelerationCVPR (Oral)F
PyTorch(Author)
Towards Optimal Structured CNN Pruning via Generative Adversarial LearningCVPRF
PyTorch(Author)
Centripetal SGD for Pruning Very Deep Convolutional Networks with Complicated StructureCVPRF
PyTorch(Author)
On Implicit Filter Level Sparsity in Convolutional Neural Networks, Extension1, Extension2CVPRF
PyTorch(Author)
Structured Pruning of Neural Networks with Budget-Aware RegularizationCVPRF
-
Importance Estimation for Neural Network PruningCVPRF
PyTorch(Author)
OICSR: Out-In-Channel Sparsity Regularization for Compact Deep Neural NetworksCVPRF
-
Partial Order Pruning: for Best Speed/Accuracy Trade-off in Neural Architecture SearchCVPROther
TensorFlow(Author)
Variational Convolutional Neural Network PruningCVPR--
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural NetworksICLR (Best)W
TensorFlow(Author)
Rethinking the Value of Network PruningICLRF
PyTorch(Author)
Dynamic Channel Pruning: Feature Boosting and SuppressionICLRF
TensorFlow(Author)
SNIP: Single-shot Network Pruning based on Connection SensitivityICLRW
TensorFLow(Author)
Dynamic Sparse Graph for Efficient Deep LearningICLRF
CUDA(3rd)
Collaborative Channel Pruning for Deep NetworksICMLF
-
Approximated Oracle Filter Pruning for Destructive CNN Width Optimization githubICMLF
-
EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis4ICMLW
PyTorch(Author)
COP: Customized Deep Model Compression via Regularized Correlation-Based Filter-Level PruningIJCAIF
Tensorflow(Author)
2018
TitleVenueTypeCode
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution LayersICLRF
TensorFlow(Author), PyTorch(3rd)
To prune, or not to prune: exploring the efficacy of pruning for model compressionICLRW
-
Discrimination-aware Channel Pruning for Deep Neural NetworksNeurIPSF
TensorFlow(Author)
Frequency-Domain Dynamic Pruning for Convolutional Neural NetworksNeurIPSW
-
Learning Sparse Neural Networks via Sensitivity-Driven RegularizationNeurIPSWF
-
Amc: Automl for model compression and acceleration on mobile devicesECCVF
TensorFlow(3rd)
Data-Driven Sparse Structure Selection for Deep Neural NetworksECCVF
MXNet(Author)
Coreset-Based Neural Network CompressionECCVF
PyTorch(Author)
Constraint-Aware Deep Neural Network CompressionECCVW
SkimCaffe(Author)
A Systematic DNN Weight Pruning Framework using Alternating Direction Method of MultipliersECCVW
Caffe(Author)
PackNet: Adding Multiple Tasks to a Single Network by Iterative PruningCVPRF
PyTorch(Author)
NISP: Pruning Networks using Neuron Importance Score PropagationCVPRF
-
CLIP-Q: Deep Network Compression Learning by In-Parallel Pruning-QuantizationCVPRW
-
“Learning-Compression” Algorithms for Neural Net PruningCVPRW
-
Soft Filter Pruning for Accelerating Deep Convolutional Neural NetworksIJCAIF
PyTorch(Author)
Accelerating Convolutional Networks via Global & Dynamic Filter PruningIJCAIF
-
2017
TitleVenueTypeCode
Pruning Filters for Efficient ConvNetsICLRF
PyTorch(3rd)
Pruning Convolutional Neural Networks for Resource Efficient InferenceICLRF
TensorFlow(3rd)
Net-Trim: Convex Pruning of Deep Neural Networks with Performance GuaranteeNeurIPSW
TensorFlow(Author)
Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain SurgeonNeurIPSW
PyTorch(Author)
Runtime Neural PruningNeurIPSF
-
Designing Energy-Efficient Convolutional Neural Networks using Energy-Aware PruningCVPRF
-
ThiNet: A Filter Level Pruning Method for Deep Neural Network CompressionICCVF
Caffe(Author), PyTorch(3rd)
Channel pruning for accelerating very deep neural networksICCVF
Caffe(Author)
Learning Efficient Convolutional Networks Through Network SlimmingICCVF
PyTorch(Author)
2016
TitleVenueTypeCode
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman CodingICLR (Best)W
Caffe(Author)
Dynamic Network Surgery for Efficient DNNsNeurIPSW
Caffe(Author)
2015
TitleVenueTypeCode
Learning both Weights and Connections for Efficient Neural NetworksNeurIPSW
PyTorch(3rd)
Related Repo
Awesome-model-compression-and-acceleration
awesome-AutoML-and-Lightweight-Models
原文:https://github.com/he-y/Awesome-Pruning
- 登录 发表评论