×
Jun 9, 2023 · In this work, we study this question in the context of the "long tail" phenomenon in computer vision datasets observed by Feldman, et al. They ...
We study the question: are mismatches between the full and compressed models correlated with the memorized training data?
People also ask
Model compression, such as pruning and quantization, has been widely applied to optimize neural networks on resource-limited classical devices. Paper
For a more detailed comparison of the long tail theory with the standard approaches to understanding of generalization and work on interpolating methods we ...
Sep 9, 2024 · First, natural image and data distributions are (informally) known to be long-tailed, that is have a significant fraction of rare and atypical ...
For the l-th layer, the accuracy model is obtained by changing rl while keeping the rest of the network unchanged. Empir- ical models are developed for each ...
We can see the effect of restoring weights in the first layers has much more impact on accuracy than in the final classifier layers as would be expected. We ...
These facts again verify the negative effect of the long-tailed data, especially for shallow networks with small training sets. 3.2. Explorations with ...
The change in distance, along a random direction, from training data to the decision boundary captures changes in robustness. …