HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

LumiNet: The Bright Side of Perceptual Knowledge Distillation

Md. Ismail Hossain M M Lutfe Elahi Sameera Ramasinghe Ali Cheraghian Fuad Rahman Nabeel Mohammed Shafin Rahman

LumiNet: The Bright Side of Perceptual Knowledge Distillation

Abstract

In knowledge distillation literature, feature-based methods have dominated due to their ability to effectively tap into extensive teacher models. In contrast, logit-based approaches, which aim to distill `dark knowledge' from teachers, typically exhibit inferior performance compared to feature-based methods. To bridge this gap, we present LumiNet, a novel knowledge distillation algorithm designed to enhance logit-based distillation. We introduce the concept of 'perception', aiming to calibrate logits based on the model's representation capability. This concept addresses overconfidence issues in logit-based distillation method while also introducing a novel method to distill knowledge from the teacher. It reconstructs the logits of a sample/instances by considering relationships with other samples in the batch. LumiNet excels on benchmarks like CIFAR-100, ImageNet, and MSCOCO, outperforming leading feature-based methods, e.g., compared to KD with ResNet18 and MobileNetV2 on ImageNet, it shows improvements of 1.5% and 2.05%, respectively.

Code Repositories

ismail31416/luminet
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
classification-on-cifar-100ResNet8×4
Accuracy: 77.50

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp