Lightweight Detection of Dangerous Driving Features via Knowledge Distillation

Authors

  • Yiming Yang Author

DOI:

https://doi.org/10.61173/j9ee3k30

Keywords:

Dangerous driving detection, Knowledge distillation, Lightweight model, EfficientNet-V2, Mo-bileNet-V3

Abstract

To address the contradiction between accuracy and efficiency in dangerous driving detection models, this study proposes a lightweight feature learning framework based on knowledge distillation. By constructing a collaborative model architecture, with EfficientNet-V2 as the knowledge source and MobileNet-V3 as the lightweight carrier, combined with an attention feature transfer strategy, efficient extraction of key features of driving behaviors is achieved. Experiments based on the Kaggle Driver Inattention Detection Dataset verify that this method reduces computational demand while increasing operating speed and accuracy. The research results can provide low-latency, high-robustness behavior monitoring solutions for in-vehicle embedded systems.

Downloads

Published

2025-08-26

Issue

Section

Articles