SMaRT: Stick via Motion and Recognition Tracker
| dc.contributor.author | Simsek, Fatih Emre | |
| dc.contributor.author | Cigla, Cevahir | |
| dc.contributor.author | Kayabol, Koray | |
| dc.date.accessioned | 2025-10-29T11:15:55Z | |
| dc.date.issued | 2025 | |
| dc.department | Fakülteler, Mühendislik Fakültesi, Elektronik Mühendisliği Bölümü | |
| dc.description.abstract | This paper presents SMaRT (Stick via Motion and Recognition Tracker), a novel multi-object tracking (MOT) approach that integrates motion estimation and re-identification within a unified, efficient framework. Inspired by leading MOT methods like CenterTrack and FairMOT, SMaRT enhances tracking robustness by fusing re-identification features from an advanced teacher-student model. This integration enables the simultaneous regression of object locations and extraction of re-identification vectors within a single neural network. Evaluations on the DIVOTrack, MOT17 and SOMPT22 datasets demonstrate significant improvements over previous state-of-the-art methods in terms of Higher Order Tracking Accuracy (HOTA), Multi-Object Tracking Accuracy (MOTA), and Association Accuracy (AssA). Additionally, SMaRT's efficiency and accuracy are validated through comprehensive synthetic video experiments, highlighting its adaptability to varied motion patterns and occlusions. The proposed approach offers a robust, accurate, and efficient solution for real-world applications such as surveillance, autonomous driving, and robotics. The tracker is available at: github.com/sompt22/SMaRT. | |
| dc.description.sponsorship | Aselsan Inc. | |
| dc.description.sponsorship | This work was supported in part by Aselsan Inc. | |
| dc.identifier.doi | 10.1109/ACCESS.2025.3569732 | |
| dc.identifier.endpage | 85744 | |
| dc.identifier.issn | 2169-3536 | |
| dc.identifier.scopus | 2-s2.0-105005326925 | |
| dc.identifier.scopusquality | Q1 | |
| dc.identifier.startpage | 85728 | |
| dc.identifier.uri | https://doi.org/10.1109/ACCESS.2025.3569732 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14854/7339 | |
| dc.identifier.volume | 13 | |
| dc.identifier.wos | WOS:001492121500015 | |
| dc.identifier.wosquality | Q2 | |
| dc.indekslendigikaynak | Web of Science | |
| dc.indekslendigikaynak | Scopus | |
| dc.language.iso | en | |
| dc.publisher | IEEE-Inst Electrical Electronics Engineers Inc | |
| dc.relation.ispartof | IEEE Access | |
| dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | |
| dc.rights | info:eu-repo/semantics/openAccess | |
| dc.snmz | KA_WOS_20251020 | |
| dc.subject | Tracking | |
| dc.subject | Accuracy | |
| dc.subject | Vectors | |
| dc.subject | Feature extraction | |
| dc.subject | Computational modeling | |
| dc.subject | Object tracking | |
| dc.subject | Multitasking | |
| dc.subject | Robustness | |
| dc.subject | Neural networks | |
| dc.subject | Estimation | |
| dc.subject | Knowledge distillation | |
| dc.subject | multiple object tracking | |
| dc.subject | multi task learning | |
| dc.subject | pedestrian detection | |
| dc.subject | video surveillance | |
| dc.title | SMaRT: Stick via Motion and Recognition Tracker | |
| dc.type | Article |









