Follow
Xu Qing
Xu Qing
Senior Research Engineer, Institute for Infocomm Research (I2R) - A*STAR
Verified email at i2r.a-star.edu.sg
Title
Cited by
Cited by
Year
KDnet-RUL: A knowledge distillation framework to compress deep neural networks for machine remaining useful life prediction
Q Xu, Z Chen, K Wu, C Wang, M Wu, X Li
IEEE Transactions on Industrial Electronics 69 (2), 2022-2032, 2021
592021
Contrastive adversarial knowledge distillation for deep model compression in time-series regression tasks
Q Xu, Z Chen, M Ragab, C Wang, M Wu, X Li
Neurocomputing 485, 242-251, 2022
262022
A hybrid ensemble deep learning approach for early prediction of battery remaining useful life
Q Xu, M Wu, E Khoo, Z Chen, X Li
IEEE/CAA Journal of Automatica Sinica 10 (1), 177-187, 2023
152023
Automatic detection of retinopathy with optical coherence tomography images via a semi-supervised deep learning method
Y Luo, Q Xu, R Jin, M Wu, L Liu
Biomedical Optics Express 12 (5), 2684-2702, 2021
112021
Cross‐domain retinopathy classification with optical coherence tomography images via a novel deep domain adaptation method
Y Luo, Q Xu, Y Hou, L Liu, M Wu
Journal of Biophotonics 14 (8), e202100096, 2021
32021
Reinforced Knowledge Distillation for Time Series Regression
Q Xu, K Wu, M Wu, K Mao, X Li, Z Chen
IEEE Transactions on Artificial Intelligence, 2023
12023
Contrastive Distillation with Regularized Knowledge for Deep Model Compression on Sensor-based Human Activity Recognition
Q Xu, M Wu, X Li, K Mao, Z Chen
IEEE Transactions on Industrial Cyber-Physical Systems, 2023
12023
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Q Xu, M Wu, X Li, K Mao, Z Chen
arXiv preprint arXiv:2307.03347, 2023
12023
Improve Knowledge Distillation via Label Revision and Data Selection
W Lan, Y Cheung, Q Xu, B Liu, Z Hu, M Li, Z Chen
arXiv preprint arXiv:2404.03693, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–9