Benjin ZHU

Computer Vision, Robotics. Here is my CV.

SHB 304, CUHK, Hong Kong SAR, China

Currently, I‚Äôm a 3rd-year Ph.D. candidate at the MMLab @ The Chinese University of Hong Kong, supervised by Prof. Hongsheng Li and Prof. Xiaogang Wang. I got my B.Eng. degree from Software Engineering College (Excellent Engineer Class) of the South China University of Technology (one of the ‚ÄúProject 985‚ÄĚ of China) in July 2018.

My research interests mainly lies in computer vision and robotics, especially the 3D scene understanding, object detection and self-supervised learning. At present, my main focus is to build a general and efficient framework for 3D object detection and tracking. To facilitate research, I have built and maintained many codebases: a general 3D object detection codebse Det3D, a versatile and efficient codebase for many computer vision tasks named cvpods.


Jul 28, 2023 1 paper is accepted by ICCV 2023.
Mar 21, 2023 Release of an Efficient, Flexible, and General deep learning framework for Research EFG that retains minimal. :sparkles:
Feb 28, 2023 ConQueR is accepted by CVPR 2023, and selected as a Highlight (Top 2.5%). :sparkles:
Sep 4, 2022 MPPNet ranks 1st on WOD 3D Object Detection, and is accepted by ECCV 2022.
Dec 8, 2020 A collection of self-supervised methods has been release at SelfSup.
Oct 8, 2020 EqCo is available on arXiv. Code.

selected publications

  1. CVPR
    ConQueR: Query Contrast Voxel-DETR for 3D Object Detection
    Benjin Zhu, Zhe Wang, Shaoshuai Shi, Hang Xu, Lanqing Hong, and Hongsheng Li
    Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2023
  2. arXiv
    EqCo: Equivalent Rules for Self-supervised Contrastive Learning
    Benjin Zhu, Junqiang Huang, Zeming Li, Xiangyu Zhang, and Jian Sun
    arXiv preprint: 2010.01929 2020
  3. arXiv
    AutoAssign: Differentiable Label Assignment for Dense Object Detection
    Benjin Zhu, Jianfeng Wang, Zhengkai Jiang, Fuhang Zong, Songtao Liu, Zeming Li, and Jian Sun
    arXiv preprint: 2007.03496 2020