Address

Room 105, Institute of Cyber-Systems and Control, Yuquan Campus, Zhejiang University, Hangzhou, Zhejiang, China

Contact Information

Email: 21932133@zju.edu.cn

Kewei Hu

MS Student

Institute of Cyber-Systems and Control, Zhejiang University, China

Biography

I am pursuing my M.S degree in Zhejiang University, Hangzhou, China. My major research interests include Sensor Fusion and SLAM.

Research and Interests

  • Sensor Fusion

Publications

  • Han Li, Yukai Ma, Yaqing Gu, Kewei Hu, Yong Liu, and Xingxing Zuo. RadarCam-Depth: Radar-Camera Fusion for Depth Estimation with Learned Metric Scale. In 2024 IEEE International Conference on Robotics and Automation (ICRA), pages 10665-10672, 2024.
    [BibTeX] [Abstract] [DOI] [PDF]
    We present a novel approach for metric dense depth estimation based on the fusion of a single-view image and a sparse, noisy Radar point cloud. The direct fusion of heterogeneous Radar and image data, or their encodings, tends to yield dense depth maps with significant artifacts, blurred boundaries, and suboptimal accuracy. To circumvent this issue, we learn to augment versatile and robust monocular depth prediction with the dense metric scale induced from sparse and noisy Radar data. We propose a Radar-Camera framework for highly accurate and fine-detailed dense depth estimation with four stages, including monocular depth prediction, global scale alignment of monocular depth with sparse Radar points, quasi-dense scale estimation through learning the association between Radar points and image patches, and local scale refinement of dense depth using a scale map learner. Our proposed method significantly outperforms the state-of-the-art Radar-Camera depth estimation methods by reducing the mean absolute error (MAE) of depth estimation by 25.6% and 40.2% on the challenging nuScenes dataset and our self-collected ZJU-4DRadarCam dataset, respectively. Our code and dataset will be released at https://github.com/MMOCKING/RadarCam-Depth.
    @inproceedings{li2024rcd,
    title = {RadarCam-Depth: Radar-Camera Fusion for Depth Estimation with Learned Metric Scale},
    author = {Han Li and Yukai Ma and Yaqing Gu and Kewei Hu and Yong Liu and Xingxing Zuo},
    year = 2024,
    booktitle = {2024 IEEE International Conference on Robotics and Automation (ICRA)},
    pages = {10665-10672},
    doi = {10.1109/ICRA57147.2024.10610929},
    abstract = {We present a novel approach for metric dense depth estimation based on the fusion of a single-view image and a sparse, noisy Radar point cloud. The direct fusion of heterogeneous Radar and image data, or their encodings, tends to yield dense depth maps with significant artifacts, blurred boundaries, and suboptimal accuracy. To circumvent this issue, we learn to augment versatile and robust monocular depth prediction with the dense metric scale induced from sparse and noisy Radar data. We propose a Radar-Camera framework for highly accurate and fine-detailed dense depth estimation with four stages, including monocular depth prediction, global scale alignment of monocular depth with sparse Radar points, quasi-dense scale estimation through learning the association between Radar points and image patches, and local scale refinement of dense depth using a scale map learner. Our proposed method significantly outperforms the state-of-the-art Radar-Camera depth estimation methods by reducing the mean absolute error (MAE) of depth estimation by 25.6% and 40.2% on the challenging nuScenes dataset and our self-collected ZJU-4DRadarCam dataset, respectively. Our code and dataset will be released at https://github.com/MMOCKING/RadarCam-Depth.}
    }
  • Jiajun Lv, Xingxing Zuo, Kewei Hu, Jinhong Xu, Guoquan Huang, and Yong Liu. Observability-Aware Intrinsic and Extrinsic Calibration of LiDAR-IMU System. IEEE Transactions on Robotics, 38(6):3734-3753, 2022.
    [BibTeX] [Abstract] [DOI] [PDF]
    Accurate and reliable sensor calibration is essential to fuse LiDAR and inertial measurements, which are usually available in robotic applications. In this article, we propose a novel LiDAR-IMU calibration method within the continuous-time batch-optimization framework, where the intrinsics of both sensors and the spatial-temporal extrinsics between sensors are calibrated without using calibration infrastructure, such as fiducial tags. Compared to discrete-time approaches, the continuous-time formulation has natural advantages for fusing high-rate measurements from LiDAR and IMU sensors. To improve efficiency and address degenerate motions, the following two observability-aware modules are leveraged: first, The information-theoretic data selection policy selects only the most informative segments for calibration during data collection, which significantly improves the calibration efficiency by processing only the selected informative segments. Second, the observability-aware state update mechanism in nonlinear least-squares optimization updates only the identifiable directions in the state space with truncated singular value decomposition, which enables accurate calibration results even under degenerate cases where informative data segments are not available. The proposed LiDAR-IMU calibration approach has been validated extensively in both simulated and real-world experiments with different robot platforms, demonstrating its high accuracy and repeatability in commonly-seen human-made environments.
    @article{lv2022oai,
    title = {Observability-Aware Intrinsic and Extrinsic Calibration of LiDAR-IMU System},
    author = {Jiajun Lv and Xingxing Zuo and Kewei Hu and Jinhong Xu and Guoquan Huang and Yong Liu},
    year = 2022,
    journal = {IEEE Transactions on Robotics},
    volume = {38},
    number = {6},
    pages = {3734-3753},
    doi = {10.1109/TRO.2022.3174476},
    abstract = {Accurate and reliable sensor calibration is essential to fuse LiDAR and inertial measurements, which are usually available in robotic applications. In this article, we propose a novel LiDAR-IMU calibration method within the continuous-time batch-optimization framework, where the intrinsics of both sensors and the spatial-temporal extrinsics between sensors are calibrated without using calibration infrastructure, such as fiducial tags. Compared to discrete-time approaches, the continuous-time formulation has natural advantages for fusing high-rate measurements from LiDAR and IMU sensors. To improve efficiency and address degenerate motions, the following two observability-aware modules are leveraged: first, The information-theoretic data selection policy selects only the most informative segments for calibration during data collection, which significantly improves the calibration efficiency by processing only the selected informative segments. Second, the observability-aware state update mechanism in nonlinear least-squares optimization updates only the identifiable directions in the state space with truncated singular value decomposition, which enables accurate calibration results even under degenerate cases where informative data segments are not available. The proposed LiDAR-IMU calibration approach has been validated extensively in both simulated and real-world experiments with different robot platforms, demonstrating its high accuracy and repeatability in commonly-seen human-made environments.}
    }
  • Jiajun Lv, Kewei Hu, Jinhong Xu, Yong Liu, and Xingxing Zuo. CLINS: Continuous-Time Trajectory Estimation for LiDAR Inertial System. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 6657-6663, 2021.
    [BibTeX] [Abstract] [DOI] [PDF]
    In this paper, we propose a highly accurate continuous-time trajectory estimation framework dedicated to SLAM (Simultaneous Localization and Mapping) applications, which enables fuse high-frequency and asynchronous sensor data effectively. We apply the proposed framework in a 3D LiDAR-inertial system for evaluations. The proposed method adopts a non-rigid registration method for continuous-time trajectory estimation and simultaneously removing the motion distortion in LiDAR scans. Additionally, we propose a two-state continuous-time trajectory correction method to efficiently and efficiently tackle the computationally-intractable global optimization problem when loop closure happens. We examine the accuracy of the proposed approach on several publicly available datasets and the data we collected. The experimental results indicate that the proposed method outperforms the discrete-time methods regarding accuracy especially when aggressive motion occurs. Furthermore, we open source our code at https://github.com/APRIL-ZJU/clins to benefit research community.
    @inproceedings{lv2021clins,
    title = {CLINS: Continuous-Time Trajectory Estimation for LiDAR Inertial System},
    author = {Jiajun Lv and Kewei Hu and Jinhong Xu and Yong Liu and Xingxing Zuo},
    year = 2021,
    booktitle = {2021 IEEE/RSJ International Conference on Intelligent Robots and Systems},
    pages = {6657-6663},
    doi = {https://doi.org/10.1109/IROS51168.2021.9636676},
    abstract = {In this paper, we propose a highly accurate continuous-time trajectory estimation framework dedicated to SLAM (Simultaneous Localization and Mapping) applications, which enables fuse high-frequency and asynchronous sensor data effectively. We apply the proposed framework in a 3D LiDAR-inertial system for evaluations. The proposed method adopts a non-rigid registration method for continuous-time trajectory estimation and simultaneously removing the motion distortion in LiDAR scans. Additionally, we propose a two-state continuous-time trajectory correction method to efficiently and efficiently tackle the computationally-intractable global optimization problem when loop closure happens. We examine the accuracy of the proposed approach on several publicly available datasets and the data we collected. The experimental results indicate that the proposed method outperforms the discrete-time methods regarding accuracy especially when aggressive motion occurs. Furthermore, we open source our code at https://github.com/APRIL-ZJU/clins to benefit research community.}
    }
  • Jiajun Lv, Jinhong Xu, Kewei Hu, Yong Liu, and Xingxing Zuo. Targetless Calibration of LiDAR-IMU System Based on Continuous-time Batch Estimation. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), page 9968–9975, 2020.
    [BibTeX] [Abstract] [DOI] [arXiv] [PDF]
    Sensor calibration is the fundamental block for a multi-sensor fusion system. This paper presents an accurate and repeatable LiDAR-IMU calibration method (termed LI-Calib), to calibrate the 6-DOF extrinsic transformation between the 3D LiDAR and the Inertial Measurement Unit (IMU). Regarding the high data capture rate for LiDAR and IMU sensors, LI-Calib adopts a continuous-time trajectory formulation based on B-Spline, which is more suitable for fusing high-rate or asynchronous measurements than discrete-time based approaches. Additionally, LI-Calib decomposes the space into cells and identifies the planar segments for data association, which renders the calibration problem well-constrained in usual scenarios without any artificial targets. We validate the proposed calibration approach on both simulated and real-world experiments. The results demonstrate the high accuracy and good repeatability of the proposed method in common human-made scenarios. To benefit the research community, we open-source our code at https://github.com/APRIL-ZJU/lidar_IMU_calib.
    @inproceedings{lv2020targetlessco,
    title = {Targetless Calibration of LiDAR-IMU System Based on Continuous-time Batch Estimation},
    author = {Jiajun Lv and Jinhong Xu and Kewei Hu and Yong Liu and Xingxing Zuo},
    year = 2020,
    booktitle = {2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
    pages = {9968--9975},
    doi = {https://doi.org/10.1109/IROS45743.2020.9341405},
    abstract = {Sensor calibration is the fundamental block for a multi-sensor fusion system. This paper presents an accurate and repeatable LiDAR-IMU calibration method (termed LI-Calib), to calibrate the 6-DOF extrinsic transformation between the 3D LiDAR and the Inertial Measurement Unit (IMU). Regarding the high data capture rate for LiDAR and IMU sensors, LI-Calib adopts a continuous-time trajectory formulation based on B-Spline, which is more suitable for fusing high-rate or asynchronous measurements than discrete-time based approaches. Additionally, LI-Calib decomposes the space into cells and identifies the planar segments for data association, which renders the calibration problem well-constrained in usual scenarios without any artificial targets. We validate the proposed calibration approach on both simulated and real-world experiments. The results demonstrate the high accuracy and good repeatability of the proposed method in common human-made scenarios. To benefit the research community, we open-source our code at https://github.com/APRIL-ZJU/lidar_IMU_calib.},
    arxiv = {https://arxiv.org/pdf/2007.14759.pdf}
    }