sensors-logo

Journal Browser

Journal Browser

Radar Imaging and Data Fusion Techniques for Integrated Sensing and Communication (ISAC) Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Radar Sensors".

Deadline for manuscript submissions: 10 July 2025 | Viewed by 210

Special Issue Editor


E-Mail Website
Guest Editor
School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Interests: radar signal processing; synthetic aperture radar (SAR) images interpretation; target detection and recognition; joint radar-communication systems
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Integrated Sensing and Communication (ISAC) systems can utilize hardware and frequency spectrum resources more efficiently, leading to cost savings, improved information sharing, and enhanced sensing and communication performance. ISAC is becoming a promising technology in both the radar and communication fields, with many potential applications in autonomous vehicles, 6G networks, healthcare, and smart cities. Among the various technical aspects of ISAC systems, radar imaging techniques are crucial as they provide high-resolution 2D/3D images, enhancing environmental awareness. The fusion of radar images with other source data can further improve the ISAC system’s overall capability, reliability, and efficiency. Addressing radar imaging and data fusion techniques for ISAC systems, this Special Issue focuses on the following aspects:

  1. New ISAC system architecture centered on radar imaging;
  2. Waveform design and signal processing for ISAC systems;
  3. 2D/3D/4D mmWave radar imaging methods;
  4. MIMO mmWave radar signal processing;
  5. Data fusion methods for ISAC systems.

Dr. Zenghui Zhang
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • radar imaging
  • Integrated Sensing and Communication (ISAC) systems
  • 2D/3D/4D mmWave radar imaging methods
  • MIMO mmWave radar
  • data fusion

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 20456 KiB  
Article
RCRFNet: Enhancing Object Detection with Self-Supervised Radar–Camera Fusion and Open-Set Recognition
by Minwei Chen, Yajun Liu, Zenghui Zhang and Weiwei Guo
Sensors 2024, 24(15), 4803; https://doi.org/10.3390/s24154803 - 24 Jul 2024
Viewed by 82
Abstract
Robust object detection in complex environments, poor visual conditions, and open scenarios presents significant technical challenges in autonomous driving. These challenges necessitate the development of advanced fusion methods for millimeter-wave (mmWave) radar point cloud data and visual images. To address these issues, this [...] Read more.
Robust object detection in complex environments, poor visual conditions, and open scenarios presents significant technical challenges in autonomous driving. These challenges necessitate the development of advanced fusion methods for millimeter-wave (mmWave) radar point cloud data and visual images. To address these issues, this paper proposes a radar–camera robust fusion network (RCRFNet), which leverages self-supervised learning and open-set recognition to effectively utilise the complementary information from both sensors. Specifically, the network uses matched radar–camera data through a frustum association approach to generate self-supervised signals, enhancing network training. The integration of global and local depth consistencies between radar point clouds and visual images, along with image features, helps construct object class confidence levels for detecting unknown targets. Additionally, these techniques are combined with a multi-layer feature extraction backbone and a multimodal feature detection head to achieve robust object detection. Experiments on the nuScenes public dataset demonstrate that RCRFNet outperforms state-of-the-art (SOTA) methods, particularly in conditions of low visual visibility and when detecting unknown class objects. Full article
Back to TopTop