Next Article in Journal
DRA-UNet for Coal Mining Ground Surface Crack Delineation with UAV High-Resolution Images
Previous Article in Journal
Deep Integration of Fiber-Optic Communication and Sensing Systems Using Forward-Transmission Distributed Vibration Sensing and on–off Keying
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

Instantaneous Material Classification Using a Polarization-Diverse RMCW LIDAR

Baraja Pty Ltd., Suite 303, Building 1, 3 Richardson Pl., North Ryde, NSW 2113, Australia
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(17), 5761; https://doi.org/10.3390/s24175761
Submission received: 1 August 2024 / Revised: 23 August 2024 / Accepted: 31 August 2024 / Published: 4 September 2024
(This article belongs to the Section Radar Sensors)

Abstract

:
Light detection and ranging (LIDAR) sensors using a polarization-diverse receiver are able to capture polarimetric information about the target under measurement. We demonstrate this capability using a silicon photonic receiver architecture that enables this on a shot-by-shot basis, enabling polarization analysis nearly instantaneously in the point cloud, and then use this data to train a material classification neural network. Using this classifier, we show an accuracy of 85.4% for classifying plastic, wood, concrete, and coated aluminum.

1. Introduction

Light detection and ranging (LIDAR) is a critical sensor used by autonomous vehicles, as it provides a dense pointcloud with exceptional angular resolution, enabling the ability to provide mapping as well as detection and classification of moving objects in the environment [1]. Each point in the pointcloud is a detection event, where the LIDAR sensor has emitted energy, and received some portion of the reflected energy, using the time delay between these two events to calculate an accurate estimate of distance.
Next-generation LIDAR will use homodyne or coherent detection in the receiver hardware; this approach has several advantages over direct detection systems, such as the ability to instantaneously measure the Doppler velocity of moving targets [1]. This is possible as a homodyne detection system is able to measure the amplitude and phase of the reflected light, which provides the LIDAR sensor with additional information about the objects in the environment. In contrast, direct detection systems are sensitive to the intensity of the received signal, which is sufficient for ranging, but cannot measure the Doppler shift from a moving object.
Autonomous vehicles use sensors to understand the world around them, and in many applications, understanding the physical properties of the environment can greatly improve their functionality, such as when a sensor can classify a detection by the material type or structure; we call this ability “material classification”. This capability has been demonstrated in several autonomous applications, such as using feedback from force sensors in robotic excavators [2], using robotic arms and optical sensors in recycling plants [3], or capturing infra-red (IR) spectra of biomass on a production line to understand the composition of the feedstock [4]. Other methods of active sensing for material classification have been demonstrated with thermal sensors [5] as well as millimeter-wave vibrometry [6].
Material classification using laser sensors has shown tremendous potential; compared with camera-based methods, which are lighting dependent and rely on visible color [7], lasers provide a stimulus to the material, and then the sensor receiver records the response. Typically, the reflection from an objects is treated as an ideal Lambertian surface, which is a diffuse reflector, but real-world objects have complicated behaviour that can be characterized and used to identify materials [8]. Kirchner et al. demonstrated the ability to classify five materials using the depth error over angle and intensity from a commercial laser rangefinder [9]. Similarly, intensity histograms have been used in aerial LIDAR to classify different types of forest, as well as surfaces, such as water, gravel, and low vegetation, using a simple decision tree classifier [10].
Several authors have looked at using off-the-shelf time-of-flight (ToF) cameras to exploit depth errors for classifying materials in an image, independent of the material color [7,11]. Tanaka et al. also demonstrated that the accuracy could be greatly improved from 55.0% to 89.9% by sweeping the modulation frequency as well [7].
The use of spectral methods to classify materials is well covered in the literature and is demonstrated in diverse methods, such as hyperspectral cameras for material identification [12,13], optical absorbance sensors for detecting heavy metals in water [7], and many others. For the purpose of this article, we will focus on single wavelength LIDAR systems, as this reduces the complexity and cost of the system, instead of requiring an array of lasers or swept-wavelength systems.
Polarization is an additional property of light that describes the orientation of the oscillation of an electromagnetic wave; when reflected back from an object, the polarization state may change in a manner that is related to the physical structure of the surface of that object [14]. This insight led to investigations into how to leverage polarization LIDAR to measure the depolarization of returns from different particles. Simply stated, Mie scattering from spherical particles results in the reflected light maintaining the same polarization as the transmit beam; when the particles are non-spherical, some proportion of the reflected light is depolarized [14]. In a specific example, Sassen et al. demonstrated using polarization LIDAR to measure the ash size distribution from a volcanic eruption off the coast of Alaska [15]. Alternatively, simply augmenting LIDAR with a passive polarimetric sensor was shown to provide over 90% accuracy in classifying materials, even in low signal-to-noise (SNR) conditions [16]; in this article, the authors demonstrate the large improvement in classification accuracy by combining polarization with the LIDAR information.
Using polarization-coded LIDAR, Nunes-Pereira et al. demonstrated that polarization could be effectively used for the classification of common materials observed in operational domains for autonomous vehicles. To understand the effect, they conducted extensive examinations of the polarization-dependent reflectance of materials, then used optical coherence tomography (OCT) to determine the material cross-section of automotive car paints [17]. In order to reconstruct the degree of polarization, the authors used a pulsed ToF LIDAR and placed a linear polarizer in front of the optics. To capture the orthogonal polarization, they rotated the polarizer and repeated the capture, synthesizing a material-coded pointcloud by processing both polarizations.
In this article, we demonstrate a method of classifying materials using a polarization-diverse LIDAR with random modulated continuous wave (RMCW) ranging. This method enables material classification on an instantaneous shot-by-shot basis, only using the data acquired by the LIDAR sensor during the acquisition time. For the purpose of this article, we demonstrate the technique to classify a material, and describe a specific implementation using an integrated photonic chip to produce received signals for ranging and for calculating the polarization parameters required by our machine learning model; however, this method can be applied to LIDAR systems for various remote sensing applications. To the best of our knowledge, this is the first demonstration of a classification method using polarization-diverse RMCW LIDAR system that can perform instantaneous material classification.

2. Theory

2.1. Random Modulated Continuous Wave (RMCW) Ranging

RMCW ranging is a technique that avoids using narrow, high peak power pulses by spreading the same energy into a low peak power series of pulses, coded by a pseudorandom sequence, and was first described by Takeuchi et al. in 1983 [18]. When the received signal is digitized, it is simply correlated with the reference sequence, resulting in a correlation peak corresponding to the delay of the signal, which can be used to calculated the distance to target.
The polarization-diverse homodyne receiver is a much more complex system than the direct-detection scheme shown by Takeuchi et al., as we have four differential signals to digitize and combine, corresponding to X- and Y-polarizations, as well as the in-phase (I) and quadrature (Q) components. A thorough discussion of these devices and how polarization is recovered is shown by Roudas et al. [19].
Additionally, recovering our RMCW signal in a homodyne receiver is challenging due to the phase fluctuations of the laser source—while this can be ameliorated by using a narrow-linewidth laser, it is useful to have a system that is insensitive to laser linewidth, as this increases the types of lasers available for RMCW ranging. We provide a detailed discussion of detecting homodyne RMCW LIDAR signals in [20].
As an example, we show a numerically generated example of an RMCW time domain signal converted to the correlation domain; we generated a Barker-13 code and delayed it by 0.5 μs in an acquisition window of 2.0 μs. We included additive gaussian white noise (AWGN) to introduce noise to the received signal, as shown in Figure 1a. After correlating with the ideal reference Barker-13 sequence, we obtain a correlation signal in Figure 1b, where the correlation peak corresponds to the time delay from the return signal, T d . The distance to the target is then simply
d T = T d · c 2 ,
where c is the speed of light. SNR is calculated as
S N R ( d B ) = 10 log 10 X P 3 σ ,
where X P is the correlation peak height and σ is the standard deviation of the noise fluctuations in the correlation domain. As shown in Figure 1b, the correlation peak has quite a lot of structure outside of the main peak; this is due to the length of the Barker sequence relative to the overall acquisition time. Thus, to calculate the noise variance in σ it is important to exclude any samples that have residual correlation energy in them—for example, we use the last 200 samples to calculate σ .

2.2. Stokes Parameters

The in-phase and quadrature voltage signals are proportional to the electric field vector in our received optical signal, and we can, therefore, treat this as a Jones Vector j ( j x , j y ) denoting the polarization of the transverse electric field. The zero phase of the received j will be neglected as we do not have the means to reliably measure a phase difference between transmitted and received light in a way that can isolate the dominant contributions from macroscopic propagation. This is the level of information described by Stokes Parameters [21], a four-element basis for defining polarization states in way that can be measured from optical intensity alone: S 0 as the intensity of the field, and S 1 , S 2 , S 3 as the difference in intensities of the field projected onto different common polarization bases, linear polarizations 0 , 90 then ± 45 and left- and right-circular polarizations.
S 0 | j | 2 ¯ S 1 | j · ( 1 , 0 ) | 2 ¯ | j · ( 0 , 1 ) | 2 ¯ S 2 | j · ( 1 , 1 ) / 2 | 2 ¯ | j · ( 1 , 1 ) / 2 | 2 ¯ S 3 | j · ( 1 , i ) / 2 | 2 ¯ | j · ( 1 , i ) / 2 | 2 ¯
Here, the overline indicates the averaging over a measurement interval, which permits a fraction of power that is depolarized and not contained in S 1 , S 2 , S 3 :
p S 1 2 + S 2 2 + S 3 2 S 0 1
In an RMCW LIDAR system, polarization variation is negligible relative to a sufficiently high sampling rate, however, the polarization variation over a codeword represents a temporal depolarization. The overline in (3) indicates averaging over a codeword.
Additionally, we assume that the launch polarization state is constant over the propagation to the target, which is a reasonable assumption given that the range to target was no more than 10 meters. In the case that this work extends to long range operation, the atmospheric effects on the launch polarization must be considered.

2.3. Classification Strategy

Material classification using LIDAR and polarimetric sensing has been demonstrated with a variety of classifiers, such as SVM, decision trees, and neural networks [16], showing good results in accuracy. In this work, the best performing method was SVM (accuracy = 94.4%), compared with k-nearest neighbors (accuracy = 92.0%), neural network (accuracy = 89.0%), and decision tree (accuracy = 83.8%). From this, we conclude that gains from selecting the optimal framework and training strategy are not the focus of this paper; we are investigating the applicability of this method to real-world materials and configurations that would be observed by autonomous vehicles.
Instead, we select a simple feed-forward perceptron model trained and validated using the predictive modelling tools in JMP 16 [22]. Once we have collected an entire dataset, we use a k-fold cross-validation method, where the number of folds is 5. Thus, the neural network is trained on one portion of the dataset, and then validated on the portion that has not been used for training.
As we will show in the Results section, increasing the number of nodes in the hidden layer can improve the classification accuracy; however, we would like to assess the relative performance of this classifier against different sets of materials. With this in mind, we fix our classifier to a feed-forward neural network with a single hidden layer using 64 nodes, and then try to describe the results in terms of the relative performance.
The presented neural network comprises a perceptron featuring a non-binary output classification. A total of six distinct input nodes are employed in conjunction with 64 hidden nodes. The input nodes consist of the calculated distance to the target as in (1), the SNR of the correlation peak as in (2), and the four Stokes parameters calculated from the polarization-diverse receiver, as shown in (3). A hyperbolic tangent activation function is used to facilitate the required non-binary classification. The activation function categorizes materials into output value ranges within the possible −1 to 1 overall output, depending on the number of materials for classification.
The computation of the perceptron output node’s value involves summing the inputs from the hidden nodes, each multiplied by its corresponding synaptic weight. The values of the hidden nodes are similarly calculated by summing the values of the input nodes multiplied by their respective synaptic weights, as shown in [23]. This process is handled via an automated optimizer during the training process and is documented on the JMP website [22].

3. Bi-Directional Optical Sub-Assembly (BOSA)

The recovering polarization from the received LIDAR signals can be accomplished with several methods [14,17,24]; however, in this work, we use a polarization-diverse homodyne receiver, similar to the work in the digital self-homodyne receiver shown by Puttnam et al. [25], but combining transmit circuit on the same chip as the receive circuit, which we call a bi-directional optical sub-assembly (BOSA). The purpose of the BOSA is to generate transmition signals in an optical fiber, and then to receive the reflected light in an optical fiber. The actual collimation into free space towards the target requires an optical telescope as well as an optical circulator to separate the transmit and receive optical signals in a coaxial LIDAR; this is detailed in Section 4.
Within the LiDAR engine, the BOSA is comprised of two primary elements: a receiver (RX) and a transmitter (TX). The transmitter segment encompasses a photonic integrated circuit (PIC) that produces a local oscillator (LO) signal and an RMCW modulated signal to be sent into the environment. Conversely, the receiver segment incorporates a photonic-integrated polarization-diverse in-phase and quadrature (IQ) receiver. Both of these elements are integrated within a PIC, ensuring a compact and efficient design.

3.1. Transmit Circuit (TX)

The Tx circuit is shown schematically in Figure 2d; an external butterfly laser source is called a transmit optical subassembly (TOSA), and is coupled to the photonic chip, and then split into local oscillator and signal paths. The local oscillator is used in the receive circuitry for homodyne detection, and the signal path will be encoded with our RMCW modulation before being emitted into the environment. We use a heater-controlled Mach–Zehnder interferometer (MZI) to provide a tunable split between the two paths.
The Mach–Zehnder modulator (MZM) plays a pivotal role in transforming the electrical modulation code into optical modulation. This transformation is accomplished by modulating the depletion regions in the PN junctions of the MZM arms, as depicted in Figure 2. To enhance efficiency, a push–pull configuration is employed, effectively doubling the applied drive voltage. This method enables the attainment of the essential 2 · V π voltage swing for phase modulation, ensuring it is achieved with the least possible power consumption. To ensure that the MZM consistently operates at the required operating point, as depicted in Figure 2a, thermal heaters are integrated into each arm of the MZM. These heaters enable fine-tuning of the MZM output, thereby maintaining continuous and stable phase modulation.

3.2. Receive Circuit (RX)

Our ability to detect and classify materials is based on accurate measurement of the polarization state of the received signal, and this Rx circuit in the BOSA is the core component that achieves this functionality. This advanced receiver comprises three principal components: a polarization–splitter–rotator (PSR), 90 hybrids, and photodetectors (PDs), all interconnected via low-loss SiN waveguides within the PIC, as illustrated in Figure 3a.
PSR plays a pivotal role in processing optical inputs with indeterminate TE/TM polarization. It adiabatically transforms the TM component into the fundamental TE mode of the SiN waveguides, while simultaneously transmitting the TE component without alteration. The unaltered TE fraction is hereby referred to as the X-polarization and the adiabatically converted TM fraction is referred to as the Y-polarization. Subsequent to polarization separation, each polarization state is directed to its corresponding 90° hybrid, which consist of four multi-mode interference couplers (MMIs). A 2 × 2 MMI with an unused input port is employed on the LO side of the hybrids, whereas the signal input is managed by a 1 × 2 MMI on the opposite side.
This configuration results in a 90° phase difference between the two inputs, labelled in the diagram as p and n, essential for the receiver’s capability to process both amplitude and phase information. To convert this optical information into processable electrical information, pairs of vertically stacked germanium photodiodes are implemented, with their photocurrents subtracted to remove any common-mode noise, as shown by Jin et al. [26]. The resulting photocurrents generated across the photodiodes are then converted to voltage and amplified through the use of trans-impedance amplifiers (TIAs). Figure 3b depicts how the p and n photocurrents are 90 out of phase, but construct an in-phase and quadrature measurement, which is also the case for the Y-polarization. Thus, eight photocurrents result in the digitization of four voltages at the ADC.

4. Experimental Setup and Method

Figure 4 displays the experimental setup used to validate our material classification approach using a single-point RMCW LIDAR system. In our experimental setup, we use a wavelength-tunable DBR laser (Oclaro TL3000), set to 1545.3 nm and with +12 dBm output power. The laser is connected to the input of our custom silicon photonic BOSA, as described in Section 3.1, which modulates the laser signal with a phase-modulated 512-bit Gold code, where the MZM is biased, as shown in Figure 2a. In this setup, the Gold code is 2 μ s in duration, and then the acquisition system waits another 2 μ s for the return signal. Thus, each pixel takes a total of 4 μ s in duration. To boost the optical power, we use an erbium-doped fiber amplifier, which boosts the transmitted power to +27 dBm.
To control the polarization of the transmitted beam, we connect a fiber polarization controller to the output of the EDFA, and align the polarization to be 45 linear relative to the input of the polarization beam splitter (PBS). Thus, we have equal power in both X- and Y-polarizations in separate fibers, which are assembled together into a fiber array. As part of our design of experiments, we would like to vary the transmitted polarization, and by connecting/disconnecting the inputs of the PBS, we are able to create roughly 0 , 45 , and 90 linear polarizations.
The optical fiber array is used to closely position both Tx fibers and a single Rx fiber in parallel behind a birefringent and magneto-optic crystal stack, forming a system that resembles a free-space optical circulator. These types of non-reciprocal devices have been used in optics for decades to enable a polarization-independent separation of forward- and backwards-travelling waves; we direct the reader to the literature to understand more about these devices [27,28]. Creating a collimation system that projects both Tx and Rx fibers onto the same optical axis after collimation is called a coaxial LIDAR system, and has several benefits, such as ensuring alignment of Tx and Rx paths at all times. In this manner, the system resembles a free-space optical circulator with a single collimating lens that focuses on both the Tx and Rx fibers. The resulting collimated beam has a 1 / e 2 intensity diameter of 19.1   m m × 5.2   m m .
This beam is directed to the target of interest, and is reflected along the same path to the collimation optics. The free-space optical circulator collects the received signal and couples it to a singlemode fiber, which is edge-coupled to the Rx BOSA, as described in Section 3.2. The photonic integrated circuit mixes the local oscillator with the weak reflected LIDAR signal from the environment and creates four photocurrents, for two polarizations and two quadratures. These photocurrents are converted to voltages and then sampled by 4 × 500 MSps ADCs, producing a polarization-diverse field reconstruction of the received LIDAR signal, labelled as XI, XQ, YI, and YQ, where X and Y refer to the horizontal and vertical polarization, and I + Q are the in-phase and quadrature photocurrents.
These digitized signals are processed in two steps: First, combined into a single measurement and correlated digitally with the ideal RMCW code, producing a correlation signal, similar to Figure 1b. From this, we calculate the distance to thre target and the SNR. This signal processing chain is executed entirely in a Xilinx Ultrascale (ZU19) FPGA, which has been designed to accommodate the time budget of calculating the relevant parameters every 4 μ s.
Second, we process the polarization measurements to calculate the Stokes time-series for the received signal, as in Section 2.2. We then select the mean Stokes values over the duration of the codeword and use scalar values of S 0 , S 1 , S 2 , and S 3 as inputs to our machine learning model, which, combined with the distance SNR, provides six inputs to our neural network model. In this experiment, the neural network is processed offline using Python; however, the neural network that we use is a simple multi-layer perceptron with a single hidden layer. Thus, we believe that the calculation of the material class can be implemented in an FPGA and the calculation can be completed in the 4 μ s acquisition window.

4.1. Data Collection Methodology

The set of materials used in our experiment are four that would occur in an urban environment: black power-coated aluminum, concrete, engineered wood, and black plastic; photos of the samples are shown in Figure 5. The samples are large enough for our laser beam to entirely fit without clipping. In order to assess the classification performance of the polarization-diverse RMCW LIDAR, we sought to collect data in controlled experimental conditions, but with predetermined variations that would mimic a situation in a real environment. To this end, we collected data for every material sample and varying the target distance, the angle of incidence to the LIDAR (which we call yaw), the rotation of the material in the plane perpendicular to the LIDAR optical path (which we call roll), and the polarization of the transmission signal from the RMCW LIDAR.
These deliberate variations are used to explore the range of polarizations that we receive in the BOSA, and to validate the classifier with these variations.
The variations are explored using a full-factorial design-of-experiments (DOE), and the values used in the input factors are shown in Table 1. For each one of the 108 combinations, we collect roughly 830 measurements for a total of more than 89,640 measurements per material; using our RMCW reference code, we correlate the return signal and use a constant false alarm rate (CFAR) algorithm to determine if we have a valid peak, using a false alarm rate of 1 × 10 4 [29]. If no peaks are passed through CFAR, we reject the measurement completely.

5. Results

As described in Section 4.1, each material was tested according to the DOE plan, with the polarization-diverse quadrature signals digitized and recorded for every configuration for each material. Thus, we stored more than 89,000 measurements for each material, and each measurement was processed to produce a distance to the target, the SNR of the cross-correlation peak, and the four Stokes parameters. Therefore, in total, we collected 356,000 measurements to train and validate our classifier, with the DOE ensuring we had significant real-world variation in the dataset.
These six measurement outputs are the six inputs of our neural network. We then trained a multi-layer perceptron neural network with a single hidden layer of 64 nodes, using a tanh activation function, as described in Section 2.3, and then ran the resulting model on the validation data set. We expressed the performance of the classifier with a confusion matrix, as shown in Figure 6. The overall classification accuracy was 85.4%; however, from the confusion matrix, it is clear that the majority of the misclassifications came from plastic, which had a classification accuracy of just 72.6%.
We believe that the cause of this lower accuracy for black plastic is not related to the characteristics of the material, but from the amount of measurement data in building the classifier. From the confusion matrix in Figure 6, we can see that the number of samples available for each of aluminum, concrete, and engineered wood (n = 17,864, 12,931, and 17,585, respectively) was much higher than the number of samples for black plastic (n = 9033), which produced a worse accuracy. This was expected as the number of available measurements from black plastic was much lower, thus the material classifier was biased against black plastic. This is due to the low reflectivity of the material, and in the future, a classifier should be trained on an equal number of points for every class. The difference in SNR for each material class is shown in Figure 7.
In comparison with previous work, Lee et al. used the estimation of surface reflectance from a time-of-flight camera to achieve an accuracy of 76.5% on seven materials [11]. Using a polarimetric multispectral LIDAR, Han et al. demonstrated 100% material classification, demonstrating no false detections by combining polarization information as well as the information from 33 lasing wavelengths between 580 nm and 900 nm [30]. This approach, while impressive, needs a large laboratory setup to create the multispectral LIDAR, and would not be amenable to an integrated LIDAR system.
Capturing the variation over the DOE is an important method to assess the real-world performance of the classifier. For example, we measured the materials at two distances, as we expected the laser return SNR to change over distance for each material. If we trained the classifier at a single distance of 10 m, the classification accuracy was 99.7%; similarly, training only at measurements of 3 m yielded a classification accuracy of 99.6%. In these two cases, the network was overfitted to the measurements of these materials at that specific distance, and thus the classifier would perform poorly at any other distance.
To explore the performance of the classifier with the hidden node number of neurons, we repeated the training and validation for a selection of hidden layer sizes, and measured the overall classification accuracy, as shown in Figure 8.
We use a 3D scatter plot to visualize the Stokes parameters, S1, S2, and S3, and the measurements demonstrated that the materials showed differing polarization results; note that we remind the reader that the launch conditions were 0 , 45 , and 90 linear polarization relative to the output window of the LIDAR sensor. As shown in Figure 9a, all materials measured Stokes parameters that were well clustered; the exceptions were concrete (in purple), which was clustered, but not visible, and coated metal, which was clustered into many smaller groupings. This would suggest that classification on concrete would have a poor accuracy, and that most of the false classifications would be for coated aluminum. In contrast, the confusion matrix in Figure 6 indicates a different conclusion; the accuracy for concrete was 88.3%, and the false reading was highest for plastic.
Additional insight is available by plotting the relationship between SNR and the Stokes parameters. In Figure 7, there is an observable difference in the mean SNR of each material, though the overall distributions are not separated; in Figure 9b, we see a clear distinction between concrete and the other materials, as concrete is strongly confined in S2, but greatly dispersed in SNR. Similarly, we note that coated metal only has measurement points above SNR > 8 dB.

6. Conclusions

We have demonstrated instantaneous material classification using an RMCW LIDAR system with an integrated transmit/receive photonic chip to enable polarization-diverse homodyne detection. This technique enables the creation of LIDAR point clouds with a point-by-point estimate of materials in the environment, which could greatly aid perception tasks for autonomous vehicles and robotics. In our field test at 3 m and 10 m, we varied the angle of incidence and rotation normal to the LIDAR, as well as the launch polarization state, collecting over 350,000 measurements to train and validate our machine learning model for material classification. The field test demonstrated that plastic, engineered wood, concrete, and coated aluminum could be correctly classified with an accuracy of 85.4%. We believe that this is a strong demonstration of material classification as a novel LIDAR data product for autonomy and robotics.

Author Contributions

Conceptualization, C.P., Y.K.L. and F.C.; methodology, Y.K.L.; software, C.P.; experimental validation, D.R.; writing—original draft preparation, C.P. and A.T.; writing—review, Y.K.L. and F.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to acknowledge the contributions of Ben Hopkins, who assisted with the details on polarization and the optical system.

Conflicts of Interest

The authors disclose that they are full-time employees of Baraja Pty Ltd. and/or Baraja Inc., where they are actively commercializing LIDAR technology.

References

  1. Hecht, J. Lidar for self-driving cars. Opt. Photonics News 2018, 29, 26–33. [Google Scholar] [CrossRef]
  2. Fernando, H.; Marshall, J. What lies beneath: Material classification for autonomous excavators using proprioceptive force sensing and machine learning. Autom. Constr. 2020, 119, 103374. [Google Scholar] [CrossRef]
  3. Kiyokawa, T.; Takamatsu, J.; Koyanaka, S. Challenges for future robotic sorters of mixed industrial waste: A survey. IEEE Trans. Autom. Sci. Eng. 2022, 21, 1023–1040. [Google Scholar] [CrossRef]
  4. Tao, J.; Liang, R.; Li, J.; Yan, B.; Chen, G.; Cheng, Z.; Li, W.; Lin, F.; Hou, L. Fast characterization of biomass and waste by infrared spectra and machine learning models. J. Hazard. Mater. 2020, 387, 121723. [Google Scholar] [CrossRef] [PubMed]
  5. Dashpute, A.; Saragadam, V.; Alexander, E.; Willomitzer, F.; Katsaggelos, A.; Veeraraghavan, A.; Cossairt, O. Thermal Spread Functions (TSF): Physics-guided Material Classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 1641–1650. [Google Scholar]
  6. Shanbhag, H.; Madani, S.; Isanaka, A.; Nair, D.; Gupta, S.; Hassanieh, H. Contactless Material Identification with Millimeter Wave Vibrometry. In Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services, Helsinki, Finland, 18–22 June 2023; pp. 475–488. [Google Scholar]
  7. Tanaka, K.; Mukaigawa, Y.; Funatomi, T.; Kubo, H.; Matsushita, Y.; Yagi, Y. Material classification from time-of-flight distortions. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 41, 2906–2918. [Google Scholar] [CrossRef]
  8. Muckenhuber, S.; Holzer, H.; Bockaj, Z. Automotive lidar modelling approach based on material properties and lidar capabilities. Sensors 2020, 20, 3309. [Google Scholar] [CrossRef]
  9. Kirchner, N.; Taha, T.; Liu, D.; Paul, G. Simultaneous material type classification and mapping data acquisition using a laser range finder. In Proceedings of the International Conference on Intelligent Technologies, Sydney, Australia, 12–14 December 2007; University of Technology: Sydney, Australia, 2007. [Google Scholar]
  10. Antonarakis, A.; Richards, K.S.; Brasington, J. Object-based land cover classification using airborne LiDAR. Remote Sens. Environ. 2008, 112, 2988–2998. [Google Scholar] [CrossRef]
  11. Lee, S.; Lee, D.; Kim, H.C.; Lee, S. Material Type Recognition of Indoor Scenes via Surface Reflectance Estimation. IEEE Access 2021, 10, 134–143. [Google Scholar] [CrossRef]
  12. Bonifazi, G.; Capobianco, G.; Palmieri, R.; Serranti, S. Hyperspectral imaging applied to the waste recycling sector. Spectrosc. Eur. 2019, 31, 8–11. [Google Scholar] [CrossRef]
  13. Peyghambari, S.; Zhang, Y. Hyperspectral remote sensing in lithological mapping, mineral exploration, and environmental geology: An updated review. J. Appl. Remote Sens. 2021, 15, 031501. [Google Scholar] [CrossRef]
  14. Liu, X.; Zhang, L.; Zhai, X.; Li, L.; Zhou, Q.; Chen, X.; Li, X. Polarization Lidar: Principles and Applications. Photonics 2023, 10, 1118. [Google Scholar] [CrossRef]
  15. Sassen, K.; Zhu, J.; Webley, P.; Dean, K.; Cobb, P. Volcanic ash plume identification using polarization lidar: Augustine eruption, Alaska. Geophys. Res. Lett. 2007, 34, L08803. [Google Scholar] [CrossRef]
  16. Brown, J.P.; Roberts, R.G.; Card, D.C.; Saludez, C.L.; Keyser, C.K. Hybrid passive polarimetric imager and lidar combination for material classification. Opt. Eng. 2020, 59, 073106. [Google Scholar] [CrossRef]
  17. Nunes-Pereira, E.; Peixoto, H.; Teixeira, J.; Santos, J. Polarization-coded material classification in automotive LIDAR aiming at safer autonomous driving implementations. Appl. Opt. 2020, 59, 2530–2540. [Google Scholar] [CrossRef]
  18. Takeuchi, N.; Sugimoto, N.; Baba, H.; Sakurai, K. Random modulation cw lidar. Appl. Opt. 1983, 22, 1382–1386. [Google Scholar] [CrossRef] [PubMed]
  19. Roudas, I.; Vgenis, A.; Petrou, C.S.; Toumpakaris, D.; Hurley, J.; Sauer, M.; Downie, J.; Mauro, Y.; Raghavan, S. Optimal polarization demultiplexing for coherent optical communications systems. J. Light. Technol. 2010, 28, 1121–1134. [Google Scholar] [CrossRef]
  20. Pulikkaseril, C. Simulating correlation waveforms of random modulated continuous wave LIDAR. Opt. Eng. 2022, 62, 031205. [Google Scholar] [CrossRef]
  21. Hecht, E. Optics; Pearson Education, Incorporated: London, UK, 2017. [Google Scholar]
  22. JMP Statistical Discovery LLC. Neural Networks. Available online: https://www.jmp.com/support/help/en/17.2/index.shtml#page/jmp/neural-networks.shtml#103373 (accessed on 18 January 2024).
  23. Castaño, F.; Beruvides, G.; Haber, R.E.; Artuñedo, A. Obstacle recognition based on machine learning for on-chip LiDAR sensors in a cyber-physical system. Sensors 2017, 17, 2109. [Google Scholar] [CrossRef]
  24. Han, Y.; Salido-Monzú, D.; Butt, J.A.; Wieser, A. Polarimetric femtosecond-laser LiDAR for multispectral material probing. In Proceedings of the Optics and Photonics for Advanced Dimensional Metrology II, Strasbourg, France, 3 April–23 May 2022; Volume 12137, pp. 70–77. [Google Scholar]
  25. Puttnam, B.J.; Luís, R.S.; Delgado Mendinueta, J.M.; Sakaguchi, J.; Klaus, W.; Kamio, Y.; Nakamura, M.; Wada, N.; Awaji, Y.; Kanno, A.; et al. Self-homodyne detection in optical communication systems. Photonics 2014, 1, 110–130. [Google Scholar] [CrossRef]
  26. Jin, X.; Su, J.; Zheng, Y.; Chen, C.; Wang, W.; Peng, K. Balanced homodyne detection with high common mode rejection ratio based on parameter compensation of two arbitrary photodiodes. Opt. Express 2015, 23, 23859–23866. [Google Scholar] [CrossRef]
  27. Fujii, Y. High-isolation polarization-independent optical circulator. J. Light. Technol. 1991, 9, 1238–1243. [Google Scholar] [CrossRef]
  28. Matsumoto, T.; Sato, K.i. Polarization-independent optical circulator: An experiment. Appl. Opt. 1980, 19, 108–112. [Google Scholar] [CrossRef] [PubMed]
  29. Wikipedia Contributors. Constant False Alarm Rate—Wikipedia, The Free Encyclopedia. 2022. Available online: https://en.wikipedia.org/w/index.php?title=Constant_false_alarm_rate&oldid=1104952768 (accessed on 18 January 2024).
  30. Han, Y.; Salido-Monzú, D.; Wieser, A. Classification of material and surface roughness using polarimetric multispectral LiDAR. In Proceedings of the Multimodal Sensing and Artificial Intelligence: Technologies and Applications III, Munich, Germany, 26–30 June 2023; Volume 12621, pp. 97–106. [Google Scholar]
Figure 1. (a) Example of a received RMCW signal corrupted by white noise, and (b) the resulting correlation signal showing a peak at the delayed time of the received waveform.
Figure 1. (a) Example of a received RMCW signal corrupted by white noise, and (b) the resulting correlation signal showing a peak at the delayed time of the received waveform.
Sensors 24 05761 g001
Figure 2. (a) Mach–Zehnder modulator (MZM) transfer function; operating point shown in red denotes the desired bias point to operate in phase-modulation ( V π is the half-wave voltage). (b) Input electrical modulation in the form of high and low voltages. (c) Output optical modulation in the form of intensity and phase information. (d) The transmit (TX) portion of the photonic chip receives its input light from an external laser, which is then distributed between the MZM and the path leading to the local oscillator using a Mach–Zehnder interferometer (MZI).
Figure 2. (a) Mach–Zehnder modulator (MZM) transfer function; operating point shown in red denotes the desired bias point to operate in phase-modulation ( V π is the half-wave voltage). (b) Input electrical modulation in the form of high and low voltages. (c) Output optical modulation in the form of intensity and phase information. (d) The transmit (TX) portion of the photonic chip receives its input light from an external laser, which is then distributed between the MZM and the path leading to the local oscillator using a Mach–Zehnder interferometer (MZI).
Sensors 24 05761 g002
Figure 3. (a) Single channel polarization-diverse in-phase/quadrature (IQ) receiver (XIp, XIn = x-polarization in-phase pair; YIp, YIn = y-polarization in-phase pair; XQp, XQn = x-polarization quadrature pair; YQp, YQn = y-polarization quadrature pair; PSR = polarization splitter/rotator; PD = photodiode; LO = local oscillator). (b) X/Y polarization constellation diagrams. Two example measurements are shown with their component breakdowns, which are labelled on the receiver.
Figure 3. (a) Single channel polarization-diverse in-phase/quadrature (IQ) receiver (XIp, XIn = x-polarization in-phase pair; YIp, YIn = y-polarization in-phase pair; XQp, XQn = x-polarization quadrature pair; YQp, YQn = y-polarization quadrature pair; PSR = polarization splitter/rotator; PD = photodiode; LO = local oscillator). (b) X/Y polarization constellation diagrams. Two example measurements are shown with their component breakdowns, which are labelled on the receiver.
Sensors 24 05761 g003
Figure 4. Experimental setup: we use a Tx BOSA for modulating the RMCW code on the transmit path, and an RX BOSA for performing the polarization-diverse IQ demodulation, using unmodulated light as the local oscillator (DBR: distributed Bragg reflector, EDFA: erbium-doped fiber amplifier, PC: polarization controller, PBS: polarization beamsplitter, Rx: received optical path, XI, XQ, YI, YQ: the in-phase and quadrature portions of the x- and y-polarization.).
Figure 4. Experimental setup: we use a Tx BOSA for modulating the RMCW code on the transmit path, and an RX BOSA for performing the polarization-diverse IQ demodulation, using unmodulated light as the local oscillator (DBR: distributed Bragg reflector, EDFA: erbium-doped fiber amplifier, PC: polarization controller, PBS: polarization beamsplitter, Rx: received optical path, XI, XQ, YI, YQ: the in-phase and quadrature portions of the x- and y-polarization.).
Sensors 24 05761 g004
Figure 5. Materials used for experimental validation: (a) coated aluminum, (b) concrete, (c) black plastic, (d) engineered wood.
Figure 5. Materials used for experimental validation: (a) coated aluminum, (b) concrete, (c) black plastic, (d) engineered wood.
Sensors 24 05761 g005
Figure 6. Confusion matrix for the 64-node classifier on four different materials.
Figure 6. Confusion matrix for the 64-node classifier on four different materials.
Sensors 24 05761 g006
Figure 7. Distribution of SNR values for all materials in the dataset.
Figure 7. Distribution of SNR values for all materials in the dataset.
Sensors 24 05761 g007
Figure 8. Classification accuracy as number of nodes in hidden layer increase.
Figure 8. Classification accuracy as number of nodes in hidden layer increase.
Sensors 24 05761 g008
Figure 9. Scatter plots to demonstrate the clustering of measurements in polarization space, and with SNR. Points are colored by material, with concrete (red), coated metal (green), plastic (orange), and engineered wood (blue). (a) 3D visualization of S1, S2, and S3. (b) 2D scatter plot of SNR and S2.
Figure 9. Scatter plots to demonstrate the clustering of measurements in polarization space, and with SNR. Points are colored by material, with concrete (red), coated metal (green), plastic (orange), and engineered wood (blue). (a) 3D visualization of S1, S2, and S3. (b) 2D scatter plot of SNR and S2.
Sensors 24 05761 g009
Table 1. Experimental factors used in the full-factorial design-of-experiments (DOE) for collecting the training and validation data.
Table 1. Experimental factors used in the full-factorial design-of-experiments (DOE) for collecting the training and validation data.
Input FactorValues
Distance (m)3, 10
Tx polarization (°)0, 45, 90
Yaw (°)0, 7, 15
Roll (°)0, 45, 135, 180, 225, 315
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pulikkaseril, C.; Ross, D.; Tofini, A.; Lize, Y.K.; Collarte, F. Instantaneous Material Classification Using a Polarization-Diverse RMCW LIDAR. Sensors 2024, 24, 5761. https://doi.org/10.3390/s24175761

AMA Style

Pulikkaseril C, Ross D, Tofini A, Lize YK, Collarte F. Instantaneous Material Classification Using a Polarization-Diverse RMCW LIDAR. Sensors. 2024; 24(17):5761. https://doi.org/10.3390/s24175761

Chicago/Turabian Style

Pulikkaseril, Cibby, Duncan Ross, Alexander Tofini, Yannick K. Lize, and Federico Collarte. 2024. "Instantaneous Material Classification Using a Polarization-Diverse RMCW LIDAR" Sensors 24, no. 17: 5761. https://doi.org/10.3390/s24175761

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop