Next Article in Journal
Automating Systematic Literature Reviews with Retrieval-Augmented Generation: A Comprehensive Overview
Previous Article in Journal
Study on Dynamic Mechanical Properties and Failure Pattern of Thin-Layered Schist
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Performance Analysis of a Color-Code-Based Optical Camera Communication System

Department of Electrical and Electronics Engineering, Firat University, 23119 Elazig, Turkey
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(19), 9102; https://doi.org/10.3390/app14199102
Submission received: 30 August 2024 / Revised: 1 October 2024 / Accepted: 4 October 2024 / Published: 8 October 2024
(This article belongs to the Section Electrical, Electronics and Communications Engineering)

Abstract

:
In this study, we present a visible light communication (VLC) system that analyzes the performance of an optical camera communication (OCC) system, utilizing a mobile phone camera as the receiver and a computer monitor as the transmitter. By creating color channels in the form of a 4 × 4 matrix within a frame, we determine the parameters that affect the successful transmission of data packets. Factors such as the brightness or darkness of the test room, the light color of the lamp in the illuminated environment, the effects of daylight when the monitor is positioned in front of a window, and issues related to dead pixels and light bleed originating from the monitor’s production process have been considered to ensure accurate data transmission. In this context, we utilized the PyCharm, Pydroid, Python, Tkinter, and OpenCV platforms for programming the transmitter and receiver units. Through the application of image processing techniques, we mitigated the effects of daylight on communication performance, thereby proposing a superior system compared to standard VLC systems that incorporate photodiodes. Additionally, considering objectives such as the maximum number of channels and the maximum distance, we regulated the sizes of the channels, the distances between the channels, and the number of channels. The NumPy library, compatible with Python–Tkinter, was employed to determine the color levels and dimensions of the channels. We investigate the effects of RGB and HSV color spaces on the data transmission rate and communication distance. Furthermore, the impact of the distance between color channels on color detection performance is discussed in detail.

1. Introduction

Visible light communication (VLC) is an alternative to radio frequency (RF) communication and covers the visible light region of the electromagnetic spectrum in the 400–800 THz band. Since the operating frequency is much higher than RF, the communication distance is shorter, but VLC has a much higher bandwidth and is insensitive to electromagnetic noise in the RF band. Due to these advantages, it is widely used in security systems, health sector, avionics systems, underwater communication, and vehicle-to-vehicle communication [1,2].
VLC has been recognized by both industry and academia as a feasible solution to realize the wireless communication networks of the future. VLC-compatible light-emitting diode (LED) luminaires convert an ordinary luminaire used for lighting into a high-speed modem. Due to the high-frequency flickering of light, it is not possible for persons in the environment to detect the data transfer. Today, VLC-compatible LED luminaires are released by various companies [3].
Free-space optical (FSO) communication, similar to VLC, uses light to transmit data. However, the light used is not limited to visible light. Ultraviolet (UV) and infrared (IR) are also included in this category. Optical wireless communication (OWC) is a general term used to refer to all forms of optical communication and includes visible light communication, light-fidelity (Li-Fi), FSO communication, and infrared remote-control systems [4].
Most VLC systems use LEDs as optical emitters and photodiode arrays as optical receivers. The light emission mechanism of LEDs is based on the laws of quantum physics and when current flows through the P-N junction, light is emitted at a frequency corresponding to the visible light region of the spectrum. The range of 380–780 nm is recognized by the International Commission on Illumination (Commission Internationale de l’Eclairage, CIE) as the visible light band [5]. LEDs in VLC systems, which provide communication at high bps values, have high luminous fluxes and low junction capacities [6]. Ordinary LED chips produced for general lighting can be used in VLC communication systems with some compromise in data communication speed. The photodiode used as an optical sensor is very sensitive to light, and when biased in the opposite direction, the leakage current changes with the light incident on it. Since a photodiode, unlike a light-dependent resistor (LDR), reacts very fast to changes in light (a few ns), silicon photodiodes, positive-intrinsic-negative (PIN) diodes, and avalanche photodiodes are preferred as optical receivers in visible light communication systems [7]. Other alternatives used as optical receivers in VLC systems are charge-coupled device (CCD) and complementary metal oxide semiconductor (CMOS) type image sensors [8,9,10].
The visible light communication technology called “Li-Fi” was announced in a TED Talk in 2011 by Professor Harald Hass from the University of Edinburgh and attracted worldwide attention [11]. In the following period, many technical developments have emerged. There are many academic studies in the literature on VLC communication and different technical terms are used other than VLC, for example, optical camera communication (OCC), optical spatial modulation (OSM), camera–screen communication (CSC), light-to-camera communication (LCC), color intensity modulation (CIM), and high-density modulation (HDM).
Several studies on VLC systems using a cell phone camera as the receiver are mentioned below.
In [12], Chavez-Burbano et al. proposed a new flicker-free, distance-independent modulation method for OCC. An OCC-based modulation is presented that can operate at short, medium, and long distances for both indoor and outdoor scenarios. In the VLC system [13] with a CMOS-based mobile phone camera and a quadrichromatic light-emitting diode (QLED), unlike color intensity modulation (CIM), color ratio modulation (CRM) is proposed to increase the data rate and improve the illumination effect at the same time. According to the experimental results, a downlink data link rate of 13.2 kbit/s at 60 fps was achieved with a CMOS image sensor with a resolution of 1920 × 1080. In [14], Cahyadi et al. provide a comprehensive review of the OCC system. The current status of OCC as well as the potential developments and challenges faced by the OCC system are discussed. The paper provides a comprehensive perspective on future research directions along with recent standardization activities in OCC. In [15], an Android camera-based application and transmitter system using frequency shift on–off keying (FSOOK) is proposed to analyze the requirements and challenges of an OCC system for smartphone cameras. This system has a maximum communication range of 7.5 m. Specifically, flicker rate, omnidirectional communication, communication distance, and data rate are analyzed. Jung et al. designed a complementary color barcode-based optical camera communication (CCB-OCC) system [16]. In the proposed method, information is encoded into specially designed color barcodes and transmitted in a format that cannot be detected by humans but can be detected by camera-equipped devices. The feasibility of the CCB-OCC scheme for short-range communication is verified in the experimental results, and a new option for designing a display-to-camera (D2C) communication system that is robust to environmental change, easy to use, and has a simple implementation is presented. In [17], a real-time color intensity modulation multi-input–multi-output (CIM-MIMO) OCC system was designed and implemented. A total of 192 data-carrying LEDs in a 16 × 16 LED array achieved a data rate of 126.72 kbps at a distance of 1.4 m at a refresh rate of 82.5 Hz and a frame rate of 330 fps without external optical hardware. In [18], it is shown that by using the rolling shutter effect (RSE) of the camera image sensor, the data can be properly demodulated from light. A camera-based visible light communication system was realized using LED as the transmitter and a mobile phone camera as the receiver. A demodulation method based on HSB color space is proposed to improve the transmission performance and reduce the implementation complexity. It is shown that the designed VLC system can achieve a data capacity of up to 1.28 kbits per image under a bit error rate (BER) of 3.8 × 10−3. In [19], a combination of color-shift keying and variable pulse position modulation (VPPM), available in the IEEE VLC standard, is presented. Although the use of VPPM results in a reduction in spectral efficiency, it is shown to simplify synchronization and reduce power consumption. Furthermore, CMOS sensors are shown to significantly increase the equivalent signal-to-noise ratio (SNR) after a simple signal processing technique.
Our work is based on optical camera communication. In this paper, we present a VLC system in which a cell phone camera is used as the receiver and a computer monitor is used as the transmitter. By creating different color channels within a frame, simultaneous transmission of the data packet from the transmitter to the receiver is provided. In this scope, PyCharm, Pydroid, Python, and Tkinter platforms were used for programming the transmitter and receiver units. With the image processing technique, the effects of daylight on communication performance were eliminated. Unlike similar publications in the literature, the effects of the proximity of color channels on communication speed and accuracy are discussed in detail and supported experimentally. In this original work, the amount of color shift is shown on the CIE xy chromaticity diagram to determine the parameters that cause the camera to detect colors incorrectly.

2. Proposed OCC System Architecture

We designed the VLC system using a Redmi Note Pro 9 mobile phone as the receiver, an AsusX540U laptop computer as the transmitter, and a 21.5-inch AsusVP228HE monitor connected to this computer via a high-definition multimedia interface (HDMI) cable. Cross-platform application development software was used on the laptop used as a transmitter. The phone with the Android 10 operating system has 4 rear cameras and 1 front camera, and the main camera, with a 64 MP, f/1.9, Sony IMX686 sensor, was used in the tests. The phone has an 8-core CPU (2 × 2.3 GHz Kryo 465 Gold + 6 × 1.8 GHz Kryo 465 Silver) and Adreno 618 GPU and the chipset is Qualcomm Snapdragon 720 G. The AsusX540U model laptop has a 2.50 GHz Intel Core i5-7200U dual-core processor. The resolution of the 15.6-inch screen is 1366 × 768 pixels. The 21.5-inch Asus VP228HE monitor has Full HD (1920 × 1080 pixels) image quality and is of the TN (Twisted Nematic) type. The brightness is 250 cd/m2 and response time is 1 ms.
The general view of the designed communication system is shown in Figure 1.
The cross-platform application was written on an application development platform that allows a single software application to run on operating systems such as IOS, Android, Windows, Linux, MacOS, Raspberry Pi, etc. Although many software languages are supported for cross-platform use, for this study, the Tkinter graphical interface library, which comes with the Python language, was preferred.
The Tkinter graphical interface library has features that allow it to work with the Pydroid compiler on mobile devices and is compatible with the open source OpenCV. OpenCV provides flexible usage in image processing applications. It is also frequently used in deep learning applications. Therefore, the Pydroid compiler was installed on the cell phone used as a receiver. Thanks to the Pydroid compiler, the data sent from the transmitter are received from the phone’s camera and converted into useful data with image processing techniques on the receiving phone. Thus, there was no need to connect a computer to the phone.
The panel type of the ASUS monitor is a liquid crystal display (LCD). Adjustments such as brightness and saturation can be made on the transmitter side. However, these parameters can also be adjusted at the receiver side with image processing. A serial monitor was created with Canvas to monitor the data packets. Figure 2 shows the block diagram of the system.

2.1. Transmitter Design

In the transmitter part, Python–Tkinter was chosen as the most compatible application development tool. PyCharm was used as the compiler in the transmitter.
With the application software written in the transmitter unit, it is aimed to send data such as audio, image, and video to the receiver. The colors used in the designed color-coded OCC system are red, green, blue, white, black, cyan, purple, and yellow.
In the transmitter part, 8 color values were assigned to an empty array with NumPy. R, G, B color values and 3-bit coded values are [255, 0, 0] = “100” for red, [0, 255, 0] = “010” for green, [0, 0, 255] = “001” for blue, [255, 255, 255] = “111” for white, [0, 0, 0] = “000” for black, [0, 255, 255] = “011” for cyan, [255, 0, 255] = “101” for purple, [255, 255, 0] = “110” for yellow.
As shown in Figure 3, the 12-character sample text data selected as “Li-Fi:#Test-” was first converted into an 8-bit data sequence as ASCII code. Then, this 96-bit data sequence was divided into 32 parts of 3 bits and a color channel was defined for each 3-bit part. Thus, 32 color channels were obtained for the sample text data of 12 characters. These color channels were converted into 2 separate frames in 4 × 4 matrix format. Thus, in the first stage, Frame-1 containing the text “Li-Fi:”, and in the second stage, Frame-2 containing the text “#Test-”, are color-coded and transferred from transmitter to receiver. If the text data are selected to consist of 14 characters instead of 12 characters, the size of the data packet will be 112 bits instead of 96 bits, so 3 separate packets of 48 + 48 + 16 bits must be transmitted. When dividing the 16 bits of data in the last packet into 3 bits, 1 bit will be left out. For an error-free transmission, the data packet is completed to 48 bits by adding 0 to the remaining part. Thus, it is possible to obtain the original data again at the receiving end.
The coordinates of the color channels in Python software V3.9.7 are chosen as shown in Figure 4. Each channel is 40 × 40 pixels in size. In the literature, there are several studies on the effects called inter-pixel interference (IPI) and inter-symbol interference (ISI) in OCC systems. Since the interference effects decrease when the distance between pixels increases, the channels are spaced 40 pixels apart horizontally and vertically [20,21,22]. Channel 1 covers the region between coordinates (600, 250) and (640, 290), with the upper left corner being the origin.
The black background color of the monitor does not have a negative impact on information transfer. Since the image processing algorithm at the receiver takes into account the center coordinate of each color channel, the black color channel on a black background can be successfully detected.

2.2. Receiver Design

The cell phone used as a receiver was connected to a USB keyboard for easy code writing. The Pydroid compiler was installed on the receiver phone. Python and Tkinter are installed in the Pydroid compiler. Although Python offers many libraries for image processing, the OpenCV library was used in this study. Since OpenCV cannot be used directly in computer applications, Tkinter was needed. The main camera of the phone, called number 1, and the front camera, can be used by OpenCV. Since it is not authorized to access the other 3 cameras, only the main camera was used in this study.
After the application is run, the camera starts taking images within the specified time. The image is zoomed in depending on the distance of the transmitter. The frequency of image acquisition is synchronized to the transmission speed of the transmitter. After the starting data are received, each image is recorded. After the image acquisition process is completed, the decoding stage is started with image processing. At this stage, various algorithms and modules are applied to prevent distortions with image processing techniques. The data are converted into messages by making the desired improvements. Here, the entire image from the camera is not processed. In Canvas, only the section with the channels is processed by cropping and resizing, and pixel color values are written to Canvas. Since Pydroid does not offer terminal options for applications, in this study, a terminal was created in the Python language on Canvas. The flow chart summarizing the working principle of the program running on the mobile phone is shown in Figure 5.
The OpenCV code block needed for the camera to capture images is shown in Figure 6. The time.sleep (second) and cnvs.after (millisecond, capture) commands written in the Pydroid compiler specify the delay time of the receiver.
Canvas and Label tags were used to display the image on the screen. Figure 7 shows the code sections of the software used for Canvas. Since Canvas was chosen as the serial monitor, Canvas was also chosen for displaying the image on the screen. In the communication system, the character string “Li-Fi:#Test-” was selected as sample data. The monitor at the transmitter displays this character string at the specified time interval. The camera at the receiver is continuously searching for the start data.
When the starting data are received, images are taken within the specified time and saved in a list type array named “frame_x”. Problems that occur on the receiver side during communication are logged in an array named “exc:”, which Python provides for error conditions. The message to be written to Canvas for the camera capture problem is “camera can’t get a picture”. The serial monitor monitors whether the data are received correctly according to parameters such as contrast, color brightness, ambient light, channel size, color depth, distance, etc.

2.3. Image Processing Algorithm

When the application is started, definitions are saved first. The Python-Global module is used to call these definitions at any time. Then the index of the camera used in the application is defined; otherwise, the application will not be able to select the camera itself. The capture function runs 5 ms after the application is compiled. OpenCV starts taking images in the capture function and first looks for the starting bitstream. In this study, the starting data were chosen as the color green in a special coordinate outside the 16 color channels in matrix form. The bit sequence of the green color is the same in red–green–blue (RGB) and blue–green–red (BGR) color spaces. If the starting data are not found, images are taken again at specified time intervals. For each image, a 44 × 44-pixel square is drawn with image processing and added to the Canvas and displayed on the screen. Thus, the coded color channels sent by the transmitter can be included in the matrix. Since the coordinates of the 4 × 4 color matrix are known in advance, only the colors belonging to the matrix are saved after the starting color (green color) is taken. If the camera fails to take an image, an error message is written to Canvas. After the recording process is completed, the function is called for decoding. In the function prepared for the decoding process, all recorded matrices are called one by one. Here, the color space is selected first. All image processing techniques for the test are applied here. Dimming, brightening, masking, noise reduction, blurring, sizing, etc. can be added to the algorithm if needed.
For example, the basic principle of masking is to show the desired color values as white and the remaining colors as black. Figure 8 shows how the masking process works for red, green, and blue colors. Here, by specifying threshold values, masking was applied to the camera image in daylight. After the images below the threshold were made black, the red, green, and blue colors in the 4 × 4 color channel were detected.
In the masking technique, the image is first converted to the hue–saturation–value (HSV) color space. Here, lower and upper limits are set separately for hue, saturation, and value parameters. These lower and upper bounds represent the color’s main value, saturation, and luminance values, respectively. The lower and upper bounds are set as 60 and 75 for color value, 0 and 255 for saturation, and 0 and 255 for brightness, respectively, in the code given in Figure 9 to mask the green color. The code detects the color within these limits, accepts the rest as 0, and outputs a black color. After the masking technique, the original masked color is accessed by the bitwise_and operation.
After applying image processing techniques, the image is resized to 308 × 308 pixels. In the resized image, data are read from the midpoint of each color channel. In OpenCV, when the upper left corner of the image is selected as the origin, the first color channel is located at coordinate (22,22) and the second color channel is located at coordinate (110,22). According to the values read sequentially from the channels in the image, bits are stored in the list named data and each 8-bit data is converted into meaningful data according to the ASCII table. If the data cannot be read, x and y values are stored in the list named error. Thus, the error is detected in which image and at which coordinate. In this study, we investigated the effect of HSV and RGB color space in the decoding phase. For this reason, the daytime values of HSV and RGB color spaces were recorded in the test phase. Here, two separate functions were written for RGB and HSV color space and both decoding processes were applied to the received data.

2.4. Optimization of Hardware Parameters

Optimization is required for the monitor at the transmitter to send the coded colors correctly. It was aimed to transmit the data accurately by taking into account whether the test room was bright or dark, the light color of the lamp in the illuminated area, and the effects of daylight when the monitor was placed in front of a window. The 30-square-meter test room was illuminated by an LED luminaire with a luminous efficiency of 120 lm/W, producing a luminous flux of 3000 lumens and a color temperature of 6500 K. Dead pixels and light bleed caused by the manufacturing process of the monitor were also taken into account. During the tests, the dead pixel status of the monitor was inspected and no dead pixels were found. However, in the light bleed test, light bleed was detected from the bottom to the top of the panel. As shown in Figure 10, the negative effect on communication was reduced by angling the screen 5 degrees in the z-axis. In addition, the brightness of the monitor was reduced by ¾ and the contrast value and color saturation were set to the highest value. The blue filter setting of the monitor was maximized to reduce hardware-related problems.
The maximum pixel count of the monitor used during the tests was 1920 × 1080. However, due to light bleed, monitor response time, and hardware issues of the LCD monitor, the monitor screen was optimized and used. The size of the channels, the distance between the channels, and the number of channels were adjusted to achieve the maximum number of channels and maximum distance, which were the objectives of this study. Here, the distance between channels is necessary to optimize the blooming effect, which has a distorting effect on the receiver.
There are many options to determine the color level and size of the channels. The NumPy library was used and the desired efficiency was achieved because it can meet the requests, such as being easily adjustable to the dimensions and positions we want to apply in the channels, easily callable, and compatible with Python–Tkinter. The channels created with the NumPy library were placed on the Canvas. The size and color values of the transmitter channels created with NumPy are shown in Figure 11. Channel sizes were optimized as 40 × 40. Color values were chosen between 0 and 255. Due to the angle of the monitor on the z-axis, the value for the green color was chosen as 200. After optimization, the data were distributed to the channels as packets of three bits each. The first 3 bits were set to the first channel and the last 3 bits to the sixteenth channel.

3. Test and Results

In this study, various tests were conducted to analyze the performance of the color-coded VLC system. The effects of RGB and HSV color spaces on data transmission rate and communication distance are examined. In addition, the effects of the distance between the color channels on the color detection performance were also investigated. Figure 12 shows an overview of the test setup.

3.1. Bit Error Ratio Test

The color matching and color sensitivity of the CMOS camera used in the receiver varies according to the manufacturer. During the tests, RGB and HSV color spaces were used with the receiver camera to demonstrate their performance in decoding.
In order to determine the bit error ratio of the designed color-coded VLC system, a data packet consisting of 2400 bits was generated, as shown in Figure 13. For this purpose, first the six-character text “Li-Fi:” and then the six-character text “#Test-” were sent 25 times consecutively. Since each color channel represents 3 bits, there are 48 bits in total in 16 color channels, and when 50 frames are sent, there are 2400 bits, i.e., 300 bytes of data are transmitted. There is a delay time of Δt between frames and this time can be changed in the software on the transmitter side.
In order to detect the number of erroneous bits depending on the communication rate, the delay time between frames was chosen as 1 s, 0.2 s, 0.1 s, and 0.05 s. These delay times correspond to data transmission rates of 48 bps, 240 bps, 480 bps, and 960 bps, respectively. At these communication rates, the number of erroneous bits was determined for a distance of 120 cm, 220 cm, and 320 cm. Table 1 and Table 2 show the variations in the number of erroneous bits for RGB and HSV color spaces depending on the distance and data rate. Table 1 shows how many bits in a 2400-bit data packet are received incorrectly when decoding with RGB color space at the receiving unit. When frames were sent at 1 s intervals, it took 50 s to send the 2400-bit data packet. At a distance of 120 cm, the 2400-bit data packet was received without error. When the distance between the monitor and the camera was 220 cm, the number of erroneous bits was 2. For a distance of 320 cm, 13 bits were detected incorrectly. When the data transmission rate was increased, the number of erroneous bits increased. At 960 bps, the transmission time for a 2400-bit data packet is 2.5 s. The number of erroneous bits was 196 bits for 120 cm, 707 bits for 220 cm, and 910 bits for 320 cm.
Table 2 shows how many bits in a 2400-bit data packet are received incorrectly when decoding with the HSV color space at the receiving unit. At 48 bps, error-free data transmission was achieved up to a distance of 320 cm. At 960 bps, the number of erroneous bits was 154 bits for 120 cm, 643 bits for 220 cm, and 832 bits for 320 cm. It was observed that the error rate was lower when using the HSV color space compared to the RGB color space.
Figure 14 plots the number of erroneous bits for RGB and HSV color spaces using the values in Table 1 and Table 2.
The environment where the tests were carried out was illuminated with a room lamp and no external lighting was used. The lamp was not switched on and off during the tests.
In the RGB color space, as the distance between the transmitter and receiver increases, the contrast values between the colors get closer to each other and the error rate increases during the decoding phase. As can be seen in Figure 14, when the communication speed increased, the error rate increased more in the RGB color space than in the HSV color space. This is because the negative effects of the camera during exposure exceed the threshold value in the RGB color space. Although the phone used has 120 frame/s capability, due to the distorting effect of exposure, the system speed dropped to 40 frame/s. The tests were conducted up to a maximum speed of 20 frame/s. In the HSV color space, decoding is not based on the brightness of the color. For example, it is easier to distinguish a very bright red color from a less bright red color in the HSV color space. As the brightness of the color increases, all bits in the RGB color space increase, whereas in the HSV color space, only the bits reserved for brightness increase. This means that the HSV color space is less susceptible to adverse effects caused by the environment or hardware. Additionally, the CMOS camera used in the test phase loses color values as it heats up.

3.2. Effect of Distance between Channels

As can be seen in Figure 15, the distance between 16 different color channels arranged in a 4 × 4 matrix increases horizontally and vertically starting from d = 0. The distance between color channels has a significant impact on communication performance. How many channels a monitor can effectively send data from simultaneously can be determined by appropriately adjusting the distance between the channels. When the channels are too close together, it is difficult for the camera to distinguish colors at a distance.
To observe the effects of the distance between the channels, a 3 × 3 test field with a blue pixel in the center was created. Black, white, red, green, blue, yellow, purple, and cyan colors were placed around the blue color in the center, and the camera’s ability to detect the blue color was tested.
In Table 3, in the first stage, d = 0 is taken in the case where all color channels are adjacent. In the case where there was black color around the blue color, the camera perceived the blue color in the center as (24, 19, 250) instead of reading (0, 0, 255). Similarly, when there is a yellow color around the blue color in the center, the R, G, B values are perceived as (175, 191, 255) for d = 0, (93, 116, 247) for d = 20 pixels, and (53, 61, 255) for d = 40 pixels. As can be seen, decreasing the distance between channels causes the camera to detect the R, G, B values incorrectly. This is one of the main factors affecting the bit error rate. The tests in Table 3 are repeated for d = 20 pixels and d = 40 pixels for all color values, and the effects of the inter-channel distance on the color detection performance of the camera are shown in detail.
The (x, y) coordinates are utilized to determine chromaticity. Chromaticity coordinates are normalized with respect to the amount of light. The CIE system is the most commonly employed method for characterizing the composition of any color based on three primary colors. Having a metric that delineates the hue and vividness of a color, independent of light intensity, is beneficial [23,24]. The x and y values of the color coordinates are obtained from the following equations:
x = X X + Y + Z
y = Y X + Y + Z
The R, G, B values in Table 3 were converted to x,y values and then transferred to the CIE xy chromaticity diagram shown in Figure 16. This diagram shows more clearly the amount of misdetection of the original blue color by the camera. When there are different colors around the blue color and the channels are close to each other, there is a significant amount of color shift. For d = 40 pixels, the measurement results are within a narrower region, while for d = 0 they are spread over a very wide region.

4. Conclusions and Outlook

In this study, we present various methods to test the performance of an optical camera-based visible light communication system. A 21.5-inch AsusVP228HE monitor connected to an AsusX540U laptop via an HDMI cable was used in the transmitting unit of the designed system, and a Redmi Note Pro 9 cell phone was used as the receiving unit. On the transmitter side, the Pydroid, Python, Tkinter, and OpenCV platforms were used to transmit the information messages as color-coded data packets. The PyCharm, Python, Tkinter, and NumPy platforms were used to reconstruct the original information messages from the images received by the camera.
The parameters affecting the communication performance in the designed OCC system can be listed as follows: the amount of daylight in the environment of the transmitter and the color temperature of the external lighting, the distance between the transmitter and the receiver, the number of color channels in a frame, the distance between the channels, the frame repetition rate, the type of monitor, the number of dead pixels, light bleed, the image acquisition capacity of the camera at the receiver per second, the efficiency of the image processing algorithm, and the type of color space used.
In the RGB color space, as the distance between the transmitter and receiver increases, the contrast values between the colors get closer to each other and the error rate increases during the decoding phase. It was also observed that as the communication speed increases, the error rate in the RGB color space increases more than in the HSV color space. Since the decoding process in the HSV color space is not based on the brightness level of the color, it is easier to distinguish between bright and dim colors in the HSV color space. Therefore, by choosing the HSV color space instead of RGB in the communication system, the negative effects caused by the environment or hardware parameters are minimized. The effects of the distance between color channels on communication performance are discussed in detail. It was found that when the color channels at the transmitter are too close to each other, it is difficult for the camera at the receiver to distinguish colors at a distance and causes incorrect perception of R, G, B values. The amount of color shift is shown on the CIE xy chromaticity diagram.
In this study, it was shown that it is possible to communicate at a distance of 3.2 m at a rate of less than 1 kbps and that data can be transmitted to a distance of 1.2 m at 240 bps without error. Our suggestions for increasing the data communication speed and communication distance of the technique used are as follows:
  • Increase the number of channels;
  • Use the event camera;
  • Use an imaging lens and image sensor;
  • Choose an OLED monitor type instead of LCD;
  • Create color channels with power LEDs arranged in matrix form;
  • Enable automatic tuning of parameters with deep learning.
It has been shown that the OCC-based system presented in this study has much better daylight communication performance than the OOK-modulated VLC systems with a photodiode. Therefore, it is envisaged that the designed system can be efficiently used for data communication between vehicles in environments exposed to RF noise and in military vehicle convoys where Wi-Fi, Bluetooth, GSM, and GPRS communication is interrupted.

Author Contributions

Methodology, Y.E. and H.Z.D.; data curation, software, H.Z.D.; investigation, visualization, Y.E. and H.Z.D.; validation, supervision, Y.E.; writing—original draft preparation, Y.E. and H.Z.D.; writing—review and editing, Y.E. and H.Z.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by FIRAT University Scientific Research Projects Unit (MF.24.61).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ghassemlooy, Z.; Popoola, W.; Rajbhandari, S. Optical Wireless Communications: System and Channel Modelling with MATLAB®, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2019; pp. 9–11. [Google Scholar]
  2. Dimitrov, S.; Haas, H. Principles of LED Light Communications: Towards Networked Li-Fi; Cambridge University Press: Cambridge, UK, 2015; pp. 1–4. [Google Scholar]
  3. Aleksieva, V.; Valchanov, H.; Dinev, D. Comparison Study of Prototypes Based on LiFi Technology. In Proceedings of the 2019 International Conference on Biomedical Innovations and Applications (BIA), Varna, Bulgaria, 8–9 November 2019. [Google Scholar]
  4. Khalighi, M.A.; Uysal, M. Survey on Free Space Optical Communication: A Communication Theory Perspective. IEEE Commun. Surv. Tutor. 2014, 16, 2231–2258. [Google Scholar] [CrossRef]
  5. Zolfi, R.; Erol, Y. Effects of Photodiode Array Size on Visible Light Communication (VLC) Performance. Fırat Univ. J. Eng. Sci. 2023, 35, 901–910. [Google Scholar]
  6. Jovicic, A.; Li, J.; Richardson, T. Visible Light Communication: Opportunities, Challenges and The Path to Market. IEEE Commun. Mag. 2013, 51, 26–32. [Google Scholar] [CrossRef]
  7. Gao, D.; Zhang, J.; Wang, F.; Liang, J.; Wang, W. Design and Simulation of Ultra-Thin and High-Efficiency Silicon-Based Trichromatic PIN Photodiode Arrays for Visible Light Communication. Opt. Commun. 2020, 475, 126296. [Google Scholar] [CrossRef]
  8. Liang, K.; Chow, C.W.; Liu, Y.; Yeh, C.H. Thresholding Schemes for Visible Light Communications with CMOS Camera Using Entropy-Based Algorithms. Opt. Express 2016, 24, 25641–25646. [Google Scholar] [CrossRef] [PubMed]
  9. Do, T.H.; Yoo, M. Performance Analysis of Visible Light Communication Using CMOS Sensors. Sensors 2016, 16, 309. [Google Scholar] [CrossRef] [PubMed]
  10. Kim, S.M.; Jeon, J.B. Experimental Demonstration of 4 × 4 MIMO Wireless Visible Light Communication Using a Commercial CCD Image Sensor. J. Inf. Commun. Converg. Eng. 2012, 10, 220–224. [Google Scholar] [CrossRef]
  11. Haas, H. Wireless Data from Every Light Bulb. Available online: https://www.ted.com/talks/harald_haas_wireless_data_from_every_light_bulb (accessed on 1 October 2023).
  12. Chavez-Burbano, P.; Rabadan, J.; Guerra, V.; Perez-Jimenez, R. Flickering-Free Distance-Independent Modulation Scheme for OCC. Electronics 2021, 10, 1103. [Google Scholar] [CrossRef]
  13. Chen, H.; Lai, X.Z.; Chen, P.; Liu, Y.T.; Yu, M.Y.; Liu, Z.H.; Zhu, Z.J. Quadrichromatic LED Based Mobile Phone Camera Visible Light Communication. Opt. Express 2018, 26, 17132–17144. [Google Scholar] [CrossRef]
  14. Cahyadi, W.A.; Chung, Y.H.; Ghassemlooy, Z.; Hassan, N.B. Optical Camera Communications: Principles, Modulations, Potential and Challenges. Electronics 2020, 9, 1339. [Google Scholar] [CrossRef]
  15. Shahjalal, M.; Hasan, M.K.; Chowdhury, M.Z.; Jang, Y.M. Smartphone Camera-Based Optical Wireless Communication System: Requirements and Implementation Challenges. Electronics 2019, 8, 913. [Google Scholar] [CrossRef]
  16. Jung, S.Y.; Lee, J.H.; Nam, W.; Kim, B.W. Complementary Color Barcode-Based Optical Camera Communications. Wirel. Commun. Mob. Comput. 2020, 2020, 3898427. [Google Scholar] [CrossRef]
  17. Huang, W.; Tian, P.; Xu, Z. Design and Implementation of a Real-Time CIM-MIMO Optical Camera Communication System. Opt. Express 2016, 24, 024567. [Google Scholar] [CrossRef] [PubMed]
  18. Chen, Q.; Wen, H.; Deng, R.; Chen, M.; Xu, Q.; Zong, T.; Geng, K. Spaced Color Shift Keying Modulation for Camera-Based Visible Light Communication System Using Rolling Shutter Effect. Opt. Commun. 2019, 449, 19–23. [Google Scholar] [CrossRef]
  19. Delgado Rajo, F.A.; Guerra, V.; Rabadan Borges, J.A.; Torres, J.R.; Perez-Jimenez, R. Color Shift Keying Communication System with a Modified PPM Synchronization Scheme. IEEE Photonics Technol. Lett. 2014, 26, 1851–1854. [Google Scholar] [CrossRef]
  20. Vuong, D.; Yoo, M. Interpixel Interference Mitigation in Visible Light Communication Using Image Sensor. IEEE Access 2018, 6, 45543–45551. [Google Scholar] [CrossRef]
  21. Dong, K.; Kong, M.; Wang, M. Error performance analysis for OOK modulated optical camera communication systems. Opt. Commun. 2024, 574, 131121. [Google Scholar] [CrossRef]
  22. Hamidnejad, E.; Gholami, A. Developing a comprehensive model for underwater MIMO OCC system. Opt. Express 2023, 31, 31870–31883. [Google Scholar] [CrossRef] [PubMed]
  23. Mehr, M.Y.; Driel, W.D.; Zhang, G.Q. Progress in Understanding Color Maintenance in Solid-State Lighting Systems. Engineering 2015, 1, 170–178. [Google Scholar] [CrossRef]
  24. Kim, M.; Lee, J.; Yoon, K. An Experimental Study on Color Shift of Injection-Molded Mobile LGP Depending on Surface Micropattern. Polymers 2020, 12, 2610. [Google Scholar] [CrossRef] [PubMed]
Figure 1. General view of the communication system.
Figure 1. General view of the communication system.
Applsci 14 09102 g001
Figure 2. Block diagram of the communication system.
Figure 2. Block diagram of the communication system.
Applsci 14 09102 g002
Figure 3. Color-coded channels on the monitor screen.
Figure 3. Color-coded channels on the monitor screen.
Applsci 14 09102 g003
Figure 4. Coordinates of color channels (transmitter).
Figure 4. Coordinates of color channels (transmitter).
Applsci 14 09102 g004
Figure 5. Flow chart for the receiver.
Figure 5. Flow chart for the receiver.
Applsci 14 09102 g005
Figure 6. Capturing images in OpenCV.
Figure 6. Capturing images in OpenCV.
Applsci 14 09102 g006
Figure 7. Receiver circuit Python software: (a) Label (b) Canvas.
Figure 7. Receiver circuit Python software: (a) Label (b) Canvas.
Applsci 14 09102 g007
Figure 8. Masking with image processing.
Figure 8. Masking with image processing.
Applsci 14 09102 g008
Figure 9. Sample code for masking.
Figure 9. Sample code for masking.
Applsci 14 09102 g009
Figure 10. Monitor angle.
Figure 10. Monitor angle.
Applsci 14 09102 g010
Figure 11. Size and color values of transmitter channels.
Figure 11. Size and color values of transmitter channels.
Applsci 14 09102 g011
Figure 12. Overview of the test system.
Figure 12. Overview of the test system.
Applsci 14 09102 g012
Figure 13. Test scenario for detecting the number of erroneous bits.
Figure 13. Test scenario for detecting the number of erroneous bits.
Applsci 14 09102 g013
Figure 14. Variation in the number of erroneous bits depending on data rate and distance.
Figure 14. Variation in the number of erroneous bits depending on data rate and distance.
Applsci 14 09102 g014
Figure 15. Distance between channels: (a) d = 0 pixels, (b) d1 = 40 pixels, (c) d2 = 80 pixels.
Figure 15. Distance between channels: (a) d = 0 pixels, (b) d1 = 40 pixels, (c) d2 = 80 pixels.
Applsci 14 09102 g015
Figure 16. CIE xy chromaticity diagram.
Figure 16. CIE xy chromaticity diagram.
Applsci 14 09102 g016
Table 1. Number of erroneous bits depending on the data rate (RGB).
Table 1. Number of erroneous bits depending on the data rate (RGB).
Data
Rate
(bps)
Frame
per
Second
Transmission Time of
2400 Bits (s)
Distance between Transmitter and Receiver
120 cm220 cm320 cm
481500 bit2 bit13 bit
2405100 bit9 bit33 bit
4801056 bit42 bit48 bit
960202.5196 bit707 bit910 bit
Table 2. Number of erroneous bits depending on the data rate (HSV).
Table 2. Number of erroneous bits depending on the data rate (HSV).
Data
Rate
(bps)
Frame
per
Second
Transmission Time of
2400 Bits (s)
Distance between Transmitter and Receiver
120 cm220 cm320 cm
481500 bit0 bit0 bit
2405100 bit2 bit7 bit
4801054 bit18 bit34 bit
960202.5154 bit643 bit832 bit
Table 3. Color coordinates (x,y) depending on the distance between channels.
Table 3. Color coordinates (x,y) depending on the distance between channels.
d = 0 Pixeld = 20 Pixeld = 40 Pixel
Color RGBxy RGBxy RGBxy
BlackApplsci 14 09102 i00124192500.15340.0640Applsci 14 09102 i00221202460.15320.0651Applsci 14 09102 i00320172550.15260.0639
White1401212550.22310.1654100952490.19470.130566732550.17210.0997
Red1661032550.23980.15531121152360.21200.165175612550.17340.0924
Green301432550.18450.1763271232440.17920.157134802550.16480.1015
Blue1172550.15120.061613172550.15180.063513102550.15150.0622
Yellow1751912550.25880.2551931162470.19700.154053612550.16520.0882
Purple134722450.21530.1225961172510.19740.153150322550.16020.0719
Cyan851402220.21040.2076281192550.17560.144420632550.15870.0862
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dinc, H.Z.; Erol, Y. Performance Analysis of a Color-Code-Based Optical Camera Communication System. Appl. Sci. 2024, 14, 9102. https://doi.org/10.3390/app14199102

AMA Style

Dinc HZ, Erol Y. Performance Analysis of a Color-Code-Based Optical Camera Communication System. Applied Sciences. 2024; 14(19):9102. https://doi.org/10.3390/app14199102

Chicago/Turabian Style

Dinc, Hasan Ziya, and Yavuz Erol. 2024. "Performance Analysis of a Color-Code-Based Optical Camera Communication System" Applied Sciences 14, no. 19: 9102. https://doi.org/10.3390/app14199102

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop