1. Introduction
Currently, analog meters are still widely utilized in industrial productions and laboratory equipment for data measurements and monitoring [
1]. Despite advancements in digital technology and automation, analog meters remain indispensable in various fields, particularly in factory settings where manual reading of meter values continues to be a critical task [
2,
3].
However, this process is time-consuming and labor-intensive, especially in low-light environments such as nighttime or cloudy weather. In addition, the reduced visibility of the gauges can increase the likelihood of misreading or missing critical data points, potentially compromising the accuracy of the results.
Among analog meters, a
dial gauge is a type of the precision instrument commonly employed in a variety of tests, including
creep testing. Creep tests evaluate the strength of materials under sustained stress by measuring deformation over time until the material is broken [
4].
Figure 1 illustrates a fundamental setup of a creep testing system, showing the material under sustained stress. The three colored points at the bottom represent the rollers that contact the specimen and the arrows indicate the directions of the applied load and the corresponding reactions. Essentially, the middle colored point applies the force on the specimen, while the two outer colored points provide support.
In such tests,
dial gauges [
5] are used to measure material deformation, where they present several challenges. First, they lack the capability to output digital data, requiring experimenters to manually extract readings from recorded video footage, which is time-consuming and error-prone. Second, under low-lighting conditions, such as nighttime or cloudy weather, the visibility of a dial gauge is significantly reduced, leading to misreading or missed data points and compromising the measurement accuracy. Third, when the main needle of a dial gauge overlaps with the sub-needle, the visual distinction becomes difficult, further increasing the risk of misinterpretation. Lastly, the determination of the material failure currently relies on manual observations, which can delay the response and introduce variability into the evaluation process. These challenges collectively hinder the efficiency and reliability of the creep test data collection and analysis.
Figure 2 shows a standard dial gauge.
To address the identified challenges in automating the creep test data collection, we propose an innovative system employing
Raspberry Pi [
6] and a
web camera for digitizing analog dial gauge readings through image processing techniques. This system first utilizes
Hough Transform [
7] and edge detection [
8] to identify the needle and calculate its angle. Then, the detected needle angle is converted into digital measurement data. To enhance accuracy, several image processing techniques such as dilation, erosion [
9], and masking are applied to suppress background noise and mitigate interference from the sub-pointer. Additionally, the system integrates a smart lighting mechanism to ensure reliable performances under low-light conditions by dynamically activating and deactivating supplementary lighting based on gauge visibility.
For evaluations of the proposed system, we confirmed the effectiveness through extensive experiments under diverse lighting conditions, demonstrating improvements in measurement accuracy, lighting efficiency, and energy consumption. The proposed needle detection method achieved an average error of mm, which supports its sufficient precision in digitizing analog readings.
The implemented smart lighting mechanism effectively reduced the supplementary lighting duration, significantly decreasing energy consumption. In long-duration creep tests, especially those that span several days or weeks, leaving the supplementary light on continuously can result in substantial cumulative power usage, particularly when multiple test systems are running in the same laboratory. Specifically, lighting time was reduced by 30.0% under the same experimental conditions compared to the conventional method that keeps lighting constantly on. This reduction not only demonstrates the system’s energy efficiency but also helps to minimize visual disturbance in shared laboratory environments. It ensures the robust performance in automating the creep test data collection, maintaining consistent image quality under varying ambient lighting conditions and providing precise measurements, sustainable operations, and adaptability across various environmental conditions, making it highly suitable for practical applications.
The rest of this paper is organized as follows:
Section 2 reviews related works on automated data collection and IoT-based solutions for mechanical testing instruments.
Section 3 discusses the challenges in monitoring creep test instruments, focusing on manual monitoring issues and the importance of reliable data collection.
Section 4 presents the proposed
creep test assisting system, covering system architecture, image processing for needle detection, smart lighting, data processing, database integration, and user interface design.
Section 5 evaluates the system through experiments and performance analysis.
Section 6 concludes the paper and discusses future work.
4. Proposed Creep Test Assisting System
In this section, we present the design and functionality of the creep test assisting system. This system integrates sensor data collections, image processing, and smart decision-making to automate and enhance the efficiency of creep tests.
4.1. System Architecture
The system architecture is designed to provide a cost-effective modular approach for automating dial gauge monitoring. It integrates hardware and software components to ensure seamless operation and data reliability.
4.1.1. Hardware
Figure 3 illustrates the architecture of the
creep test assisting system. The overall workflow of the system begins with hardware components. The system primarily consists of a dial gauge, a web camera, and
Raspberry Pi. The dial gauge is responsible for recording the material’s micro-deformation in real time during the creep test under applied stress. To ensure the complete recording of its changes during each measurement cycle, a web camera is fixed directly in front of the gauge. The web camera is configured to capture high-resolution images at scheduled intervals, transmitting the images to
Raspberry Pi via a
USB interface for subsequent processing.
4.1.2. Image Processing
Upon entering the image processing module, the captured raw images are first converted to grayscale, reducing the interference from unnecessary color information and improving processing efficiency. Next,
Gaussian blur [
24] is applied to the images to eliminate noise caused by light fluctuations or equipment vibrations. The denoised images are processed with
Canny edge detection to extract critical features, including the scale circle and needle outline. Subsequently, the system employs the
Hough transform algorithm to accurately locate the dial gauge’s center and the edges of the scale circle, calculating the needle’s angular position. This angular position is then mapped to the scale range of the dial gauge to derive numerical measurement values, ensuring highly accurate readings even in complex environments.
4.1.3. Smart Lighting
To ensure the image clarity in low-light conditions, the system incorporates a smart lighting module. By connecting to a light intensity sensor, Raspberry Pi continuously monitors the ambient light level in the laboratory and analyzes the brightness histogram of the captured images. When the light intensity sensor detects insufficient lighting or the image brightness is deemed inadequate, the system automatically activates the auxiliary LED light to provide uniform and stable illumination for the dial gauge. This process continues until the lighting conditions meet the requirements for accurate image processing.
4.1.4. Data Processing
The data processing module serves as the core of the decision-making capability. This module receives real-time measurement data from the image processing module and performs time-series analysis to record the stress and strain variations over time. Based on predefined fracture criteria, the system detects material fractures by identifying sharp drops in stress. Upon detecting such events, the system triggers a control signal to terminate the experiment and sends an experimental termination notification to laboratory personnel via the LINE messaging platform. This notification includes the experimental time, the measured value at the time of fracture, and the corresponding experimental image, enabling laboratory personnel to respond promptly and take further action.
4.1.5. Data Storage
All the experimental data, including timestamps, measured values, ambient light intensity, and image processing results, are stored in the local database of the data storage. This database ensures secure and rapid data storage and retrieval, avoids the potential risks of network dependency, and supports offline access.
4.1.6. User Interface
The user interface module provides a platform for interactions between the system and the laboratory personnel. The user interface can be accessed through web or mobile applications, displaying real-time gauge readings, stress–strain variation curves, and environmental parameters in intuitive formats. Additionally, the interface supports browsing historical data and allows experimental data to be exported in standard formats for subsequent analysis or report generation.
4.2. Image Processing for Needle Detection
The methodology for reading the dial gauge is described here, including the dial detection, the image preprocessing, the noise reduction, the Hough transform for needle detection, and the conversion of the detected needle angle into the measurement value.
4.2.1. Hough Transform for Circle Detection
Before applying the image preprocessing and the needle detection, the position and boundary of the dial is extracted using the
Hough circle transform [
25], which is a geometry-based detection method that identifies a circle by searching for the center and radius in the parameter space.
The
Hough circle transform is defined by the following equation in the parameter space:
where
represents the circle center coordinates and
r stands for the radius. To represent a circle in the parameter space, this equation can be rewritten by
By iterating this equation over each edge point at in the image with a different radius value r and recording the votes for each , the point with the highest accumulated votes is selected as the optimal circle center and radius.
To detect the dial gauge circle, first, the edge detection algorithm is applied to the input image to extract salient edge points, ensuring the input data for the circle detection are geometrically well defined. During the parameterized search phase, all of the possible circle combinations are examined within a predefined radius range (100 to 300 pixels) and the step size (20 pixels). Each edge point is mapped to the parameter space based on potential radius values, generating a distribution of accumulated votes. The optimal combination of the center and radius is determined by selecting the highes-voted point in the parameter space. These detected optimal parameters are then used to crop the image and extract the dial region while recording the center coordinates and radius for further analysis. As shown in
Figure 4, our program effectively detects the circle region for the dial gauge, providing a reliable basis for subsequent processing.The green circle highlights the detected dial gauge region used for subsequent image processing. The label above the gauge is written in Japanese, indicating the manufacturer’s name.
4.2.2. Preprocessing and Noise Reduction
To ensure the accurate detection of the primary needle and to minimize interference from secondary needles and background noise, the input image of the detected dial undergoes the following preprocessing steps. (1) The input image is converted from a color image to a grayscale image, and all areas except for the dial are cropped to reduce unnecessary information. (2) The grayscale image undergoes binarization, making the pixel value into 0 (black) or 255 (white) based on a set threshold, generating a high-contrast edge image. This step emphasizes the structural features of the needle and scale while eliminating redundant background noise. As shown in
Figure 5a, the background is effectively removed after binarization, leaving the needle and dial structure more distinct. (3) The following dilation and erosion operations are applied to the binarized image. These steps aim to further enhance the needle’s contour and reduce false edges caused by noise or secondary needles:
Dilation: Expands the highlighted regions to fill gaps in the image, making the needle’s contour more continuous.
Erosion: Shrinks the highlighted regions to remove small noise points while refining the needle’s actual contour.
The dilation and erosion operations use kernels of size 3 × 3 and 7 × 7, respectively. This parameter selection achieves a balance between detection accuracy and computational efficiency. As shown in
Figure 5b, the integrity of the needle edge is significantly improved after the processes.
Finally, a circular mask is applied to limit the analysis area to the center of the dial. The mask radius is calculated as
where
h and
w represent the height and width of the cropped image containing only the dial. The masked image sets the areas outside the dial region to black, effectively eliminating the background interference. As shown in
Figure 5c, the processed image retains only the center region of the dial, free of extraneous background disturbances. With these preprocessing steps completed, the image is ready for needle detection.
4.2.3. Hough Transform for Needle Detection
After locating the dial using the Hough circle transform, the next step involves detecting the needle’s position on the dial using another Hough line transform for the line detection. The detection stability and accuracy are further enhanced through geometric filtering and feature analysis.
For line detection, the
Hough line transform maps the edge points from
Cartesian coordinates to the parameter space, identifying the potential lines in the parameter space. In
Cartesian coordinates, a line can be expressed as
where
a is the line’s slope and
b is its y-intercept. However, this representation fails for vertical lines (
), where the slope
a becomes undefined.
where
represents the perpendicular distance from the origin to the line and
stands for the angle between the perpendicular and the horizontal axis. Edge points
from the image are transformed into the parameter space, where the intersections of sinusoidal curves indicate the existence of a line. The number of intersections corresponds to the line’s likelihood.
In practice, the
OpenCV function
HoughLines() is used, where a voting threshold is set to select reliable lines. Higher vote counts correspond to longer and more stable lines. As shown in
Figure 5d, the system successfully detects the needle’s edges using the
Hough transform, which serves as a basis for further analysis. Additionally, the blue lines in the figure represents the extension of the detected needle edges.
To improve the line detection accuracy, the geometric filtering and the angle analysis are applied as follows:
First, eight candidate lines are detected using the
Hough transform instead of the traditional two-line approach. This parameter selection is based on experimental observations, where typically six to eight edge lines are often detected in the needle region of the dial. Increasing the number of candidate lines ensures that more potential needle edges are captured, reducing the risk of missing critical information. However, the number of candidate lines is limited to eight to prevent noises from introducing false positives. As shown in
Figure 6a, this proposal successfully identifies eight candidate lines in the needle region, which are indicated by red lines in the figure.
Next, the geometric filtering is applied to all candidate line combinations. Specifically, it calculates the midline of each pair of candidate lines and verifies whether this midline passes through the center of the dial. Additionally, the intersection points of the candidate lines are checked to ensure that they lie within the dial region. Only line pairs meeting these geometric criteria are considered potential needle outlines. As shown in
Figure 6b, our program successfully identifies midlines that pass through the dial center and meet the geometric filtering conditions, as indicated by the blue lines in the figure.
Finally, the angle analysis is performed on the filtered line pairs. Experimental results show that the angle between the edges of the needle is typically around
. The line pairs with angles close to
are filtered, treating these pairs as the needle’s outlines. The average angle of these line pairs is then computed to determine the needle’s actual position. As shown in
Figure 6c, the final needle outlines are accurately identified based on the geometric and angle filtering criteria, as indicated by the red lines in the figure.
4.2.4. Angle-Based Measurement Conversion Method
After detecting the needle, the detected needle angle is converted into the actual measurement value. The detailed calculation process is as follows.
First, the initial angle
of the needle is recorded at the start of the experiment when the needle points to the “0” scale. For each detected angle
, the difference
between these two angles is calculated using the following formula:
Next, the angle difference
is converted into an actual measurement value
L using the following equation:
Additionally, special conditions are considered. During prolonged experiments, the needle may rotate more than one full circle. To address such scenarios, an automatic correction mechanism is incorporated. When it detects that the needle’s angle exceeds 360°, it adds 1 mm to the current measurement value to account for the additional rotation.
Furthermore, in cases where material rupture occurs during the experiment, the needle typically points to the dial’s maximum scale value of
mm. The final measurement value is calculated for such cases using the following formula:
where
l is the initial deformation value recorded at the start of the experiment.
These methods ensure the accuracy and reliability of measurements under various conditions, providing a clear basis for data processing and analysis.
4.2.5. Material Fracture Detection
After creating the mask, the system analyzes the pixel distribution of the marker region and calculates its centroid position. By monitoring changes of the centroid’s coordinates, the system determines whether the spindle of the test machine has separated from the dial gauge. When a significant displacement of the centroid is detected and is held over a period of time, the system regards that the specimen has fractured.
Figure 7 shows the states of the test machine before the material fracture.
The dial gauge indirectly measures material stress changes by detecting the slight vertical movement of the test machine’s spindle caused by material deformation. When the specimen fractures, as shown in
Figure 8, the test machine is separated from the dial gauge and comes to a complete stop. By monitoring the positional changes in the machine’s spindle, the fracturing of the specimen is detected.
To achieve real-time detection, a prominent pink marker is affixed to the contact point between the test machine and the dial gauge. By processing the captured images, the system effectively extracts the marker region and analyzes its motion trajectory to know whether the specimen has fractured.
Specifically, the captured
RGB [
26] images are converted to the
HSV [
27] color space, which enables a more intuitive separation of color and brightness information. The system defines a precise
HSV value range for the pink marker, where this color was chosen for its high contrast against typical laboratory backgrounds. To optimize noise removals, the
HSV range was fine-tuned through iterative testing under different lighting conditions, ensuring reliable detections under variable brightness levels.
After creating the mask, the system analyzes the pixel distribution of the marker region and calculates its centroid position. By monitoring changes in the centroid’s coordinates, the system determines whether the spindle of the test machine has separated from the dial gauge. When a significant displacement of the centroid is detected and sustained over a period of time, the system concludes that the specimen has fractured.
4.3. Smart Lighting
This section outlines an intelligent lighting function in the proposed system for dynamic control of supplementary lighting under ambient light conditions.
In laboratory environments, an intelligent lighting function can significantly enhance the experimental efficiency and precision while promoting the efficient energy utilization. During daylighting hours, the web camera can clearly capture the dial gauge image without additional lighting. However, under insufficient lighting conditions at nights or in cloudy days, the proper activation of supplementary lighting is essential to ensure the accurate recognition of the dial gauge and needle.
To address this issue, an intelligent lighting function is designed for this creep test assisting system. It monitors ambient lighting conditions in real time and dynamically adjusts the supplementary light’s operation. By maintaining the high precision in experimental data collections under varying lighting conditions, the system effectively minimizes energy wastes and reduces interferences with other laboratory experiments.
To achieve the intelligent lighting control in this laboratory, the function combines a light sensor and a camera to ensure reliable and accurate decision-making. First, the light sensor, connected to
Raspberry Pi via a
Monostick [
28] USB Dongle, continuously collects real-time ambient light intensity data in the laboratory.
Raspberry Pi compares the received light intensity data with a predefined threshold, which was determined through experiments under various laboratory lighting conditions. Experimental results showed that 300 lux [
29] and an average
RGB value of 180 provided optimal visibility for needle detection, ensuring the high accuracy and minimal energy consumption.
Simultaneously, the camera captures real-time images of the dial gauge and converts them into grayscale images. The average RGB values of the pixels in the entire image are calculated to further evaluate the lighting conditions. If the light sensor’s detected value falls below the given threshold and the camera’s computed image brightness is insufficient, the function integrates the results of both assessments to turn on the supplementary light, ensuring clear images of the dial gauge. Once the lighting conditions return to normal, the function automatically turns off the supplementary light, thereby conserving energy and reducing disruption to other experiments.
By leveraging the combined functionality of the light sensor and the camera, the function can dynamically control the supplementary light under complex lighting conditions, achieving the energy-efficient and effective lighting management.
The hardware components for the intelligent lighting function are illustrated in
Figure 9. The light sensor, connected to
Raspberry Pi via a
USB dongle, collects ambient light data and transmits it in real time. Concurrently, the camera captures images of the dial gauge, which are analyzed to further evaluate light intensity. The integrated results of these methods are processed at
Raspberry Pi to control the operation of the supplementary light.
4.4. Data Processing for Decision-Making
This section explains how the system dynamically evaluates experimental conditions by simultaneously measuring
RGB [
26] values and illuminance (lux). The system captures real-time images of the dial gauge using a camera, calculates the average
RGB values across the entire image to characterize the color intensity, and uses a light sensor to measure lux values, which represent ambient light intensity [
29]. This dual measurement mechanism not only assists in identifying the color details of the dial gauge, such as determining the exact position of the pink marker, but also evaluates whether the current lighting conditions are suitable for recognizing the dial gauge and needle.
Experimental results show that when the average
RGB value of an image exceeds 180, it meets the requirements. As shown in
Figure 10, the three-day experimental measurements of average
RGB values indicate that when the
RGB value is low, images are not discernible, whereas values above 180 allow for sufficient detail recognitions. Additionally, to ensure that experiments are conducted under appropriate lighting conditions, we measured variations in the light intensity in the laboratory under different scenarios. As shown in
Figure 11, the laboratory’s light intensity fluctuates significantly due to natural lights and intermittent operations of other experimental equipment, which may cause temporary activations or deactivations of indoor lighting.
After extensive experimental validation, it was determined that when the measured light intensity exceeds 300 lux, Raspberry Pi can accurately recognize the dial gauge and the needle position under both natural and artificial lighting conditions. Consequently, the system establishes two thresholds based on experimental results: an average RGB value of 180 and a light intensity of 300 lux, which serve as the foundation for data processing and decision-making to ensure efficient and precise experimental performance.
4.5. Database
The proposed system implements a local database on
Raspberry Pi to store and manage experimental data while providing real-time data access. Using the
Flask [
30] web application framework, a local server is built in
Raspberry Pi. Experimental data are stored in the
CSV format and made accessible to users via a web interface. Additionally, the server allows users to select specific dates or files for data visualization, dynamically generating corresponding line charts. This feature improves the efficiency of data management and enables experimenters to quickly monitor experimental progress.
4.6. User Interface
The user interface consists of two components: a web interface and
LINE Notify [
31] notifications, which provide, respectively, real-time visualizations of experimental data and remote notifications of significant events.
Through the web interface, users can view the progress and results of ongoing experiments at any time and download experimental data files for offline analysis. As shown in
Figure 12, users can intuitively monitor experimental progress and data trends via a browser. Experimental data are also available for download in the
CSV format, facilitating further detailed analysis. Additionally, users can select specific dates or files through the web interface to generate corresponding line charts, enhancing the flexibility of data visualization.
To ensure that experimenters are always informed about the experimental status, the system employs the
LINE Notify API for remote notifications. As shown in
Figure 13, the system sends real-time notifications to users when the specimen fractures or the experiment concludes. Additionally, the non-English information in the figure indicates that the received time is 01:18. Furthermore, the system supports the scheduled notifications, periodically sending updates on experimental progress and measurement results. The notification intervals can be freely configured by the user, ensuring that experimenters can stay updated on the experimental progress even when they are outside the laboratory.
5. Experimental Evaluation
In this section, we conduct experiments to evaluate the performance and reliability of the proposed system.
5.1. Experimental Environment
In the experimental setup, we utilize the hardware and software configurations in
Table 1.
In our experiments,
Raspberry Pi 4 Model B equipped with
Logicool C920N and
MONOSTICK USB Dongle served as the central processing unit, managing the image processing and IoT functionalities. The software configuration included
Ubuntu [
32] as the operating system,
Python 3 [
33] for the development environment, and
OpenCV [
34] for real-time image processing. During the creep test, the system collected measurement data at one-minute intervals from the start of the experiment until the test specimen fractured, enabling real-time fracture detections. The camera was located at the same height as the dial gauge and was placed 20 cm away from it to ensure accurate readings. The experiments took place in the experiment room of #1 Engineering Building at Okayama University, Okayama, Japan.
5.2. Results and Analysis
This section presents the results of two experiments conducted to evaluate the performance of the proposed creep test assistance system. These experiments lasted 155 h and 40 min, and 359 h and 4 min, respectively. Both experiments followed a similar procedure, from the material placement to the specimen rupture, with different applied stable pressures resulting in varying rupture durations. The results focus on the system’s accuracy, efficiency, robustness, and fracture detection capabilities, substantiated by detailed analysis and statistical evidence.
5.2.1. Experiment 1
During the first experiment, data were collected at one-minute intervals from the dial gauge, yielding a total of 9340 data points. The average measurement error was 0.00103, indicating the high precision for the needle position detection. The smart lighting function was activated 15 times under varying lighting conditions, with a total usage time of 108 h and 55 min, accounting for 69.9% of the total experimental duration.
Figure 14 provides a visual comparison between system measurements and manual readings, demonstrating their strong correlation. The result confirms that the system maintained the consistent accuracy across different lighting environments, further validating the effectiveness of the smart lighting mechanism. This result also highlights the system’s ability to reduce the energy consumption while ensuring the reliable measurement accuracy. By focusing on pre-rupture data, the system demonstrated its capability to monitor and record experimental data with minimal human interventions.
5.2.2. Experiment 2
In the second experiment, the system operated for
min, collecting the same number of data points. The average measurement error was 0.00097, indicating further improvement in accuracy, likely due to optimized image processing algorithms and long-term operational stability. The smart lighting function was activated 42 times due to fluctuating environmental conditions, with a total usage time of 251 h and 18 min, accounting for 70.1% of the total experimental duration.
Figure 15 shows a comparison between the system and manual measurements, confirming the system’s consistent performance. The data reveal the enhanced stability over extended durations, with error trends indicating minimal drifts. The higher activation frequency of the smart lighting mechanism during this longer experiment underscores its dynamic adaptability to environmental changes. These findings highlight the system’s robustness in maintaining high accuracy and energy efficiency over prolonged testing periods.
5.2.3. Fracture Detection
The fracture detection capability of the system was evaluated during the second experiment.
Figure 16 illustrates the process where the “rupture state” variable transitioned from 0 (intact material) to 1 (ruptured material). During the test, the marker’s y-coordinate gradually decreased from 120 px to 113 px as the material deformed. Upon rupture, the y-coordinate abruptly dropped to 33 px, triggering an update in the rupture state. This transition activated the
LINE notification to the laboratory personnel, ensuring prompt awareness of the fracture event. The analysis demonstrates the system’s sensitivity to rapid changes in material conditions and its ability to accurately classify rupture events. Additionally, the automated notification mechanism achieved an average response time of 5 s, further validating the system’s practical applicability in laboratory environments. These results confirm the reliability of the fracture detection method described in
Section 4 and highlight its effectiveness in ensuring timely interventions during critical experimental moments.
5.2.4. Summary
The final mean error of 0.00100 reflects the system’s high accuracy across varying experimental durations.
Table 2 summarizes the key parameters and results of the experiments.
The two experimental results indicate that the creep test assisting system maintained an error rate below 0.0011 across both tests, meeting the laboratory precision standard. Data visualizations and statistical analysis results validate the system’s long-term stability and the consistent performance across both short-duration and long-duration tests. In addition, the system introduces functions that enhance its practical applicability: the smart lighting mechanism significantly reduces energy consumptions by lighting only when necessary, showcasing its dynamic responsiveness and energy-saving potential; the material fracture detection function effectively identifies the transition from the intact to ruptured states, with timely notifications ensuring rapid responses; and the overall system achieves the robust automation performance without relying on costly robotic equipment. The results collectively demonstrate the system’s advantages in terms of the accuracy, reliability, energy efficiency, and usability for long-duration experimental monitoring in laboratory environments.
5.3. Discussion
In recent years, deep learning techniques, such as convolutional neural networks, have achieved remarkable results in image recognition tasks and have been successfully applied to gauge reading scenarios. However, our system is deployed on edge devices such as Raspberry Pi, which have limited computational capacity and power resources. Incorporating deep learning models directly may result in significant latency increases and higher energy consumption, which contradicts our design goals for real-time responsiveness and low-power operation. Furthermore, as the system must operate continuously for several days or weeks, sustained high computational loads could lead to system instability. MobileNet and Tiny-YOLO offer promising solutions for edge devices by balancing accuracy and resource usage. As a future direction, we plan to investigate the integration of such models, provided that the real-time performance of the system can be preserved.