Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (9,370)

Search Parameters:
Keywords = battery system

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 15956 KB  
Article
A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control
by Nora Almania, Sarah Alhouli and Deepak Sahoo
Electronics 2025, 14(18), 3576; https://doi.org/10.3390/electronics14183576 (registering DOI) - 9 Sep 2025
Abstract
Many gesture recognition systems with innovative interfaces have emerged for smart home control. However, these systems tend to be energy-intensive, bulky, and expensive. There is also a lack of real-time demonstrations of gesture recognition and subsequent evaluation of the user experience. Photovoltaic light [...] Read more.
Many gesture recognition systems with innovative interfaces have emerged for smart home control. However, these systems tend to be energy-intensive, bulky, and expensive. There is also a lack of real-time demonstrations of gesture recognition and subsequent evaluation of the user experience. Photovoltaic light sensors are self-powered, battery-free, flexible, portable, and easily deployable on various surfaces throughout the home. They enable natural, intuitive, hover-based interaction, which could create a positive user experience. In this paper, we present the development and evaluation of a real-time, hover gesture recognition system that can control multiple smart home devices via a self-powered photovoltaic interface. Five popular supervised machine learning algorithms were evaluated using gesture data from 48 participants. The random forest classifier achieved high accuracies. However, a one-size-fits-all model performed poorly in real-time testing. User-specific random forest models performed well with 10 participants, showing no significant difference in offline and real-time performance and under normal indoor lighting conditions. This paper demonstrates the technical feasibility of using photovoltaic surfaces as self-powered interfaces for gestural interaction systems that are perceived to be useful and easy to use. It establishes a foundation for future work in hover-based interaction and sustainable sensing, enabling human–computer interaction researchers to explore further applications. Full article
(This article belongs to the Special Issue Human-Computer Interaction in Intelligent Systems, 2nd Edition)
Show Figures

Figure 1

70 pages, 6601 KB  
Review
A Comparative Study of Waveforms Across Mobile Cellular Generations: From 0G to 6G and Beyond
by Farah Arabian and Morteza Shoushtari
Telecom 2025, 6(3), 67; https://doi.org/10.3390/telecom6030067 (registering DOI) - 9 Sep 2025
Abstract
Waveforms define the shape, structure, and frequency characteristics of signals, whereas modulation schemes determine how information symbols are mapped onto these waveforms for transmission. Their appropriate selection plays a critical role in determining the efficiency, robustness, and reliability of data transmission. In wireless [...] Read more.
Waveforms define the shape, structure, and frequency characteristics of signals, whereas modulation schemes determine how information symbols are mapped onto these waveforms for transmission. Their appropriate selection plays a critical role in determining the efficiency, robustness, and reliability of data transmission. In wireless communications, the choice of waveform influences key factors, such as network capacity, coverage, performance, power consumption, battery life, spectral efficiency (SE), bandwidth utilization, and the system’s resistance to noise and electromagnetic interference. This paper provides a comprehensive analysis of the waveforms and modulation schemes used across successive generations of mobile cellular networks, exploring their fundamental differences, structural characteristics, and trade-offs for various communication scenarios. It also situates this analysis within the historical evolution of mobile standards, highlighting how advances in modulation and waveform technologies have shaped the development and proliferation of cellular networks. It further examines criteria for waveform selection—such as SE, bit error rate (BER), throughput, and latency—and discusses methods for assessing waveform performance. Finally, this study presents a comparative evaluation of modulation schemes across multiple mobile generations, focusing on key performance metrics, with the BER analysis conducted through MATLAB simulations. Full article
(This article belongs to the Special Issue Advances in Wireless Communication: Applications and Developments)
Show Figures

Figure 1

16 pages, 2444 KB  
Article
Energy Consumption Analysis and Thermal Equilibrium Research of High-Voltage Lithium Battery Electric Forklifts
by Xia Wu, Junyi Chen, Tianliang Lin, Zhongshen Li, Cheng Miao and Wen Gong
Appl. Sci. 2025, 15(18), 9854; https://doi.org/10.3390/app15189854 (registering DOI) - 9 Sep 2025
Abstract
With the escalation of global warming and environmental pollution, electric products characterized by zero emissions, low vibration, and minimal pollution are increasingly favored by consumers. As a pivotal loading and transportation tool, the electrification of forklifts progressed earlier and is relatively mature. However, [...] Read more.
With the escalation of global warming and environmental pollution, electric products characterized by zero emissions, low vibration, and minimal pollution are increasingly favored by consumers. As a pivotal loading and transportation tool, the electrification of forklifts progressed earlier and is relatively mature. However, the prevalent low-voltage systems (72 V or 80 V) in current electric forklifts exhibit issues such as elevated heat loss, restricted motor instantaneous power due to voltage constraints, susceptibility to electrical erosion, and challenges in achieving rapid charging. To address these challenges, a powertrain solution employing high-voltage lithium batteries (320 V) as energy storage units for electric forklifts is proposed. The key parameters of the high-voltage lithium battery were meticulously calculated and selected. The powertrain architecture of the high-voltage lithium battery electric forklift was analyzed, and operational conditions were thoroughly examined. To verify the superior energy efficiency performance of the proposed high-voltage electric forklift in comparison to its low-voltage counterparts, a test prototype was constructed, and comprehensive tests, including average energy consumption and thermal equilibrium assessments, were conducted. The test results demonstrated that under average energy consumption conditions, the operational duration ranged from 8.89 to 13.34 h, surpassing the 7.5 h achieved by low-voltage electric forklifts. The thermal equilibrium temperatures of all electrical control units remained below 43 °C, significantly lower than the 80 °C shutdown protection threshold allowed for low-voltage forklifts. These findings indicate that the proposed high-voltage lithium battery electric forklift exhibits relatively low energy consumption, significantly enhances overall operational efficiency, and ensures stable operation, providing a viable solution and reference for the electrification of forklifts and other construction machinery. Full article
(This article belongs to the Section Mechanical Engineering)
Show Figures

Figure 1

1612 KB  
Review
Machine Learning-Based Electric Vehicle Charging Demand Forecasting: A Systematized Literature Review
by Maher Alaraj, Mohammed Radi, Elaf Alsisi, Munir Majdalawieh and Mohamed Darwish
Energies 2025, 18(17), 4779; https://doi.org/10.3390/en18174779 (registering DOI) - 8 Sep 2025
Abstract
The transport sector significantly contributes to global greenhouse gas emissions, making electromobility crucial in the race toward the United Nations Sustainable Development Goals. In recent years, the increasing competition among manufacturers, the development of cheaper batteries, the ongoing policy support, and people’s greater [...] Read more.
The transport sector significantly contributes to global greenhouse gas emissions, making electromobility crucial in the race toward the United Nations Sustainable Development Goals. In recent years, the increasing competition among manufacturers, the development of cheaper batteries, the ongoing policy support, and people’s greater environmental awareness have consistently increased electric vehicles (EVs) adoption. Nevertheless, EVs charging needs—highly influenced by EV drivers’ behavior uncertainty—challenge their integration into the power grid on a massive scale, leading to potential issues, such as overloading and grid instability. Smart charging strategies can mitigate these adverse effects by using information and communication technologies to optimize EV charging schedules in terms of power systems’ constraints, electricity prices, and users’ preferences, benefiting stakeholders by minimizing network losses, maximizing aggregators’ profit, and reducing users’ driving range anxiety. To this end, accurately forecasting EV charging demand is paramount. Traditionally used forecasting methods, such as model-driven and statistical ones, often rely on complex mathematical models, simulated data, or simplifying assumptions, failing to accurately represent current real-world EV charging profiles. Machine learning (ML) methods, which leverage real-life historical data to model complex, nonlinear, high-dimensional problems, have demonstrated superiority in this domain, becoming a hot research topic. In a scenario where EV technologies, charging infrastructure, data acquisition, and ML techniques constantly evolve, this paper conducts a systematized literature review (SLR) to understand the current landscape of ML-based EV charging demand forecasting, its emerging trends, and its future perspectives. The proposed SLR provides a well-structured synthesis of a large body of literature, categorizing approaches not only based on their ML-based approach, but also on the EV charging application. In addition, we focus on the most recent technological advances, exploring deep-learning architectures, spatial-temporal challenges, and cross-domain learning strategies. This offers an integrative perspective. On the one hand, it maps the state of the art, identifying a notable shift toward deep-learning approaches and an increasing interest in public EV charging stations. On the other hand, it uncovers underexplored methodological intersections that can be further exploited and research gaps that remain underaddressed, such as real-time data integration, long-term forecasting, and the development of adaptable models to different charging behaviors and locations. In this line, emerging trends combining recurrent and convolutional neural networks, and using relatively new ML techniques, especially transformers, and ML paradigms, such as transfer-, federated-, and meta-learning, have shown promising results for addressing spatial-temporality, time-scalability, and geographical-generalizability issues, paving the path for future research directions. Full article
(This article belongs to the Topic Electric Vehicles Energy Management, 2nd Volume)
4246 KB  
Article
PI-Based Current Constant Control with Ripple Component for Lifetime Extension of Lithium-Ion Battery
by Min-Ho Shin, Jin-Ho Lee and Jehyuk Won
Electronics 2025, 14(17), 3566; https://doi.org/10.3390/electronics14173566 (registering DOI) - 8 Sep 2025
Abstract
This paper presents a proportional–integral (PI) control-based charging strategy that introduces a ripple component into the constant-current (CC) charging profile to regulate battery temperature and improve long-term performance. The proposed method is implemented within an on-board charger (OBC), where the ripple amplitude is [...] Read more.
This paper presents a proportional–integral (PI) control-based charging strategy that introduces a ripple component into the constant-current (CC) charging profile to regulate battery temperature and improve long-term performance. The proposed method is implemented within an on-board charger (OBC), where the ripple amplitude is adaptively adjusted based on battery temperature and internal resistance. While most prior studies focus on electrochemical characteristics, this work highlights the importance of analyzing current profiles from a power electronics and converter control perspective. The ripple magnitude is controlled in real time through gain tuning of the PI current controller, allowing temperature-aware charging. To validate the proposed method, experiments were conducted using a 11 kW OBC system and 70 Ah lithium-ion battery to examine the correlation between ripple amplitude and battery temperature rise, as well as its impact on internal resistance. The control strategy was evaluated under various thermal conditions and shown to be effective in mitigating temperature-related degradation through ripple-based modulation. Full article
Show Figures

Figure 1

3 pages, 133 KB  
Editorial
Thermal Management in Lithium-Ion Batteries: Latest Advances and Prospects
by Xianglin Li, Chuanbo Yang and Prahit Dubey
Batteries 2025, 11(9), 335; https://doi.org/10.3390/batteries11090335 - 7 Sep 2025
Viewed by 128
Abstract
We are excited to present a Special Issue (SI) for Batteries on battery thermal management systems (BTMS) [...] Full article
33 pages, 16564 KB  
Article
Design and Implementation of an Off-Grid Smart Street Lighting System Using LoRaWAN and Hybrid Renewable Energy for Energy-Efficient Urban Infrastructure
by Seyfettin Vadi
Sensors 2025, 25(17), 5579; https://doi.org/10.3390/s25175579 - 6 Sep 2025
Viewed by 1055
Abstract
The growing demand for electricity and the urgent need to reduce environmental impact have made sustainable energy utilization a global priority. Street lighting, as a significant consumer of urban electricity, requires innovative solutions to enhance efficiency and reliability. This study presents an off-grid [...] Read more.
The growing demand for electricity and the urgent need to reduce environmental impact have made sustainable energy utilization a global priority. Street lighting, as a significant consumer of urban electricity, requires innovative solutions to enhance efficiency and reliability. This study presents an off-grid smart street lighting system that combines solar photovoltaic generation with battery storage and Internet of Things (IoT)-based control to ensure continuous and efficient operation. The system integrates Long Range Wide Area Network (LoRaWAN) communication technology for remote monitoring and control without internet connectivity and employs the Perturb and Observe (P&O) maximum power point tracking (MPPT) algorithm to maximize energy extraction from solar sources. Data transmission from the LoRaWAN gateway to the cloud is facilitated through the Message Queuing Telemetry Transport (MQTT) protocol, enabling real-time access and management via a graphical user interface. Experimental results demonstrate that the proposed system achieves a maximum MPPT efficiency of 97.96%, supports reliable communication over distances of up to 10 km, and successfully operates four LED streetlights, each spaced 400 m apart, across an open area of approximately 1.2 km—delivering a practical, energy-efficient, and internet-independent solution for smart urban infrastructure. Full article
Show Figures

Figure 1

14 pages, 6680 KB  
Article
In Situ Engineered Plastic–Crystal Interlayers Enable Li-Rich Cathodes in PVDF-HFP-Based All-Solid-State Polymer Batteries
by Fei Zhou, Jinwei Tan, Feixiang Wang and Meiling Sun
Batteries 2025, 11(9), 334; https://doi.org/10.3390/batteries11090334 - 6 Sep 2025
Viewed by 374
Abstract
All-solid-state lithium batteries (ASSLBs) employing Li-rich layered oxide (LLO) cathodes are regarded as promising next-generation energy storage systems owing to their outstanding energy density and intrinsic safety. Polymer-in-salt solid electrolytes (PISSEs) offer advantages such as high room-temperature ionic conductivity, enhanced Li anode interfacial [...] Read more.
All-solid-state lithium batteries (ASSLBs) employing Li-rich layered oxide (LLO) cathodes are regarded as promising next-generation energy storage systems owing to their outstanding energy density and intrinsic safety. Polymer-in-salt solid electrolytes (PISSEs) offer advantages such as high room-temperature ionic conductivity, enhanced Li anode interfacial compatibility, and low processing costs; however, their practical deployment is hindered by poor oxidative stability especially under high-voltage conditions. In this study, we report the rational design of a bilayer electrolyte architecture featuring an in situ solidified LiClO4-doped succinonitrile (LiClO4–SN) plastic–crystal interlayer between a Li1.2Mn0.6Ni0.2O2 (LMNO) cathode and a poly (vinylidene fluoride-co-hexafluoropropylene) (PVDF-HFP)-based PISSE. This PISSE/SN–LiClO4 configuration exhibits a wide electrochemical stability window up to 4.7 V vs. Li+/Li and delivers a high ionic conductivity of 5.68 × 10−4 S cm−1 at 25 °C. The solidified LiClO4-SN layer serves as an effective physical barrier, shielding the PVDF-HFP matrix from direct interfacial contact with LMNO and thereby suppressing its oxidative decomposition at elevated potentials. As a result, the bilayer polymer-based cells with the LMNO cathode demonstrate an initial discharge capacity of ∼206 mAh g−1 at 0.05 C and exhibit good cycling stability with 85.7% capacity retention after 100 cycles at 0.5 C under a high cut-off voltage of 4.6 V. This work not only provides a promising strategy to enhance the compatibility of PVDF-HFP-based electrolytes with high-voltage cathodes through the facile in situ solidification of plastic interlayers but also promotes the application of LMNO cathode material in high-energy ASSLBs. Full article
Show Figures

Graphical abstract

17 pages, 4358 KB  
Article
Development of Real-Time Estimation of Thermal and Internal Resistance for Reused Lithium-Ion Batteries Targeted at Carbon-Neutral Greenhouse Conditions
by Muhammad Bilhaq Ashlah, Chiao-Yin Tu, Chia-Hao Wu, Yulian Fatkur Rohman, Akhmad Azhar Firdaus, Won-Jung Choi and Wu-Yang Sean
Energies 2025, 18(17), 4755; https://doi.org/10.3390/en18174755 - 6 Sep 2025
Viewed by 299
Abstract
The transition toward renewable-powered greenhouse agriculture offers opportunities for reducing operational costs and environmental impacts, yet challenges remain in managing fluctuating energy loads and optimizing agricultural inputs. While second-life lithium-ion batteries provide a cost-effective energy storage option, their thermal and electrical characteristics under [...] Read more.
The transition toward renewable-powered greenhouse agriculture offers opportunities for reducing operational costs and environmental impacts, yet challenges remain in managing fluctuating energy loads and optimizing agricultural inputs. While second-life lithium-ion batteries provide a cost-effective energy storage option, their thermal and electrical characteristics under real-world greenhouse conditions are poorly documented. Similarly, although plasma-activated water (PAW) shows potential to reduce chemical fertilizer usage, its integration with renewable-powered systems requires further investigation. This study develops an adaptive monitoring and modeling framework to estimate the thermal resistances (Ru, Rc) and internal resistance (Rint) of second-life lithium-ion batteries using operational data from greenhouse applications, alongside a field trial assessing PAW effects on beefsteak tomato cultivation. The adaptive control algorithm accurately estimated surface temperature (Ts) and core temperature (Tc), achieving a root mean square error (RMSE) of 0.31 °C, a mean absolute error (MAE) of 0.25 °C, and a percentage error of 0.31%. Thermal resistance values stabilized at Ru ≈ 3.00 °C/W (surface to ambient) and Rc ≈ 2.00 °C/W (core to surface), indicating stable thermal regulation under load variations. Internal resistance (Rint) maintained a baseline of ~1.0–1.2 Ω, with peaks up to 12 Ω during load transitions, confirming the importance of continuous monitoring for performance and degradation prevention in second-life applications. The PAW treatment reduced chemical nitrogen fertilizer use by 31.2% without decreasing total nitrogen availability (69.5 mg/L). The NO3-N concentration in PAW reached 134 mg/L, with an initial pH of 3.04 neutralized before application, ensuring no adverse effects on germination or growth. Leaf nutrient analysis showed lower nitrogen (1.83% vs. 2.28%) and potassium (1.66% vs. 2.17%) compared to the control, but higher magnesium content (0.59% vs. 0.37%), meeting Japanese adequacy standards. The total yield was 7.8 kg/m2, with fruit quality comparable between the PAW and control groups. The integration of adaptive battery monitoring with PAW irrigation demonstrates a practical pathway toward energy efficient and sustainable greenhouse operations. Full article
(This article belongs to the Section D: Energy Storage and Application)
Show Figures

Figure 1

32 pages, 5016 KB  
Review
A Review on the Crashworthiness of Bio-Inspired Cellular Structures for Electric Vehicle Battery Pack Protection
by Tamana Dabasa, Hirpa G. Lemu and Yohannes Regassa
Computation 2025, 13(9), 217; https://doi.org/10.3390/computation13090217 - 5 Sep 2025
Viewed by 350
Abstract
The rapid shift toward electric vehicles (EVs) has underscored the critical importance of battery pack crashworthiness, creating a demand for lightweight, energy-absorbing protective systems. This review systematically explores bio-inspired cellular structures as promising solutions for improving the impact resistance of EV battery packs. [...] Read more.
The rapid shift toward electric vehicles (EVs) has underscored the critical importance of battery pack crashworthiness, creating a demand for lightweight, energy-absorbing protective systems. This review systematically explores bio-inspired cellular structures as promising solutions for improving the impact resistance of EV battery packs. Inspired by natural geometries, these designs exhibit superior energy absorption, controlled deformation behavior, and high structural efficiency compared to conventional configurations. A comprehensive analysis of experimental, numerical, and theoretical studies published up to mid-2025 was conducted, with emphasis on design strategies, optimization techniques, and performance under diverse loading conditions. Findings show that auxetic, honeycomb, and hierarchical multi-cell architectures can markedly enhance specific energy absorption and deformation control, with improvements often exceeding 100% over traditional structures. Finite element analyses highlight their ability to achieve controlled deformation and efficient energy dissipation, while optimization strategies, including machine learning, genetic algorithms, and multi-objective approaches, enable effective trade-offs between energy absorption, weight reduction, and manufacturability. Persistent challenges remain in structural optimization, overreliance on numerical simulations with limited experimental validation, and narrow focus on a few bio-inspired geometries and thermo-electro-mechanical coupling, for which engineering solutions are proposed. The review concludes with future research directions focused on geometric optimization, multi-physics modeling, and industrial integration strategies. Collectively, this work provides a comprehensive framework for advancing next-generation crashworthy battery pack designs that integrate safety, performance, and sustainability in electric mobility. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Graphical abstract

20 pages, 7428 KB  
Article
The Impact of the Cooling System on the Thermal Management of an Electric Bus Battery
by Piotr Miś, Katarzyna Miś and Aleksandra Waszczuk-Młyńska
Appl. Sci. 2025, 15(17), 9776; https://doi.org/10.3390/app15179776 (registering DOI) - 5 Sep 2025
Viewed by 252
Abstract
This paper presents a thermal study of a lithium-ion traction battery with different cooling configurations during simulated city driving and high-power charging. Four liquid cooling configurations—single or triple plates with straight or U-shaped tubes—were evaluated using finite element models in the Q-Bat Toolbox [...] Read more.
This paper presents a thermal study of a lithium-ion traction battery with different cooling configurations during simulated city driving and high-power charging. Four liquid cooling configurations—single or triple plates with straight or U-shaped tubes—were evaluated using finite element models in the Q-Bat Toolbox for MATLAB. Simulations were conducted using the Worldwide Harmonized Light Vehicles Test Cycle (WLTC) and a high-current charging profile based on the CHAdeMO standard (up to 400 A). The results indicate that while cooling is not strictly necessary under typical driving conditions, it significantly improves thermal stability and reduces peak temperatures. The best configuration reduced peak cell temperatures by 1.96% during driving and by 16% during fast charging. The cooling system also minimized temperature gradients within the battery, reducing the risk of degradation. Box-plot analysis confirmed that an efficient cooling system stabilizes the temperature distribution and smooths out extreme values. The results highlight the importance of thermal management for extending battery life and ensuring safe operation, particularly during fast charging conditions. Full article
(This article belongs to the Section Transportation and Future Mobility)
Show Figures

Figure 1

15 pages, 2114 KB  
Article
Discharge-Based DC-Bus Voltage Link Capacitor Monitoring with Repetitive Recursive Least Squares Method for Hybrid-Electric Aircraft
by Stanisław Oliszewski, Marcin Pawlak and Mateusz Dybkowski
Energies 2025, 18(17), 4743; https://doi.org/10.3390/en18174743 - 5 Sep 2025
Viewed by 273
Abstract
Hybrid-electric aircraft require a reliable power distribution architecture. The electrical drive system is connected to the power source via a DC-link composed mostly of capacitors—one of the faultiest power electronic components. In order to ensure the safe operation of the aircraft, DC-link capacitor [...] Read more.
Hybrid-electric aircraft require a reliable power distribution architecture. The electrical drive system is connected to the power source via a DC-link composed mostly of capacitors—one of the faultiest power electronic components. In order to ensure the safe operation of the aircraft, DC-link capacitor condition monitoring is needed. The main requirements for such an algorithm are low data consumption and the possibility to use it in generator- or battery-powered systems. The proposed discharge-based repetitive recursive least squares (RRLS) method provides satisfactory estimates utilizing small data packages. Its execution during capacitor discharge makes it independent from the power source type. Based on the capacitor’s physical parameters, the computational complexity of the estimation process is reduced. Simulation validation and experimental tests were conducted. An analysis was carried out in a capacitance range between 705 μF and 1175 μF. The effective range of the algorithm is 881 μF–1044 μF, with an estimation error of less than 5%. Additionally, a range of changes in the time constant of the multiplier of 0.1–10 was tested in the simulation study. Full article
(This article belongs to the Special Issue Electric Machinery and Transformers III)
Show Figures

Figure 1

17 pages, 4289 KB  
Article
Performance Analysis of an Ice-Based Buoy Operating from the Packed Ice Zone to the Marginal Ice Zone with an Imaging System
by Guangyu Zuo, Haocai Huang and Huifang Chen
J. Mar. Sci. Eng. 2025, 13(9), 1717; https://doi.org/10.3390/jmse13091717 - 5 Sep 2025
Viewed by 227
Abstract
Arctic sea ice can be regarded as a sensitive indicator of climate change, and it has declined dramatically in recent decades. The swift decline in Arctic sea ice coverage leads to an expansion of the marginal ice zone (MIZ). In this study, an [...] Read more.
Arctic sea ice can be regarded as a sensitive indicator of climate change, and it has declined dramatically in recent decades. The swift decline in Arctic sea ice coverage leads to an expansion of the marginal ice zone (MIZ). In this study, an ice-based buoy with an imaging system is designed for the long-term observation of the changes in sea ice from the packed ice zone to the marginal ice zone in polar regions. The system composition, main buoy, image system, and buoy load were analyzed. An underwater camera supports a 640 × 480 resolution image acquisition, RS485 communication, stable operation at –40 °C, and long-term underwater sealing protection through a titanium alloy housing. During a continuous three-month field deployment in the Arctic, the system successfully captured images of ice-bottom morphology and biological attachment, demonstrating imaging reliability and operational stability under extreme conditions. In addition, the buoy employed a battery state estimation method based on the Extreme Learning Machine (ELM). Compared with LSTM, BP, BiLSTM, SAELSTM, and RF models, the ELM achieved a test set performance of RMSE = 0.05 and MAE = 0.187, significantly outperforming the alternatives and thereby improving energy management and the reliability of long-term autonomous operation. Laboratory flume tests further verified the power generation performance of the wave energy-assisted supply system. However, due to the limited duration of Arctic deployment, full year-round performance has not yet been validated, and the imaging resolution remains insufficient for biological classification. The results indicate that the buoy demonstrates strong innovation and application potential for long-term polar observations, while further improvements are needed through extended deployments and enhanced imaging capability. Full article
Show Figures

Figure 1

22 pages, 4693 KB  
Article
Experience-Driven NeuroSymbolic System for Efficient Robotic Bolt Disassembly
by Pengxu Chang, Zhigang Wang, Yanlong Peng, Ziwen He and Ming Chen
Batteries 2025, 11(9), 332; https://doi.org/10.3390/batteries11090332 - 5 Sep 2025
Viewed by 181
Abstract
With the rapid growth of electric vehicles, the efficient and safe recycling of high-energy battery packs, particularly the removal of structural bolts, has become a critical challenge. This study presents a NeuroSymbolic robotic system for battery disassembly, driven by autonomous learning capabilities. The [...] Read more.
With the rapid growth of electric vehicles, the efficient and safe recycling of high-energy battery packs, particularly the removal of structural bolts, has become a critical challenge. This study presents a NeuroSymbolic robotic system for battery disassembly, driven by autonomous learning capabilities. The system integrates deep perception modules, symbolic reasoning, and action primitives to achieve interpretable and efficient disassembly. To improve adaptability, we introduce an offline learning framework driven by a large language model (LLM), which analyzes historical disassembly trajectories and generates optimized action sequences via prompt-based reasoning. This enables the synthesis of new action primitives tailored to familiar scenarios. The system is validated on a real-world UR10e robotic platform across various battery configurations. Experimental results show a 17 s reduction in average disassembly time per bolt and a 154.4% improvement in overall efficiency compared with traditional approaches. These findings demonstrate that combining neural perception, symbolic reasoning, and LLM-guided learning significantly enhances robotic disassembly performance and offers strong potential for generalization in future battery recycling applications. Full article
(This article belongs to the Special Issue Batteries: 10th Anniversary)
Show Figures

Figure 1

14 pages, 1079 KB  
Article
Estimation of Lead Acid Battery Degradation—A Model for the Optimization of Battery Energy Storage System Using Machine Learning
by Arief S. Budiman, Rayya Fajarna, Muhammad Asrol, Fitya Syarifa Mozar, Christian Harito, Bens Pardamean, Derrick Speaks and Endang Djuana
Electrochem 2025, 6(3), 33; https://doi.org/10.3390/electrochem6030033 - 5 Sep 2025
Viewed by 140
Abstract
Energy storage systems are becoming increasingly important as more renewable energy systems are integrated into the electrical (or power utility) grid. Low-cost and reliable energy storage is paramount if renewable energy systems are to be increasingly integrated into the power grid. Lead-acid batteries [...] Read more.
Energy storage systems are becoming increasingly important as more renewable energy systems are integrated into the electrical (or power utility) grid. Low-cost and reliable energy storage is paramount if renewable energy systems are to be increasingly integrated into the power grid. Lead-acid batteries are widely used as energy storage for stationary renewable energy systems and agriculture due to their low cost, especially compared to lithium-ion batteries (LIB). However, lead-acid battery technology suffers from system degradation and a relatively short lifetime, largely due to its charging/discharging cycles. In the present study, we use Machine Learning methodology to estimate the battery degradation in an energy storage system. It uses two types of datasets: discharge condition and lead acid battery data. In the initial analysis, the Support Vector Regression (SVR) method with the RBF kernel showed poor results, with a low accuracy value of 0.0127 and RMSE 5377. On the other hand, the Long Short-Term Memory (LSTM) method demonstrated better estimation results with an RMSE value of 0.0688, which is relatively close to 0. Full article
Show Figures

Figure 1

Back to TopTop