Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (28)

Search Parameters:
Keywords = multiple batches planning

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 2822 KB  
Article
A New Framework for Job Shop Integrated Scheduling and Vehicle Path Planning Problem
by Ruiqi Li, Jianlin Mao, Xing Wu, Wenna Zhou, Chengze Qian and Haoshuang Du
Sensors 2026, 26(2), 543; https://doi.org/10.3390/s26020543 - 13 Jan 2026
Viewed by 439
Abstract
With the development of manufacturing industry, traditional fixed process processing methods cannot adapt to the changes in workshop operations and the demand for small batches and multiple orders. Therefore, it is necessary to introduce multiple robots to provide a more flexible production mode. [...] Read more.
With the development of manufacturing industry, traditional fixed process processing methods cannot adapt to the changes in workshop operations and the demand for small batches and multiple orders. Therefore, it is necessary to introduce multiple robots to provide a more flexible production mode. Currently, some Job Shop Scheduling Problems with Transportation (JSP-T) only consider job scheduling and vehicle task allocation, and does not focus on the problem of collision free paths between vehicles. This article proposes a novel solution framework that integrates workshop scheduling, material handling robot task allocation, and conflict free path planning between robots. With the goal of minimizing the maximum completion time (Makespan) that includes handling, this paper first establishes an extended JSP-T problem model that integrates handling time and robot paths, and provides the corresponding workshop layout map. Secondly, in the scheduling layer, an improved Deep Q-Network (DQN) method is used for dynamic scheduling to generate a feasible and optimal machining scheduling scheme. Subsequently, considering the robot’s position information, the task sequence is assigned to the robot path execution layer. Finally, at the path execution layer, the Priority Based Search (PBS) algorithm is applied to solve conflict free paths for the handling robot. The optimized solution for obtaining the maximum completion time of all jobs under the condition of conflict free path handling. The experimental results show that compared with algorithms such as PPO, the scheduling algorithm proposed in this paper has improved performance by 9.7% in Makespan, and the PBS algorithm can obtain optimized paths for multiple handling robots under conflict free conditions. The framework can handle scheduling, task allocation, and conflict-free path planning in a unified optimization process, which can adapt well to job changes and then flexible manufacturing. Full article
Show Figures

Figure 1

20 pages, 1518 KB  
Article
An Effective Hybrid Rescheduling Method for Wafer Chip Precision Packaging Workshops in Complex Manufacturing Environments
by Ziyue Wang, Weikang Fang and Yichen Yang
Micromachines 2025, 16(12), 1403; https://doi.org/10.3390/mi16121403 - 12 Dec 2025
Cited by 1 | Viewed by 450
Abstract
With the continuous development of semiconductor manufacturing technology and information technology, the sizes of wafer chips are becoming smaller and the variety is increasing, which has put forward high requirements for wafer chip precision manufacturing and packaging workshops. On the one hand, the [...] Read more.
With the continuous development of semiconductor manufacturing technology and information technology, the sizes of wafer chips are becoming smaller and the variety is increasing, which has put forward high requirements for wafer chip precision manufacturing and packaging workshops. On the one hand, the market demand for multiple varieties and small batches will increase the difficulty of scheduling. On the other hand, the complex manufacturing environment brings various types of dynamic events that will disrupt production plans. Accordingly, this work researches the wafer chip precision packaging workshop rescheduling problem under events of machine breakdown, emergency order inserting and original order modification. Firstly, the mathematical model for the addressed problem is established, and the rolling horizon technology is adopted to deal with multiple dynamic events. Then, a hybrid algorithm combining an improved firefly optimization framework and variable neighborhood search strategy is proposed. The population evolution mechanism is designed based on the location-updating law of fireflies in nature. The variable neighborhood search is applied for avoiding local optima and sufficiently exploring in the neighborhood. At last, the test results of comparative experiments and engineering cases indicate that the proposed method is effective and stable and is superior to the current advanced algorithms. Full article
(This article belongs to the Special Issue Future Trends in Ultra-Precision Machining)
Show Figures

Figure 1

29 pages, 2210 KB  
Article
Bi-Level Collaborative Optimization for Medical Consumable Order Splitting and Reorganization Considering Multi-Dimensional and Multi-Scale Characteristics
by Peng Jiang, Shunsheng Guo and Xu Luo
Appl. Sci. 2025, 15(14), 7627; https://doi.org/10.3390/app15147627 - 8 Jul 2025
Cited by 1 | Viewed by 956
Abstract
Medical consumable orders are characterized by diverse product types, small batch sizes, frequent orders, and high customization requirements, often leading to inefficient workshop scheduling and difficulties in meeting multiple production constraints. To address these challenges, this study proposes a bi-level optimization model for [...] Read more.
Medical consumable orders are characterized by diverse product types, small batch sizes, frequent orders, and high customization requirements, often leading to inefficient workshop scheduling and difficulties in meeting multiple production constraints. To address these challenges, this study proposes a bi-level optimization model for order splitting and reorganization considering multi-dimensional and multi-scale characteristics. The multi-dimensional characteristics encompass materials, processes, equipment, and work efficiency, while the multi-scale aspects involve finished products, components, assemblies, and parts. At the upper level, the model optimizes order task splitting by refining splitting strategies and preprocessing constraints to generate high-quality input for the reorganization phase. The lower level optimizes sub-task prioritization, batch sizes, and resource scheduling to develop a production plan that balances cost and efficiency. Subsequently, to solve this bi-level optimization problem, a hybrid bi-objective optimization algorithm is designed, integrating a collaborative iterative strategy to enhance solution efficiency and quality. Finally, a case study and comparative experiments validate the practicality and effectiveness of the proposed model and algorithm. Full article
(This article belongs to the Special Issue Fuzzy Control Systems and Decision-Making)
Show Figures

Figure 1

22 pages, 2752 KB  
Article
Direct Consideration of Process History During Intensified Design of Experiments Planning Eases Interpretation of Mammalian Cell Culture Dynamics
by Samuel Kienzle, Lisa Junghans, Stefan Wieschalka, Katharina Diem, Ralf Takors, Nicole Erika Radde, Marco Kunzelmann, Beate Presser and Verena Nold
Bioengineering 2025, 12(3), 319; https://doi.org/10.3390/bioengineering12030319 - 19 Mar 2025
Viewed by 1275
Abstract
Intra-experimental factor setting shifts in intensified design of experiments (iDoE) enhance understanding of bioproduction processes by capturing their dynamics and are thus essential to fulfill quality by design (QbD) ambitions. Determining the influence of process history on the cellular responses, often referred to [...] Read more.
Intra-experimental factor setting shifts in intensified design of experiments (iDoE) enhance understanding of bioproduction processes by capturing their dynamics and are thus essential to fulfill quality by design (QbD) ambitions. Determining the influence of process history on the cellular responses, often referred to as memory effect, is fundamental for accurate predictions. However, the current iDoE designs do not explicitly consider nor quantify the influence of process history. Therefore, we propose the one-factor-multiple-columns (OFMC)-format for iDoE planning. This format explicitly describes stage-dependent factor effects and potential memory effects as across-stage interactions (ASIs) during a bioprocess. To illustrate its utility, an OFMC-iDoE that considers the characteristic growth phases during a fed-batch process was planned. Data were analyzed using ordinary least squares (OLS) regression as previously described via stage-wise analysis of the time series and compared to direct modeling of end-of-process outcomes enabled by the OFMC-format. This article aims to provide the reader with a framework on how to plan and model iDoE data and highlights how the OFMC-format simplifies planning, and data acquisition, eases modeling and gives a straightforward quantification of potential memory effects. With the proposed OFMC-format, optimization of bioprocesses can leverage which factor settings are most beneficial in which state of the mammalian culture and thus elevate performance and quality to the next level. Full article
(This article belongs to the Special Issue From Residues to Bio-Based Products through Bioprocess Engineering)
Show Figures

Figure 1

22 pages, 21962 KB  
Article
A Mixed-Integer Linear Programming Model for Addressing Efficient Flexible Flow Shop Scheduling Problem with Automatic Guided Vehicles Consideration
by Dekun Wang, Hongxu Wu, Wengang Zheng, Yuhao Zhao, Guangdong Tian, Wenjie Wang and Dong Chen
Appl. Sci. 2025, 15(6), 3133; https://doi.org/10.3390/app15063133 - 13 Mar 2025
Cited by 9 | Viewed by 3686
Abstract
With the development of Industry 4.0, discrete manufacturing systems are accelerating their transformation toward flexibility and intelligence to meet the market demand for various products and small-batch production. The flexible flow shop (FFS) paradigm enhances production flexibility, but existing studies often address FFS [...] Read more.
With the development of Industry 4.0, discrete manufacturing systems are accelerating their transformation toward flexibility and intelligence to meet the market demand for various products and small-batch production. The flexible flow shop (FFS) paradigm enhances production flexibility, but existing studies often address FFS scheduling and automated guided vehicle (AGV) path planning separately, resulting in resource competition conflicts, such as equipment idle time and AGV congestion, which prolong the manufacturing cycle time and reduce system energy efficiency. To solve this problem, this study proposes an integrated production–transportation scheduling framework (FFSP-AGV). By using the adjacent sequence modeling idea, a mixed-integer linear programming (MILP) model is established, which takes into account the constraints of the production process and AGV transportation task conflicts with the aim of minimizing the makespan and improving overall operational efficiency. Systematic evaluations are carried out on multiple test instances of different scales using the CPLEX solver. The results show that, for small-scale instances (job count ≤10), the MILP model can generate optimal scheduling solutions within a practical computation time (several minutes). Moreover, it is found that there is a significant marginal diminishing effect between AGV quantity and makespan reduction. Once the number of AGVs exceeds 60% of the parallel equipment capacity, their incremental contribution to cycle time reduction becomes much smaller. However, the computational complexity of the model increases exponentially with the number of jobs, making it slightly impractical for large-scale problems (job count > 20). This research highlights the importance of integrated production–transportation scheduling for reducing manufacturing cycle time and reveals a threshold effect in AGV resource allocation, providing a theoretical basis for collaborative optimization in smart factories. Full article
(This article belongs to the Special Issue Multiobjective Optimization: Theory, Methods and Applications)
Show Figures

Figure 1

20 pages, 3536 KB  
Article
A Multi-Trigger Mechanism Design for Rescheduling Decision Assistance in Smart Job Shops Based on Machine Learning
by Rong Duan, Siqi Wang, Ya Liu, Wei Yan, Zhigang Jiang and Zhiqiang Pan
Sustainability 2025, 17(5), 2198; https://doi.org/10.3390/su17052198 - 3 Mar 2025
Cited by 1 | Viewed by 1471
Abstract
The empowerment of lean intelligent manufacturing technologies has provided a solid foundation for enterprises to achieve a balance between economic benefits and sustainable development. In production workshops, various disruptive factors, especially in multi-variety small-batch production environments, often lead to deviations from the planned [...] Read more.
The empowerment of lean intelligent manufacturing technologies has provided a solid foundation for enterprises to achieve a balance between economic benefits and sustainable development. In production workshops, various disruptive factors, especially in multi-variety small-batch production environments, often lead to deviations from the planned schedule. This creates an urgent need to enhance the workshop’s dynamic responsiveness and self-regulation capabilities. Existing single-trigger mechanisms in job shops focus on changes in overall performance or deviations from production goals but lack a representation of the varying degrees of impact on different equipment under multiple disturbances. This results in either over-scheduling or under-scheduling in terms of scope, thereby impacting the optimization of production efficiency and resource utilization. To address this, this paper proposes a method for coordinated decision-making on rescheduling timing and location in intelligent job shops under disturbance environments. First, by analyzing the relationship between disturbance impact and the scope of rescheduling implementation, a mapping relationship is established between disturbance impact and disturbance response hierarchy. A trigger is set up on each piece of equipment to characterize the differences in the degree of impact on different equipment, which not only reduces the complexity of disturbance information processing but also provides support for specific location decisions for disturbance response. Second, a decision module for the triggers is constructed using a multilayer perceptron, establishing a mapping relationship between process and workpiece data attributes and response categories. Based on the basic processing units of the manufacturing process and the relevant quantitative indicators of the processed objects, disturbance response strategies are generated. Finally, through a case study, the proposed method is evaluated and validated in an intelligent factory setting. The new rescheduling decision support method can effectively make timing and location decisions for disturbance events. Full article
Show Figures

Figure 1

25 pages, 5681 KB  
Article
Multi-Batch Carrier-Based UAV Formation Rendezvous Method Based on Improved Sequential Convex Programming
by Zirui Zhang, Liguo Sun and Yanyang Wang
Drones 2024, 8(11), 615; https://doi.org/10.3390/drones8110615 - 26 Oct 2024
Viewed by 1914
Abstract
The limitations of the existing catapults necessitate multiple batches of take-offs for carrier-based unmanned aerial vehicles (UAVs) to form a formation. Because of the differences in takeoff time and location of each batch of UAVs, ensuring the temporal and spatial consistency and rendezvous [...] Read more.
The limitations of the existing catapults necessitate multiple batches of take-offs for carrier-based unmanned aerial vehicles (UAVs) to form a formation. Because of the differences in takeoff time and location of each batch of UAVs, ensuring the temporal and spatial consistency and rendezvous efficiency of the formation becomes crucial. Concerning the challenges mentioned above, a multi-batch formation rendezvous method based on improved sequential convex programming (SCP) is proposed. A reverse solution approach based on the multi-batch rendezvous process is developed. On this basis, a non-convex optimization problem is formulated considering the following constraints: UAV dynamics, collision avoidance, obstacle avoidance, and formation consistency. An SCP method that makes use of the trust region strategy is introduced to solve the problem efficiently. Due to the spatiotemporal coupling characteristics of the rendezvous process, an inappropriate initial solution for SCP will inevitably reduce the rendezvous efficiency. Thus, an initial solution tolerance mechanism is introduced to improve the SCP. This mechanism follows the idea of simulated annealing, allowing the SCP to search for better reference solutions in a wider space. By utilizing the initial solution tolerance SCP (IST-SCP), the multi-batch formation rendezvous algorithm is developed correspondingly. Simulation results are obtained to verify the effectiveness and adaptability of the proposed method. IST-SCP reduces the rendezvous time from poor initial solutions without significantly increasing the computing time. Full article
Show Figures

Figure 1

19 pages, 6739 KB  
Article
Artificial Neural Network Modeling in the Presence of Uncertainty for Predicting Hydrogenation Degree in Continuous Nitrile Butadiene Rubber Processing
by Chandra Mouli R. Madhuranthakam, Farzad Hourfar and Ali Elkamel
Processes 2024, 12(5), 999; https://doi.org/10.3390/pr12050999 - 15 May 2024
Cited by 9 | Viewed by 1847
Abstract
The transition from batch to continuous production in the catalytic hydrogenation of nitrile butadiene rubber (NBR) into hydrogenated NBR (HNBR) marks a significant advance for applications under demanding conditions. This study introduces a continuous process utilizing a static mixer (SM) reactor, which notably [...] Read more.
The transition from batch to continuous production in the catalytic hydrogenation of nitrile butadiene rubber (NBR) into hydrogenated NBR (HNBR) marks a significant advance for applications under demanding conditions. This study introduces a continuous process utilizing a static mixer (SM) reactor, which notably achieves a hydrogenation conversion rate exceeding 97%. We thoroughly review a mechanistic model of the SM reactor to elucidate the internal dynamics governing the hydrogenation process and address the inherent uncertainties in key parameters such as the Peclet number (Pe), dimensionless time (θτ), reaction coefficient (R), and flow rate coefficient (q). A comprehensive dataset generated from varied parameter values serves as the basis for training an artificial neural network (ANN), which is then compared against traditional models including linear regression, decision tree, and random forest in terms of efficacy. Our results clearly demonstrate the ANN’s superiority in predicting the degree of hydrogenation, achieving the lowest root mean squared error (RMSE) of 3.69 compared to 21.90 for linear regression, 4.94 for decision tree, and 7.51 for random forest. The ANN’s robust capability for modeling complex nonlinear relationships and dynamics significantly enhances decision-making, planning, and optimization of the reactor, reducing computational demands and operational costs. In other words, this approach allows users to rely on a single ML-based model instead of multiple mechanistic models for reflecting the effects of possible uncertainties. Additionally, a feature importance study validates the critical impact of time and element number on the hydrogenation process, further supporting the ANN’s predictive accuracy. These findings underscore the potential of ML-based models in streamlining and enhancing the efficiency of chemical production processes. Full article
(This article belongs to the Section Materials Processes)
Show Figures

Figure 1

32 pages, 16876 KB  
Article
Automated Grasp Planning and Finger Design Space Search Using Multiple Grasp Quality Measures
by Roshan Kumar Hota, Gaoyuan Liu, Bieke Decraemer, Barry Swevels, Sofie Burggraeve, Tom Verstraten, Bram Vanderborght and Greet Van de Perre
Robotics 2024, 13(5), 74; https://doi.org/10.3390/robotics13050074 - 9 May 2024
Cited by 3 | Viewed by 5345
Abstract
As the industry shifts to automated manufacturing and the assembly of parts in smaller batches, there is a clear need for an efficient design of grippers. This paper presents a method for automated grasp planning and finger design for multiple parts using four [...] Read more.
As the industry shifts to automated manufacturing and the assembly of parts in smaller batches, there is a clear need for an efficient design of grippers. This paper presents a method for automated grasp planning and finger design for multiple parts using four grasp quality measures that capture the following important requirements for grasping: (i) uniform contact force distribution; (ii) better gravity wrench resistance; (iii) robustness against gripper positioning error; and (iv) ability to resist larger external wrench on the object. We introduce the fingertip score to quantify the grasp performance of a fingertip design over all the objects. The method takes the CAD model of the objects as the input and outputs the optimal grasp location and the best finger design. We use the method for a three-point grasp with a parallel jaw gripper. We validate our method on two sets of objects. Results show how each grasp quality measure behaves on different objects and the variation in the fingertip score with finger design. Finally, we test the effectiveness of the optimal finger design experimentally. The three-point grasp is suitable for grasping objects larger than is possible with shape-matching fingertips. Full article
(This article belongs to the Special Issue Advanced Grasping and Motion Control Solutions)
Show Figures

Figure 1

28 pages, 3547 KB  
Article
Research on Multi-Objective Flexible Job Shop Scheduling Problem with Setup and Handling Based on an Improved Shuffled Frog Leaping Algorithm
by Jili Kong and Yi Yang
Appl. Sci. 2024, 14(10), 4029; https://doi.org/10.3390/app14104029 - 9 May 2024
Cited by 9 | Viewed by 4097
Abstract
Flexible job shop scheduling problem (FJSP), widely prevalent in many intelligent manufacturing industries, is one of the most classic problems of production scheduling and combinatorial optimization. In actual manufacturing enterprises, the setup of machines and the handling of jobs have an important impact [...] Read more.
Flexible job shop scheduling problem (FJSP), widely prevalent in many intelligent manufacturing industries, is one of the most classic problems of production scheduling and combinatorial optimization. In actual manufacturing enterprises, the setup of machines and the handling of jobs have an important impact on the scheduling plan. Furthermore, there is a trend for a cluster of machines with similar functionalities to form a work center. Considering the above constraints, a new order-driven multi-equipment work center FJSP model with setup and handling including multiple objectives encompassing the minimization of the makespan, the number of machine shutdowns, and the number of handling batches is established. An improved shuffled frog leading algorithm is designed to solve it through the optimization of the initial solution population, the improvement of evolutionary operations, and the incorporation of Pareto sorting. The algorithm also combines the speed calculation method in the gravity search algorithm to enhance the stability of the solution search. Some standard FJSP data benchmarks have been selected to evaluate the effectiveness of the algorithm, and the experimental results confirm the satisfactory performance of the proposed algorithm. Finally, a problem example is designed to demonstrate the algorithm’s capability to generate an excellent scheduling plan. Full article
Show Figures

Figure 1

18 pages, 3880 KB  
Article
Persistent Monitoring for Points of Interest with Different Data Update Deadlines
by Qing Guo and Jian Peng
Sensors 2024, 24(4), 1224; https://doi.org/10.3390/s24041224 - 14 Feb 2024
Cited by 1 | Viewed by 1563
Abstract
In this paper, we study the regular sensory data collection of Points of Interest (PoIs) with multiple Unmanned Aerial Vehicles (UAVs) during an extended monitoring period, where each PoI is visited multiple times before its data update deadline to keep the data fresh. [...] Read more.
In this paper, we study the regular sensory data collection of Points of Interest (PoIs) with multiple Unmanned Aerial Vehicles (UAVs) during an extended monitoring period, where each PoI is visited multiple times before its data update deadline to keep the data fresh. We observe that most existing studies ignored the important differences in the data stored in the PoIs, scheduled a plan that dispatched UAVs to visit all PoIs before the same deadline, and simply repeated the plan during the monitoring period, which undoubtedly increased the service cost of the UAVs. Considering the specific data update deadline of each PoI, we formulate a novel UAV cost minimization problem to collect the data stored in each PoI before its deadline by finding a series of plans for UAVs such that the service cost of the UAVs during the monitoring period is minimized; the service cost of the UAVs is composed of the consumed energy of the UAVs utilized for hovering for data collection and the consumed energy of the UAVs utilized for flying. To deal with the above NP-hard problem, we devise an approximation algorithm by grouping the PoIs and accessing them in batches. Then, we analyze the proposed algorithm and evaluate the performance of the algorithm through experimental simulations. The experimental results show that the proposed algorithm is very promising. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

18 pages, 732 KB  
Article
Efficient Skip Connections-Based Residual Network (ESRNet) for Brain Tumor Classification
by Ashwini B., Manjit Kaur, Dilbag Singh, Satyabrata Roy and Mohammed Amoon
Diagnostics 2023, 13(20), 3234; https://doi.org/10.3390/diagnostics13203234 - 17 Oct 2023
Cited by 19 | Viewed by 3657
Abstract
Brain tumors pose a complex and urgent challenge in medical diagnostics, requiring precise and timely classification due to their diverse characteristics and potentially life-threatening consequences. While existing deep learning (DL)-based brain tumor classification (BTC) models have shown significant progress, they encounter limitations like [...] Read more.
Brain tumors pose a complex and urgent challenge in medical diagnostics, requiring precise and timely classification due to their diverse characteristics and potentially life-threatening consequences. While existing deep learning (DL)-based brain tumor classification (BTC) models have shown significant progress, they encounter limitations like restricted depth, vanishing gradient issues, and difficulties in capturing intricate features. To address these challenges, this paper proposes an efficient skip connections-based residual network (ESRNet). leveraging the residual network (ResNet) with skip connections. ESRNet ensures smooth gradient flow during training, mitigating the vanishing gradient problem. Additionally, the ESRNet architecture includes multiple stages with increasing numbers of residual blocks for improved feature learning and pattern recognition. ESRNet utilizes residual blocks from the ResNet architecture, featuring skip connections that enable identity mapping. Through direct addition of the input tensor to the convolutional layer output within each block, skip connections preserve the gradient flow. This mechanism prevents vanishing gradients, ensuring effective information propagation across network layers during training. Furthermore, ESRNet integrates efficient downsampling techniques and stabilizing batch normalization layers, which collectively contribute to its robust and reliable performance. Extensive experimental results reveal that ESRNet significantly outperforms other approaches in terms of accuracy, sensitivity, specificity, F-score, and Kappa statistics, with median values of 99.62%, 99.68%, 99.89%, 99.47%, and 99.42%, respectively. Moreover, the achieved minimum performance metrics, including accuracy (99.34%), sensitivity (99.47%), specificity (99.79%), F-score (99.04%), and Kappa statistics (99.21%), underscore the exceptional effectiveness of ESRNet for BTC. Therefore, the proposed ESRNet showcases exceptional performance and efficiency in BTC, holding the potential to revolutionize clinical diagnosis and treatment planning. Full article
(This article belongs to the Special Issue Artificial Intelligence in Clinical Medical Imaging)
Show Figures

Figure 1

23 pages, 9168 KB  
Article
Optimizing Size Consistency in Batch Roller Production: A Mixed Strategy Approach
by Weifeng Liu and Chengzu Ren
Appl. Sci. 2023, 13(19), 10890; https://doi.org/10.3390/app131910890 - 30 Sep 2023
Cited by 4 | Viewed by 1775
Abstract
The Double-Disk Straight Groove Lapping (DDSGL) technique, a novel approach to batch processing of bearing rollers, achieves high dimensional consistency by removing material through size comparison between multiple rollers in the processing area. To avoid collision between the rollers, the prevalent practice in [...] Read more.
The Double-Disk Straight Groove Lapping (DDSGL) technique, a novel approach to batch processing of bearing rollers, achieves high dimensional consistency by removing material through size comparison between multiple rollers in the processing area. To avoid collision between the rollers, the prevalent practice in DDSGL involves circulating the rollers in fixed linear sequences, an approach that impedes comprehensive size comparison throughout the entire batch of rollers. To counter this, we introduce a Dual-Channel Mixing Scheduling (DCMS) strategy that disrupts the roller sequence without triggering collisions. This strategy promotes extensive size comparison and enhances batch size consistency. To elucidate the operational principles of DCMS, we have developed a computational model grounded in DDSGL, designed simulation test plans under different mixing parameters, and summarized the number of direct comparisons, total comparisons, and differences in roller cycle times to determine the optimal combination of mixing parameters. Finally, structural modifications were made in the DDSGL system for validation studies under different mixing parameters. The test results show that the use of DCHS can reduce processing time by up to 50%, and the batch diameter change of the rollers can converge from 1.15 μm to as low as 0.76 μm. The industrial relevance of this research is significant; these improvements can lead to higher efficiency in the manufacturing process and improved quality of bearing rollers. Full article
(This article belongs to the Section Applied Industrial Technologies)
Show Figures

Figure 1

28 pages, 809 KB  
Article
Heuristics and Learning Models for Dubins MinMax Traveling Salesman Problem
by Abhishek Nayak and Sivakumar Rathinam
Sensors 2023, 23(14), 6432; https://doi.org/10.3390/s23146432 - 15 Jul 2023
Cited by 12 | Viewed by 2901
Abstract
This paper addresses a MinMax variant of the Dubins multiple traveling salesman problem (mTSP). This routing problem arises naturally in mission planning applications involving fixed-wing unmanned vehicles and ground robots. We first formulate the routing problem, referred to as the one-in-a-set Dubins mTSP [...] Read more.
This paper addresses a MinMax variant of the Dubins multiple traveling salesman problem (mTSP). This routing problem arises naturally in mission planning applications involving fixed-wing unmanned vehicles and ground robots. We first formulate the routing problem, referred to as the one-in-a-set Dubins mTSP problem (MD-GmTSP), as a mixed-integer linear program (MILP). We then develop heuristic-based search methods for the MD-GmTSP using tour construction algorithms to generate initial feasible solutions relatively fast and then improve on these solutions using variants of the variable neighborhood search (VNS) metaheuristic. Finally, we also explore a graph neural network to implicitly learn policies for the MD-GmTSP using a learning-based approach; specifically, we employ an S-sample batch reinforcement learning method on a shared graph neural network architecture and distributed policy networks to solve the MD-GMTSP. All the proposed algorithms are implemented on modified TSPLIB instances, and the performance of all the proposed algorithms is corroborated. The results show that learning based approaches work well for smaller sized instances, while the VNS based heuristics find the best solutions for larger instances. Full article
(This article belongs to the Special Issue Aerial Robotics: Navigation and Path Planning)
Show Figures

Figure 1

17 pages, 3945 KB  
Article
Spatio-Temporal Distribution Characteristics of Intangible Cultural Heritage and Tourism Response in the Beijing–Hangzhou Grand Canal Basin in China
by Mo Chen, Jiacan Wang, Jing Sun, Fang Ye and Hongyan Zhang
Sustainability 2023, 15(13), 10348; https://doi.org/10.3390/su151310348 - 30 Jun 2023
Cited by 34 | Viewed by 3954
Abstract
The Beijing–Hangzhou Grand Canal is renowned for being one of the longest and largest canals in the world. Running from Beijing to Hangzhou (north to south), it connects China’s five major water systems and has an important impact on the ecological environment and [...] Read more.
The Beijing–Hangzhou Grand Canal is renowned for being one of the longest and largest canals in the world. Running from Beijing to Hangzhou (north to south), it connects China’s five major water systems and has an important impact on the ecological environment and economy of northern and southern China. It also boasts a large quantity of intangible cultural heritage (ICH). Clarifying the spatio-temporal distribution pattern of ICH in the Beijing–Hangzhou Grand Canal Basin and its influencing factors is essential for the protection and utilization of heritage resources and the formulation of management policies. In this study, 977 national ICH items in the Beijing–Hangzhou Grand Canal Basin are analyzed with the help of ArcGIS spatial analysis technology, SPSS regression analysis, and human geography research methods. The results show that the national ICH in the Beijing–Hangzhou Grand Canal Basin has complete categories but varies in provincial scale, particularly between the north and south parts. According to the analysis using tools such as kernel density estimation, standard deviation ellipse, and the center-of-gravity model, it is clear that the ICH in the Beijing–Hangzhou Grand Canal Basin shows different degrees of sub-type aggregation, varying directional characteristics of each batch of ICH, and a centre of gravity of ICH with a tendency to shift in multiple directions. The main factors affecting the spatio-temporal distribution pattern of ICH in the Beijing–Hangzhou Grand Canal Basin are natural geographical factors, socioeconomic factors, and policy environment factors. Moreover, there is a significant positive correlation between ICH resources and the tourism industry that cannot be ignored. This study provides an important reference for planning the reuse of ICH resource systems in northern and southern China. Full article
Show Figures

Figure 1

Back to TopTop