Due to scheduled maintenance work on our servers, there may be short service disruptions on this website between 11:00 and 12:00 CEST on March 28th.
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (83)

Search Parameters:
Keywords = warehouse organization

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
9 pages, 196 KB  
Brief Report
Assessing the Frequency, Prescribing Patterns, and Characteristics of Patients Receiving Drugs with Pharmacogenomic (PGx) Guidelines Through an EMR: Follow-Up Analysis 5 Years Later
by George E. MacKinnon, Megan Mills and Ulrich Broeckel
Pharmacy 2026, 14(2), 53; https://doi.org/10.3390/pharmacy14020053 - 25 Mar 2026
Viewed by 64
Abstract
(1) Background: This follow-up retrospective analysis used electronic medical record (EMR) data from a health system to identify patients and medications prescribed in accordance with Clinical Pharmacogenetics Implementation Consortium (CPIC) guidelines. (2) Methods: This analysis included EMR data from a clinical research data [...] Read more.
(1) Background: This follow-up retrospective analysis used electronic medical record (EMR) data from a health system to identify patients and medications prescribed in accordance with Clinical Pharmacogenetics Implementation Consortium (CPIC) guidelines. (2) Methods: This analysis included EMR data from a clinical research data warehouse encompassing 928,291 patients seen at an academic medical center between 2020 and 2024. The study evaluated 75 commercially available medications linked to 52 evidence-based CPIC pharmacogenomic (PGx) guidelines. (3) Results: Of the 928,291 patient encounters, 709,673 medication orders were recorded, with 416,621 patients (44.8%) prescribed at least 1 of the 75 CPIC-associated medications. This compares with 845,518 patients who had an encounter in 2015–2019 with 590,526 medication orders, and 335,849 (56.9%) patients had medication orders represented by CPIC-associated medications. One to three CPIC-associated medications accounted for 76.6% of patients in 2020–2024 compared to 75.6% in 2015–2019. (4) Conclusions: The findings demonstrate that the proportion of patients prescribed a CPIC-actionable medication remained just under half of those evaluated within a single institution’s EMR. About three-quarters of patients over the ten-year period had between one to three CPIC-associated medications identified, and the top five classes of medications remained the same in the two periods. This understanding of patient volume may help organizations as they begin to assess the implementation of PGx services. Full article
(This article belongs to the Section Pharmacy Practice and Practice-Based Research)
16 pages, 252 KB  
Review
The Role of Digitalization in Implementing Green Logistics Principles in Warehousing Operations: A Case Study
by Diana Šateikiene and Juliana Kovalevskaja
World 2026, 7(3), 43; https://doi.org/10.3390/world7030043 - 10 Mar 2026
Viewed by 351
Abstract
Warehouses are energy-intensive nodes in a logistics chain and critical hotspots for decarbonization efforts. Digitalization and Industry 4.0 technologies are increasingly promoted as enablers of greener warehousing; however, environmental benefits are often implied rather than empirically quantified. This study examines how digitalization, automation, [...] Read more.
Warehouses are energy-intensive nodes in a logistics chain and critical hotspots for decarbonization efforts. Digitalization and Industry 4.0 technologies are increasingly promoted as enablers of greener warehousing; however, environmental benefits are often implied rather than empirically quantified. This study examines how digitalization, automation, and robotization support the implementation of green logistics principles in warehousing operations. The research combines a scientific literature review and document content analysis with semi-structured interviews with company managers and logistics professionals. The results indicate that implementing a warehouse management system (Vision Equinox), integrating information systems, and adopting RFID technology reduce paper-based processes, improve picking accuracy and internal routing, shorten loading and unloading times, and may decrease the risk of human error. Consequently, these technologies enable more efficient resource use and can contribute to lower energy consumption and a reduced environmental footprint associated with warehouse activities. The study concludes that digital technologies already serve as a systematic enabler of green logistics within the organization; however, their environmental benefits have not yet been quantified. Future research should therefore focus on measuring changes in energy use and CO2 emissions under different warehousing scenarios. Full article
37 pages, 1099 KB  
Review
Deep Learning for e-Commerce: Recent Developments in Prediction, Personalization and Decision Intelligence
by Georgios Kostopoulos, Antonia Stefani, Vasilios Vasiliadis and Sotiris Kotsiantis
Appl. Sci. 2026, 16(5), 2263; https://doi.org/10.3390/app16052263 - 26 Feb 2026
Viewed by 583
Abstract
The rapid expansion of global e-commerce platforms has led to unprecedented volumes of heterogeneous, multimodal, and continuously evolving data, creating significant challenges for prediction, personalization, trust, and operational decision-making. Deep Learning has emerged as a core enabling technology for addressing these challenges, offering [...] Read more.
The rapid expansion of global e-commerce platforms has led to unprecedented volumes of heterogeneous, multimodal, and continuously evolving data, creating significant challenges for prediction, personalization, trust, and operational decision-making. Deep Learning has emerged as a core enabling technology for addressing these challenges, offering powerful representation learning, sequential reasoning, graph-based inference, and decision-centric optimization capabilities. This survey provides a comprehensive and decision-oriented review of recent advances in Deep Learning for e-commerce, covering consumer behavior prediction, demand forecasting, recommendation systems, sentiment and review intelligence, catalogue understanding, fraud detection, cybersecurity, and large-scale operational optimization. Beyond predictive and personalization tasks, the survey emphasizes decision intelligence, highlighting the growing role of Reinforcement Learning and integrated Artificial Intelligence systems in pricing, logistics, warehouse automation, and platform reliability. We organize the literature according to key e-commerce objectives and operational contexts, analyze methodological trends and deployment challenges, and discuss limitations related to scalability, robustness, interpretability, and cross-border adaptability. Finally, we identify open research directions toward unified multimodal foundation models, culturally adaptive intelligence, and trustworthy, sustainable Artificial Intelligence systems for next-generation e-commerce platforms. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

18 pages, 1070 KB  
Article
Predicting Toxicities and Survival Outcomes in De Novo Metastatic Hormone-Sensitive Prostate Cancer Using Clinical Features, Routine Blood Tests and Their Early Variations
by Giuseppe Salfi, Martino Pedrani, Amos Colombo, Lorenzo Ruinelli, Daniele Brenna, Chiara Maria Agrippina Clerici, Giovanna Pecoraro, Sara Merler, Caroline-Claudia Erhart, Marialuisa Puglisi, Fabio Turco, Luigi Tortola, Ursula Vogl, Silke Gillessen and Ricardo Pereira Mestre
Cancers 2025, 17(23), 3806; https://doi.org/10.3390/cancers17233806 - 27 Nov 2025
Viewed by 852
Abstract
Background: Conventional prognostic factors are typically assessed at diagnosis in metastatic hormone-sensitive prostate cancer (mHSPC). However, variations in vital signs and laboratory parameters occur during systemic treatment and may predict patients’ prognosis and anticipate organ-specific toxicity development. Methods: This single-center retrospective study included [...] Read more.
Background: Conventional prognostic factors are typically assessed at diagnosis in metastatic hormone-sensitive prostate cancer (mHSPC). However, variations in vital signs and laboratory parameters occur during systemic treatment and may predict patients’ prognosis and anticipate organ-specific toxicity development. Methods: This single-center retrospective study included 363 patients with de novo mHSPC treated between 2014 and 2023. Clinical and laboratory data were systematically collected from the hospital data warehouse, from treatment initiation through the following seven months. Variations in vital parameters and blood test results were graded using CTCAE V5.0 (dynamic variables). Cox regression analyses were performed to explore the impact of dynamic variables on progression-free survival (PFS) and overall survival (OS). Machine learning (ML) models (Support Vector Classifier, Random Forest, and LGBM Classifier) were developed to predict single organ-specific toxicities and to identify good and poor responders based on 7-month PSA levels, PFS and OS. We compared ML model performance when trained only on baseline factors (static models) with those integrating variables generated by vital sign and blood test monitoring within 3 and 7 months from treatment start (dynamic models). Results: Dynamic model failed to improve the prediction of single organ-specific toxicities. Univariable Cox analysis revealed that the development of hematological, liver, and kidney-related toxicity, as well as the development of electrolyte disturbances within 3 or 7 months, was associated with shorter PFS (p = 0.011, 0.007, 0.174, and 0.02, respectively) and/or OS (p = 0.001, 0.099, 0.012, and 0.001, respectively). In multivariable Cox analysis, increasing alkaline phosphatase levels (HR = 1.93, p = 0.009), decreasing albumin (HR = 1.92, p = 0.008) and development of hyponatremia (HR = 1.79, p = 0.033) were associated with a shorter OS. The combination of static and dynamic variables significantly improved the ability of ML models to identify poor responders (shorter PFS: AUC range 0.91–0.94 vs. 0.79–0.89). Conclusions: The integration of conventional prognostic factors with the detection of significant changes in vital signs and blood tests occurring early during systemic treatment in patients with de novo mHSPC may enhance patient stratification and improve prediction of survival outcomes. Multicenter validation studies are needed to confirm these results. Full article
Show Figures

Figure 1

25 pages, 3571 KB  
Article
GenAI Technology Approach for Sustainable Warehouse Management Operations: A Case Study from the Automative Sector
by Sorina Moica, Tripon Lucian, Vassilis Kostopoulos, Adrian Gligor and Noha A. Mostafa
Sustainability 2025, 17(20), 9081; https://doi.org/10.3390/su17209081 - 14 Oct 2025
Viewed by 2594
Abstract
The emergence of Generative Artificial Intelligence (GenAI) is reshaping logistics and supply chain operations, offering new opportunities to improve efficiency, accuracy, and responsiveness. In the automotive manufacturing sector, where high-volume throughput and precision are critical, the integration of AI technologies into warehouse management [...] Read more.
The emergence of Generative Artificial Intelligence (GenAI) is reshaping logistics and supply chain operations, offering new opportunities to improve efficiency, accuracy, and responsiveness. In the automotive manufacturing sector, where high-volume throughput and precision are critical, the integration of AI technologies into warehouse management represents a strategic advancement. This study presents a case analysis of the implementation of AI-driven reception processes at an Automotive facility in Blaj, Romania. The research focuses on the transition from manual operations to automated recognition using industrial-grade imaging systems integrated with enterprise resource planning platforms. The integrated approach used combines Value Stream Mapping, quantitative performance analysis, and statistical validation using the Wilcoxon Signed-Rank Test. The results reveal a substantial reduction in reception time up to 79% and significant cost savings across various operational scales with improved data accuracy and minimized logistics failures. To support broader industry adoption, the study proposes a Cleaner Logistics and Supply Chain Model, incorporating principles of sustainability, ethical compliance, and continuous improvement. This model serves as a strategic framework for organizations seeking to align AI adoption with long-term operational resilience and environmental responsibility. The findings validate the operational and financial advantages of AI-enabled warehousing management in achieving sustainable digital transformation in logistics. Full article
Show Figures

Figure 1

31 pages, 834 KB  
Article
A Systematic Lean-Driven Framework for Warehouse Optimization
by Bruno J. B. Julião, Marco S. Reis and Belmiro P. M. Duarte
Systems 2025, 13(9), 813; https://doi.org/10.3390/systems13090813 - 17 Sep 2025
Cited by 3 | Viewed by 6885
Abstract
Optimizing warehouse operations is a strategic priority for ensuring the timely and efficient flow of materials in industrial environments. In contexts with limited digital infrastructure, organizations often face persistent challenges such as inefficient picking, poor material traceability, and suboptimal space utilization, ultimately leading [...] Read more.
Optimizing warehouse operations is a strategic priority for ensuring the timely and efficient flow of materials in industrial environments. In contexts with limited digital infrastructure, organizations often face persistent challenges such as inefficient picking, poor material traceability, and suboptimal space utilization, ultimately leading to productivity losses and operational delays. This paper introduces a systematic, lean-driven framework for warehouse optimization, structured around a sequential methodology involving Define, Improve, and Control. The approach begins with a comprehensive diagnostic phase to evaluate the current state and identify performance gaps. It then guides the development and implementation of targeted interventions aimed at eliminating waste, standardizing operations, and aligning resources with value-added activities. Finally, the framework supports long-term sustainability through continuous monitoring, process standardization, and performance control. The methodology is validated through its application in a parts warehouse within the glass transformation industry, highlighting its adaptability, practical relevance, and capacity to generate meaningful improvements, even in low-digitalization environments. The framework offers a scalable solution for organizations seeking to enhance warehouse performance through structured lean practices. Full article
Show Figures

Figure 1

15 pages, 2172 KB  
Communication
Triangulating Timing, Tropism and Burden of Sarcoma Metastases: Toward Precision Surveillance and Therapy in a Real-World-Time Cohort
by Philip Heesen, Dario Feusi, Bettina Vogel, Gabriela Studer, Bruno Fuchs and on behalf of the Swiss Sarcoma Network
Cancers 2025, 17(18), 2944; https://doi.org/10.3390/cancers17182944 - 9 Sep 2025
Viewed by 845
Abstract
Background: Sarcoma surveillance guidelines still apply uniform imaging intervals based on tumor grade and stage that ignore histotype-specific metastatic behavior. We prospectively analyzed metastatic timing, organ tropism, and lesion burden across a real-world sarcoma cohort to generate an evidence base for risk-adapted [...] Read more.
Background: Sarcoma surveillance guidelines still apply uniform imaging intervals based on tumor grade and stage that ignore histotype-specific metastatic behavior. We prospectively analyzed metastatic timing, organ tropism, and lesion burden across a real-world sarcoma cohort to generate an evidence base for risk-adapted follow-up and treatment stratification. Methods: In a prospective multicenter study, 1850 patients with suspected sarcoma were screened. SHAPEHub, a real-world-time data warehouse, captured clinicopathological variables and imaging. Adults with histologically confirmed soft-tissue or bone sarcoma (n = 295) formed the analytic cohort. Metastases were classified as synchronous (≤6 months) or metachronous (>6 months), lung-only versus multi-organ, and oligometastatic (≤5 lesions, ≤2 organs) versus polymetastatic. TTME was illustrated with Kaplan–Meier curves for the full cohort (descriptive); where subgroup comparisons are shown, log-rank tests are reported. Results: Ninety-three patients (31.5%) developed metastases after a median follow-up of 20.9 months. Metastatic risk was front-loaded: 36.6% were synchronous, and 67.8% of metachronous events occurred within year 1. The lung was the initial site in 62.4% of events, bone in 18.3%, and liver in 11.8%. Half of the lung-metastatic patients remained pulmonary-confined; the remainder followed a multi-organ route involving bone and lymph nodes. Oligometastatic spread predominated in the lung-only subgroup (61%) versus multi-organ (28%). Histotype influenced both timing and tropism: angiosarcoma and Ewing sarcoma metastasized earliest (median 3.7 and 5.0 months) and multi-organ; leiomyosarcoma and UPS were lung-dominant; Ewing sarcoma and epithelioid haemangioendothelioma were bone-tropic; and angiosarcoma was liver-tropic. Conclusions: Metastatic sarcoma displays three intersecting dimensions—early versus late onset, organ-specific tropism, and oligo- versus polymetastatic burden—none of which are addressed by the current “one-size-fits-all” surveillance. Recognizing these patterns delineates windows for tailored imaging and stratified therapy selection (e.g., local ablation for oligometastatic lung disease, intensified systemic regimens for early, polymetastatic spread). These findings lay the groundwork for precision-adapted surveillance and treatment protocols. Pattern-stratified trials and health-economic evaluations are now needed to assess whether this approach improves outcomes and optimizes resource allocation. Full article
(This article belongs to the Section Methods and Technologies Development)
Show Figures

Figure 1

8 pages, 206 KB  
Article
Long COVID Frailty: A Comparative Analysis in a Veteran Population
by Jerry Bradley, Elizabeth Bast, Natasha M. Resendes, Fei Tang, Victor D. Cevallos, Dominique M. Tosi, Leonardo Tamariz, Ana Palacio and Iriana S. Hammel
COVID 2025, 5(8), 136; https://doi.org/10.3390/covid5080136 - 16 Aug 2025
Viewed by 898
Abstract
Long COVID is characterized by persistent symptoms affecting one or more organ systems for at least 3 months following a SARS-CoV-2 infection. Our study aimed to examine the characteristics of frailty seen in patients with Long COVID compared to the frailty seen in [...] Read more.
Long COVID is characterized by persistent symptoms affecting one or more organ systems for at least 3 months following a SARS-CoV-2 infection. Our study aimed to examine the characteristics of frailty seen in patients with Long COVID compared to the frailty seen in aging patients with multimorbidity. This is a retrospective cohort study conducted in the Miami Veterans Affairs Medical Center (VAMC). The data used to calculate the Fried phenotype through the Johns Hopkins frailty calculator was collected from two separate clinics, a Long COVID clinic and a geriatric frailty clinic. We obtained the VA Frailty Index from VA CDW (Corporate Data Warehouse). We included 106 patients from the Long COVID clinic and 97 from the frailty clinic. Patients from the Long COVID clinic were significantly younger than those from the frailty clinic (60 ± 12.6 vs. 79.8 ± 5.8, p < 0.01). Patients with frailty in the Long COVID group experienced exhaustion (96.4% vs. 53.3%) and low activity (78.6% vs. 63.3%) at a higher rate than those in the geriatric frailty clinic. Long COVID may predispose patients to develop frailty that presents with a higher frequency of exhaustion and low activity. Full article
(This article belongs to the Section Long COVID and Post-Acute Sequelae)
21 pages, 1245 KB  
Article
Geochemical Behaviour of Trace Elements in Diesel Oil-Contaminated Soil During Remediation Assisted by Mineral and Organic Sorbents
by Mirosław Wyszkowski and Natalia Kordala
Appl. Sci. 2025, 15(15), 8650; https://doi.org/10.3390/app15158650 - 5 Aug 2025
Cited by 2 | Viewed by 1120
Abstract
The topic of environmental pollution by petroleum products is highly relevant due to rapid urbanisation, including industrial development, road infrastructure and fuel distribution. Potential threat areas include refineries, fuel stations, pipelines, warehouses and transshipment bases, as well as sites affected by accidents or [...] Read more.
The topic of environmental pollution by petroleum products is highly relevant due to rapid urbanisation, including industrial development, road infrastructure and fuel distribution. Potential threat areas include refineries, fuel stations, pipelines, warehouses and transshipment bases, as well as sites affected by accidents or fuel spills. This study aimed to determine whether organic and mineral materials could mitigate the effects of diesel oil pollution on the soil’s trace element content. The used materials were compost, bentonite and calcium oxide. Diesel oil pollution had the most pronounced effect on the levels of Cd, Ni, Fe and Co. The levels of the first three elements increased, while the level of Co decreased by 53%. Lower doses of diesel oil (2.5 and 5 cm3 per kg of soil) induced an increase in the levels of the other trace elements, while higher doses caused a reduction, especially in Cr. All materials applied to the soil (compost, bentonite and calcium oxide) reduced the content of Ni, Cr and Fe. Compost and calcium oxide also increased Co accumulation in the soil. Bentonite had the strongest reducing effect on the Ni and Cr contents of the soil, reducing them by 42% and 53%, respectively. Meanwhile, calcium oxide had the strongest reducing effect on Fe and Co accumulation, reducing it by 12% and 31%, respectively. Inverse relationships were recorded for Cd (mainly bentonite), Pb (especially compost), Cu (mainly compost), Mn (mainly bentonite) and Zn (only compost) content in the soil. At the most contaminated site, the application of bentonite reduced the accumulation of Pb, Zn and Mn in the soil, while the application of compost reduced the accumulation of Cd. Applying various materials, particularly bentonite and compost, limits the content of certain trace elements in the soil. This has a positive impact on reducing the effect of minor diesel oil pollution on soil properties and can promote the proper growth of plant biomass. Full article
Show Figures

Figure 1

15 pages, 5739 KB  
Article
Prevalence of Actionable Exposures to Pharmacogenetic Medications Among Solid Organ Transplant Recipients in a Population-Scale Biobank
by Alaa Radwan, Kimberly M. Deininger, Amrut V. Ambardekar, Heather D. Anderson, Nicholas Rafaels, Laura M. Saba, The Colorado Center for Personalized Medicine and Christina L. Aquilante
J. Pers. Med. 2025, 15(5), 185; https://doi.org/10.3390/jpm15050185 - 2 May 2025
Viewed by 1234
Abstract
Background/Objectives: Solid organ transplant (SOT) recipients are exposed to multiple medications, many of which have pharmacogenetic (PGx) prescribing recommendations. This study leveraged data from a population-scale biobank and an enterprise data warehouse to determine the prevalence of actionable exposures to PGx medications [...] Read more.
Background/Objectives: Solid organ transplant (SOT) recipients are exposed to multiple medications, many of which have pharmacogenetic (PGx) prescribing recommendations. This study leveraged data from a population-scale biobank and an enterprise data warehouse to determine the prevalence of actionable exposures to PGx medications among kidney, heart, and lung transplant recipients during the first six months post-transplant. Methods: We conducted a retrospective analysis of adult SOT patients with genetic data available from the Colorado Center for Personalized Medicine (CCPM) biobank and clinical data from Health Data Compass (HDC). We evaluated 29 variants in 13 pharmacogenes and 42 Clinical Pharmacogenetics Implementation Consortium (CPIC) level A or B medications (i.e., sufficient evidence to recommend at least one prescribing action based on genetics). The primary outcome was actionable exposure to a PGx medication (i.e., actionable phenotype and a prescription for an affected PGx medication). Results: The study included 358 patients. All patients were prescribed at least one PGx medication, and 49.4% had at least one actionable exposure to a PGx medication during the first six months post-transplant. The frequency of actionable exposure was highest for tacrolimus (15.4%), followed by proton pump inhibitors (PPIs) (15.1%) and statins (12.8%). Statin actionable exposures significantly differed by transplant type, likely due to variations in prescribing patterns and actionable phenotypes for individual statins. Conclusions: Our findings highlight the potential clinical utility of PGx testing among SOT patients. Further studies are needed to address the impact on clinical outcomes and the optimal timing of PGx testing in the SOT population. Full article
(This article belongs to the Section Pharmacogenetics)
Show Figures

Figure 1

26 pages, 3721 KB  
Article
Schema Understandability: A Comprehensive Empirical Study of Requirements Metrics
by Tanu Singh, Vinod Patidar, Manu Singh and Álvaro Rocha
Information 2025, 16(2), 155; https://doi.org/10.3390/info16020155 - 19 Feb 2025
Cited by 1 | Viewed by 2329
Abstract
Ensuring high-quality data warehouses is crucial for organizations, as they provide the reliable information needed for informed decision-making. While various methodologies emphasize the importance of requirements, conceptual, logical, and physical models in developing data warehouses, empirical quality assessment of these models remains underexplored, [...] Read more.
Ensuring high-quality data warehouses is crucial for organizations, as they provide the reliable information needed for informed decision-making. While various methodologies emphasize the importance of requirements, conceptual, logical, and physical models in developing data warehouses, empirical quality assessment of these models remains underexplored, especially requirements models. To bridge this gap, this study focuses on assessment of requirements metrics for predicting the understandability of requirements schemas, a key indicator of model quality. In this empirical study, 28 requirements schemas were classified into understandable and non-understandable clusters using the k-means clustering technique. The study then employed six classification techniques—logistic regression, naive Bayes, linear discriminant analysis with decision tree, reinforcement learning, voting rule, and a hybrid approach—within both univariate and multivariate models to identify strong predictors of schema understandability. Results indicate that 13 out of 17 requirements metrics are robust predictors of schema understandability. Furthermore, a comparative performance analysis of the classification techniques reveals that the hybrid classifier outperforms other techniques across key evaluation parameters, including accuracy, sensitivity, specificity, and AUC. These findings highlight the potential of requirements metrics as effective predictors of schema understandability, contributing to improved quality assessment and the development of better conceptual data models for data warehouses. Full article
(This article belongs to the Special Issue Editorial Board Members’ Collection Series: "Information Systems")
Show Figures

Figure 1

17 pages, 1250 KB  
Article
Quality Risk Management in the Final Operational Stage of Sterile Pharmaceutical Manufacturing: A Case Study Highlighting the Management of Sustainable Related Risks in Product Sterilization, Inspection, Labeling, Packaging, and Storage Processes
by Bassam Elmadhoun, Rawidh Alsaidalani and Frank Burczynski
Sustainability 2025, 17(4), 1670; https://doi.org/10.3390/su17041670 - 17 Feb 2025
Cited by 4 | Viewed by 8384
Abstract
Quality risk management, commonly known as QRM, is designed to systematically assess, control, communicate, and review potential risks at every stage of the pharmaceutical manufacturing process. The preservation of consistent product quality across the entirety of the product’s life cycle is of paramount [...] Read more.
Quality risk management, commonly known as QRM, is designed to systematically assess, control, communicate, and review potential risks at every stage of the pharmaceutical manufacturing process. The preservation of consistent product quality across the entirety of the product’s life cycle is of paramount importance. The aim of this article is to formulate a best practice guide that will assist pharmaceutical manufacturers in comprehending and implementing the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH) Q9: quality risk management principles. A widely recognized methodology for defining and monitoring risk mitigation strategies within the pharmaceutical sector is the Failure Mode and Effects Analysis (FMEA). ICH Q9 does not, however, offer detailed instructions for applying FMEA to real-world pharmaceutical situations. We previously provided real-world case studies that identify and mitigate risks in the early stages of the manufacturing process of sterile products, such as (1) supply chain and procurement; (2) logistics and warehousing; (3) raw material dispensing; (4) glass bottle washing and handling; (5) product filling; and (6) final product receiving and handling. The final steps of the sterile manufacturing process are the subject of the case study we present in this paper. We identify and control the risks related to (I) product sterilization; (II) product inspection, labeling, and packaging; (III) the finished product’s transfer to storage; and (IV) storing finished products in a warehouse. In order to maximize decision-making and reduce the risk of regulatory noncompliance, this case study describes a proactive strategy for the identification, management, and communication of risks associated with crucial tasks. While each organization’s products and methods are distinct, with varying tolerances for risk, certain stages and associated risks are common. Consequently, the examples provided here offer relevant insights into any pharmaceutical production environment. Managing sustainability-related risks and ensuring the transparency of pharmaceutical company operations are key tasks of success today. These risks, if not managed, will cause serious problems and a negative reputation, as well as environmental and public impact. Full article
Show Figures

Figure 1

19 pages, 653 KB  
Review
Revolutionizing Supply Chains: Unleashing the Power of AI-Driven Intelligent Automation and Real-Time Information Flow
by Mohammad Shamsuddoha, Eijaz Ahmed Khan, Md Maruf Hossan Chowdhury and Tasnuba Nasir
Information 2025, 16(1), 26; https://doi.org/10.3390/info16010026 - 6 Jan 2025
Cited by 27 | Viewed by 22320
Abstract
Artificial intelligence (AI) and smart automation are revolutionizing the global supply chain ecosystem at an accelerated pace, providing tremendous potential for resilience, innovation, efficacy, and profitability. This paper examines how AI, machine learning (ML), and robotic process automation (RPA) influence supply chain operations [...] Read more.
Artificial intelligence (AI) and smart automation are revolutionizing the global supply chain ecosystem at an accelerated pace, providing tremendous potential for resilience, innovation, efficacy, and profitability. This paper examines how AI, machine learning (ML), and robotic process automation (RPA) influence supply chain operations to adjust to the risks and vulnerabilities. It focuses on how AI and other relevant technologies will enhance forecasting to predict actual demand, expedite logistics, increase warehouse efficiency, and promote instantaneously making decisions. This study utilizes thematic analysis to find AI-driven supply chain applications, including logistics optimization, forecasting demand, and risk mitigation, among 383 peer-reviewed articles (2017–2024). It provides a strategic framework for dealing with vulnerabilities, operational excellence, and resilient solutions. Additionally, the research investigates how AI contributes to supply chain resilience by predicting disruptions and automating risk mitigation strategies. This paper identifies critical success factors and challenges in adopting intelligent automation by analyzing real-world industry implementations. The findings will propose a strategic framework for organizations aiming to leverage AI to achieve operational excellence, agility, and real-time information flow for effective decision-making. Full article
(This article belongs to the Special Issue Feature Papers in Artificial Intelligence 2024)
Show Figures

Figure 1

17 pages, 2091 KB  
Article
The Assessment of the Influence of Low-Frequency Electromagnetic Fields Originated from the Power Infrastructure on Humans’ Health
by Leszek Sławomir Litzbarski, Marek Olesz, Grzegorz Redlarski, Piotr Mateusz Tojza, Arkadiusz Żak, Emanuel Gifuni, Zuzanna Cieślikowska and Mieszko Czapliński
Appl. Sci. 2024, 14(21), 9668; https://doi.org/10.3390/app14219668 - 23 Oct 2024
Cited by 2 | Viewed by 4186
Abstract
The objective of this study is to assess the impact of low-frequency electromagnetic fields (LF EMFs) generated by power infrastructure on the nearby environment. Measurements of electric (E) and magnetic (H) field intensities were conducted around high-voltage power lines, [...] Read more.
The objective of this study is to assess the impact of low-frequency electromagnetic fields (LF EMFs) generated by power infrastructure on the nearby environment. Measurements of electric (E) and magnetic (H) field intensities were conducted around high-voltage power lines, transformer stations and facilities related to them. Numerical simulations were also performed to model the distribution of the field values around real buildings in close proximity to power delivery systems. Given the ongoing scientific debate regarding the effects of EMFs on living organisms, the current analysis was based on the existing standards—particularly ICNIRP 2010 guidelines, which set the maximum allowable E and magnetic induction (B) values at 5 kV/m and 200 μT, respectively. Stricter national regulations were also examined, such as Poland’s 1 kV/m E limit in residential areas and Belgium’s 10 μT limit for B. The results showed that while most cases complied with ICNIRP 2010 standards, certain stricter local regulations were exceeded. Specifically, 9 of 14 cases exceeded Poland’s E limits, and 8 failed to meet Belgium’s B requirements. Only in one place—a warehouse near 110 kV power lines (in a critical case)—the ICNIRP limit B was exceeded. These findings underscore the variability in regulatory standards and highlight the need for localized assessments of EMF exposure. Full article
Show Figures

Figure 1

24 pages, 696 KB  
Article
A Performance Analysis of Hybrid and Columnar Cloud Databases for Efficient Schema Design in Distributed Data Warehouse as a Service
by Fred Eduardo Revoredo Rabelo Ferreira and Robson do Nascimento Fidalgo
Data 2024, 9(8), 99; https://doi.org/10.3390/data9080099 - 5 Aug 2024
Cited by 4 | Viewed by 5200
Abstract
A Data Warehouse (DW) is a centralized database that stores large volumes of historical data for analysis and reporting. In a world where enterprise data grows exponentially, new architectures are being investigated to overcome the deficiencies of traditional Database Management Systems (DBMSs), driving [...] Read more.
A Data Warehouse (DW) is a centralized database that stores large volumes of historical data for analysis and reporting. In a world where enterprise data grows exponentially, new architectures are being investigated to overcome the deficiencies of traditional Database Management Systems (DBMSs), driving a shift towards more modern, cloud-based solutions that provide resources such as distributed processing, columnar storage, and horizontal scalability without the overhead of physical hardware management, i.e., a Database as a Service (DBaaS). Choosing the appropriate class of DBMS is a critical decision for organizations, and there are important differences that impact data volume and query performance (e.g., architecture, data models, and storage) to support analytics in a distributed cloud environment efficiently. In this sense, we carry out an experimental evaluation to analyze the performance of several DBaaS and the impact of data modeling, specifically the usage of a partially normalized Star Schema and a fully denormalized Flat Table Schema, to further comprehend their behavior in different configurations and designs in terms of data schema, storage form, memory availability, and cluster size. The analysis is done in two volumes of data generated by a well-established benchmark, comparing the performance of the DW in terms of average execution time, memory usage, data volume, and loading time. Our results provide guidelines for efficient DW design, showing, for example, that the denormalization of the schema does not guarantee improved performance, as solutions performed differently depending on its architecture. We also show that a Hybrid Processing (HTAP) NewSQL solution can outperform solutions that support only Online Analytical Processing (OLAP) in terms of overall execution time, but that the performance of each query is deeply influenced by its selectivity and by the number of join functions. Full article
(This article belongs to the Section Information Systems and Data Management)
Show Figures

Figure 1

Back to TopTop