Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (16)

Search Parameters:
Keywords = Hilbert space filling curve

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 957 KB  
Article
Strategic Capacity Planning Algorithm for Last-Mile Delivery Under High-Volume Demand Surges
by Didar Yedilkhan, Aidarbek Shalakhmetov, Bakbergen Mendaliyev and Nursultan Khaimuldin
Algorithms 2026, 19(4), 319; https://doi.org/10.3390/a19040319 - 18 Apr 2026
Viewed by 280
Abstract
Last-mile delivery companies can face demand surges where large-volume order requests exceed daily courier capacity. In such cases fast and robust feasibility-first planning becomes more practical and valuable than building optimal routes. This paper proposes a hierarchical, computationally feasible decomposition pipeline that produces [...] Read more.
Last-mile delivery companies can face demand surges where large-volume order requests exceed daily courier capacity. In such cases fast and robust feasibility-first planning becomes more practical and valuable than building optimal routes. This paper proposes a hierarchical, computationally feasible decomposition pipeline that produces shift-feasible clusters under a strict shift-duration limit using travel-time-based duration estimates. While decomposition methods for large-scale VRPs are well established, they typically remain oriented toward route-construction quality within a single operational day or toward balancing customer counts, demand, or Euclidean territory partitions. In contrast, the proposed method targets a different decision problem: rapid feasibility-first strategic capacity planning for one-time extreme demand surges, where the primary requirement is to estimate, within seconds, a conservative upper bound on the number of courier shifts under a strict shift-duration limit. When end-to-end latency is evaluated from raw geographic points, including distance-matrix preparation for monolithic baselines, the proposed pipeline becomes 187 to 1315 times faster than matrix-based monolithic optimization on the common benchmark sizes. Methodologically, the contribution lies in combining (i) topology-preserving spatial linearization with a Hilbert Space-Filling Curve, (ii) adaptive greedy microclustering driven by empirical travel-time quantiles, and (iii) lexicographic dynamic-programming merge that minimizes the number of shifts first and total travel time second. This yields a planning-oriented decomposition mechanism that is distinct from classical route-quality-centered hierarchical VRP approaches. Full article
(This article belongs to the Section Combinatorial Optimization, Graph, and Network Algorithms)
Show Figures

Figure 1

13 pages, 2648 KB  
Article
Virtual Optical Waveguides for Particle Transport and Sorting
by Liuhao Zhu, Xiaohe Zhang, Xiang Zang, Jun He, Bing Gu and Xi Xie
Photonics 2026, 13(4), 378; https://doi.org/10.3390/photonics13040378 - 16 Apr 2026
Viewed by 456
Abstract
Precise manipulation and directed transport of micro- and nano-particles are cornerstones of emerging lab-on-a-chip technologies. Traditional optofluidic systems that combine optical tweezers with microfluidic channels enable long-range transport. However, they rely on fixed physical boundaries that lack reconfigurability. To bridge this gap, we [...] Read more.
Precise manipulation and directed transport of micro- and nano-particles are cornerstones of emerging lab-on-a-chip technologies. Traditional optofluidic systems that combine optical tweezers with microfluidic channels enable long-range transport. However, they rely on fixed physical boundaries that lack reconfigurability. To bridge this gap, we propose a reconfigurable virtual optical waveguide (VOW) based on a discretized beam-shaping strategy. By superposing two orthogonally polarized shaped beams, we construct interference-free optical channels without physical boundaries. This platform enables programmable transport along complex trajectories, including space-filling Hilbert curves that maximize interaction path length, and shields the transport channel from perturbations induced by surrounding particles. Crucially, the VOW offers multi-dimensional sorting capabilities: (i) it performs precise size-dependent sieving via tunable channel widths, and (ii) it functions as an intrinsic material filter by stably guiding scattering-dominated particles (e.g., gold) while rejecting gradient-dominated dielectric ones. This work establishes a versatile, contactless strategy for adaptive optical logistics and on-chip material purification. Full article
(This article belongs to the Special Issue Advances in Spin-Orbit Coupling of Light)
Show Figures

Figure 1

23 pages, 17441 KB  
Article
A Method for Automated Crop Health Monitoring in Large Areas Using Multi-Spectral Images and Deep Convolutional Neural Networks
by Oscar Andrés Martínez, Kevin David Ortega Quiñones and German Andrés Holguin-Londoño
AgriEngineering 2026, 8(3), 109; https://doi.org/10.3390/agriengineering8030109 - 13 Mar 2026
Viewed by 563
Abstract
Crop monitoring over large land extensions represents a central challenge in precision agriculture, especially in polyculture contexts where species with different nutritional needs are combined. This study presents a methodology to manage and analyze large volumes of multispectral images captured by unmanned aerial [...] Read more.
Crop monitoring over large land extensions represents a central challenge in precision agriculture, especially in polyculture contexts where species with different nutritional needs are combined. This study presents a methodology to manage and analyze large volumes of multispectral images captured by unmanned aerial vehicles (UAVs) in order to identify and monitor crops at the plant level. The images are efficiently stored and retrieved using a Hilbert Curve, which reduces the complexity of the search process from O(n2) to O(log(n)) where n represents the number of indexed data points). The system connects to a distributed Structured Query Language (SQL) database, allowing for fast image retrieval based on GPS coordinates and other metadata. Additionally, the Normalized Difference Vegetation Index (NDVI) is calculated using reflectance data from the red and near-infrared channels, adjusted by semantic segmentation masks generated with a U-Net model, which allows for species-specific evaluations. The methodology was evaluated on a 20,000 m2 polyculture farm with coffee, avocado, and plantain crops, using a dataset of 270 aerial images partitioned into 70% for training and 30% for validation. The results show improvements in retrieval speed and precision with the Hilbert Space-Filling Curve (HSFC) approach, and an accuracy of 82.3% and an the Mean Intersection over Union (MIoU) of 68.4% in species detection with the U-Net model. Overall, this integrated framework demonstrates a scalable potential for precision agriculture in complex polyculture systems, facilitating efficient data management and targeted crop interventions. Full article
Show Figures

Figure 1

24 pages, 5019 KB  
Article
A Dual Stream Deep Learning Framework for Alzheimer’s Disease Detection Using MRI Sonification
by Nadia A. Mohsin and Mohammed H. Abdul Ameer
J. Imaging 2026, 12(1), 46; https://doi.org/10.3390/jimaging12010046 - 15 Jan 2026
Viewed by 635
Abstract
Alzheimer’s Disease (AD) is an advanced brain illness that affects millions of individuals across the world. It causes gradual damage to the brain cells, leading to memory loss and cognitive dysfunction. Although Magnetic Resonance Imaging (MRI) is widely used in AD diagnosis, the [...] Read more.
Alzheimer’s Disease (AD) is an advanced brain illness that affects millions of individuals across the world. It causes gradual damage to the brain cells, leading to memory loss and cognitive dysfunction. Although Magnetic Resonance Imaging (MRI) is widely used in AD diagnosis, the existing studies rely solely on the visual representations, leaving alternative features unexplored. The objective of this study is to explore whether MRI sonification can provide complementary diagnostic information when combined with conventional image-based methods. In this study, we propose a novel dual-stream multimodal framework that integrates 2D MRI slices with their corresponding audio representations. MRI images are transformed into audio signals using a multi-scale, multi-orientation Gabor filtering, followed by a Hilbert space-filling curve to preserve spatial locality. The image and sound modalities are processed using a lightweight CNN and YAMNet, respectively, then fused via logistic regression. The experimental results of the multimodal achieved the highest accuracy in distinguishing AD from Cognitively Normal (CN) subjects at 98.2%, 94% for AD vs. Mild Cognitive Impairment (MCI), and 93.2% for MCI vs. CN. This work provides a new perspective and highlights the potential of audio transformation of imaging data for feature extraction and classification. Full article
(This article belongs to the Section AI in Imaging)
Show Figures

Figure 1

29 pages, 8032 KB  
Article
WH-MSDM: A W-Hilbert Curve-Based Multiscale Data Model for Spatial Indexing and Management of 3D Geological Blocks in Digital Earth Applications
by Genshen Chen, Gang Liu, Jiongqi Wu, Yang Dong, Zhiting Zhang, Xiangwu Zeng and Junping Xiong
Appl. Sci. 2025, 15(24), 13112; https://doi.org/10.3390/app152413112 - 12 Dec 2025
Viewed by 662
Abstract
Multiscale 3D geological characterization and joint analysis are increasingly important topics in spatial information science. However, the non-uniform spatial distribution of objects and scale heterogeneity in geological surveys lead to dispersed storage, long access paths, and limited query performance in managing multiscale 3D [...] Read more.
Multiscale 3D geological characterization and joint analysis are increasingly important topics in spatial information science. However, the non-uniform spatial distribution of objects and scale heterogeneity in geological surveys lead to dispersed storage, long access paths, and limited query performance in managing multiscale 3D geological model data. This study presents a W-Hilbert curve-based multiscale data model (WH-MSDM) that improves data indexing and management through a unified data structure (UDS) for multi-scale blocks and a bidirectional mapping model (BMM) linking spatial coordinates to memory locations. It supports spatial, attribute, hybrid, and cross-scale queries for diverse retrieval tasks. By exploiting the space-filling properties of the W-Hilbert curve to linearize multidimensional geological data into a one-dimensional index, it preserves locality and increases query efficiency across scales. Experimental results on a real 3D geological model demonstrate that WH-MSDM outperforms three mainstream baselines in both unified data organization and diverse query workloads. It thus provides a data-model foundation for Digital Earth-oriented multiscale geological analysis. Full article
Show Figures

Figure 1

30 pages, 4790 KB  
Article
LDS3Pool: Pooling with Quasi-Random Spatial Sampling via Low-Discrepancy Sequences and Hilbert Ordering
by Yuening Ma, Liang Guo and Min Li
Mathematics 2025, 13(18), 3016; https://doi.org/10.3390/math13183016 - 18 Sep 2025
Viewed by 915
Abstract
Feature map pooling in convolutional neural networks (CNNs) serves the dual purpose of reducing spatial dimensions and enhancing feature invariance. Current pooling approaches face a fundamental trade-off: deterministic methods (e.g., MaxPool and AvgPool) lack regularization benefits, while stochastic approaches introduce beneficial randomness but [...] Read more.
Feature map pooling in convolutional neural networks (CNNs) serves the dual purpose of reducing spatial dimensions and enhancing feature invariance. Current pooling approaches face a fundamental trade-off: deterministic methods (e.g., MaxPool and AvgPool) lack regularization benefits, while stochastic approaches introduce beneficial randomness but can suffer from sampling biases and may require careful hyperparameter tuning (e.g., S3Pool). To address these limitations, this paper introduces LDS3Pool, a novel pooling method that leverages low-discrepancy sequences (LDSs) for quasi-random spatial sampling. LDS3Pool first linearizes 2D feature maps to 1D sequences using Hilbert space-filling curves to preserve spatial locality, then applies LDS-based sampling to achieve quasi-random downsampling with mathematical guarantees of uniform coverage. This framework provides the regularization benefits of randomness while maintaining comprehensive feature representation, without requiring sensitive hyperparameter tuning. Extensive experiments demonstrate that LDS3Pool consistently outperforms baseline methods across multiple datasets and a range of architectures, from classic models like VGG11 to modern networks like ResNet18, achieving significant accuracy gains with moderate computational overhead. The method’s empirical success is supported by a rigorous theoretical analysis, including a quantitative evaluation of the Hilbert curve’s superior, size-independent locality preservation. In summary, LDS3Pool represents a theoretically sound and empirically effective pooling method that enhances CNN generalization through a principled, quasi-random sampling framework. Full article
Show Figures

Figure 1

35 pages, 24325 KB  
Article
Enhancing Digital Twin Fidelity Through Low-Discrepancy Sequence and Hilbert Curve-Driven Point Cloud Down-Sampling
by Yuening Ma, Liang Guo and Min Li
Sensors 2025, 25(12), 3656; https://doi.org/10.3390/s25123656 - 11 Jun 2025
Cited by 1 | Viewed by 1854
Abstract
This paper addresses the critical challenge of point cloud down-sampling for digital twin creation, where reducing data volume while preserving geometric fidelity remains an ongoing research problem. We propose a novel down-sampling approach that combines Low-Discrepancy Sequences (LDS) with Hilbert curve ordering to [...] Read more.
This paper addresses the critical challenge of point cloud down-sampling for digital twin creation, where reducing data volume while preserving geometric fidelity remains an ongoing research problem. We propose a novel down-sampling approach that combines Low-Discrepancy Sequences (LDS) with Hilbert curve ordering to create a method that preserves both global distribution characteristics and local geometric features. Unlike traditional methods that impose uniform density or rely on computationally intensive feature detection, our LDS-Hilbert approach leverages the complementary mathematical properties of Low-Discrepancy Sequences and space-filling curves to achieve balanced sampling that respects the original density distribution while ensuring comprehensive coverage. Through four comprehensive experiments covering parametric surface fitting, mesh reconstruction from basic closed geometries, complex CAD models, and real-world laser scans, we demonstrate that LDS-Hilbert consistently outperforms established methods, including Simple Random Sampling (SRS), Farthest Point Sampling (FPS), and Voxel Grid Filtering (Voxel). Results show parameter recovery improvements often exceeding 50% for parametric models compared to the FPS and Voxel methods, nearly 50% better shape preservation as measured by the Point-to-Mesh Distance (than FPS) and up to 160% as measured by the Viewpoint Feature Histogram Distance (than SRS) on complex real-world scans. The method achieves these improvements without requiring feature-specific calculations, extensive pre-processing, or task-specific training data, making it a practical advance for enhancing digital twin fidelity across diverse application domains. Full article
(This article belongs to the Section Sensing and Imaging)
Show Figures

Figure 1

22 pages, 2200 KB  
Article
Intra- and Interpatient ECG Heartbeat Classification Based on Multimodal Convolutional Neural Networks with an Adaptive Attention Mechanism
by Ítalo Flexa Di Paolo and Adriana Rosa Garcez Castro
Appl. Sci. 2024, 14(20), 9307; https://doi.org/10.3390/app14209307 - 12 Oct 2024
Cited by 13 | Viewed by 5636
Abstract
Echocardiography (ECG) is a noninvasive technology that is widely used for recording heartbeats and diagnosing cardiac arrhythmias. However, interpreting ECG signals is challenging and may require substantial time from medical specialists. The evolution of technology and artificial intelligence has led to advances in [...] Read more.
Echocardiography (ECG) is a noninvasive technology that is widely used for recording heartbeats and diagnosing cardiac arrhythmias. However, interpreting ECG signals is challenging and may require substantial time from medical specialists. The evolution of technology and artificial intelligence has led to advances in the study and development of automatic arrhythmia classification systems to aid in medical diagnoses. Within this context, this paper introduces a framework for classifying cardiac arrhythmias on the basis of a multimodal convolutional neural network (CNN) with an adaptive attention mechanism. ECG signal segments are transformed into images via the Hilbert space-filling curve (HSFC) and recurrence plot (RP) techniques. The framework is developed and evaluated using the MIT-BIH public database in alignment with AAMI guidelines (ANSI/AAMI EC57). The evaluations accounted for interpatient and intrapatient paradigms, considering variations in the input structure related to the number of ECG leads (lead MLII and V1 + MLII). The results indicate that the framework is competitive with those in state-of-the-art studies, particularly for two ECG leads. The accuracy, precision, sensitivity, specificity and F1 score are 98.48%, 94.15%, 80.23%, 96.34% and 81.91%, respectively, for the interpatient paradigm and 99.70%, 98.01%, 97.26%, 99.28% and 97.64%, respectively, for the intrapatient paradigm. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

18 pages, 5984 KB  
Article
Magnetic Performance of Eddy Current Suppressing Structures in Additive Manufacturing
by Carsten Klein, Christopher May and Matthias Nienhaus
Actuators 2024, 13(3), 94; https://doi.org/10.3390/act13030094 - 28 Feb 2024
Cited by 6 | Viewed by 3916
Abstract
Additively manufactured soft-magnetic components are inherently bulky leading to significant eddy current losses when applied to electrical machines. Prior works have addressed this issue by implementing structures based on the Hilbert space-filling curve which include eddy current suppressing gaps, thereby reducing the fill [...] Read more.
Additively manufactured soft-magnetic components are inherently bulky leading to significant eddy current losses when applied to electrical machines. Prior works have addressed this issue by implementing structures based on the Hilbert space-filling curve which include eddy current suppressing gaps, thereby reducing the fill factor of the soft-magnetic component. The present research aims at investigating a number of space-filling curves in addition to sheets in order to find the optimal eddy current suppressing structure from an electromagnetic point of view. By means of both analysis and finite-element simulation, it was shown that sheets are superior at minimizing eddy current losses while space-filling curves excel at maximizing the fill factor. Full article
(This article belongs to the Special Issue Electromagnetic Actuators)
Show Figures

Figure 1

13 pages, 27073 KB  
Article
A New Transformation Technique for Reducing Information Entropy: A Case Study on Greyscale Raster Images
by Borut Žalik, Damjan Strnad, David Podgorelec, Ivana Kolingerová, Luka Lukač, Niko Lukač, Simon Kolmanič, Krista Rizman Žalik and Štefan Kohek
Entropy 2023, 25(12), 1591; https://doi.org/10.3390/e25121591 - 27 Nov 2023
Cited by 4 | Viewed by 2111
Abstract
This paper proposes a new string transformation technique called Move with Interleaving (MwI). Four possible ways of rearranging 2D raster images into 1D sequences of values are applied, including scan-line, left-right, strip-based, and Hilbert arrangements. Experiments on 32 benchmark greyscale raster images of [...] Read more.
This paper proposes a new string transformation technique called Move with Interleaving (MwI). Four possible ways of rearranging 2D raster images into 1D sequences of values are applied, including scan-line, left-right, strip-based, and Hilbert arrangements. Experiments on 32 benchmark greyscale raster images of various resolutions demonstrated that the proposed transformation reduces information entropy to a similar extent as the combination of the Burrows–Wheeler transform followed by the Move-To-Front or the Inversion Frequencies. The proposed transformation MwI yields the best result among all the considered transformations when the Hilbert arrangement is applied. Full article
(This article belongs to the Special Issue Information Theory and Coding for Image/Video Processing)
Show Figures

Figure 1

19 pages, 8655 KB  
Article
An Efficient Parallel Algorithm for Polygons Overlay Analysis
by Yuke Zhou, Shaohua Wang and Yong Guan
Appl. Sci. 2019, 9(22), 4857; https://doi.org/10.3390/app9224857 - 13 Nov 2019
Cited by 11 | Viewed by 4344
Abstract
Map overlay analysis is essential for geospatial analytics. Large scale spatial data pressing poses challenges for geospatial map overlay analytics. In this study, we propose an efficient parallel algorithm for polygons overlay analysis, including active-slave spatial index decomposition for intersection, multi-strategy Hilbert ordering [...] Read more.
Map overlay analysis is essential for geospatial analytics. Large scale spatial data pressing poses challenges for geospatial map overlay analytics. In this study, we propose an efficient parallel algorithm for polygons overlay analysis, including active-slave spatial index decomposition for intersection, multi-strategy Hilbert ordering decomposition, and parallel spatial union algorithm. Multi-strategy based spatial data decomposition mechanism is implemented, including parallel spatial data index, the Hilbert space-filling curve sort, and decomposition. The results of the experiments showed that the parallel algorithm for polygons overlay analysis achieves high efficiency. Full article
(This article belongs to the Special Issue Recent Advances in Geographic Information System for Earth Sciences)
Show Figures

Figure 1

26 pages, 2546 KB  
Article
Distributed Learning Fractal Algorithm for Optimizing a Centralized Control Topology of Wireless Sensor Network Based on the Hilbert Curve L-System
by Jaime Moreno, Oswaldo Morales, Ricardo Tejeida, Juan Posadas, Hugo Quintana and Grigori Sidorov
Sensors 2019, 19(6), 1442; https://doi.org/10.3390/s19061442 - 23 Mar 2019
Cited by 11 | Viewed by 5361
Abstract
Wireless sensor networks (WSNs) consist of a large number of small devices or nodes, called micro controller units (MCUs) and located in homes and/or offices, to be operated through the internet from anywhere, making these devices smarter and more efficient. Quality of service [...] Read more.
Wireless sensor networks (WSNs) consist of a large number of small devices or nodes, called micro controller units (MCUs) and located in homes and/or offices, to be operated through the internet from anywhere, making these devices smarter and more efficient. Quality of service routing is one of the critical challenges in WSNs, especially in surveillance systems. To improve the efficiency of the network, in this article we proposes a distributed learning fractal algorithm (DFLA) to design the control topology of a wireless sensor network (WSN), whose nodes are the MCUs distributed in a physical space and which are connected to share parameters of the sensors such as concentrations of C O 2 , humidity, temperature within the space or adjustment of the intensity of light inside and outside the home or office. For this, we start defining the production rules of the L-systems to generate the Hilbert fractal, since these rules facilitate the generation of this fractal, which is a fill-space curve. Then, we model the optimization of a centralized control topology of WSNs and proposed a DFLA to find the best two nodes where a device can find the highly reliable link between these nodes. Thus, we propose a software defined network (SDN) with strong mobility since it can be reconfigured depending on the amount of nodes, also we employ a target coverage because distributed learning fractal algorithm (DLFA) only consider reliable links among devices. Finally, through laboratory tests and computer simulations, we demonstrate the effectiveness of our approach by means of a fractal routing in WSNs, by using a large amount of WSNs devices (from 16 to 64 sensors) for real time monitoring of different parameters, in order to make efficient WSNs and its application in a forthcoming Smart City. Full article
(This article belongs to the Special Issue Artificial Intelligence and Sensors)
Show Figures

Figure 1

13 pages, 2174 KB  
Article
A Method of HBase Multi-Conditional Query for Ubiquitous Sensing Applications
by Bo Shen, Yi-Chen Liao, Dan Liu and Han-Chieh Chao
Sensors 2018, 18(9), 3064; https://doi.org/10.3390/s18093064 - 12 Sep 2018
Cited by 4 | Viewed by 4081
Abstract
Big data gathered from real systems, such as public infrastructure, healthcare, smart homes, industries, and so on, by sensor networks contain enormous value, and need to be mined deeply, which depends on a data storing and retrieving service. HBase is playing an increasingly [...] Read more.
Big data gathered from real systems, such as public infrastructure, healthcare, smart homes, industries, and so on, by sensor networks contain enormous value, and need to be mined deeply, which depends on a data storing and retrieving service. HBase is playing an increasingly important part in the big data environment since it provides a flexible pattern for storing extremely large amounts of unstructured data. Despite the fast-speed reading by RowKey, HBase does not natively support multi-conditional query, which is a common demand and operation in relational databases, especially for data analysis of ubiquitous sensing applications. In this paper, we introduce a method to construct a linear index by employing a Hilbert space-filling curve. As a RowKey generating schema, the proposed method maps multiple index-columns into a one-dimensional encoded sequence, and then constructs a new RowKey. We also provide a R-tree-based optimization to reduce the computational cost of encoding query conditions. Without using a secondary index mode, experimental results indicate that the proposed method has better performance in multi-conditional queries. Full article
(This article belongs to the Special Issue Internet of Things and Ubiquitous Sensing)
Show Figures

Figure 1

19 pages, 5569 KB  
Article
A Parallel N-Dimensional Space-Filling Curve Library and Its Application in Massive Point Cloud Management
by Xuefeng Guan, Peter Van Oosterom and Bo Cheng
ISPRS Int. J. Geo-Inf. 2018, 7(8), 327; https://doi.org/10.3390/ijgi7080327 - 15 Aug 2018
Cited by 23 | Viewed by 7793
Abstract
Because of their locality preservation properties, Space-Filling Curves (SFC) have been widely used in massive point dataset management. However, the completeness, universality, and scalability of current SFC implementations are still not well resolved. To address this problem, a generic n-dimensional (nD) SFC library [...] Read more.
Because of their locality preservation properties, Space-Filling Curves (SFC) have been widely used in massive point dataset management. However, the completeness, universality, and scalability of current SFC implementations are still not well resolved. To address this problem, a generic n-dimensional (nD) SFC library is proposed and validated in massive multiscale nD points management. The library supports two well-known types of SFCs (Morton and Hilbert) with an object-oriented design, and provides common interfaces for encoding, decoding, and nD box query. Parallel implementation permits effective exploitation of underlying multicore resources. During massive point cloud management, all xyz points are attached an additional random level of detail (LOD) value l. A unique 4D SFC key is generated from each xyzl with this library, and then only the keys are stored as flat records in an Oracle Index Organized Table (IOT). The key-only schema benefits both data compression and multiscale clustering. Experiments show that the proposed nD SFC library provides complete functions and robust scalability for massive points management. When loading 23 billion Light Detection and Ranging (LiDAR) points into an Oracle database, the parallel mode takes about 10 h and the loading speed is estimated four times faster than sequential loading. Furthermore, 4D queries using the Hilbert keys take about 1~5 s and scale well with the dataset size. Full article
Show Figures

Figure 1

20 pages, 7870 KB  
Article
Approach to Accelerating Dissolved Vector Buffer Generation in Distributed In-Memory Cluster Architecture
by Jinxin Shen, Luo Chen, Ye Wu and Ning Jing
ISPRS Int. J. Geo-Inf. 2018, 7(1), 26; https://doi.org/10.3390/ijgi7010026 - 15 Jan 2018
Cited by 12 | Viewed by 6809
Abstract
The buffer generation algorithm is a fundamental function in GIS, identifying areas of a given distance surrounding geographic features. Past research largely focused on buffer generation algorithms generated in a stand-alone environment. Moreover, dissolved buffer generation is data- and computing-intensive. In this scenario, [...] Read more.
The buffer generation algorithm is a fundamental function in GIS, identifying areas of a given distance surrounding geographic features. Past research largely focused on buffer generation algorithms generated in a stand-alone environment. Moreover, dissolved buffer generation is data- and computing-intensive. In this scenario, the improvement in the stand-alone environment is limited when considering large-scale mass vector data. Nevertheless, recent parallel dissolved vector buffer algorithms suffer from scalability problems, leaving room for further optimization. At present, the prevailing in-memory cluster-computing framework—Spark—provides promising efficiency for computing-intensive analysis; however, it has seldom been researched for buffer analysis. On this basis, we propose a cluster-computing-oriented parallel dissolved vector buffer generating algorithm, called the HPBM, that contains a Hilbert-space-filling-curve-based data partition method, a data skew and cross-boundary objects processing strategy, and a depth-given tree-like merging method. Experiments are conducted in both stand-alone and cluster environments using real-world vector data that include points and roads. Compared with some existing parallel buffer algorithms, as well as various popular GIS software, the HPBM achieves a performance gain of more than 50%. Full article
(This article belongs to the Special Issue Geospatial Big Data and Urban Studies)
Show Figures

Figure 1

Back to TopTop