Next Issue
Volume 3, March
Previous Issue
Volume 2, September
 
 

Algorithms, Volume 2, Issue 4 (December 2009) – 12 articles , Pages 1263-1525

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:

Research

Jump to: Other

2380 KiB  
Article
Compound Biorthogonal Wavelets on Quadrilaterals and Polar Structures
by Chong Zhao, Hanqiu Sun, Huawei Wang and Kaihuai Qin
Algorithms 2009, 2(4), 1263-1280; https://doi.org/10.3390/a2041263 - 28 Sep 2009
Viewed by 7867
Abstract
In geometric models with high-valence vertices, current subdivision wavelets may not deal with the special cases well for good visual effect of multiresolution surfaces. In this paper, we present the novel biorthogonal polar subdivision wavelets, which can efficiently perform wavelet analysis to the [...] Read more.
In geometric models with high-valence vertices, current subdivision wavelets may not deal with the special cases well for good visual effect of multiresolution surfaces. In this paper, we present the novel biorthogonal polar subdivision wavelets, which can efficiently perform wavelet analysis to the control nets with polar structures. The polar subdivision can generate more natural subdivision surfaces around the high-valence vertices and avoid the ripples and saddle points where Catmull-Clark subdivision may produce. Based on polar subdivision, our wavelet scheme supports special operations on the polar structures, especially suitable to models with many facets joining. For seamless fusing with Catmull-Clark subdivision wavelet, we construct the wavelets in circular and radial layers of polar structures, so can combine the subdivision wavelets smoothly for composite models formed by quadrilaterals and polar structures. The computations of wavelet analysis and synthesis are highly efficient and fully in-place. The experimental results have confirmed the stability of our proposed approach. Full article
Show Figures

Figure 1

919 KiB  
Article
An Adaptive h-Refinement Algorithm for Local Damage Models
by Jonathan S. Pitt and Francesco Costanzo
Algorithms 2009, 2(4), 1281-1300; https://doi.org/10.3390/a2041281 - 06 Oct 2009
Cited by 1 | Viewed by 8505
Abstract
An adaptive mesh refinement strategy is proposed for local damage models that often arise from internal state variable based continuum damage models. The proposed algorithm employs both the finite element method and the finite difference method to integrate the equations of motion of [...] Read more.
An adaptive mesh refinement strategy is proposed for local damage models that often arise from internal state variable based continuum damage models. The proposed algorithm employs both the finite element method and the finite difference method to integrate the equations of motion of a linear elastic material with simple isotropic microcracking. The challenges of this problem include the time integration of coupled partial differential equations with time-dependent coefficients, and the proper choice of solution spaces to yield a stable finite element formulation. Discontinuous elements are used for the representation of the damage field, as it is believed that this reduction in regularity is more consistent with the physical nature of evolving microcracking. The adaptive mesh refinement algorithm relies on custom refinement indicators, two of which are presented and compared. The two refinement indicators we explore are based on the time rate of change of the damage field and on the energy release rate, respectively, where the energy release rate measures the energy per unit volume available for damage to evolve. We observe the performance of the proposed algorithm and refinement indicators by comparing the predicted damage morphology on different meshes, hence judging the capability of the proposed technique to address, but not eliminate, the mesh dependency present in the solutions of the damage field. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Show Figures

Figure 1

619 KiB  
Article
Incentive Compatible and Globally Efficient Position Based Routing for Selfish Reverse Multicast in Wireless Sensor Networks
by Stephan Eidenbenz, Gunes Ercal-Ozkaya, Adam Meyerson, Allon Percus and Sarvesh Varatharajan
Algorithms 2009, 2(4), 1303-1326; https://doi.org/10.3390/a2041303 - 14 Oct 2009
Viewed by 8890
Abstract
We consider the problem of all-to-one selfish routing in the absence of a payment scheme in wireless sensor networks, where a natural model for cost is the power required to forward, referring to the resulting game as a Locally Minimum Cost Forwarding (LMCF). [...] Read more.
We consider the problem of all-to-one selfish routing in the absence of a payment scheme in wireless sensor networks, where a natural model for cost is the power required to forward, referring to the resulting game as a Locally Minimum Cost Forwarding (LMCF). Our objective is to characterize equilibria and their global costs in terms of stretch and diameter, in particular finding incentive compatible algorithms that are also close to globally optimal. We find that although social costs for equilibria of LMCF exhibit arbitrarily bad worst-case bounds and computational infeasibility of reaching optimal equilibria, there exist greedy and local incentive compatible heuristics achieving near-optimal global costs. Full article
(This article belongs to the Special Issue Sensor Algorithms)
Show Figures

Figure 1

3512 KiB  
Article
Delaunay Meshing of Piecewise Smooth Complexes without Expensive Predicates
by Tamal K. Dey and Joshua A. Levine
Algorithms 2009, 2(4), 1327-1349; https://doi.org/10.3390/a2041327 - 11 Nov 2009
Cited by 33 | Viewed by 7851
Abstract
Recently a Delaunay refinement algorithm has been proposed that can mesh piecewise smooth complexes which include polyhedra, smooth and piecewise smooth surfaces, and non-manifolds. However, this algorithm employs domain dependent numerical predicates, some of which could be computationally expensive and hard to implement. [...] Read more.
Recently a Delaunay refinement algorithm has been proposed that can mesh piecewise smooth complexes which include polyhedra, smooth and piecewise smooth surfaces, and non-manifolds. However, this algorithm employs domain dependent numerical predicates, some of which could be computationally expensive and hard to implement. In this paper we develop a refinement strategy that eliminates these complicated domain dependent predicates. As a result we obtain a meshing algorithm that is practical and implementation-friendly. Full article
(This article belongs to the Special Issue Computational Geometry)
Show Figures

Figure 1

1074 KiB  
Article
CADrx for GBM Brain Tumors: Predicting Treatment Response from Changes in Diffusion-Weighted MRI
by Jing Huo, Kazunori Okada, Hyun J. Kim, Whitney B. Pope, Jonathan G. Goldin, Jeffrey R. Alger and Matthew S. Brown
Algorithms 2009, 2(4), 1350-1367; https://doi.org/10.3390/a2041350 - 16 Nov 2009
Cited by 8 | Viewed by 12221
Abstract
The goal of this study was to develop a computer-aided therapeutic response (CADrx) system for early prediction of drug treatment response for glioblastoma multiforme (GBM) brain tumors with diffusion weighted (DW) MR images. In conventional Macdonald assessment, tumor response is assessed nine weeks [...] Read more.
The goal of this study was to develop a computer-aided therapeutic response (CADrx) system for early prediction of drug treatment response for glioblastoma multiforme (GBM) brain tumors with diffusion weighted (DW) MR images. In conventional Macdonald assessment, tumor response is assessed nine weeks or more post-treatment. However, we will investigate the ability of DW-MRI to assess response earlier, at five weeks post treatment. The apparent diffusion coefficient (ADC) map, calculated from DW images, has been shown to reveal changes in the tumor’s microenvironment preceding morphologic tumor changes. ADC values in treated brain tumors could theoretically both increase due to the cell kill (and thus reduced cell density) and decrease due to inhibition of edema. In this study, we investigated the effectiveness of features that quantify changes from pre- and post-treatment tumor ADC histograms to detect treatment response. There are three parts to this study: first, tumor regions were segmented on T1w contrast enhanced images by Otsu’s thresholding method, and mapped from T1w images onto ADC images by a 3D region of interest (ROI) mapping tool using DICOM header information; second, ADC histograms of the tumor region were extracted from both pre- and five weeks post-treatment scans, and fitted by a two-component Gaussian mixture model (GMM). The GMM features as well as standard histogram-based features were extracted. Finally, supervised machine learning techniques were applied for classification of responders or non-responders. The approach was evaluated with a dataset of 85 patients with GBM under chemotherapy, in which 39 responded and 46 did not, based on tumor volume reduction. We compared adaBoost, random forest and support vector machine classification algorithms, using ten-fold cross validation, resulting in the best accuracy of 69.41% and the corresponding area under the curve (Az) of 0.70. Full article
(This article belongs to the Special Issue Machine Learning for Medical Imaging)
Show Figures

Figure 1

2034 KiB  
Article
Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems
by Kuncup Iswandy and Andreas König
Algorithms 2009, 2(4), 1368-1409; https://doi.org/10.3390/a2041368 - 18 Nov 2009
Cited by 16 | Viewed by 11132
Abstract
The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational [...] Read more.
The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints. Full article
(This article belongs to the Special Issue Sensor Algorithms)
Show Figures

Graphical abstract

239 KiB  
Article
A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models
by Yao Ren, Michael T. Johnson, Patrick J. Clemins, Michael Darre, Sharon Stuart Glaeser, Tomasz S. Osiejuk and Ebenezer Out-Nyarko
Algorithms 2009, 2(4), 1410-1428; https://doi.org/10.3390/a2041410 - 18 Nov 2009
Cited by 39 | Viewed by 11955
Abstract
Using Hidden Markov Models (HMMs) as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility [...] Read more.
Using Hidden Markov Models (HMMs) as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks. Full article
(This article belongs to the Special Issue Algorithms for Sound Localization and Sound Classification)
Show Figures

Figure 1

267 KiB  
Article
Linear-Time Text Compression by Longest-First Substitution
by Ryosuke Nakamura, Shunsuke Inenaga, Hideo Bannai, Takashi Funamoto, Masayuki Takeda and Ayumi Shinohara
Algorithms 2009, 2(4), 1429-1448; https://doi.org/10.3390/a2041429 - 25 Nov 2009
Cited by 16 | Viewed by 9663
Abstract
We consider grammar-based text compression with longest first substitution (LFS), where non-overlapping occurrences of a longest repeating factor of the input text are replaced by a new non-terminal symbol. We present the first linear-time algorithm for LFS. Our algorithm employs a [...] Read more.
We consider grammar-based text compression with longest first substitution (LFS), where non-overlapping occurrences of a longest repeating factor of the input text are replaced by a new non-terminal symbol. We present the first linear-time algorithm for LFS. Our algorithm employs a new data structure called sparse lazy suffix trees. We also deal with a more sophisticated version of LFS, called LFS2, that allows better compression. The first linear-time algorithm for LFS2 is also presented. Full article
(This article belongs to the Special Issue Data Compression)
Show Figures

Figure 1

306 KiB  
Article
Exact and Heuristic Algorithms for Thrift Cyclic Scheduling
by Michael J. Short
Algorithms 2009, 2(4), 1449-1472; https://doi.org/10.3390/a2041449 - 26 Nov 2009
Viewed by 8759
Abstract
Non-preemptive schedulers, despite their many discussed drawbacks, remain a very popular choice for practitioners of real-time and embedded systems. The non-preemptive ‘thrift’ cyclic scheduler—variations of which can be found in other application areas—has recently received considerable attention for the implementation of such embedded [...] Read more.
Non-preemptive schedulers, despite their many discussed drawbacks, remain a very popular choice for practitioners of real-time and embedded systems. The non-preemptive ‘thrift’ cyclic scheduler—variations of which can be found in other application areas—has recently received considerable attention for the implementation of such embedded systems. A thrift scheduler provides a flexible and compact implementation model for periodic task sets with comparatively small overheads; additionally, it can overcome several of the problems associated with traditional ‘cyclic executives’. However, severe computational difficulties can still arise when designing schedules for non-trivial task sets. This paper is concerned with an optimization version of the offset-assignment problem, in which the objective is to assign task offsets such that the required CPU clock speed is minimized whilst ensuring that task overruns do not occur; it is known that the decision version of this problem is complete for Σ2p. The paper first considers the problemof candidate solution verification—itself strongly coNP-Complete—and a fast, exact algorithm for this problem is proposed; it is shown that for any fixed number of tasks, its execution time is polynomial. The paper then proposes two heuristic algorithms of pseudopolynomial complexity for solving the offset-assignment problem, and considers how redundant choices of offset combinations can be eliminated to help speed up the search. The performance of these algorithms is then experimentally evaluated, before conclusions are drawn. Full article
Show Figures

Figure 1

650 KiB  
Article
Predicting Radiological Panel Opinions Using a Panel of Machine Learning Classifiers
by Dmitriy Zinovev, Daniela Raicu, Jacob Furst and Samuel G. Armato III
Algorithms 2009, 2(4), 1473-1502; https://doi.org/10.3390/a2041473 - 30 Nov 2009
Cited by 50 | Viewed by 11173
Abstract
This paper uses an ensemble of classifiers and active learning strategies to predict radiologists’ assessment of the nodules of the Lung Image Database Consortium (LIDC). In particular, the paper presents machine learning classifiers that model agreement among ratings in seven semantic characteristics: spiculation, [...] Read more.
This paper uses an ensemble of classifiers and active learning strategies to predict radiologists’ assessment of the nodules of the Lung Image Database Consortium (LIDC). In particular, the paper presents machine learning classifiers that model agreement among ratings in seven semantic characteristics: spiculation, lobulation, texture, sphericity, margin, subtlety, and malignancy. The ensemble of classifiers (which can be considered as a computer panel of experts) uses 64 image features of the nodules across four categories (shape, intensity, texture, and size) to predict semantic characteristics. The active learning begins the training phase with nodules on which radiologists’ semantic ratings agree, and incrementally learns how to classify nodules on which the radiologists do not agree. Using our proposed approach, the classification accuracy of the ensemble of classifiers is higher than the accuracy of a single classifier. In the long run, our proposed approach can be used to increase consistency among radiological interpretations by providing physicians a “second read”. Full article
(This article belongs to the Special Issue Machine Learning for Medical Imaging)
Show Figures

Figure 1

677 KiB  
Article
Image Similarity to Improve the Classification of Breast Cancer Images
by Dave Tahmoush
Algorithms 2009, 2(4), 1503-1525; https://doi.org/10.3390/a2041503 - 01 Dec 2009
Cited by 3 | Viewed by 9666
Abstract
Techniques in image similarity can be used to improve the classification of breast cancer images. Breast cancer images in the mammogram modality have an abundance of non-cancerous structures that are similar to cancer, which make classification of images as containing cancer especially difficult [...] Read more.
Techniques in image similarity can be used to improve the classification of breast cancer images. Breast cancer images in the mammogram modality have an abundance of non-cancerous structures that are similar to cancer, which make classification of images as containing cancer especially difficult to work with. Only the cancerous part of the image is relevant, so the techniques must learn to recognize cancer in noisy mammograms and extract features from that cancer to appropriately classify images. There are also many types or classes of cancer with different characteristics over which the system must work. Mammograms come in sets of four, two images of each breast, which enables comparison of the left and right breast images to help determine relevant features and remove irrelevant features. In this work, image feature clustering is done to reduce the noise and the feature space, and the results are used in a distance function that uses a learned threshold in order to produce a classification. The threshold parameter of the distance function is learned simultaneously with the underlying clustering and then integrated to produce an agglomeration that is relevant to the images. This technique can diagnose breast cancer more accurately than commercial systems and other published results. Full article
(This article belongs to the Special Issue Machine Learning for Medical Imaging)
Show Figures

Figure 1

Other

Jump to: Research

18 KiB  
New Book Received
Encyclopedia of Algorithms. Edited by Kao, Ming-Yang, Springer-Verlag GmbH, 2008; 1220 pages, 183 figures, 38 tables; Hard Cover. Price: € 309.- / CHF 479.50.- ISBN 978-0-387-30770-1
by Shu-Kun Lin
Algorithms 2009, 2(4), 1301-1302; https://doi.org/10.3390/a2041301 - 12 Oct 2009
Viewed by 6937
Abstract
The Encyclopedia of Algorithms provides a comprehensive set of solutions to important algorithmic problems for students and researchers, including high-impact solutions from the most recent decade. [...] Full article
Previous Issue
Next Issue
Back to TopTop