Next Issue
Previous Issue

Table of Contents

Algorithms, Volume 2, Issue 2 (June 2009), Pages 623-878

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-12
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle Neural Network Modeling to Predict Shelf Life of Greenhouse Lettuce
Algorithms 2009, 2(2), 623-637; doi:10.3390/a2020623
Received: 24 October 2008 / Revised: 7 March 2009 / Accepted: 25 March 2009 / Published: 3 April 2009
Cited by 2 | PDF Full-text (89 KB) | HTML Full-text | XML Full-text
Abstract
Greenhouse-grown butter lettuce (Lactuca sativa L.) can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf [...] Read more.
Greenhouse-grown butter lettuce (Lactuca sativa L.) can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf life of 7 to 10 days is common, due to postharvest temperature fluctuations. The objective of this study was to establish neural network (NN) models to predict the remaining shelf life (RSL) under fluctuating postharvest temperatures. A box of 12 - 24 lettuce heads constituted a sample unit. The end of the shelf life of each head was determined when it showed initial signs of decay or yellowing. Air temperatures inside a shipping box were recorded. Daily average temperatures in storage and averaged shelf life of each box were used as inputs, and the RSL was modeled as an output. An R2 of 0.57 could be observed when a simple NN structure was employed. Since the "future" (or remaining) storage temperatures were unavailable at the time of making a prediction, a second NN model was introduced to accommodate a range of future temperatures and associated shelf lives. Using such 2-stage NN models, an R2 of 0.61 could be achieved for predicting RSL. This study indicated that NN modeling has potential for cold chain quality control and shelf life prediction. Full article
(This article belongs to the Special Issue Neural Networks and Sensors)
Figures

Open AccessArticle Pattern Recognition and Pathway Analysis with Genetic Algorithms in Mass Spectrometry Based Metabolomics
Algorithms 2009, 2(2), 638-666; doi:10.3390/a2020638
Received: 20 October 2008 / Revised: 2 February 2009 / Accepted: 26 March 2009 / Published: 3 April 2009
Cited by 11 | PDF Full-text (2169 KB) | HTML Full-text | XML Full-text
Abstract
A robust and complete workflow for metabolic profiling and data mining was described in detail. Three independent and complementary analytical techniques for metabolic profiling were applied: hydrophilic interaction chromatography (HILIC–LC–ESI–MS), reversed-phase liquid chromatography (RP–LC–ESI–MS), and gas chromatography (GC–TOF–MS) all coupled to mass [...] Read more.
A robust and complete workflow for metabolic profiling and data mining was described in detail. Three independent and complementary analytical techniques for metabolic profiling were applied: hydrophilic interaction chromatography (HILIC–LC–ESI–MS), reversed-phase liquid chromatography (RP–LC–ESI–MS), and gas chromatography (GC–TOF–MS) all coupled to mass spectrometry (MS). Unsupervised methods, such as principle component analysis (PCA) and clustering, and supervised methods, such as classification and PCA-DA (discriminatory analysis) were used for data mining. Genetic Algorithms (GA), a multivariate approach, was probed for selection of the smallest subsets of potentially discriminative predictors. From thousands of peaks found in total, small subsets selected by GA were considered as highly potential predictors allowing discrimination among groups. It was found that small groups of potential top predictors selected with PCA-DA and GA are different and unique. Annotated GC–TOF–MS data generated identified feature metabolites. Metabolites putatively detected with LC–ESI–MS profiling require further elemental composition assignment with accurate mass measurement by Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS) and structure elucidation by nuclear magnetic resonance spectroscopy (NMR). GA was also used to generate correlated networks for pathway analysis. Several case studies, comprising groups of plant samples bearing different genotypes and groups of samples of human origin, namely patients and healthy volunteers’ urine samples, demonstrated that such a workflow combining comprehensive metabolic profiling and advanced data mining techniques provides a powerful approach for pattern recognition and biomarker discovery Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Open AccessArticle A Bayesian Algorithm for Functional Mapping of Dynamic Complex Traits
Algorithms 2009, 2(2), 667-691; doi:10.3390/a2020667
Received: 8 January 2009 / Revised: 6 March 2009 / Accepted: 24 March 2009 / Published: 21 April 2009
Cited by 9 | PDF Full-text (311 KB) | HTML Full-text | XML Full-text
Abstract
Functional mapping of dynamic traits measured in a longitudinal study was originally derived within the maximum likelihood (ML) context and implemented with the EM algorithm. Although ML-based functional mapping possesses many favorable statistical properties in parameter estimation, it may be computationally intractable [...] Read more.
Functional mapping of dynamic traits measured in a longitudinal study was originally derived within the maximum likelihood (ML) context and implemented with the EM algorithm. Although ML-based functional mapping possesses many favorable statistical properties in parameter estimation, it may be computationally intractable for analyzing longitudinal data with high dimensions and high measurement errors. In this article, we derive a general functional mapping framework for quantitative trait locus mapping of dynamic traits within the Bayesian paradigm. Markov chain Monte Carlo techniques were implemented for functional mapping to estimate biologically and statistically sensible parameters that model the structures of time-dependent genetic effects and covariance matrix. The Bayesian approach is useful to handle difficulties in constructing confidence intervals as well as the identifiability problem, enhancing the statistical inference of functional mapping. We have undertaken simulation studies to investigate the statistical behavior of Bayesian-based functional mapping and used a real example with F2 mice to validate the utilization and usefulness of the model. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Open AccessArticle Fast Structural Alignment of Biomolecules Using a Hash Table, N-Grams and String Descriptors
Algorithms 2009, 2(2), 692-709; doi:10.3390/a2020692
Received: 30 November 2008 / Revised: 8 April 2009 / Accepted: 9 April 2009 / Published: 21 April 2009
Cited by 17 | PDF Full-text (465 KB) | HTML Full-text | XML Full-text
Abstract
This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular [...] Read more.
This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular structure-to-string translators were implemented for protein and RNA structures. A query against the index is performed in two hierarchical steps to unite speed and precision. In the first step the query structure is translated into n-grams, and all target structures containing these n-grams are retrieved from the hash table. In the second step all corresponding n-grams of the query and each target structure are subsequently aligned, and after each alignment a score is calculated based on the matching n-grams of query and target. The extendable framework enables the user to query and structurally align thousands of protein and RNA structures on a commodity machine and is available as open source from http://lajolla.sf.net. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Open AccessArticle ALE-PSO: An Adaptive Swarm Algorithm to Solve Design Problems of Laminates
Algorithms 2009, 2(2), 710-734; doi:10.3390/a2020710
Received: 23 March 2009 / Revised: 13 April 2009 / Accepted: 16 April 2009 / Published: 21 April 2009
Cited by 5 | PDF Full-text (313 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents an adaptive PSO algorithm whose numerical parameters can be updated following a scheduled protocol respecting some known criteria of convergence in order to enhance the chances to reach the global optimum of a hard combinatorial optimization problem, such those [...] Read more.
This paper presents an adaptive PSO algorithm whose numerical parameters can be updated following a scheduled protocol respecting some known criteria of convergence in order to enhance the chances to reach the global optimum of a hard combinatorial optimization problem, such those encountered in global optimization problems of composite laminates. Some examples concerning hard design problems are provided, showing the effectiveness of the approach. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Open AccessArticle Application of an Image Tracking Algorithm in Fire Ant Motion Experiment
Algorithms 2009, 2(2), 735-749; doi:10.3390/a2020735
Received: 26 March 2009 / Revised: 23 April 2009 / Accepted: 24 April 2009 / Published: 30 April 2009
Cited by 2 | PDF Full-text (2044 KB) | HTML Full-text | XML Full-text
Abstract
An image tracking algorithm, which was originally used with the particle image velocimetry (PIV) to determine velocities of buoyant solid particles in water, is modified and applied in the presented work to detect motion of fire ant on a planar surface. A [...] Read more.
An image tracking algorithm, which was originally used with the particle image velocimetry (PIV) to determine velocities of buoyant solid particles in water, is modified and applied in the presented work to detect motion of fire ant on a planar surface. A group of fire ant workers are put to the bottom of a tub and excited with vibration of selected frequency and intensity. The moving fire ants are captured with an image system that successively acquires image frames of high digital resolution. The background noise in the imaging recordings is extracted by averaging hundreds of frames and removed from each frame. The individual fire ant images are identified with a recursive digital filter, and then they are tracked between frames according to the size, brightness, shape, and orientation angle of the ant image. The speed of an individual ant is determined with the displacement of its images and the time interval between frames. The trail of the individual fire ant is determined with the image tracking results, and a statistical analysis is conducted for all the fire ants in the group. The purpose of the experiment is to investigate the response of fire ants to the substrate vibration. Test results indicate that the fire ants move faster after being excited, but the number of active ones are not increased even after a strong excitation. Full article
Open AccessArticle Probabilistic Upscaling of Material Failure Using Random Field Models – A Preliminary Investigation
Algorithms 2009, 2(2), 750-763; doi:10.3390/a2020750
Received: 31 December 2008 / Revised: 2 March 2009 / Accepted: 22 April 2009 / Published: 30 April 2009
PDF Full-text (400 KB) | HTML Full-text | XML Full-text
Abstract
Complexity of failure is reflected from sensitivity of strength to small defects and wide scatter of macroscopic behaviors. In engineering practices, spatial information of materials at fine scales can only be partially measurable. Random field (RF) models are important to address the [...] Read more.
Complexity of failure is reflected from sensitivity of strength to small defects and wide scatter of macroscopic behaviors. In engineering practices, spatial information of materials at fine scales can only be partially measurable. Random field (RF) models are important to address the uncertainty in spatial distribution. To transform a RF of micro-cracks into failure probability at full structural-scale crossing a number of length scales, the operator representing physics laws need be implemented in a multiscale framework, and to be realized in a stochastic setting. Multiscale stochastic modeling of materials is emerging as a new methodology at this research frontier, which provides a new multiscale thinking by upscaling fine-scale RFs. In this study, a preliminary framework of probabilistic upscaling is presented for bottom-up hierarchical modeling of failure propagation across micro-meso-macro scales. In the micro-to-meso process, the strength of stochastic representative volume element (SRVE) is probabilistically assessed by using a lattice model. A mixed Weibull-Gaussian distribution is proposed to characterize the statistical strength of SRVE, which can be used as input for the subsequent meso-to-macro upscaling process using smeared crack finite element analysis. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Open AccessArticle SDPhound, a Mutual Information-Based Method to Investigate Specificity-Determining Positions
Algorithms 2009, 2(2), 764-789; doi:10.3390/a2020764
Received: 22 December 2008 / Revised: 12 March 2009 / Accepted: 5 May 2009 / Published: 26 May 2009
Cited by 2 | PDF Full-text (1452 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Considerable importance in molecular biophysics is attached to influencing by mutagenesis the specific properties of a protein family. The working hypothesis is that mutating residues at few selected positions can affect specificity. Statistical analysis of homologue sequences can identify putative specificity determining [...] Read more.
Considerable importance in molecular biophysics is attached to influencing by mutagenesis the specific properties of a protein family. The working hypothesis is that mutating residues at few selected positions can affect specificity. Statistical analysis of homologue sequences can identify putative specificity determining positions (SDPs) and help to shed some light on the peculiarities underlying their functional role. In this work, we present an approach to identify such positions inspired by state of the art mutual information-based SDP prediction methods. The algorithm based on this approach provides a systematic procedure to point at the relevant physical characteristics of putative SPDs and can investigate the effects of correlated mutations. The method is tested on two standard benchmarks in the field and further validated in the context of a biologically interesting problem: the multimerization of the Intrinsically Fluorescent Proteins (IFP). Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Figures

Open AccessArticle Security of the Bennett-Brassard Quantum Key Distribution Protocol against Collective Attacks
Algorithms 2009, 2(2), 790-807; doi:10.3390/a2020790
Received: 16 October 2008 / Revised: 17 April 2009 / Accepted: 7 May 2009 / Published: 3 June 2009
PDF Full-text (304 KB) | HTML Full-text | XML Full-text
Abstract
The theoretical Quantum Key-Distribution scheme of Bennett and Brassard (BB84) has been proven secure against very strong attacks including the collective attacks and the joint attacks. Though the latter are the most general attacks, collective attacks are much easier to analyze, yet, [...] Read more.
The theoretical Quantum Key-Distribution scheme of Bennett and Brassard (BB84) has been proven secure against very strong attacks including the collective attacks and the joint attacks. Though the latter are the most general attacks, collective attacks are much easier to analyze, yet, they are conjectured to be as informative to the eavesdropper. Thus, collective attacks are likely to be useful in the analysis of many theoretical and practical schemes that are still lacking a proof of security, including practical BB84 schemes. We show how powerful tools developed in previous works for proving security against the joint attack, are simplified when applied to the security of BB84 against collective attacks whilst providing the same bounds on leaked information and the same error threshold. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)
Open AccessArticle Failure Assessment of Layered Composites Subject to Impact Loadings: a Finite Element, Sigma-Point Kalman Filter Approach
Algorithms 2009, 2(2), 808-827; doi:10.3390/a2020808
Received: 1 April 2009 / Accepted: 27 May 2009 / Published: 4 June 2009
Cited by 6 | PDF Full-text (616 KB) | HTML Full-text | XML Full-text
Abstract
We present a coupled finite element, Kalman filter approach to foresee impactinduced delamination of layered composites when mechanical properties are partially unknown. Since direct numerical simulations, which require all the constitutive parameters to be assigned, cannot be run in such cases, an [...] Read more.
We present a coupled finite element, Kalman filter approach to foresee impactinduced delamination of layered composites when mechanical properties are partially unknown. Since direct numerical simulations, which require all the constitutive parameters to be assigned, cannot be run in such cases, an inverse problem is formulated to allow for modeling as well as constitutive uncertainties. Upon space discretization through finite elements and time integration through the explicit ®¡method, the resulting nonlinear stochastic state model, wherein nonlinearities are due to delamination growth, is attacked with sigma-point Kalman filtering. Comparison with experimental data available in the literature and concerning inter-laminar failure of layered composites subject to low-velocity impacts, shows that the proposed procedure leads to: an accurate description of the failure mode; converged estimates of inter-laminar strength and toughness in good agreement with experimental data. Full article
(This article belongs to the Special Issue Numerical Simulation of Discontinuities in Mechanics)
Open AccessArticle Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing
Algorithms 2009, 2(2), 850-878; doi:10.3390/a2020850
Received: 5 December 2008 / Revised: 11 May 2009 / Accepted: 28 May 2009 / Published: 9 June 2009
PDF Full-text (1016 KB)
Abstract
Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data [...] Read more.
Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details. Full article
(This article belongs to the Special Issue Algorithms and Molecular Sciences)

Review

Jump to: Research

Open AccessReview Computer-Aided Diagnosis in Mammography Using Content-Based Image Retrieval Approaches: Current Status and Future Perspectives
Algorithms 2009, 2(2), 828-849; doi:10.3390/a2020828
Received: 29 April 2009 / Revised: 28 May 2009 / Accepted: 28 May 2009 / Published: 4 June 2009
Cited by 29 | PDF Full-text (178 KB) | HTML Full-text | XML Full-text
Abstract
As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR) has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD) schemes that use CBIR to search [...] Read more.
As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR) has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD) schemes that use CBIR to search for the clinically relevant and visually similar medical images (or regions) depicting suspicious lesions has also been attracting research interest. CBIR-based CAD schemes have potential to provide radiologists with “visual aid” and increase their confidence in accepting CAD-cued results in the decision making. The CAD performance and reliability depends on a number of factors including the optimization of lesion segmentation, feature selection, reference database size, computational efficiency, and relationship between the clinical relevance and visual similarity of the CAD results. By presenting and comparing a number of approaches commonly used in previous studies, this article identifies and discusses the optimal approaches in developing CBIR-based CAD schemes and assessing their performance. Although preliminary studies have suggested that using CBIR-based CAD schemes might improve radiologists’ performance and/or increase their confidence in the decision making, this technology is still in the early development stage. Much research work is needed before the CBIR-based CAD schemes can be accepted in the clinical practice. Full article
(This article belongs to the Special Issue Machine Learning for Medical Imaging)

Journal Contact

MDPI AG
Algorithms Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
algorithms@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Algorithms
Back to Top