Next Issue
Volume 7, March
Previous Issue
Volume 6, September
 
 

Information, Volume 6, Issue 4 (December 2015) – 20 articles , Pages 576-894

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
708 KiB  
Article
An Interval-Valued Intuitionistic Fuzzy MADM Method Based on a New Similarity Measure
by Haiping Ren and Guofu Wang
Information 2015, 6(4), 880-894; https://doi.org/10.3390/info6040880 - 18 Dec 2015
Cited by 13 | Viewed by 3532
Abstract
Similarity measure is one of the most important measures of interval-valued intuitionistic fuzzy (IVIF) sets. This article will put forward a new similarity measure, which considers the impacts of membership degree, nonmembership degree and median point of IVIF sets. For cases with partially [...] Read more.
Similarity measure is one of the most important measures of interval-valued intuitionistic fuzzy (IVIF) sets. This article will put forward a new similarity measure, which considers the impacts of membership degree, nonmembership degree and median point of IVIF sets. For cases with partially known attribute weight information in multi-attribute decision-making (MADM) problems, a new weighting method is put forward by establishing the maximum similarity optimization model to solve the optimal weights. Further, a new decision-making method is developed on the basis of proposed similarity measure, and an applied example proves the effectiveness and feasibility of the proposed methods. Full article
246 KiB  
Article
Codeword Structure Analysis for LDPC Convolutional Codes
by Hua Zhou, Jiao Feng, Peng Li and Jingming Xia
Information 2015, 6(4), 866-879; https://doi.org/10.3390/info6040866 - 14 Dec 2015
Viewed by 4077
Abstract
The codewords of a low-density parity-check (LDPC) convolutional code (LDPC-CC) are characterised into structured and non-structured. The number of the structured codewords is dominated by the size of the polynomial syndrome former matrix H T ( D ) , while the number of [...] Read more.
The codewords of a low-density parity-check (LDPC) convolutional code (LDPC-CC) are characterised into structured and non-structured. The number of the structured codewords is dominated by the size of the polynomial syndrome former matrix H T ( D ) , while the number of the non-structured ones depends on the particular monomials or polynomials in H T ( D ) . By evaluating the relationship of the codewords between the mother code and its super codes, the low weight non-structured codewords in the super codes can be eliminated by appropriately choosing the monomials or polynomials in H T ( D ) , resulting in improved distance spectrum of the mother code. Full article
Show Figures

Figure 1

1029 KiB  
Article
Effects of Semantic Features on Machine Learning-Based Drug Name Recognition Systems: Word Embeddings vs. Manually Constructed Dictionaries
by Shengyu Liu, Buzhou Tang, Qingcai Chen and Xiaolong Wang
Information 2015, 6(4), 848-865; https://doi.org/10.3390/info6040848 - 11 Dec 2015
Cited by 47 | Viewed by 6887
Abstract
Semantic features are very important for machine learning-based drug name recognition (DNR) systems. The semantic features used in most DNR systems are based on drug dictionaries manually constructed by experts. Building large-scale drug dictionaries is a time-consuming task and adding new drugs to [...] Read more.
Semantic features are very important for machine learning-based drug name recognition (DNR) systems. The semantic features used in most DNR systems are based on drug dictionaries manually constructed by experts. Building large-scale drug dictionaries is a time-consuming task and adding new drugs to existing drug dictionaries immediately after they are developed is also a challenge. In recent years, word embeddings that contain rich latent semantic information of words have been widely used to improve the performance of various natural language processing tasks. However, they have not been used in DNR systems. Compared to the semantic features based on drug dictionaries, the advantage of word embeddings lies in that learning them is unsupervised. In this paper, we investigate the effect of semantic features based on word embeddings on DNR and compare them with semantic features based on three drug dictionaries. We propose a conditional random fields (CRF)-based system for DNR. The skip-gram model, an unsupervised algorithm, is used to induce word embeddings on about 17.3 GigaByte (GB) unlabeled biomedical texts collected from MEDLINE (National Library of Medicine, Bethesda, MD, USA). The system is evaluated on the drug-drug interaction extraction (DDIExtraction) 2013 corpus. Experimental results show that word embeddings significantly improve the performance of the DNR system and they are competitive with semantic features based on drug dictionaries. F-score is improved by 2.92 percentage points when word embeddings are added into the baseline system. It is comparative with the improvements from semantic features based on drug dictionaries. Furthermore, word embeddings are complementary to the semantic features based on drug dictionaries. When both word embeddings and semantic features based on drug dictionaries are added, the system achieves the best performance with an F-score of 78.37%, which outperforms the best system of the DDIExtraction 2013 challenge by 6.87 percentage points. Full article
(This article belongs to the Special Issue Recent Advances of Big Data Technology)
Show Figures

Figure 1

1059 KiB  
Article
Datafication and the Seductive Power of Uncertainty—A Critical Exploration of Big Data Enthusiasm
by Stefan Strauß
Information 2015, 6(4), 836-847; https://doi.org/10.3390/info6040836 - 09 Dec 2015
Cited by 12 | Viewed by 5850
Abstract
This contribution explores the fine line between overestimated expectations and underrepresented momentums of uncertainty that correlate with the prevalence of big data. Big data promises a multitude of innovative options to enhance decision-making by employing algorithmic power to gather worthy information out of [...] Read more.
This contribution explores the fine line between overestimated expectations and underrepresented momentums of uncertainty that correlate with the prevalence of big data. Big data promises a multitude of innovative options to enhance decision-making by employing algorithmic power to gather worthy information out of large unstructured data sets. Datafication—the exploitation of raw data in many different contexts—can be seen as an attempt to tackle complexity and reduce uncertainty. Accordingly promising are the prospects for innovative applications to gain new insights and valuable knowledge in a variety of domains ranging from business strategy, security to health and medical research, etc. However, big data also entails an increase in complexity that, together with growing automation, may trigger not merely uncertain but also unintended societal events. As a new source of networking power, big data has inherent risks to create new asymmetries and transform possibilities to probabilities that can inter alia affect the autonomy of the individual. To reduce these risks, challenges ahead include improving data quality and interpretation supported by new modalities to allow for scrutiny and verifiability of big data analytics. Full article
(This article belongs to the Special Issue Selected Papers from the ISIS Summit Vienna 2015)
Show Figures

Figure 1

609 KiB  
Editorial
A Summary of the Special Issue “Cybersecurity and Cryptography”
by Qiong Huang and Guomin Yang
Information 2015, 6(4), 833-835; https://doi.org/10.3390/info6040833 - 08 Dec 2015
Cited by 1 | Viewed by 3192
Abstract
Nowadays in the cyber world, massive amounts of data are being collected, transmitted, and stored by different organizations and individuals.[...] Full article
(This article belongs to the Special Issue Cybersecurity and Cryptography)
798 KiB  
Article
Information and Phylogenetic Systematic Analysis
by Walter Craig and Jonathon Stone
Information 2015, 6(4), 811-832; https://doi.org/10.3390/info6040811 - 08 Dec 2015
Cited by 1 | Viewed by 3900
Abstract
Information in phylogenetic systematic analysis has been conceptualized, defined, quantified, and used differently by different authors. In this paper, we start with the Shannon Uncertainty Measure information measure I, applying it to cladograms containing only consistent character states. We formulate a general [...] Read more.
Information in phylogenetic systematic analysis has been conceptualized, defined, quantified, and used differently by different authors. In this paper, we start with the Shannon Uncertainty Measure information measure I, applying it to cladograms containing only consistent character states. We formulate a general expression for I, utilizing a standard format for taxon-character matrices, and investigate the effect that adding data to an existing taxon-character matrix has on I. We show that I may increase when character vectors that encode autapomorphic or synapomorphic character states are added. However, as added character vectors accumulate, I tends to a limit, which generally is less than the maximum I. We show computationally and analytically that limc→∞ I = log2 t, in which t enumerates taxa and c enumerates characters. For any particular t, upper and lower bounds in I exist. We use our observations to suggest several interpretations about the relationship between information and phylogenetic systematic analysis that have eluded previous, precise recognition. Full article
(This article belongs to the Section Information Theory and Methodology)
Show Figures

Figure 1

787 KiB  
Review
Drug Name Recognition: Approaches and Resources
by Shengyu Liu, Buzhou Tang, Qingcai Chen and Xiaolong Wang
Information 2015, 6(4), 790-810; https://doi.org/10.3390/info6040790 - 25 Nov 2015
Cited by 34 | Viewed by 14302
Abstract
Drug name recognition (DNR), which seeks to recognize drug mentions in unstructured medical texts and classify them into pre-defined categories, is a fundamental task of medical information extraction, and is a key component of many medical relation extraction systems and applications. A large [...] Read more.
Drug name recognition (DNR), which seeks to recognize drug mentions in unstructured medical texts and classify them into pre-defined categories, is a fundamental task of medical information extraction, and is a key component of many medical relation extraction systems and applications. A large number of efforts have been devoted to DNR, and great progress has been made in DNR in the last several decades. We present here a comprehensive review of studies on DNR from various aspects such as the challenges of DNR, the existing approaches and resources for DNR, and possible directions. Full article
(This article belongs to the Section Information Theory and Methodology)
Show Figures

Figure 1

211 KiB  
Article
A Non-Probabilistic Model of Relativised Predictability in Physics
by Alastair A. Abbott, Cristian S. Calude and Karl Svozil
Information 2015, 6(4), 773-789; https://doi.org/10.3390/info6040773 - 19 Nov 2015
Cited by 7 | Viewed by 3833
Abstract
Unpredictability is an important concept throughout physics and plays a central role in quantum information theory. Despite this, little effort has been devoted to studying generalised notions or models of (un)predictability in physics. In this paper, we continue the programme of developing a [...] Read more.
Unpredictability is an important concept throughout physics and plays a central role in quantum information theory. Despite this, little effort has been devoted to studying generalised notions or models of (un)predictability in physics. In this paper, we continue the programme of developing a general, non-probabilistic model of (un)predictability in physics. We present a more refined model that is capable of studying different degrees of “relativised” unpredictability. This model is based on the ability of an agent, acting via uniform, effective means, to predict correctly and reproducibly the outcome of an experiment using finite information extracted from the environment. We use this model to study the degree of unpredictability certified by different quantum phenomena further, showing that quantum complementarity guarantees a form of relativised unpredictability that is weaker than that guaranteed by Kochen–Specker-type value indefiniteness. We exemplify further the difference between certification by complementarity and value indefiniteness by showing that, unlike value indefiniteness, complementarity is compatible with the production of computable sequences of bits. Full article
(This article belongs to the Special Issue Selected Papers from the ISIS Summit Vienna 2015)
371 KiB  
Article
Three Aspects of Information Science in Reality: Symmetry, Semiotics and Society
by Joseph E. Brenner
Information 2015, 6(4), 750-772; https://doi.org/10.3390/info6040750 - 17 Nov 2015
Cited by 4 | Viewed by 3900
Abstract
The 2nd International Conference on the Philosophy of Information (ICPI 2015) took place in Vienna June 5–6, 2015 as a major Section of the Vienna 2015 Summit Conference on the Response and Responsibility of the Information Sciences. At the ICPI, Wu Kun and [...] Read more.
The 2nd International Conference on the Philosophy of Information (ICPI 2015) took place in Vienna June 5–6, 2015 as a major Section of the Vienna 2015 Summit Conference on the Response and Responsibility of the Information Sciences. At the ICPI, Wu Kun and others presented evidence for a current integration and convergence of the philosophy and science of information, under the influence of the unique characteristics of information itself. As I have shown, my extension of logic to real systems (Logic in Reality; LIR) applies to and explicates many of the properties of information processes. In this paper, I apply LIR as a framework for understanding the operation of information in three areas of science and philosophy that were discussed at the Summit. The utility of this approach in support of an information commons is suggested the abstract section. Full article
(This article belongs to the Special Issue Selected Papers from the ISIS Summit Vienna 2015)
750 KiB  
Article
Digital Information and Value
by Paul Walton
Information 2015, 6(4), 733-749; https://doi.org/10.3390/info6040733 - 10 Nov 2015
Cited by 11 | Viewed by 4802
Abstract
Digital information changes the ways in which people and organisations interact. This paper examines the nature of this change in the context of the author’s Model for Information (MfI). It investigates the relationship between outcomes and value, selection processes and some attributes of [...] Read more.
Digital information changes the ways in which people and organisations interact. This paper examines the nature of this change in the context of the author’s Model for Information (MfI). It investigates the relationship between outcomes and value, selection processes and some attributes of information and explores how this relationship changes in the move from analogue to digital information. Selection processes shape the evolution of information ecosystems in which conventions are established for the ways in which information is used. The conventions determine norms for information friction and information quality as well as the sources of information and channels used. Digital information reduces information friction, often dramatically, and changes information quality. The increasing use of analytics in business increasingly delivers predictive or prescriptive digital information. These changes are happening faster than information ecosystem conventions can change. The relationships established in the paper enable an analysis of, and guide changes to, these conventions enabling a more effective use of digital information. Full article
Show Figures

Figure 1

1341 KiB  
Article
Cable Capacitance Attack against the KLJN Secure Key Exchange
by Hsien-Pu Chen, Elias Gonzalez, Yessica Saez and Laszlo B. Kish
Information 2015, 6(4), 719-732; https://doi.org/10.3390/info6040719 - 30 Oct 2015
Cited by 18 | Viewed by 5937
Abstract
The security of the Kirchhoff-law-Johnson-(like)-noise (KLJN) key exchange system is based on the fluctuation-dissipation theorem of classical statistical physics. Similarly to quantum key distribution, in practical situations, due to the non-idealities of the building elements, there is a small information leak, which can [...] Read more.
The security of the Kirchhoff-law-Johnson-(like)-noise (KLJN) key exchange system is based on the fluctuation-dissipation theorem of classical statistical physics. Similarly to quantum key distribution, in practical situations, due to the non-idealities of the building elements, there is a small information leak, which can be mitigated by privacy amplification or other techniques so that unconditional (information-theoretic) security is preserved. In this paper, the industrial cable and circuit simulator LTSPICE is used to validate the information leak due to one of the non-idealities in KLJN, the parasitic (cable) capacitance. Simulation results show that privacy amplification and/or capacitor killer (capacitance compensation) arrangements can effectively eliminate the leak. Full article
(This article belongs to the Special Issue Cybersecurity and Cryptography)
Show Figures

Figure 1

332 KiB  
Article
Batch Attribute-Based Encryption for Secure Clouds
by Chen Yang, Yang Sun and Qianhong Wu
Information 2015, 6(4), 704-718; https://doi.org/10.3390/info6040704 - 29 Oct 2015
Cited by 6 | Viewed by 6479
Abstract
Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE) is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated [...] Read more.
Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE) is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated with an access policy and an attribute set, respectively; in addition to holding a secret key, one can decrypt a ciphertext only if the associated attributes match the predetermined access policy, which allows one to enforce fine-grained access control on outsourced files. One issue in existing ABE schemes is that they are designed for the users of a single organization. When one wants to share the data with the users of different organizations, the owner needs to encrypt the messages to the receivers of one organization and then repeats this process for another organization. This situation is deteriorated with more and more mobile devices using cloud services, as the ABE encryption process is time consuming and may exhaust the power supplies of the mobile devices quickly. In this paper, we propose a batch attribute-based encryption (BABE) approach to address this problem in a provably-secure way. With our approach, the data owner can outsource data in batches to the users of different organizations simultaneously. The data owner is allowed to decide the receiving organizations and the attributes required for decryption. Theoretical and experimental analyses show that our approach is more efficient than traditional encryption implementations in computation and communication. Full article
(This article belongs to the Special Issue Cybersecurity and Cryptography)
Show Figures

Figure 1

671 KiB  
Article
The Development of Philosophy and Its Fundamental Informational Turn
by Kun Wu
Information 2015, 6(4), 693-703; https://doi.org/10.3390/info6040693 - 21 Oct 2015
Cited by 6 | Viewed by 12162
Abstract
Through the rescientification of philosophy and the philosophization of science, an entirely new concept of science and philosophy as part of general human knowledge is developing. In this concept, science and philosophy become intrinsically integrated and unified, forming dynamic feedback-loops which lead to [...] Read more.
Through the rescientification of philosophy and the philosophization of science, an entirely new concept of science and philosophy as part of general human knowledge is developing. In this concept, science and philosophy become intrinsically integrated and unified, forming dynamic feedback-loops which lead to further mutual transformation and integration. This development is taking place in the face of two kinds of dogmatism: one is naturalistic dogmatism, the other the dogmatism of consciousness philosophy. These two kinds of dogmatism are an inevitable consequence of the method of segmentation of the field of existence in traditional philosophy namely: existence = matter + Spirit (mind). The development of the Science and Philosophy of Information reveals a world of information-by-itself lying between the worlds of matter and Sprit, and re-interprets the essence of the Spiritual world in the sense of prior information activities. Accordingly, we can describe the movements from matter to Spirit, and from Spirit to matter in these activities as processes, eliminating their dualistic separation, and achieve an informational turn in philosophy, the first truly fundamental one. Full article
(This article belongs to the Special Issue Selected Papers from the ISIS Summit Vienna 2015)
2338 KiB  
Article
A New Efficient Optimal 2D Views Selection Method Based on Pivot Selection Techniques for 3D Indexing and Retrieval
by Hassan Silkan and Youssef Hanyf
Information 2015, 6(4), 679-692; https://doi.org/10.3390/info6040679 - 20 Oct 2015
Cited by 2 | Viewed by 5336
Abstract
In this paper, we propose a new method for 2D/3D object indexing and retrieval. The principle consists of an automatic selection of optimal views by using an incremental algorithm based on pivot selection techniques for proximity searching in metric spaces. The selected views [...] Read more.
In this paper, we propose a new method for 2D/3D object indexing and retrieval. The principle consists of an automatic selection of optimal views by using an incremental algorithm based on pivot selection techniques for proximity searching in metric spaces. The selected views are afterward described by four well-established descriptors from the MPEG-7 standard, namely: the color structure descriptor (CSD), the scalable color descriptor (SCD), the edge histogram descriptor (EHD) and the color layout descriptor (CLD). We present our results on two databases: The Amsterdam Library of Images (ALOI-1000), consisting of 72,000 color images of views, and the Columbia Object Image Library (COIL-100), consisting of 7200 color images of views. The results prove the performance of the developed method and its superiority over the k-means algorithm and the automatic selection of optimal views proposed by Mokhtarian et al. Full article
(This article belongs to the Special Issue Selected Papers from MedICT 2015)
Show Figures

Figure 1

646 KiB  
Article
Extending Deacon’s Notion of Teleodynamics to Culture, Language, Organization, Science, Economics and Technology (CLOSET)
by Robert K. Logan
Information 2015, 6(4), 669-678; https://doi.org/10.3390/info6040669 - 16 Oct 2015
Cited by 1 | Viewed by 4591
Abstract
Terrence Deacon’s (2012) notion developed in his book Incomplete Nature (IN) that living organisms are teleodynamic systems that are self-maintaining, self-correcting and self-reproducing is extended to human social systems. The hypothesis is developed that culture, language, organization, science, economics and technology (CLOSET) can [...] Read more.
Terrence Deacon’s (2012) notion developed in his book Incomplete Nature (IN) that living organisms are teleodynamic systems that are self-maintaining, self-correcting and self-reproducing is extended to human social systems. The hypothesis is developed that culture, language, organization, science, economics and technology (CLOSET) can be construed as living organisms that evolve, maintain and reproduce themselves and are self-correcting, and hence are teleodynamic systems. The elements of CLOSET are to a certain degree autonomous, even though they are obligate symbionts dependent on their human hosts for the energy that sustains them. Full article
(This article belongs to the Special Issue Selected Papers from the ISIS Summit Vienna 2015)
1315 KiB  
Article
Bayesian Angular Superresolution Algorithm for Real-Aperture Imaging in Forward-Looking Radar
by Yuebo Zha, Yin Zhang, Yulin Huang and Jianyu Yang
Information 2015, 6(4), 650-668; https://doi.org/10.3390/info6040650 - 15 Oct 2015
Cited by 10 | Viewed by 4918
Abstract
In real aperture imaging, the limited azimuth angular resolution seriously restricts the applications of this imaging system. This report presents a maximum a posteriori (MAP) approach based on the Bayesian framework for high angular resolution of real aperture radar. First, Rayleigh statistic and [...] Read more.
In real aperture imaging, the limited azimuth angular resolution seriously restricts the applications of this imaging system. This report presents a maximum a posteriori (MAP) approach based on the Bayesian framework for high angular resolution of real aperture radar. First, Rayleigh statistic and the lq norm (for 0 < q ≤ 1) sparse constraint are considered to express the clutter property and target scattering coefficient distribution, respectively. Then, the MAP objective function is established according to the hypotheses above. At last, a recursive iterative strategy is developed to estimate the original target scattering coefficient distribution and clutter statistic. The comparison of simulations and experimental results are given to verify the performance of our proposed algorithm. Full article
Show Figures

Figure 1

914 KiB  
Article
An Enhanced Quantum-Behaved Particle Swarm Optimization Based on a Novel Computing Way of Local Attractor
by Pengfei Jia, Shukai Duan  and Jia Yan
Information 2015, 6(4), 633-649; https://doi.org/10.3390/info6040633 - 13 Oct 2015
Cited by 17 | Viewed by 4073
Abstract
Quantum-behaved particle swarm optimization (QPSO), a global optimization method, is a combination of particle swarm optimization (PSO) and quantum mechanics. It has a great performance in the aspects of search ability, convergence speed, solution accuracy and solving robustness. However, the traditional QPSO still [...] Read more.
Quantum-behaved particle swarm optimization (QPSO), a global optimization method, is a combination of particle swarm optimization (PSO) and quantum mechanics. It has a great performance in the aspects of search ability, convergence speed, solution accuracy and solving robustness. However, the traditional QPSO still cannot guarantee the finding of global optimum with probability 1 when the number of iterations is limited. A novel way of computing the local attractor for QPSO is proposed to improve QPSO’s performance in global searching, and this novel QPSO is denoted as EQPSO during which we can guarantee the particles are diversiform at the early stage of iterations, and have a good performance in local searching ability at the later stage of iteration. We also discuss this way of computing the local attractor in mathematics. The results of test functions are compared between EQPSO and other optimization techniques (including six different PSO and seven different optimization algorithms), and the results found by the EQPSO are better than other considered methods. Full article
(This article belongs to the Section Information Theory and Methodology)
Show Figures

Figure 1

830 KiB  
Article
Analysis of Two-Worm Interaction Model in Heterogeneous M2M Network
by Jinhua Ma, Zhide Chen, Jianghua Liu and Rongjun Zheng
Information 2015, 6(4), 613-632; https://doi.org/10.3390/info6040613 - 10 Oct 2015
Cited by 3 | Viewed by 4215
Abstract
With the rapid development of M2M (Machine-to-Machine) networks, the damages caused by malicious worms are getting more and more serious. By considering the influences of the network heterogeneity on worm spreading, we are the first to study the complex interaction dynamics between benign [...] Read more.
With the rapid development of M2M (Machine-to-Machine) networks, the damages caused by malicious worms are getting more and more serious. By considering the influences of the network heterogeneity on worm spreading, we are the first to study the complex interaction dynamics between benign worms and malicious worms in heterogeneous M2M network. We analyze and compare three worm propagation models based on different immunization schemes. By investigating the local stability of the worm-free equilibrium, we obtain the basic reproduction number R0 . Besides, by using suitable Lyapunov functions, we prove that the worm-free equilibrium is globally asymptotically stable if R0 ≤ 1 , otherwise unstable. The dynamics of worm models is completely determined by R0 . In the absence of birth, death and users’ treatment, we obtain the final size formula of worms. This study shows that the nodes with higher node degree are more susceptible to be infected than those with lower node degree. In addition, the effects of various immunization schemes are studied. Numerical simulations verify our theoretical results. The research results are meaningful for us to further understand the spread of worms in heterogeneous M2M network, and enact effectual control tactics. Full article
(This article belongs to the Special Issue Cybersecurity and Cryptography)
Show Figures

Figure 1

922 KiB  
Article
Duplication Detection When Evolving Feature Models of Software Product Lines
by Amal Khtira, Anissa Benlarabi and Bouchra El Asri
Information 2015, 6(4), 592-612; https://doi.org/10.3390/info6040592 - 07 Oct 2015
Cited by 3 | Viewed by 6233
Abstract
After the derivation of specific applications from a software product line, the applications keep evolving with respect to new customer’s requirements. In general, evolutions in most industrial projects are expressed using natural language, because it is the easiest and the most flexible way [...] Read more.
After the derivation of specific applications from a software product line, the applications keep evolving with respect to new customer’s requirements. In general, evolutions in most industrial projects are expressed using natural language, because it is the easiest and the most flexible way for customers to express their needs. However, the use of this means of communication has shown its limits in detecting defects, such as inconsistency and duplication, when evolving the existing models of the software product line. The aim of this paper is to transform the natural language specifications of new evolutions into a more formal representation using natural language processing. Then, an algorithm is proposed to automatically detect duplication between these specifications and the existing product line feature models. In order to instantiate the proposed solution, a tool is developed to automatize the two operations. Full article
(This article belongs to the Special Issue Selected Papers from MedICT 2015)
Show Figures

Graphical abstract

362 KiB  
Article
A Backward Unlinkable Secret Handshake Scheme with Revocation Support in the Standard Model
by Yamin Wen, Zheng Gong and Lingling Xu
Information 2015, 6(4), 576-591; https://doi.org/10.3390/info6040576 - 07 Oct 2015
Cited by 1 | Viewed by 3863
Abstract
Secret handshake schemes have been proposed to achieve private mutual authentications, which allow the members of a certain organization to anonymously authenticate each other without exposing their affiliations. In this paper, a backward unlinkable secret handshake scheme with revocation support (BU-RSH) is constructed. [...] Read more.
Secret handshake schemes have been proposed to achieve private mutual authentications, which allow the members of a certain organization to anonymously authenticate each other without exposing their affiliations. In this paper, a backward unlinkable secret handshake scheme with revocation support (BU-RSH) is constructed. For a full-fledged secret handshake scheme, it is indispensable to furnish it with practical functionality, such as unlinkability, revocation and traceability. The revocation is achieved in the BU-RSH scheme, as well as the unlinkability and the traceability. Moreover, the anonymity of revoked members is improved, so that the past transcripts of revoked members remain private, i.e., backward unlinkability. In particular, the BU-RSH scheme is provably secure in the standard model by assuming the intractability of the `-hidden strong Diffie-Hellman problem and the subgroup decision problem. Full article
(This article belongs to the Special Issue Cybersecurity and Cryptography)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop