Next Issue
Volume 1, December
 
 

Information, Volume 1, Issue 1 (September 2010) – 4 articles , Pages 1-59

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

31 KiB  
Editorial
Information – A New Open Access Scientific Journal on Information Science, Information Technology, Data, Knowledge and Communication
by Shu-Kun Lin
Information 2010, 1(1), 1-2; https://doi.org/10.3390/info1010001 - 23 Jun 2010
Cited by 2 | Viewed by 4650
Abstract
We plan to expand our Open Access publishing project to encompass additional fundamental areas in science and technology and to provide publication opportunities for scientists working in these areas. To achieve these goals, we are in the process of launching new journals. [...] [...] Read more.
We plan to expand our Open Access publishing project to encompass additional fundamental areas in science and technology and to provide publication opportunities for scientists working in these areas. To achieve these goals, we are in the process of launching new journals. [...] Full article

Research

Jump to: Editorial

232 KiB  
Article
Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
by Boris Ryabko
Information 2010, 1(1), 3-12; https://doi.org/10.3390/info1010003 - 12 Aug 2010
Cited by 4 | Viewed by 5857
Abstract
We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different [...] Read more.
We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications. Full article
Show Figures

Graphical abstract

576 KiB  
Article
New Information Measures for the Generalized Normal Distribution
by Christos P. Kitsos and Thomas L. Toulias
Information 2010, 1(1), 13-27; https://doi.org/10.3390/info1010013 - 20 Aug 2010
Cited by 12 | Viewed by 6328
Abstract
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distribution family, to study the generalized entropy type measures of information. For this generalized normal, the Kullback-Leibler information is evaluated, which extends the well known result for the normal distribution, [...] Read more.
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distribution family, to study the generalized entropy type measures of information. For this generalized normal, the Kullback-Leibler information is evaluated, which extends the well known result for the normal distribution, and plays an important role for the introduced generalized information measure. These generalized entropy type measures of information are also evaluated and presented. Full article
(This article belongs to the Special Issue What Is Information?)
Show Figures

Figure 1

908 KiB  
Article
A Paradigm Shift in Biology?
by Gennaro Auletta
Information 2010, 1(1), 28-59; https://doi.org/10.3390/info1010028 - 13 Sep 2010
Cited by 10 | Viewed by 9012
Abstract
All new developments in biology deal with the issue of the complexity of organisms, often pointing out the necessity to update our current understanding. However, it is impossible to think about a change of paradigm in biology without introducing new explanatory mechanisms. I [...] Read more.
All new developments in biology deal with the issue of the complexity of organisms, often pointing out the necessity to update our current understanding. However, it is impossible to think about a change of paradigm in biology without introducing new explanatory mechanisms. I shall introduce the mechanisms of teleonomy and teleology as viable explanatory tools. Teleonomy is the ability of organisms to build themselves through internal forces and processes (in the expression of the genetic program) and not external ones, implying a freedom relative to the exterior; however, the organism is able to integrate internal and external constraints in a process of co-adaptation. Teleology is that mechanism through which an organism exercises an informational control on another system in order to establish an equivalence class and select some specific information for its metabolic needs. Finally, I shall examine some interesting processes in phylogeny, ontogeny, and epigeny in which these two mechanisms are involved. Full article
(This article belongs to the Special Issue What Is Information?)
Show Figures

Figure 1

Next Issue
Back to TopTop