Next Issue
Volume 3, September
Previous Issue
Volume 3, March
 
 

Information, Volume 3, Issue 2 (June 2012) – 5 articles , Pages 175-255

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
324 KiB  
Article
The World Within Wikipedia: An Ecology of Mind
by Andrew M. Olney, Rick Dale and Sidney K. D’Mello
Information 2012, 3(2), 229-255; https://doi.org/10.3390/info3020229 - 18 Jun 2012
Cited by 4 | Viewed by 5107
Abstract
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest this [...] Read more.
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest this cognitive-linguistic environment, such asWikipedia, should represent language structure and conceptual categorization in a way consistent with human behavior. We use this intuition to guide the construction of a computational cognitive model, situated in Wikipedia, that generates semantic association judgments. Our unsupervised model combines information at the language structure and conceptual categorization levels to achieve state of the art correlation with human ratings on semantic association tasks including WordSimilarity-353, semantic feature production norms, word association, and false memory. Full article
(This article belongs to the Special Issue Cognition and Communication)
Show Figures

Graphical abstract

37 KiB  
Book Review
Mark Burgin’s Theory of Information
by Joseph E. Brenner
Information 2012, 3(2), 224-228; https://doi.org/10.3390/info3020224 - 1 Jun 2012
Cited by 2 | Viewed by 7349
Abstract
A review of a major, definitive source book on the basis of information theory is presented. Full article
(This article belongs to the Section Information Theory and Methodology)
176 KiB  
Article
Information and Physics
by Vlatko Vedral
Information 2012, 3(2), 219-223; https://doi.org/10.3390/info3020219 - 11 May 2012
Cited by 13 | Viewed by 8345
Abstract
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum [...] Read more.
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum physics, and, at the same time, offered us deep insights into general relativity through the study of black hole thermodynamics. Whatever the outcome of this debate, I argue that physicists will be able to benefit from continuing to explore connections between the two. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
284 KiB  
Article
Physical Computation as Dynamics of Form that Glues Everything Together
by Gordana Dodig Crnkovic
Information 2012, 3(2), 204-218; https://doi.org/10.3390/info3020204 - 26 Apr 2012
Cited by 22 | Viewed by 9535
Abstract
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability [...] Read more.
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair (matter/energy, structure/process, information/computation) are analogous to the relationship between being and becoming, where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. This approach presents a unified view built on two fundamental ontological categories: Information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks evolving through processes of natural computation helps to make more coherent models of nature, connecting non-living and living worlds. It presents a suitable basis for incorporating current developments in understanding of biological/cognitive/social systems as generated by complexification of physicochemical processes through self-organization of molecules into dynamic adaptive complex systems by morphogenesis, adaptation and learning—all of which are understood as information processing. Full article
(This article belongs to the Special Issue Information: Its Different Modes and Its Relation to Meaning)
403 KiB  
Article
Beyond Bayes: On the Need for a Unified and Jaynesian Definition of Probability and Information within Neuroscience
by Christopher D. Fiorillo
Information 2012, 3(2), 175-203; https://doi.org/10.3390/info3020175 - 20 Apr 2012
Cited by 21 | Viewed by 11303
Abstract
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information [...] Read more.
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dispute about the definition of probabilities themselves. The “frequentist” view is that probabilities are (or can be) essentially equivalent to frequencies, and that they are therefore properties of a physical system, independent of any observer of the system. E.T. Jaynes developed the alternate “Bayesian” definition, in which probabilities are always conditional on a state of knowledge through the rules of logic, as expressed in the maximum entropy principle. In doing so, Jaynes and others provided the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason). However, neuroscience literature virtually never specifies any definition of probability, nor does it acknowledge any dispute concerning the definition. Although there has recently been tremendous interest in Bayesian approaches to the brain, even in the Bayesian literature it is common to find probabilities that are purported to come directly and unconditionally from frequencies. As a result, scientists have mistakenly attributed their own information to the neural systems they study. Here I argue that the adoption of a strictly Jaynesian approach will prevent such errors and will provide us with the philosophical and mathematical framework that is needed to understand the general function of the brain. Accordingly, our challenge becomes the identification of the biophysical basis of Jaynesian information and logic. I begin to address this issue by suggesting how we might identify a probability distribution over states of one physical system (an “object”) conditional only on the biophysical state of another physical system (an “observer”). The primary purpose in doing so is not to characterize information and inference in exquisite, quantitative detail, but to be as clear and precise as possible about what it means to perform inference and how the biophysics of the brain could achieve this goal. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
Show Figures

Graphical abstract

Previous Issue
Next Issue
Back to TopTop