E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information Theory Applied to Animal Communication"

Quicklinks

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: closed (31 December 2009)

Special Issue Editor

Guest Editor
Dr. Laurance R. Doyle

Carl Sagan Center for the Study of Life in the Universe, SETI Institute, 189 Bernardo Avenue, Mountain View, California, 94043, USA
Website | E-Mail
Interests: photometric techniques for detecting extrasolar planets; information theory applied to animal communications; astro-ecology and remote detection of exobiological systems; quantum astronomy and cosmic-scale quantum measurement problems

Keywords

  • annimal communication
  • information theory
  • entropy

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle Capacity Bounds and Mapping Design for Binary Symmetric Relay Channels
Entropy 2012, 14(12), 2589-2610; doi:10.3390/e14122589
Received: 1 September 2012 / Revised: 4 December 2012 / Accepted: 6 December 2012 / Published: 17 December 2012
Cited by 1 | PDF Full-text (313 KB) | HTML Full-text | XML Full-text
Abstract
Capacity bounds for a three-node binary symmetric relay channel with orthogonal components at the destination are studied. The cut-set upper bound and the rates achievable using decode-and-forward (DF), partial DF and compress-and-forward (CF) relaying are first evaluated. Then relaying strategies with finite memory-length
[...] Read more.
Capacity bounds for a three-node binary symmetric relay channel with orthogonal components at the destination are studied. The cut-set upper bound and the rates achievable using decode-and-forward (DF), partial DF and compress-and-forward (CF) relaying are first evaluated. Then relaying strategies with finite memory-length are considered. An efficient algorithm for optimizing the relay functions is presented. The Boolean Fourier transform is then employed to unveil the structure of the optimized mappings. Interestingly, the optimized relay functions exhibit a simple structure. Numerical results illustrate that the rates achieved using the optimized low-dimensional functions are either comparable to those achieved by CF or superior to those achieved by DF relaying. In particular, the optimized low-dimensional relaying scheme can improve on DF relaying when the quality of the source-relay link is worse than or comparable to that of other links. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle Arguments for the Integration of the Non-Zero-Sum Logic of Complex Animal Communication with Information Theory
Entropy 2010, 12(1), 127-135; doi:10.3390/e12010127
Received: 27 September 2009 / Revised: 28 December 2009 / Accepted: 20 January 2010 / Published: 21 January 2010
Cited by 1 | PDF Full-text (98 KB) | HTML Full-text | XML Full-text
Abstract
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of the
[...] Read more.
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of the complexity of animal communication. Some considerations on the way information theory and intraspecific communication can be linked are presented here. Specifically, information theory may help us to explore interindividual variations in different environmental constraints and social scenarios, as well as the communicative features of social vs. solitary species. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle A Law of Word Meaning in Dolphin Whistle Types
Entropy 2009, 11(4), 688-701; doi:10.3390/e11040688
Received: 30 June 2009 / Accepted: 26 October 2009 / Published: 30 October 2009
Cited by 11 | PDF Full-text (233 KB) | HTML Full-text | XML Full-text
Abstract
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a
[...] Read more.
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a whistle tends to occur or not occur is shared by different individuals, which is consistent with the hypothesis that dolphins are communicating through whistles. Furthermore, we show that the number of behavioral contexts significantly associated with a certain whistle type tends to grow with the frequency of the whistle type, a pattern that is reminiscent of a law of word meanings stating, as a tendency, that the higher the frequency of a word, the higher its number of meanings. Our findings indicate that the presence of Zipf's law in dolphin whistle types cannot be explained with enough detail by a simplistic die rolling experiment. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle Quantification of Information in a One-Way Plant-to-Animal Communication System
Entropy 2009, 11(3), 431-442; doi:10.3390/e110300431
Received: 15 July 2009 / Revised: 18 August 2009 / Accepted: 20 August 2009 / Published: 21 August 2009
Cited by 7 | PDF Full-text (121 KB) | HTML Full-text | XML Full-text
Abstract
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps)
[...] Read more.
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type), to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia). We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types), to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message). We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception), for possible insights into the history and actual working of this one-way communication system. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)

Review

Jump to: Research

Open AccessReview The Variety of Information Transfer in Animal Sonic Communication: Review from a Physics Perspective
Entropy 2009, 11(4), 888-906; doi:10.3390/e11040888
Received: 13 October 2009 / Accepted: 17 November 2009 / Published: 18 November 2009
Cited by 3 | PDF Full-text (137 KB) | HTML Full-text | XML Full-text
Abstract
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic
[...] Read more.
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic signal varies greatly from one genus to another depending upon animal size, anatomy, physiology, and habitat, as also does the way in which information is encoded in the signal, but some general principles can be elucidated showing the possibilities and limitations for information transfer. Cases discussed range from insects through song birds to humans. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessReview The Use of Ideas of Information Theory for Studying “Language” and Intelligence in Ants
Entropy 2009, 11(4), 836-853; doi:10.3390/e11040836
Received: 21 September 2009 / Accepted: 4 November 2009 / Published: 10 November 2009
Cited by 7 | PDF Full-text (1264 KB) | HTML Full-text | XML Full-text
Abstract
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a
[...] Read more.
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l) and its frequency (p), i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i) to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii) to estimate the rate of information transmission; (iii) to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv) to reveal that ants are able to transfer to each other the information about the number of objects; (v) to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top