Next Article in Journal
Thermoeconomic Optimum Operation Conditions of a Solar-driven Heat Engine Model
Next Article in Special Issue
A Law of Word Meaning in Dolphin Whistle Types
Previous Article in Journal
Continuous-Discrete Path Integral Filtering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantification of Information in a One-Way Plant-to-Animal Communication System

SETI Institute, Carl Sagan Center for the Study of Life in the Universe, 515 N. Whisman Road, Mountain View, CA 94043, USA
Entropy 2009, 11(3), 431-442; https://doi.org/10.3390/e110300431
Submission received: 15 July 2009 / Revised: 18 August 2009 / Accepted: 20 August 2009 / Published: 21 August 2009
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)

Abstract

:
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type), to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia). We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types), to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message). We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception), for possible insights into the history and actual working of this one-way communication system.
PACS Codes:
89.70.-a; 89.70.Cf; 87.10.Vg

1. Introduction

In a paper by DeMoraes et al. [1] the cotton plant Gossypium hirsutum (among other plants) was found, by gas chromatography, to emit specific quantities of nine chemicals indicative of one or the other of two particular herbivorous insects that feed on it—the tobacco budworm (Heliothis virescens) or the maize earworm (Heliocoverpa zea). This multi-chemical communication was received by the predatory wasp, Cardiochiles nigriceps (popularly named the “red-tailed wasp”), that is known to prefer to prey (lay eggs) on H. virescens compared to H. zea. DeMoraes et al. [1] demonstrated that this “specific plant signaling” system of “information-rich signals” (quoted from their paper), being received by the wasps, allowed them to preferentially select which plants to expectantly land on, using only these chemical clues (since all evidence of the herbivores’ presence, including dead leaves, had been removed prior to releasing the wasps into the cotton garden). However, in their paper, they did not attempt to quantify the information that was actually transmitted to the wasps. We take this opportunity to demonstrate how such a communication system might be thus quantified, as well as point out some important aspects to be aware of when applying information theory to such biological signaling systems.

2. Information Entropy Measures and the Zipf Statistic

Information theory, as first formulated by Shannon ([2], see also [3]) can be applied to quantify the information transmission rates within any communication system as long as the signals are correctly classified (one of the main difficulties in the application to non-human biological communication systems). For instances where only changes within the same communication system of signals are to be measured, the signal classification, in most cases, merely needs to be consistent between the two (or more) data sets to allow preliminary comparisons of such changes in the information content. This assumes, of course, that the signaling system is not in some kind of compressed code (in which case the de-coding algorithms should be considered as part of the total information content of that communication system). In addition to broad application to human and computer communication systems (e.g., [4,5]), information theory has already found application to many animal communication systems (e.g., reviewed in McCowan et al. [6]; see also [7]), as well as chemical signaling systems such as DNA (e.g., [8]).
In this paper we will employ the straightforward formulations of information entropy as given in the original work by Shannon [2,3,7]. We consider a two-part system, X, Y where i and j are the categories of X and Y, respectively, and p(i) and p(j) are the associated marginal probabilities. The probability that Y = j if X = i is the conditional probability p(j) and the probability that X = i if Y = j is the conditional probability Pj(i). The probability of the joint occurrence X = i while Y = j is then: p(i,j) = p(i)pi(j) = p(j)pj(i) which equals p(i)p(j) when X and Y are independent.
The information entropy follows from the marginal probabilities for N (equals M in our case) signal types (representing the two plant message types, one for each herbivore type):
H ( X ) = i = 1 N p ( i ) log 2 p ( i )
H ( Y ) = j = 1 M p ( j ) log 2 p ( j )
We can refer to Equation (1a) and (1b), the marginal entropies, also as the “first-order entropies,” after [5,6,7], because it is a first-order approximation to the entropy of the system which does not take into account higher-order conditional probabilities (di-gram or tri-gram structure, etc. for example). For a uniform distribution of N different signal types we see that:
H 0 ( X ) = j = 1 N N 1 log 2 N 1 = log 2 N
which can be referred to as the “zero-order entropy”. Throughout this analysis the total number of signals is assumed to be sufficiently well sampled, and statistically ergodic, to allow the frequencies of occurrence of the signal units to approximate their actual probability distributions, (a caveat that must be taken into account in any information theoretic analysis).
The next entropic measure allows us to take into account the information in two messages—usually in the comparison of a message sent with the message received for a two-say communication system. But in the problem considered here, we shall want to be able to distinguish between two chemical messages sent essentially at the same time. Based on the joint probability, p(i,j), the joint entropy is defined as:
H ( X , Y ) = i , j = 1 N , M p ( i , j ) log 2 p ( i , j )
In our application we shall find that the joint entropy will be most relevant since the cotton garden may be expected to be emitting both sets of chemical signal distributions at once, which the wasp must distinguish between. Each set consists of the same nine chemical signals but in different ratios, as we shall see below. We note that, superficially, this should be a 1-bit decision process between two plant chemical emission types, since the order of the two chemicals, in this case, do not represent additional choices (and assuming no higher-order conditional probabilistic structure).
We are interested in the measure that allows the two messages to be distinguished. If the two messages were completely independent of each other than, as mentioned, the joint entropy would have been equal to the sum of the marginal entropies: H(X,Y) = H(X) + H(Y). However, if this is not the case, the difference between the sum of the first-order (marginal) entropies of the two messages, H(X), H(Y), and the actual joint entropy, H(X,Y), will be a measure of how far from independence these two message data sets are. Therefore I(X,Y), the mutual entropy, is the measure of the overlapping information transmitted from the two messages, and is given by:
I(X,Y) = H(X) + H(Y) − H(X,Y)
This rate is usually the entropy of X (the transmitted message data set) plus the entropy of Y (the received—or in this case the second transmitted data set), minus what is often called the “uncertainty” in the joint occurrence of the combination of and . Again, if some correlation exists between the information in message sets H(X) and H(Y), then the conditional information entropies (Equations 5 and 6, below) are non-zero. The conditional information components of the entropies of the two data sets are thus designated as:
H(Y | X) = H(X,Y) − H(X)
which, in a two-way communication system, is known as the “ambiguity” and:
H(X | Y) = H(X,Y) − H(Y)
which, in a two-way communication system, is usually known as the “equivocation”. However, as already noted, in our case both values will represent ambiguities because both messages under consideration are transmitted. In other words, the ambiguity will be between the two types of messages, while the uncorrelated portion of the two messages is what will allow the unequivocal identification of the H. virescens plants by the wasps, (as we shall see below). In the case of the emitted chemical signals, both Equation 5 and Equation 6, then, represent values for ambiguity in the transmission. The equivocation (in reception of the messages) will be calculated later using the response of the wasps to the two transmitted messages. Before this, however, in the next section we shall examine the Zipf statistic for this communication system.

3. The Zipf Statistic

Before moving on to the calculation of the information theoretic measures, we want to take a more detailed look at the distribution of the probabilities of occurrence of the signals in this system by examining a plot known as “Zipf’s Law,” or the “Zipf statistic.” Zipf’s Law is simply another way of examining the distribution of the individual components of the first-order information entropy (Equation 3) by performing a linear regression on the (base 10) logarithm of the frequencies of occurrence of signals against the logarithm (base 10) of their rank. We use the term “rank” here to refer to the order from highest to lowest frequencies of occurrence (from 1st to 9th in this example). As is well know in linguistic circles, Zipf’s Law or the Zipf statistic results in a best fit slope of about –1 for most human languages (letters, words, phonemes, etc.; see [9,10,11]). A Zipf’s Law examination of the frequency-of-occurrence components has also previously been applied to various animal communication systems as well (see, e.g., [6,12], and references therein).
As a first-order measure of potential informational structure or coding within a communication system, the Zipf statistic can act as a guide. A slope around –1 in the Zipf plot in human language systems implies polysemy and may be a requirement for symbolic communication systems [10]. Here it may simply be indicative of the nine-chemical communication system having a high repetition rate. (We note that the number of signal types needs to be similar to allow even redundant systems to be inter-compared using Zipf’s Law). In Figure 1 we show the Zipf plot for the chemical units listed in Table 1, where the log base-10 of all three lists of probabilities of occurrence, p(v), p(z), and p(v,z), have been regressed against the log base-10 of their rank order.
An example of a non-human communication system with a Zipf slope close to –1 is frequency distribution of adult bottlenose dolphins whistle-signals (although this does not hold for the whistle distribution of juvenile dolphins) [6,7,11]. The shallower the slope, the less redundant the signaling systems—for example both infant human baby babbling as well as infant dolphin whistle “babbling” have slopes around –0.8 [6,7,11]. A steeper negative slope of the Zipf statistic indicates more redundancy. Adolescent humans—already in the beginning processes of learning a specific language—may be expected to repeat a smaller vocabulary of sounds/words more often as they transition from a more random (but large number of) babbling sounds to specific words, thereby causing the negative Zipf slopes to steepen. However, a Zipf slope as steep as –2, indicated in the chemical communication system under consideration, is not generally encountered in either common human languages nor the complex animal vocal communication systems studied to date (e.g., bottlenose dolphins, squirrel monkeys, humpback whales [12,14]).
Figure 1. Zipf statistic. This is the log-10 of the frequency-of-occurrence of the nine chemical signals in Table 1 regressed against the log-10 of the rank (order from most to least frequent). R is the goodness of fit. Diamonds (green) are for the fractional chemical emissions from H. virescens infested plants, squares (blue) are for H. zea infested plants, and circles (red) are for both types of infestations taken together. Slopes for H. virescens, H. zea, and both together (joint occurrence) are –2.06, –2.65, and –1.58, respectively, indicating a not unexpectedly high level of repetition possible for this communication system. The highest, (i.e. steepest negative) slope was for the cotton plants infested with H. zea, as expected from the dominance of one particular chemical in this case (see Table 1).
Figure 1. Zipf statistic. This is the log-10 of the frequency-of-occurrence of the nine chemical signals in Table 1 regressed against the log-10 of the rank (order from most to least frequent). R is the goodness of fit. Diamonds (green) are for the fractional chemical emissions from H. virescens infested plants, squares (blue) are for H. zea infested plants, and circles (red) are for both types of infestations taken together. Slopes for H. virescens, H. zea, and both together (joint occurrence) are –2.06, –2.65, and –1.58, respectively, indicating a not unexpectedly high level of repetition possible for this communication system. The highest, (i.e. steepest negative) slope was for the cotton plants infested with H. zea, as expected from the dominance of one particular chemical in this case (see Table 1).
Entropy 11 00431 g001
One could suppose that “syntax-like” structure might exist in a chemical communication system (simple conditional probabilities between chemical units can be found even in DNA, for example [8]). But sequences of different chemical signals would have to be received in a time series of observations to ascertain if this is the case. As far as information entropic measures, repetition appears as higher-order conditional probabilistic relationships between signals just as complex syntactical rule structure indicates complex rule structure in human languages. Repetition might, therefore, be referred to as the “null” conditional probabilistic structure of a communication system. However, as pointed out, it represents no real additional grammatical structure within the communication system—for this one must exclude i = j = k, etc. in the summations deriving the entropic values (e.g., ([7], Equation 6). Real syntax-like structure thus requires conditional probabilistic relationships between signals of different types. For chemical communication systems such a higher-order structure (again, apart from repetition) would imply combinations and sequences of chemical unit differences that could produce new signal meanings when occurring in certain sequential orders. Sequences of individual chemicals, were measured in the experiment under discussion [1] but only at low time resolution (ratios of chemical were read out every three hours for a 48-hour period). The more complex the rules, the more ordered the signaling data set—such rules being a valuable tool for error recovery of messages [14,15]. So natural communication systems that improve the fitness or survival of a species might be expected to evolve such complex rule structure, at least to the capacity that a given species can produce and assimilate such complexity.

4. Information in the Chemical Signals

4.1. Information in the Transmitted Message

The most complex system (in terms of highest information entropy) will be the system in which the wasps are perceiving all nine chemicals as the basic units of the communication system. The least complicated system would be one in which the wasps are perceiving only one of the chemicals (or perceiving groups of chemicals as only one signal). For now we shall assume that all nine chemicals are detected by the wasps, to begin with, and make the reasonable assumption that detectability of each chemicals is proportional to their abundance (i.e., wasps have no more intrinsic sensitivity to one chemical than to another).
Here we are interested in the ability of the red-tailed wasp—using only the chemical signals emitted by the cotton plants—to distinguish between plants that were infested by the tobacco budworm (H. virescens) from plants that were infested by the maize earworm (H. zea). We wish to calculate a change in information content (chemical information emitted from the cotton plants) that would allow such a choice to be made by the wasps—who, as stated, prefer H. virescens to H. zea. The choice of the wasps was determined by their landing on a given plant. We note that only a quantitative change in the amount of the same nine chemical signaling units will be applied in our example here so that artifacts on the information entropy calculation, occasioned by the method of signal classification itself, are essentially absent. We recognize that the individual chemical amounts used here (nanograms) are not likely the precise units used by the wasps to “interpret” the message. Nevertheless it is reasonable to assume, as mentioned, that the amount of a given chemical present is proportional to the detectability of that chemical by a given species, which will give the same probabilities. (We also assume that both the minimum threshold for detectability of the chemicals by the wasps has been reached by the least abundant chemical and, on the other hand, that saturation of the wasps detection system by overabundance of a given chemical has not occurred.) For this precedent we might take an analogy from human language phonemes which are largely meaningless in isolation (as opposed to words) but nevertheless can give a correct idea of human language complexity when quantified using information theoretic measures (e.g., [5,13]).
In the experiment of DeMoraes et al. [1] red-tailed wasps were, indeed, preferentially attracted to the cotton plants with the H. virescens infestation a statistically significant number of times over those infested with H. zea. All insects and dead leaves had been removed at the time of release of the wasps, as mentioned, so the wasps had only chemical signals with which to make a choice (i.e., no visual clues). Therefore any deviation from a random choice regarding which plants to land on may be assumed to be made by the wasps based solely on their detection of any quantitative differences in the chemical indicators (see [1], who make this argument, for details).
DeMoraes et al. [1] applied a gas chromatograph to measure nanogram traces of these nine specific chemicals, and these measurements are shown in Table 1. Using the nanogram amounts of the nine signaling chemicals emitted in each of the two feeding herbivore cases, we can calculate the frequencies of occurrence of each chemical divided by the total chemical quantity for that herbivore-specific case (H. virescens or H. zea). The joint entropy is calculated using the total amount of a given chemical (from both types of plant emissions) divided by the total amount of chemicals emitted by all plants. In other words, for the distribution of individual chemical molecules within the wasp’s environment—near either an H. virescens or an H. zea infested plant—these abundances represent the likelihood that the wasp will detect them given an equally weighted perception, by nanogram amount, of each of these nine chemicals. For the chemicals shown in Table 1, the probabilities for H. virescens are given as p(i), and the probabilities for H. zea are given as p(j).
The joint probabilities, p(i,j), are, as mentioned, the sum of the nanogram amounts of a specific chemical given off by both cotton plants divided by the total nanogram amount of all chemicals emitted from both plant types. Thus we have the simultaneous occurrences of each chemical (joint probabilities) in the final column, each component of which is used in the calculation of their entropy and summed to get the final joint entropy of the system. We note that the total nanogram abundance of chemicals given out by the cotton plants that were infested with H. virescens is about 2.92 times the total nanogram amount give off by the cotton plants infested with H. zea, as we can see by summing the totals of the ng-labeled columns. (DeMoraes et al., note that over a 48-hour period 16 samples of the nine chemicals were taken, each time resulting in the same ratios of H. virescens compared to H. zea. However, a correction for this difference does not produce a significant change. We have applied this correction to the H. zea abundances and it reduces the joint probability by only 0.1 bits.) The information theoretic measures should still, then, be valid as long as we can indeed assume that one chemical does not totally dominate the detectability of the others by saturating the environment being sampled by the wasps. DeMoraes et al. [1] state, “No significant difference was observed in the total volatile amount released by the two plant-herbivore complexes,” so we can assume this applies to the total chemical mass given off during each of the wasp release experiments.
Proceeding on these assumptions, we use the probabilities derived in Table 1, (for N = 9) to calculate the information entropies for H. virescens ( letting X = v) and H. zea (letting Y = z). The zero-order entropy, H0, is the same for both messages (data sets), since we use the same nine chemical signaling types in both cases. The first-order entropies, H(v) and H(z), the joint entropy, H(v,z), the conditional information entropies, H(z|v) and H(v|z), and the total uncorrelated information entropy, I(v,z), are all calculated.
Table 1. Percentage occurrences of chemical signaling units (abundances from [1]).
Table 1. Percentage occurrences of chemical signaling units (abundances from [1]).
Chemical SignalsH. virescens (ng)p(i)H. zea (ng)p(j)p(i, j)
(Z)-3-Hexen-1-ol13,7380.03892150.00180.0294
α-Pinene53,3100.15097480.00620.1140
(Z)-3-Hexenyl acetate111,3500.31516,9220.05720.2493
(E)-β-Ocimene120,2570.34035,9190.04890.2660
(E)-4-8-Dimethyl-1,3,7-nonatriene35,6120.100888,0000.72740.2606
β-Caryophyllene5,5910.01585460.00450.0129
(E)- β-farnesene9,1770.02602470.00200.0199
(E,E)- α-farnesene1,2730.00369,0730.07500.0218
(E,E)-4,8,12-trimethyl-1,3,7-tridecatetnaene3,0750.00879,3170.07700.0261
Using Equations 1–6 and Table 1, we have: H0(v) = H0(z) = 3.17 bits, H(v) = 2.30 bits, H(z) = 1.46 bits, H(v,z) = 2.47 bits, H(z|v) = 0.17 bits, H(v|z) = 1.01 bits, and I(v,z) = 1.29 bits. Note again that H(z|v) and H(v|z) are both ambiguities in the transmission from the H. virescens and H. zea infested plants, respectively, while the reception of the message—as measured by the reaction of the wasps to the chemical signals (which we discuss below)—represents the equivocation in this communication system.
In our measures above, the zero-order entropy, of course, is the highest bit rate possible for a communication system with nine signal types (assuming, as it does, a uniform distribution of the signals). Subsequent constraints (conditional probabilistic relationships) on that system will reduce the degrees of freedom of these nine signal types, since higher-order entropies take into account “rules” of constraint. If the signals from the two messages were completely independent, then we would have I(v,z) = H(v) + H(z) – H(v,z) = 0, and the mutual information (overlap of information between the two message systems) would be zero, but we find that the degree of mutual dependence (from Equation 4) is I(v,z)=1.29 bits. However, it is the non-overlapping (unambiguously transmitted) part of the information that allows a decision to be made by the wasps. This is the non-overlapping information in H(v) and H(z) which is the joint information, H(v,z) = 2.47 bits. If no information had overlapped, the joint information would have been, of course, H(v) + H(z) = 3.76 bits since p(v) and p(z) would have been independent. In information theory H(v) + H(z) ≥ H(v,z) but note that, in the individual rows in Table 1, this is not the case. However, the columns for all three must be summed to characterize the joint information of the whole system and here the inequality holds.
We also see that H(v) has a larger value than H(z); this is because there is a dominant probability of occurrence in the chemical signals emitted from the H. zea–infested plants (the chemical (E)-4-8-dimethyl-1,3,7-nonatriene), which lowers the “choices” of other chemical signals being emitted in this case—that is, this chemical-set has a reduced uncertainty as to which chemical will be emitted. Since the wasps can be expected, then, to make a decision based upon non-overlapping information only, in this system, a plant-landing decision must be made by the information left from the total system when the ambiguities are removed. The unambiguous information transmitted, using this nine-chemical communication system from the cotton plants to the wasps, is thus 2.47 bits.

4.2. Information in the Received Message

On the assumption that the sole purpose of the wasps is to pick out the plants that have (had) the herbivore H. virescens on them, (that is, that there is no evolutionary benefits in imprecise landing determinations), we can quantify the equivocation in the reception of the chemical message by looking at the wasps behavioral success in achieving their goal. It was observed that wasps landed on H. virescens plants from 70% to 75% of the time [1], so we can use Equation 3 to calculate the joint information required to reach this success rate. For the case of p(z) = 0.25 and p(v) = 0.75, we obtain H(v,z) = H(0.75,0.25) = 0.81 bits, and in the case of p(z) = 0.30 and p(v) = 0.70, we obtain H(0.70,0.30) = 0.88 bits of information. This, then, is the range of the information actually acted upon by the wasps, as quantified by their landing behavior.
From our previous calculations we saw that 2.47 bits of unambiguous information was transmitted from the cotton plants (assuming all nine chemicals come into play, as assumption we re-examine below), even though only 1.00 bit of information would be required to make a choice between two types of plants to land upon. Yet apparently an average (0.81 bits and 0.88 bits) of only about 0.85 bits of information were unequivocally acted upon by the wasps as measured from their actual landing behavioral success. If we assume that the unambiguously transmitted information was 2.47 bits, and the unequivocal information received was 0.85 bits, then, applying Equation 6 above we have: H(X|Y) = H(X,Y) – H(Y) = 2.47 – 0.85 = 1.62 bits as the equivocation in the information apparently received or understood by the wasps (Equation 6 is the equivocation this time). Thus the wasps apparently would be unequivocally receiving the chemical messages with only about a 34% efficiency. However, there are alternative possibilities that might be experimentally tested as we discuss in the next section.

5. Interpretation of Results

As already noted, we are not dealing here with a typical two-way communication system. Rather we are dealing with a two-message system (made up of nine signals each) where the overlap (conditional probabilities) of each message may be considered to be “noise” to the other. It is the independent portions of each message, then, that guides the wasps to a correct decision. The subsequent behavior of the wasps can then be compared to the information transmitted. In other words, we are dealing solely with the non-ambiguous portion of the chemical messages sent by the cotton plant rather than the usual transmission and reception concept utilized in a two-way communication system (where equivocation rather measures reception fidelity of the same message). The unambiguous information transmitted to the wasps we found to be 2.47 bits, but the average information received (or at least acted upon) by the wasps, as determined by their landing responses, was only 0.85 bits (on average). Knowing (or assuming) the goal or purpose of the transmission was to determine which plants to land upon, we can compare this with how well the wasps might have performed given the unambiguous bits transmitted.
Clearly, even though the cotton plants have developed a chemical “vocabulary” of nine signal types, this communication system is not being fully utilized since there are 1.62 bits of equivocation. One could posit that this communication system may still be evolving and so has not reached a high level of efficiency yet. So one may be measuring a current limit on the wasps’ ability to detect and process chemical information that is not yet altogether familiar (i.e., the communication system has not yet evolved to be very efficient).
Another possibility could be that there are (or could have been in the past) more herbivore types feeding on this cotton plant, and they may therefore have retained the capacity to encode for more herbivore types. For this nine-chemical communication system, measured to transmit unambiguously 2.47 bits, 22.47 = 5.5 choices implies a “vocabulary” large enough to specify five types of herbivores. (We note that other trace chemicals may also have been present in the experiment but were not apparently detected [1].)
Alternatively, nine separate chemicals may simply be present because this is a residual of natural chemical pathways inherent in the plants’ ability to manufacture chemicals. In this case not all nine chemicals may be needed, implying that the wasps may be receiving a fewer number of chemical types, but receiving them more efficiently. (They may also received all the chemicals at once but perceive them as “vectors”—that is, as a two-signal systems made up of groups of chemicals.) To illustrate, let’s take only the top chemical signal abundance for H. zea infested plants—which is (E)-4-8-dimethyl-1,3,7-nonatriene—as the only necessary chemical to be detected by the wasps. Following the previous procedure for the calculation of fractional probabilities, we obtain p(i) = 0.2881 and p(j) = 0.7119, giving H(v) = 0.35 bits and H(z) = 0.52 bits. If we assume, for now, that the abundance probabilities are independent in this case, then the joint entropy is the sum of the marginal entropies, and we obtain H(v,z) = 0.87 bits. This result is a lot closer to the average 0.85 bits of information derived solely from the quantified landing behavior of the wasps themselves. Under this interpretation, then, only one chemical may be of importance to the wasps in determining where to land. Thus, one may be able to use the quantification of information to advance further experiments that might determine the number and types of chemicals that are of actual importance to the wasps in this communication system.

6. Summary and Conclusions

Measurements have been made of the information entropy of many human languages as well as computer languages, DNA, ant chemical units [16], bee dance [17], and many vocal animal communication systems—including the more complex communication systems of species such as bottlenose dolphins, squirrel monkeys, and humpback whales. Information theory has even been applied before to a simple communication system across phyla—between a vertebrate (a goby fish) and two species of crustaceans (both shrimp; [18]). But, to our knowledge, this may be the first attempt to quantify a communication system between the plant and animal kingdoms. From the Zipf’s law analysis we can say that this chemical communication system appears to be more highly redundant than human languages as well as the complex animal communication systems studied to date. Analyzing the data gathered by DeMoraes et al. [1], we found that the wasps’ reception of the plant “communiqué”—as evidenced by their success in finding the desired plants—indicates that the wasps may be inefficient “readers” of the chemical signals (i.e., evolutionarily new at this behavior) or, alternatively, only “processing” a few or even only one of the chemical signals, thereby making their landing selections inefficient. On the other hand, the cotton plants may have evolved to signal for more herbivore types. For example, it is known that cotton has many predatory herbivores—[19] lists seventeen separate infestation types. So it may not be surprising that the cotton plants might have a chemical vocabulary of at least nine signal types that could theoretically, in “di-gram” combinations, allow up to 36 signal types in a two-combinatorial signaling system. The communication system studied here could have evolved to save the wasps energy in finding the right plant to land on. However, the advantage to the cotton plant is less clear as the wasp does not destroy the herbivore immediately (using the herbivorous host for egg laying) so that the herbivores remain feeding on the cotton plant for some time after the chemical signaling.
In conclusion, information theoretic analysis of non-human (and even non-animal) communication systems may be able to provide insights into the efficiency of transmission, reception, and perhaps even the evolutionary development of diverse types of signaling systems—even those across the botanical and biological kingdoms—thereby providing insights that might otherwise be inaccessible.

Acknowledgements

The author would like to express his appreciation to Dr. Mike Rechlin of Principia College, Illinois for first bringing specific literature on plant chemical communication systems to his attention. I would also like to thank Drs. Brenda McCowan and Sean Hanser of University of California, Davis, J. Ellen Blue of SRIC in Menlo Park, California, and Chris Neller of the SETI Institute for their help and encouragement. The author wishes to also thank the seven referees of this paper for their very helpful comments spanning a range of diverse disciplines. Their encouragement and insights were greatly appreciated and significantly improved this paper. Finally thanks to Dr. Peter Harremoës for helpful recommendations for updating the mathematical notation.

References

  1. DeMoraes, C.M.; Lewis, W.J.; Paré, P.W.; Alborn, H.T.; Tumlinson, J.H. Herbivore-Infested Plants Selectively Attract Parasitoids. Nature 1998, 393, 570–573. [Google Scholar]
  2. Shannon, C.E. A Mathematical Theory of Communication. Bell. Syst. Tech. J. 2001, 5, 3–55. [Google Scholar]
  3. Quastler, H. A Primer on Information Theory. In Information Theory in Biology; Yockey, H.P., Platzman, R.L., Quastler, H., Eds.; Pergamon Press: New York, NY, USA, 1958; pp. 3–49. [Google Scholar]
  4. Pierce, J.R. An Introduction to Information Theory: Symbols, Signals, and Noise; Dover Publications: New York, NY, USA, 1980. [Google Scholar]
  5. Yaglom, A.M.; Yaglom, I.M. Probability and Information; Reidel Publishers: Boston, MA, USA, 1983. [Google Scholar]
  6. McCowan, B.; Hanser, S.F.; Doyle, L.R. Quantitative Tools for Comparing Animal Communication Systems: Information Theory Applied to Bottlenose Dolphin Whistle Repertoires. Anim. Behav. 1999, 57, 409–419. [Google Scholar] [PubMed]
  7. McCowan, B.; Doyle, L.R.; Kaufman, A.B.; Hanser, S.; Burgess, C. Detection and Estimation of Complexity and Contextual Flexibility in Nonhuman Animal Communication. In Evolution of Communicative Flexibility; Complexity, Creativity, and Adaptability in Human and Animal Communication; Oller, D.K., Griebel, U., Eds.; MIT Press: Cambridge, MA, USA, 2008; pp. 281–303. [Google Scholar]
  8. Yockey, H.P. Information Theory, Evolution, and the Origin of Life; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
  9. Zipf, G.K. Human Behaviour and the Principle of Least Effort; Addison-Wesley Press: Cambridge, UK, 1949. [Google Scholar]
  10. Cancho, R.F.; Solé, R.V. Least Effort and the Origins of Scaling in Human Language. Proc. Natl. Acad. Sci. USA 2003, 100, 788–791. [Google Scholar]
  11. McCowan, B.; Doyle, L.R.; Jenkins, J.M.; Hanser, S.F. The Appropriate Use of Zipf's Law in Animal Communication Studies. Anim. Behav. 2005, 69, F1–F7. [Google Scholar]
  12. McCowan, B.; Doyle, L.R.; Hanser, S. Using Information Theory to Assess the Diversity, Complexity, and Development of Communicative Repertoires. J. Comp. Psychol. 2002, 116, 166–172. [Google Scholar] [PubMed]
  13. Steinberg, J.B. Information Theory as an Ethological Tool. In Quantitative Methods in the Study of Animal Behavior; Hazlett, A., Ed.; Academic Press: New York, NY, USA, 1977; pp. 47–74. [Google Scholar]
  14. Doyle, L.R.; McCowan, B.; Hanser, S.F.; Bucci, T.; Chyba, C.; Blue, J.E. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales. Entropy 2008, 10, 33–46. [Google Scholar] [CrossRef]
  15. Hanser, S.F.T. Toward the Social and Acoustic Ecology of Social Foraging Humpback Whales (megaptera novaelngliae) in Southeast Alaska, Dissertation; University of California: Davis, CA, USA, 2009. [Google Scholar]
  16. Wilson, E.O. Chemical Communication Among Workers of the Fire Ant Solenopsis Saevissima 2. An Information Analysis of the Odour Trail. Anim. Behav. 1962, 10, 148–158. [Google Scholar]
  17. Haldane, J.; Spurway, H. A Statistical Analysis of Communication in Apis Mellifera and a Comparison with Communication in Other Animals. Insectes Soc. 1954, 1, 247–283. [Google Scholar] [CrossRef]
  18. Preston, J. Communication Systems and Social Interactions in a Goby-Shrimp Symbiosis. Anim. Behav. 1978, 26, 791–802. [Google Scholar] [CrossRef]
  19. Pest of cutton. http://ipm.ncsu.edu/AG271/cotton/cotton.html (accessed August 19, 2009).

Share and Cite

MDPI and ACS Style

Doyle, L.R. Quantification of Information in a One-Way Plant-to-Animal Communication System. Entropy 2009, 11, 431-442. https://doi.org/10.3390/e110300431

AMA Style

Doyle LR. Quantification of Information in a One-Way Plant-to-Animal Communication System. Entropy. 2009; 11(3):431-442. https://doi.org/10.3390/e110300431

Chicago/Turabian Style

Doyle, Laurance R. 2009. "Quantification of Information in a One-Way Plant-to-Animal Communication System" Entropy 11, no. 3: 431-442. https://doi.org/10.3390/e110300431

Article Metrics

Back to TopTop