1. Introduction
Ion channels are transmembrane proteins that fluctuate between variant open (ion-conducting) and closed (non-conducting) conformational states. These conformational fluctuations are called channel gating. The primary experimental source of information about the channel gating phenomena is a patch clamp method [
1]. It allows for recording a single-channel activity in the form of time series of ionic currents (
Figure 1) at controlled conditions. Depending on an ion current amplitude, open and closed channel states are recognized. The main characteristics of the exemplary signal, presented in
Figure 1, are common for most ion channel types.
The Markovian diagrams represent the standard kinetic model of these conformational changes with specific interconnections between the particular states and the respective transition probabilities [
3,
4,
5]. In the simplest case, only one open and one closed state can represent channel activity, but frequently, more a sophisticated approach involving several open/closed states is addressed. The Markovian approach to model gating dynamics remained the most popular throughout the years [
6]. It allows us to correctly describe many kinetic characteristics of the empirical system, such as the open-state probability or dwell-time distributions of open and closed channel states. Nevertheless, the dynamical properties of channel gating still require clarification in many aspects. The ion current sequences are the largest source of information about the channel’s system dynamics. The patch clamp recordings require appropriate data analysis methods, considering their highly complex characteristics and nonlinear properties [
7,
8].
Quantifying the physical system complexity is a tricky task. Ludwig Boltzmann historically introduced a measure of the number of possible states for the microscopic world. He connected it to Rudolf Clausius’ concept of entropy and the irreversibility of macroscopic processes. Entropy lies at the heart of the second law of thermodynamics, which states that entropy cannot decrease with time for isolated systems, which implies its maximum for the equilibrium state. Entropy is a thermodynamic property that can measure the amount of thermal energy in a system that is unavailable for doing useful work. It is now associated with disorder, randomness, or uncertainty, rather than the actual state-counting process. In such a case, it can serve as a measure of complexity—the more complex the system is, the higher entropy possesses. In 1948, Claude Shannon introduced the measure for missing information as an analog to the thermodynamic entropy [
9]. Since then, information entropy has become a key concept for information theory. It has gained considerable popularity and effectiveness among the range of techniques that can be applied in the context of biological signals. A phenomenon of entropy measures is associated with its ability to characterize the rate of creation of valuable information in a dynamical system, identifying the level of uncertainty or the possibility of an indirect description of the number of available states, which can have a direct impact on many biological aspects [
10]. The different kinds of entropy measures, in the forms of Shannon, Kolmogorov, approximate, or sample entropy, are involved in the analysis of electrophysiological signals, including cardiac rate variability [
11,
12], electromyography (EMG) [
13], and electroencephalography (EEG) [
14], to name but a few.
The ion channel signal reflects the complexity of the channel’s switching among its available states. Only a few reports have addressed the entropy-based patch clamp data analysis [
2,
15,
16]. In [
15], the values of the sample entropy of the signals, which characterized the large conductance calcium-activated potassium (BK) channel’s activity, were investigated in the context of effects of membrane strain and the possible changes of membrane morphology after a series of suction impulses. The work [
16] describes the utilization of information entropy in distinguishing the patch clamp signals of mitochondrial BK channels (mitoBK) obtained for different cell types. In [
2], the authors engaged the idea of entropy in the classification of mitoBK channels activity at changing experimental conditions (voltages). In addition, the authors used multiscale entropy to select the optimal sampling frequency rate of the ion current recordings.
Deterministic forces and stochastic thermal fluctuations shape the single-channel activity. In the mentioned works, entropy values were based only on experimental ion channel recordings. The complexity description gave combined characteristics of the signal, being both shaped by the deterministic interactions associated with protein–protein, protein–ligand, or protein–lipid interactions, as well as the thermal fluctuations of the helices forming the channel and its membrane surroundings. It is challenging to extract information about the relative impact of deterministic forces and stochastic thermal fluctuations on channel gating and signal entropy from the experimental data. Therefore, implementing the information entropy for the patch clamp recordings analysis has been based only on finding the differences in signal complexity with appropriately changed experimental conditions. In this work, we decided to go a step further and provide a more detailed description of how the properties of conformational space and the number of possible states of the channel influence entropy.
To that aim, we use data obtained in a simulation of a relatively simple Markovian channel gating model and investigate the effects of the changes in conformational diffusion space parameters on signal complexity. The investigated parameters will refer to the channel’s energetic landscape that governs the open–closed fluctuations in a confined space. Moreover, we decided to provide a signal description using different entropy measures. Our choice is concentrated on two leading groups of entropy features. We consider the standard Shannon information entropy and its frequency-based analog, in the form of spectral entropy. These two measures are based on probability distributions or power spectral density functions, and the interpretation of their values is limited to the statistical description of data. For the comprehensive characterization of ion channel activity, we decided to also select the specific kinds of entropy that can investigate the information rate considering the system’s dynamics. Sample entropy (SampEn) is one such measure, with proven effectiveness in biological signal classification [
17] and the analysis of EEG and EMG signals [
18]. However, popular entropy measures can sometimes underestimate the valuable information in complex data sets. The slope entropy, for instance, has a higher discriminating power in application to complex and numerically demanding data [
18]. Since its introduction in 2019, the slope entropy algorithm has gained popularity for biological time series analysis. Moreover, the measure of slope entropy was successfully applied in fever diagnoses [
19]. It was also a valuable feature in the machine learning ECG signal classification [
20]. The method has been used for other types of signals, such as bearing fault signals [
21] and ship-radiated noise signals [
22].
Regarding the effectiveness of these techniques, we decided to set the slope and sample entropy results together to test the possible scenarios related to the potential change of information loss in such a complex ion channel system. In particular, we will test the response to the changes in the energetic landscape of conformational states (corresponding to the stability of available channels’ open and closed conformations) or the relative noise intensity (temperature).
3. Results
The representative trajectories obtained by model simulation are presented in
Figure 3. As one can see, the overall signal characteristics mimic the patch clamp recordings. The effects of the steepness of potential wells (
b) can be observed in the form of a varying range of the single-channel current amplitudes, which correspond to the open and closed states, at a fixed noise level (determined by the
D parameter). Thus, the model is anticipated to enable the imitation of the experimental ionic current distributions after appropriate parameter optimizations.
The following results of the numerical simulations are presented for the most straightforward possible setup with only two allowed states—one open and one closed —as the proposed complexity measures are not sensitive to the number of channel states (if they all exhibit the same characteristic O/C conductance). Regardless of the ion channel kinetics’ number of states, the ionic current would always have the same value in any open (or similarly closed) states. The possibility of channel current sublevels is not considered here, since it is not very often exhibited in biological systems. The proposed measures are all based on the values in the data train, and since there is no difference in values between, say, the states and , one will not see the change in entropy.
3.1. Temperature Influence
Calculating the information entropy for white Gaussian noise, for which all the possible states are allowed, would result in the maximum entropy value [
31]. In terms of Langevin dynamics, this situation describes the free Brownian particle with no potential force present. In such a situation, with no other energy scale to compare to, the effective information loss will be the same, regardless of the noise intensity (heat bath temperature), and the function of (any) entropy
versus D will remain constant for any value of
D.
If, however, one would reduce the ability of travel for the particle by placing it inside the entropic potential, the whole picture would change. The Brownian particle cannot pass through the barriers of the proposed quadratic potential and remains located around the potential minimum, regardless of the open or closed state. The increasing temperature will increase the thermal energy available for the particle. This, in turn, would enable the particle to travel further against the gradient and visit distant locations in the potential well, causing global complexity to rise.
We can confirm the expectations for three of the four selected entropy measures. In
Figure 4, the four selected entropy measures (Shannon, spectral, sample, and slope) are shown
versus the noise intensity
D. In this scenario, the particle deals with the energy barrier, which causes the particle to sit around the potential minima separately for the open and closed states. The value of Shannon entropy is constantly rising with increasing thermal energy
D, to the point where it saturates for the intensity
. Spectral and sample entropies remain constant for a broad selection of intensities
D. The former is slightly more sensitive to temperature change and increases around
. The latter rises visibly for
. The slope entropy behaves somewhat differently and reveals a non-monotonic tendency with increasing
D. Initially, (
) drops slightly. Near unity, it reaches a minimum and then starts to increase. The existence of extrema in the complexity measure is usually responsible for the occurrence of critical phenomena. Here, we could not find any such behavior in the vicinity of
, and the presence of the minimum remains the puzzle. As the SlopEn algorithm rather aggressively affects the information carried in the data, this effect may be caused by the algorithm’s structure.
Above the noise intensity
, the structure of the signal is no longer similar to the one registered in the patch clamp experiment, and the analysis becomes meaningless. In
Figure 5, we present the trajectories (ionic currents) for the selected values of the noise intensity, together with the corresponding distributions (histograms). Please note the vanishing separation of states in the distributions for higher noise intensities (
and 10).
3.2. Localization
Reducing the ability to travel should mean reducing the information complexity and randomness of the system, as we limit the number of possible states of the Brownian particle. In our case, the most direct possibility to influence the localization will be adjusting the landscape parameter
b of the potential (
10). In
Figure 6, we present all four selected entropy measures (colors and line types stay the same as in
Figure 4).
Increasing the parameter
b causes the narrowing of the potential walls; see
Figure 2 for details. It will also cause the probability distribution of possible states to narrow. In
Figure 7a, the distributions of the positions of the Brownian particle are plotted. By inspecting the graphs, one can expect the descending Shannon entropy measure, which depends on the number of possible states (the width of the histograms).
The narrower the distribution, the fewer states accessible, and the lower the Shannon entropy, as it is solely given as a function of the distribution. Therefore, it is not surprising that the Shannon entropy curve decreases monotonically as a function of the increasing value of
b, regardless of the scaled temperature considered—cf. solid blue lines on both panels of
Figure 6.
Spectral entropy, on the other hand, behaves somewhat unexpectedly. The PSD inspection can explain its increase—cf.
Figure 7b. The area under the curve increases with parameter
b, regardless of the noise intensity. This, in turn, should increase entropy measure based on these characteristics—cf. the orange dashed line in
Figure 6. However, for higher noise intensity
, the tendency is not monotonic, and the
assumes a minimum in the vicinity of
.
Sample entropy (dashed-dotted green line in
Figure 6) is a decreasing function of the growing landscape parameter for low noise intensity
, the behavior known for the classical
. The lower complexity is somewhat expected for an algorithm based on estimating the frequency of the possible template vectors. The narrower the potential wells, the lower number of unique patterns. For higher scaled temperature,
, SampEn surprisingly reveals a maximum in the vicinity of
. Similarly to
, one can expect a somewhat critical phenomenon around that value [
45,
46].
In contradiction to the temperature dependence, slope entropy remains constant for any of the examined values of b. It suggests that increments of ionic currents over time are statistically identical for potential conformational change. Still, one can notice this measure’s lower, although still constant, value for . Based on the increments, rather than the values themselves, the information loss estimate remains insensitive to localization changes.
4. Discussion
The movement of ions across a biological membrane through channels is passive, meaning it occurs without the input of energy from the cell. The electrostatic potential gradient and the difference in ion concentration between both sides of the membrane drive this movement. Active ion transport is the movement of ions across a biological membrane against the gradient from an area of lower concentration to an area of higher concentration. This process requires energy input from the cell, often in the form of adenosine triphosphate. The energy input pumps ions across the membrane against their gradient, creating a concentration and an electrochemical gradient. Active ion transport plays a crucial role in maintaining ionic balance and cell homeostasis. Examples of active ion transport include the sodium–potassium pump and the proton pump. Most works would model one of the mechanisms employing continuum or polarizable models [
44] and references therein.
In this work, we focused solely on modeling the ionic current without first assuming the type of transport across cell membranes. We built a simple model with two-dimensional parameter space by means of the overdamped Langevin equation. The main goal was to describe the potential effects of temperature and the channel’s energetic landscape for conformational changes on the ionic current’s complexity without going into unnecessary model details. To accurately describe the information entropy of a biosystem, we used two classic measures of randomness, Shannon and spectral entropies, as well as two relatively new quantifiers that are very successful in the classification of biological signals, sample and slope entropies.
The Shannon entropy () exhibits typical, predictable behavior when we force the system to change the number of possible states, either by increasing the temperature ( is also increasing) or by forcibly increasing the location of the Brownian particle ( is decreasing). The spectral entropy behaves normally with temperature changes and increases as we heat the system. It is slightly less predictable for changes in the location of the Brownian particle. For particles with a reduced possibility of random motion, it is expressed in increased entropy based on the PSD for relatively low temperatures. For higher temperatures, the randomness characteristic shows a minimum for a specific value of the shape of the potential well.
Complexity measures based on system state vectors calculated for raw values directly from the data seem less predictable than classical counterparts. The sample entropy for the temperature dependence behaves similarly to the classical Shannon measure and increases with the temperature, showing an increase in the unpredictability of the biosystem. Additionally, similar to , it reacts to the restriction of the particle’s motion, showing a decrease in entropy for particles restrained by the possibility of visiting states distant from the minima, at least for lower temperatures. For higher temperatures, where the position close to the minimum is not so clear, and the particles can travel much higher along the potential walls, SampEn shows a maximum for the selected potential configuration. There may be a critical phenomenon for the dynamics, although we have yet to find any during careful examination of the trajectories and state vectors. Slope entropy exhibits slightly different behavior with increasing temperature. We find a shallow minimum in the D function for this characteristic. It seems surprising that this measure does not respond to the increase in the location of the Brownian particle. It can be explained with the same probability of occurrence of increments of the ion current value.
5. Conclusions
The effects of two parameters affecting single-channel gating on the complexity of the corresponding time series of ionic currents have been analyzed. In particular, we analyzed the effects of temperature and steepness of potential wells separating a channel’s open and closed functional states on the values of signals’ information entropy. The first analyzed parameter, scaled temperature
D, is mostly related to the signal-to-noise ratio in the experimental patch clamp recordings. The second one,
b, describes the effects of energetic and spatial constraints on the channel gate dynamics. Due to the fact that both aspects are hardly controllable in the biological system, our studies are performed on the simulated data, where the mechanism is ruled by the popular two-state Markovian approach [
1]. The simulated current trajectories allow us to directly observe the effects of thermal fluctuations, which represent the perturbations within the recordings of the single-channel activity, as well as the gating constraints on the characteristics of the signal representing the single-channel activity. The signals’ entropy was calculated by four different measures (Shannon, spectral, sample, and slope entropies), and the obtained results allow us to observe the changes in the signals’ complexity.
The increase in temperature should be gathered by the increase of entropy. In turn, the higher restrictions for the conformational diffusion, represented by the b parameter, are anticipated to lower the complexity. These predictions are based on the thermodynamic considerations of the possible availability of states and the complexity of the channel current values at planned conditions. Our results allow us to conclude that only the Shannon definition provides an entropy measure that enables us to directly reflect the relations discussed above. Thus, the reliability and performance of the Shannon entropy in the analysis of time series describing the single-channel gating dynamics can be highly rated and recommended. The results obtained by the other entropy measures are more tricky or precarious. Therefore, their utility in ion channel-dedicated studies needs further examination.
Additional current states found in the literature (as ”sublevels”) can be incorporated to make the model more universal. Including additional open and closed states of different characteristic conductance could increase the randomness of the model, which can better imitate the recorded currents flowing through some ion channel types in the biological membranes. This can lead to a better understanding of ion transport processes and how they contribute to the functioning of biological membranes.