Next Article in Journal
Digital Twin Technology—A Review and Its Application Model for Prognostics and Health Management of Microelectronics
Previous Article in Journal
A Novel Linear-Based Closed-Loop Control and Analysis of Solid-State Transformer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Autonomous Monitoring System with Microwatt Technology for Exploring the Lives of Arctic Subnivean Animals

by
Davood Kalhor
1,*,
Mathilde Poirier
2,3,
Gilles Gauthier
2,3,
Clemente Ibarra-Castanedo
1 and
Xavier Maldague
1
1
Department of Electrical and Computer Engineering, Université Laval, Quebec City, QC G1V 0A6, Canada
2
Department of Biology, Université Laval, Quebec City, QC G1V 0A6, Canada
3
Centre for Northern Studies, Université Laval, Quebec City, QC G1V 0A6, Canada
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(16), 3254; https://doi.org/10.3390/electronics13163254
Submission received: 23 June 2024 / Revised: 26 July 2024 / Accepted: 8 August 2024 / Published: 16 August 2024

Abstract

:
Understanding subnivean life is crucial, particularly due to the major role in food webs played by small animals inhabiting this poorly known habitat. However, challenges such as remoteness and prolonged, harsh winters in the Arctic have hampered our understanding of subnivean ecology in this region. To address this problem, we present an improved autonomous, low-power system for monitoring small mammals under the snow in the Arctic. It comprises a compact camera paired with a single-board computer for video acquisition, a low-power-microcontroller-based circuit to regulate video acquisition timing, and motion detection circuits. We also introduce a novel low-power method of gathering complementary information on animal activities using passive infrared sensors. Meticulously designed to withstand extreme cold, prolonged operation periods, and the limited energy provided by batteries, the system’s efficacy is demonstrated through laboratory tests and field trials in the Canadian Arctic. Notably, our system achieves a standby power consumption of approximately 60 µW, representing a seventy-fold reduction compared to previous equipment. The system recorded unique videos of animal life under the snow in the High Arctic. This system equips ecologists with enhanced capabilities to study subnivean life in the Arctic, potentially providing insights to address longstanding questions in ecology.

1. Introduction

Understanding the Arctic environment is crucial for several reasons, including the important role of terrestrial Arctic ecosystems in the global carbon cycle [1]. Arctic ecosystems and their unique biodiversity are also highly sensitive to ongoing climate change [2], which is amplified at high latitudes [3]. Despite the long and harsh Arctic winter, which lasts up to eight months, a diverse array of animals live throughout the year in these regions. The subnivean environment, created by the snow cover, serves as a refuge for the tundra biome, shielding inhabitants such as small mammals from extreme cold and predators [4]. Arctic predators such as snowy owls (Bubo scandiacus), ermines, and foxes critically depend on small subnivean mammals for their survival [5,6]. Arctic lemmings are well known for their cyclic population fluctuations, a phenomenon that has intrigued ecologists for more than a century [7,8,9,10]. Changes in their population dynamics in some regions, possibly linked to a changing winter climate, is a source of concern as it could lead to cascading effects on the entire tundra food web [6]. However, these phenomena remain poorly known, in part due to the paucity of information on the winter ecology of lemmings.
To overcome this obstacle, direct investigation of several aspects of the winter life of lemmings beneath the snow, including their reproductive and social behaviors, is necessary [11]. The significance of winter reproduction in lemming population dynamics seems to be substantial and is potentially influenced by snow conditions [12,13]. However, these small mammals are commonly studied using traditional methods, such as live trapping or monitoring winter nests [14,15], that are predominantly conducted during summer months due to the logistical complexities and high costs associated with accessing study sites in winter. Moreover, the formidable challenges posed by extremely low winter temperatures and extensive snow cover further compound these logistic hurdles.
Automated camera systems (also known as camera traps) [16,17] present a compelling alternative to address these challenges. Ethological studies utilizing camera traps offer several advantages over traditional methods. In contrast to live trapping, camera trapping is minimally invasive and enables unbiased, direct monitoring of animals in their natural habitats. Moreover, camera traps can be more cost-effective, requiring fewer on-site visits and personnel. Consequently, over the past two decades, camera traps have gained increasing popularity as an efficient tool for studying a diverse range of animals, including carnivores (e.g., tigers and leopards [18,19]), herbivores (e.g., elephants and deer [20,21]), birds [22], fishes [23], and more recently for studying squamates [24,25] and small mammals [26,27].
Numerous studies have demonstrated the efficacy of camera trapping for various applications in ecology and conservation: estimating population densities [28,29], collecting behavioral data and activity patterns [16,30,31], gathering occupancy data and estimating animal abundance [32,33,34] (including for endangered and cryptic species [19,35,36,37]), investigating the impact of human disturbances such as road construction, transportation, and military activities on activity patterns and habitat use [38,39], monitoring and detecting illegal human activities such as poaching [40], and estimating dispersal distances (i.e., the movement from the birthplace) [18,41]. While camera traps have been extensively utilized in various environments such as semi-arid regions, aquatic habitats, and tropical rainforests, their application in the Arctic has been notably limited, particularly for studying animals inhabiting the subnivean space, with few exceptions [42,43]. Utilizing a commercial camera from Reconyx Inc., a Norwegian team [42,43] successfully obtained photographs of small mammals (such as voles, shrews, and stoats) living in the subnivean space in northern Norway during winter. An analogous effort was undertaken in the Canadian High Arctic utilizing a camera from the same manufacturer [44] but yielded minimal success attributed to the system’s inadequate performance in the High Arctic. A significant difference between these investigations lies in the contrasting environmental conditions, as the northern Norwegian climate is subarctic, leading to higher temperatures and snowfall compared to the Canadian High Arctic. These distinctions are crucial, as Canadian researchers encounter significant issues with frosting, attributable to lower temperatures and thinner snow cover [44,45].
The majority of research with camera traps has relied on commercial camera systems, with insufficient attention given to developing systems tailored to specific environments such as the subnivean space [46]. Humbert et al. [47] developed a camera trap for underwater ecosystem surveillance utilizing a Raspberry Pi 3B+ (RPI3B), a camera connected to the RPI3B, and a customized board attached on top of the RPI3B with components such as a SparkFun PicoBuck module (to drive an array of LEDs), and a voltage regulator for powering the RPI3. The system captures images of underwater organisms when movement is detected by computer vision algorithms. A major drawback of this system is its high-power consumption at standby (30 mA), which limits its application for long-duration deployments. For example, to sustain only the standby current over a 300-day period, this system necessitates an exceedingly heavy and costly battery with a capacity of more than 200 Ah, making it impractical for many real-world applications, including in our study. Camacho et al. [48] designed a camera system for wildlife inventory purposes in the Amazon Rainforest. This system includes a motion detection sensor, a camera, an SD card, and a microcontroller that communicates with the two aforementioned components via a serial bus. Upon each trigger event, determined by the microcontroller monitoring the sensor signal, two photos are captured and saved onto the SD card. This system has certain limitations (see [46]) that make it unsuitable for studying subnivean animals in the Arctic.
Although many camera-based ecological studies, including [42,47,48], have focused on capturing still images, there is a growing recognition of the advantages of moving images. While still images offer valuable insights, they lack the comprehensive information required to fully understand animal behaviors. In contrast, videography not only addresses this limitation but also significantly enhances the likelihood of species identification.
Building upon our previous experience [46], we believe that a system developed to monitor small mammals inhabiting the subnivean space in Arctic regions must meet several critical specifications, which are:
  • Ability to withstand the very low temperatures of the High Arctic during winter;
  • Resistance to icing and flooding (during snow melting season at spring);
  • Minimization of frost formation on the camera lens and motion detection sensors;
  • Capability to operate in the absence of daylight for extended periods without disturbing animals;
  • Capacity to take videos rather than photos;
  • Low power consumption and efficient energy management to allow a long recording season (lasting up to 10 months) with an affordable and relatively lightweight battery;
  • Autonomous operation with minimal risk of failure or malfunction;
  • Ability to collect data on subnivean animals 24 h a day throughout the recording season.
To the best of our knowledge, none of the existing camera systems, including those reviewed here, fully satisfy all these critical requirements. The system proposed in [46] was an attempt to build such a system. Although this was the first time that an automated camera system managed to capture valuable video footage of lemmings during winter in the High Arctic, it had some shortcomings. Notably, its energy consumption was inefficient, requiring 5.4 Ah of battery capacity to supply the standby current (750 µA). Additionally, the frequency of video recordings was constrained by setting a minimum time interval of one hour between two consecutive recordings, a measure taken to overcome the extreme power limitations imposed by working at remote Arctic study sites where regular battery replacements are impractical [46]. A direct consequence of using a time interval is the risk of missing information on the potential presence of an animal during this interval.
In this study, we aim to address these challenges by developing a more energy-efficient monitoring system. Our approach involves minimizing the energy consumption of each module within the system. For instance, we introduce a new real-time clock circuit specifically designed to reduce current consumption. We also present a novel method for collecting complementary information from animal activities using passive infrared sensor signals. A detailed description of this system is provided, and its performance is extensively evaluated through laboratory tests and field deployment in the Canadian High Arctic.

2. Hardware and Software Design and Implementation

The block diagram depicted in Figure 1 illustrates the key components comprising the proposed system, designated as ArcÇav 2. These components can be grouped into four units: the filming box, video acquisition unit (VAU), low-power processing unit (LPPU), and power unit (PU). Upon detection of a living animal in the filming box, motion detection modules send a signal to the LPPU. If some additional conditions are satisfied, the LPPU instructs the VAU to capture a brief video. Subsequent to the completion of this task and the storage of the video in external memory, the VAU is powered off, and the LPPU returns to power-down mode (sleep). The system is meticulously designed for minimal power consumption during standby. A comprehensive description of the principal components of the ArcÇav 2 and their respective functions is presented in the subsequent sections.

2.1. Filming Box

In order to capture the behaviors of small mammals beneath the snow as they circulate through a network of tunnels [49], the use of a specialized chamber becomes imperative. The filming chamber is an improved model of the aluminum box described in [46]. It has two openings connected to 2-inch pipes A and B, which facilitate the ingress and egress of small animals. We positioned the camera on the lateral side of the box (position C in Figure A1, Appendix A) to mitigate the risk of hoarfrost accumulation on the camera window [44].

2.2. Camera and Near-Infrared Illumination

Given the extended period during which the filming box remains beneath the snowpack and the absence of natural sunlight, inclusion of an artificial light source is necessary. We employed near-infrared illumination to prevent the disruptive effects of visible light on animals [44]. We chose a Raspberry Pi Camera Module v2.1, which is equipped with an 8-megapixel Sony IMX219 image sensor for filming. This camera offers several advantageous features such as a relatively wide angle of view (Figure A2), a compact size, and seamless compatibility with the chosen single-board computer (SBC); see [46] for further details.
To ensure comprehensive coverage of the filming box, the camera is positioned at a specific height (Figure A2a), which affords a substantial viewing range within the box, particularly the areas encompassing entrances A and B (Figure A2b,c). Nevertheless, fixing the camera to the lateral side of the box presents a drawback due to its fixed-focus lens. As a result, image sharpness varies as the animals move closer to or farther from the camera. To address this issue, we manually adjusted the maximum focus to a distance of 25 cm. A more detailed exploration of the optimal focal length is presented in [44].
The system generates the radiant energy needed for the near-infrared spectrum video recording using infrared emitting diodes (IR emitters). The technical specifications and configuration of the IR emitters are provided in Appendix A (Figure A2a,d).

2.3. Real-Time Clock Module

The single-board computer employed lacks a built-in mechanism for timekeeping when powered off. Given the critical importance of time and date records in studying animal behavior, we addressed this limitation by incorporating the DS3231, a low-power real-time clock (RTC) chip with an accuracy of ±3.5 ppm (equivalent to only ±2 min of error per year) over a temperature range of −40 °C to +85 °C. This device periodically measures the temperature (i.e., temperature conversion cycles; see Section 3.2) and corrects the oscillator frequency accordingly, thereby achieving an accuracy several times superior to typical crystal RTCs. The DS3231 includes a “power control” unit, connected to the VDD pin, that automatically switches to a secondary power supply, which is provided on the backup power-supply input (the BAT pin), when the primary power level drops below a certain threshold. The device also features an active-low interrupt and two time-of-day alarms. Figure 2a illustrates a conventional RTC circuit (RTC-M1) using this chip.
Communication with the chip is done via the serial data (SDA) and serial clock (SCL) pins using the inter-integrated circuit (I2C) serial interface. Furthermore, the SQW/ INT ¯ multifunction pin is configured as an interrupt (INT) by setting the INTCN bit in the control register to 1. When either of the alarm registers matches the timekeeping registers, A1F and A2F of the OSF register are set, and the SQW/ INT ¯ pin is activated. This pin is used to notify the main microcontroller of the RTC alarms.
Despite the good specifications of the DS3231, its energy consumption is problematic for our goals because its active supply current is about 200 µA for V D D = 3.6 V. After investigating the chip behavior, we identified the “power control” unit as the source of excessive current. To mitigate this problem, we bypassed this function of the chip by supplying power to the device solely via the BAT pin while maintaining two power sources. In the new design (circuit RTC-M2, Figure 2b), the current drawn by U2 is reduced to around 2 µA (current required for time keeping and current temperature measurement). When the primary power source ( V C C 1 ) is available, it is applied to the BAT input via D 1 . In this condition, transistor Q 1 (a p-channel MOSFET) remains off as V G S = 0 and V C C 1 > V B 2 , and consequently, no current passes through R 5 . When V C C 1 falls below a threshold value ( V t h = 2.93 V), pin RST ¯ of U2, a chip designed for supervisory circuit monitoring in microprocessor and digital systems with ultra-low current (1 µA maximum), transitions to a low state within a short period (called power-down reset delay, t d o f f ). This change makes the voltage of Q 1 ’s gate lower than the threshold voltage of the transistor, causing Q 1 to turn on and thereby connecting the backup battery ( B 2 ) to the U1 BAT pin. We selected this transistor because of its low gate threshold voltage ( V G S t h ), low drain-to-source on-resistance ( R D S o n ) and very low leakage current (less than 1 µA).
Although the transition delay t d o f f of U2 is as small as 20 µs, capacitor C 1 is still essential to provide the energy U1 requires within this period if V C C 1 quickly falls below V B A T m i n . The proper capacitance of C 1 can be calculated using the following equation:
C 1 > I · Δ t V t h V D 1 V B A T m i n ,
where I is the current consumed by the chip U1 (via pin BAT) during the transition period, V D 1 denotes the voltage of D 1 , and V B A T m i n = 2.3 V, which is the minimum voltage that U1 needs to continue operating. This equation was derived from the capacitor current–voltage relation, i = C d v d t , subject to the constraint that the voltage of C 1 ( V C 1 ) during the transition should not fall below V B A T m i n : V t h V D 1 Δ V C 1 > V B A T m i n .
During normal operation, the U1 current for timekeeping is 1.2 µA, and we can show that a 100 pF capacitor satisfies the equation. However, during temperature measurements occurring once every few seconds, the current increases up to 560 µA (for V C C 1 = 3.3 V) for a short period. In the rare scenario where the power source transition occurs during this period, a significantly larger capacitor would be required. Recalculating C1 for the worst-case scenario, i.e., I = 560 µA, V D 1 = 0.25 V (at I = 560 µA), V t h = 2.857 V (minimum value over a temperature range of −40 °C to +85 °C), and t d o f f = 40 µs (at low temperatures −40 °C, the transition period doubles), yields a capacitor around 73 nF. Recognizing the crucial role of this capacitor in ensuring the proper functioning of the RTC after loss of the main power source, we opted for a larger capacitor (220 nF). Using this capacitor, the voltage at the BAT pin never drops below 2.51 V.
While the proposed circuit significantly reduces current consumption by a factor of 50–100, it does have a minor drawback. Once V C C 1 surpasses the preset threshold ( V t h ), the reset pin of U2 remains in a low state for a duration ranging from 100 to 200 milliseconds. During this interval, if V C C 1 exceeds V B 2 + V F ( V F denotes the forward voltage of D 1 ), there is a risk of charging the battery at a rate of ( V C C 1 V B 2 V D 1 ) / ( R 5 + R 6 ) , which is problematic for non-rechargeable batteries. To address this concern, we revised the circuit by replacing diode D 1 with two back-to-back p-channel MOSFETs, as depicted in Figure 2c (i.e., RTC-M3). Capacitor C 1 can be determined using Equation (1), with V D 1 substituted with zero. The new circuit offers an additional advantage: under the worst conditions, the minimum voltage at the U1 BAT pin ( r t c _ v b a t ) is 0.25 V (i.e., V F at I = 560 µA) higher compared to the previous design when the same capacitor is used. Alternatively, to sustain the same minimum level, the capacitor can be halved.
Our system necessitates two RTC modules: one for the main microcontroller and the other for the SBC. These modules differ slightly, with the SBC not utilizing any alarm interrupt; hence, there is no need for the installation of R 1 (Figure 2).

2.4. Single-Board Computer

For the purpose of image acquisition and data storage, we employed the Raspberry Pi Zero (RPi0), a SBC with minimal peripherals. This very cost-effective and compact board is equipped with a Broadcom BCM2835 system-on-chip. Communication with peripherals is facilitated through several ports, including a camera serial interface (CSI) connector and a micro-SD card slot; see [46] for technical specifications. The power source, rated at 5 V, can be provided either via a micro-USB connection or the dedicated pins of a 40-pin connector.
The camera is connected to the board CSI connector using a 15-pin ribbon cable. To maintain real time, the SBC establishes communication with the RTC module through the I2C serial interface. Additionally, one digital input is allocated for receiving commands from the main microcontroller, dictating whether to carry out a daily routine or initiate the recording of an animal behavior within the filming box. Three other general-purpose input/output (GPIO) pins are configured as digital outputs to regulate the operation of the IR emitters, while the fourth digital output serves to signal the main microcontroller upon completion of its video acquisition task.
Drawing upon the experimental findings and computations outlined in [46], we opted to install a SanDisk Ultra 32 GB Class 10 SDHC card into the micro-SD card slot, providing ample local storage capacity for our needs.

2.5. Motion Detection Circuit

To detect the presence of an animal within the filming box, we designed the circuit illustrated in Figure 3. Central to this configuration is a low-power passive infrared (PIR) motion detection sensor that yields a digital signal upon animal movement within its designated viewing angle: 90 degrees for standard motion and 44 degrees for subtle movements. This sensor operates at a mere 1–1.6 µA in sleep mode, which is a highly desirable attribute for our system.
To mitigate undue interruptions to the primary microcontroller (see Section 2.6), we incorporate a rising-edge-triggered pulse generator (circuit model MD-M1, Figure 3a). Upon application of a negative edge to pin TRIG of chip U3, capacitor C 4 initiates charging through R 4 . When the voltage across the capacitor reaches two-thirds of V C C 1 , an internal flip–flop resets the output and discharges the capacitor via an internal transistor. This state persists until a subsequent trigger signal is received. The pulse width ( t p ) is determined by the following equation [50]:
t p = ln ( 1 / 3 ) R 4 C 4 1.1 R 4 C 4 .
Given the selected values for R 4 and C 4 , the generated pulse width is approximately one second. The first part of the circuit (comprising components U2, R 2 , R 3 , and C 2 ) generates a brief falling-edge pulse essential for the proper operation of U3, with diode D 1 safeguarding the TRIG pin of this chip from voltage spikes instigated by C 2 during the low-to-high transition on the U2 output pin.
The power consumption of this circuit is modest (see Section 3.1), as the quiescent current of U2, a CMOS gate, stands at merely 10 nA, and U3 necessitates a supply current of approximately 30 µA. The trigger and threshold currents of U3 fall within the picoampere range and are thus negligible. Despite this, diminishing the power consumption of the motion detection (MD) circuit to below 2–3 µA is desirable. To achieve this objective, a second circuit is designed (MD-M2, Figure 3b). It closely mirrors the behavior of the MD-M1 circuit but is implemented through a program (elaborated in Section 2.8) running on an 8-bit CMOS microcontroller with nanowatt technology. The chip current consumption in sleep mode is exceptionally low (as minute as 50 nA), and its watchdog timer (WDT) current approaches 2 µA for a supply voltage of 3 V.
The PIR sensor is equipped with an open-drain p-channel MOSFET in its output. Consequently, a pull-down resistor ( R 1 ) becomes imperative to avert a floating state at the input of U2 when the MOSFET is inactive. While using larger resistors aids with reducing the current consumed by R 1 , it simultaneously prolongs the falling time of the sensor output up to a maximum of 3 ( R 1 R p i n ) C p i n , where C p i n is the capacitance of the input pin of U2 (≤50 pF for both circuits). R p i n , on the order of gigaohms, can be disregarded in the computation. Even for R 1 = 10 M Ω and C p i n = 50 pF the increase in the falling time remains below 1.5 milliseconds, which is inconsequential for our study. However, to preempt potential issues associated with large resistors, such as heightened error with the same leakage current, thermal noise, and increased susceptibility to environmental conditions (humidity and dust), we chose to maintain this resistor below 5 M Ω .
Our system can be configured with up to three MD modules, referred to as MDx, where x ∈ [A, B, C] indicates the location of the module’s PIR sensor (i.e., PIRx); see Section 2.9 for further details.

2.6. Main Microcontroller

To regulate the frequency and time of video acquisition and to manage a few other tasks, we employed the ATmega328P, an 8-bit AVR microcontroller (µC) with 23 programmable I/O lines, 32 KB flash memory (program memory), and 2 KB internal SRAM. This 32-pin chip boasts peripheral features, including three timers, a WDT with an on-chip oscillator, an 8-channel 10-bit analog-to-digital converter (ADC), a 2-wire serial interface (compatible with the I2C protocol), and a serial peripheral interface (SPI) port. The ATmega328P offers external and internal interrupts, including interrupt and wake-up on pin change, as well as various power-saving modes. In active mode, the power supply current ranges from a few hundred microamperes to 16 mA, depending on the oscillator frequency and supply voltage, while the power-down mode sees a very low current (≤1 µA).
For rapid prototyping, we utilized an Arduino Pro Mini µC board, which came with an ATmega328P and additional elements, including a reset button, a voltage regulator, and an 8 MHz crystal. To minimize power consumption in both the active and sleep modes, we modified the board as illustrated in Appendix A (Figure A3). Although the ATmega328P can operate within a voltage range of 1.8 V to 5.5 V, we supplied it with a 3.3 V power source to reduce power consumption, as the current at 5.5 V is almost 2–3 times that of a 3.3 V supply current. The chip was configured to communicate with the RTC module through the I2C serial interface for programming the RTC, reading time, and resetting alarms. The SQW/ INT ¯ pin of the RTC ( r t c _ i n t port in Figure 2) was connected to PD2 (external interrupt 0 input) of the µC, allowing it to be notified of the RTC alarm even during sleep without the need for frequent checks of the RTC alarm flags.
Motion detection (MD) modules (MDA, MDB, and MDC) signal the presence of a moving animal via three different pins of the main µC, which are linked to distinct interrupt service routines as described in Section 2.8. The µC also monitors the main battery voltage status through the power unit, employing two pins—one analog input and one digital output, as detailed in Section 2.6 and Section 2.8. When the specific conditions outlined in Section 2.8 are met, the µC activates the VAU through a miniature latch relay, directing the VAU to either execute a daily routine or capture a brief video of an animal in the filming box. Upon task completion, the VAU is shut down.
The µC additionally records the ambient temperature using a DS18B20 sensor, with communication facilitated through a 1-wire serial interface. Collected data, including the date, motion detection counts (calculated from interrupts received from the MD modules), and temperatures, are written to an SD card using the SPI serial port of the µC.

2.7. Power Unit

The exclusive power source of the system is a single 6 V battery that should be able to sustain its operation for a duration of up to 10 months under the extremely cold conditions of the Arctic winter. Drawing upon the test results and energy estimations outlined in [46], we opted for the same battery model.
Two regulated voltages are derived from the battery: 3.3 V ( V C C 1 ) and 5 V ( V C C 2 ). V C C 2 powers the SBC, while V C C 1 supplies the remaining components of the system, including microcontrollers, motion detection circuitry, and RTCs. To reduce the quiescent current of the power supply circuit and completely eliminate the quiescent current of the VAU during periods of inactivity, the V C C 2 supply is switched on solely during video acquisition.
The regulator used to generate V C C 2 has a maximum input operating voltage ( V I N ) of 6 V, which is slightly lower than the fully charged battery voltage. To safeguard against overvoltage, a Schottky diode with a forward voltage of 0.4 V is incorporated. However, this approach limits the full utilization of the battery capacity, as the minimum V I N must not fall below 5.5 V. To overcome this limitation, which also exists in [46], we added a relay to bypass the diode ( D 3 ) when the voltage drops below the threshold V B 1 t h L = 5.95 V (Figure 4). The selected relay is a double-pole double-coil latch type with a maximum contact resistance of 75 m Ω . Both poles are paralleled to minimize the dropout voltage across the contacts. To set or reset the relay, a 3 V pulse with a minimum width of 10 ms is applied to the relevant pins. This process, detailed in Section 2.8, is controlled by microcontroller U1. A voltage divider, consisting of resistors R 3 and R 4 (Figure 4), supplies half of the battery voltage to be measured by U1. The same divider is also utilized by the main microcontroller to assess the battery status. To minimize the current drawn by these resistors, they are connected to the battery only during measurement and by means of transistors Q 1 and Q 2 , with Q 2 serving the additional purpose of safeguarding against reverse polarity.

2.8. Processing Algorithms

The entire process is orchestrated by four algorithms: one runs on the µC of the motion detection module (µCMD), another on the µC of the power unit (µCPU), the third on the µC of the low-power processing unit (µCLPPU, also referred to as the main µC), and the fourth on the single-board computer.
The µCMD procedure (Algorithm A1, Appendix B) generates regular pulses from PIR signals. To minimize µC energy consumption, two choices were made. Firstly, we utilized the low-current WDT of the device that operates from a low-frequency internal oscillator (LFINOSC = 31 kHz). Secondly, the µC is kept in power-down mode (Sleep) as long as possible while powering off all unnecessary peripherals. As detailed in Algorithm A1, µCMD enters Sleep mode shortly after power-up and remains in this state until it awakens because of receiving the first PIR signal ( p i r _ o u t , the voltage at the GP2 pin; Figure 3b). It then disables the external interrupt for this signal (designated as ExtInt, Algorithm A1), sets the MD module output ( m d _ o u t ) high, configures the WDT to t 1 = 66 ms, enables the WDT, and returns to Sleep. When the WDT time-out occurs, the device wakes up, pulls m d _ o u t low, sets the WDT to t 2 = 1057 ms, clears the WDT, and returns to Sleep again. During this time, p i r _ o u t is ignored. For the second WDT wake-up, the WDT is disabled and the ExtInt is re-enabled (allowing p i r _ o u t to interrupt the device) before going back to Sleep mode.
The µCPU procedure (Algorithm A2, Appendix B) handles two tasks: deciding on the battery path (via diode D 3 or the relay contacts, Figure 4) and allowing µCLPPU to measure the battery voltage using the voltage divider of the PU ( R 3 and R 4 , Figure 4). The procedure begins by resetting relay K 1 , ensuring the system is initially supplied through the diode. It then measures the battery voltage after putting the voltage divider in service. If the voltage is below the threshold V B 1 t h L , it sets the relay to bypass the diode and sleeps immediately, awaiting external triggers for wake-up. The first e x t _ t r i g signal, sent by µCLPPU, triggers the device to count the e x t _ t r i g interrupts within a period of 512 µs. Based on the number of interrupts received, it then decides what action to take: (1) to check the battery voltage and change the relay state if necessary; (2) to put the voltage divider in service, allowing the main µC to measure the battery voltage by itself; (3) to disconnect the voltage divider from the battery; or (4) to do nothing. It is important to mention that µCLPPU uses the same signal to power the temperature sensor. This approach was taken to achieve two goals: (1) dealing with the limited number of main µC pins and the absence of a communication bus while preventing µCPU from consuming any unnecessary power during the temperature measurement period and (2) allowing µCLPPU to command µCPU to perform different tasks utilizing merely one pin. For additional information, including details on the battery voltage measurement, please refer to Appendix B.
The system performs three main tasks: collecting daily sensory data (S1), recording a daily video (S2), and recording a motion-triggered video upon animal entry (S3). As illustrated in Figure 5, the main µC is either directly involved in performing these tasks or manages the relevant processes.
Following initialization, it checks the battery status with the help of µCPU and goes to power-down mode (a specific sleep mode with minimal peripherals) indefinitely if the battery energy is insufficient. Otherwise, normal processing begins by configuring RTC1 to alarm every hour (T1) and enabling motion detection interrupts. Alarm T1 wakes up µCLPPU every hour to check whether it is time to perform S1 or S2 in addition to measuring the battery voltage and temperature. For S1, some data such as hourly motion detection (MD) counts, and the battery voltage is written to a file created on SD card 1. For S2, a short daily video is recorded with the help of the video acquisition (VA) unit. After handling a task, it is removed from the request queue, and µCLPPU returns to power-down mode if there are no more tasks to handle.
When µCLPPU exits sleep mode due to the reception of a signal ( m d _ o u t ) from any of the MD modules (MDA, MDB, or MDC), it increments the relevant MD counter and goes back to sleep if the VA is not permitted. Otherwise, it immediately turns on the SBC. If MDC is triggered, it also asks the SBC to start recording a video. Receiving a signal from the other MD modules does not trigger video recording until the moment when the MD module C detects that an animal has entered the filming box. If no animal is detected inside the filming box within 10 s, the SBC is powered off, and µCLPPU returns to sleep. At the end of the VA procedure, the second timer of RTC1 (T2) is set to alarm µCLPPU in one hour, the SBC is powered off, and µCLPPU goes to the sleep mode. Until receiving alarm T2, no VA will be permitted, but counting the MD signals continues regardless.
The video acquisition procedure running on the SBC is very similar to [46] except that in the current system, the SBC is powered off by µCLPPU, and two modes of video recording are available: daily and motion. Microcontrollers were programmed in C language using the MPLAB X IDE for the MD modules and the power unit and the Arduino IDE for the main microcontroller. Adopting the approach suggested in [46], bare-metal programming was used to address the slow booting and shutdown problems reported there. Codes running on the SBC were written in Free Pascal and compiled on the Ultibo platform. Readers can find details on the VA procedure in [46], including the use of libraries such as MMAL API, MMC, and VC4.

2.9. Final Configuration

All electronic components and boards (such as the camera, IR emitters, and PIRC), but excluding PIRA, PIRB, and the temperature sensor, are carefully arranged within a compact electronic box (Figure 6a). This box, alongside the battery, is then securely positioned within a customized enclosure (Figure 6b) of protection class IP68 to protect them from water, snow, dust, and wildlife. A precisely crafted opening in the electrical enclosure is implemented to allow the camera to film animals entering the filming box. This aperture is shielded by a protective window made of Borosilicate glass with an anti-reflective coating. This window possesses a high transmission rate within the wavelength range of 0.35 to 2 µm, encompassing both the visible and near-infrared spectra (0.4–1.4 µm, see [44] for optical details). The window is attached to the electrical enclosure with epoxy, complemented by waterproof tape and silicone gel to establish a robust watertight seal. Three supplementary openings are created: one for the PIRC sensor, which is shielded by an IR plastic cover, and two for the cables of the remaining PIR sensors placed outside the electrical enclosure. These openings are sealed using cable glands of a high protection class. The battery is secured to the enclosure’s bottom plate with galvanized aircraft cable and plastic tie wraps.
The external PIR sensors (PIRA and PIRB), including one paired with a temperature sensor, are assembled on a compact board and housed within a small box. These modules are affixed to entry pipes A and B (G and H in Figure 6c). The associated cables are protected by a combination of electrical conduits and flexible tubes enveloped in expandable stainless steel sleeves. This configuration allows for rotation of the sensors along with the pipes, which facilitates field deployment. Finally, the filming box and the electrical enclosure are mounted on a U-shaped aluminum frame. The frame consolidates all system components, making field deployment easier and, more importantly, enabling precise alignment of the camera with respect to the filming box.
We made two versions of the system: ArcÇav 2.1, equipped with MD-M1 and RTC-M1 modules, and ArcÇav 2.2, equipped with MD-M2 and RTC-M2 modules. Both versions require two RTC modules but can be configured with either one or three MD modules. For the current study, both versions were configured with three MD modules for field trials.

3. Laboratory Tests

3.1. Motion Detection Module

First, we assessed the performance of both models of the motion detection module (MD-M1 and MD-M2, Figure 3) without the PIR sensor. To accomplish this, we simulated the output of the PIR sensor ( p i r _ o u t ) by generating a sequence of input signals using a push button or a signal generator. In the case of MD-M1 (Figure 7a,b), the rising edge of the input signal triggered the module to generate a one-second pulse. Subsequent input pulses did not retrigger the module (or extended the output) until the output had returned to a low state initially.
The MD-M2 model (Figure 7c,d) produced a signal with a pulse width ( t p ) of approximately 81 ms following a delay ( t d ) of less than 13 ms. Similar to MD-M1, this model is non-retriggerable with a masking period ( t m ) of approximately 1.3 s. Inputs with a pulse width smaller than 10 ms ( t p m i n ) did not yield an output, as they were considered noise. This approach filters out weak activations of the PIR sensor that may arise from sources other than animal movement.
The time delay introduced by this module was negligible but could be further reduced to a few microseconds by decreasing t p m i n or eliminating the input signal checking process (lines 14–15 of Algorithm A1) and also by using a higher oscillator frequency, such as 4 MHz, instead of 31 kHz. However, this can lead to a significant increase in current consumption.
Model MD-M1 without the sensor exhibited a continuous consumption of approximately 27.5 µA, and the presence of the input signal had no significant impact on this current. However, the situation was more complex for MD-M2, as depicted in Figure 8a,b. Immediately after a rising edge in the input ( p i r _ o u t ), the current sharply rose to reach a peak of 13.9 µA, followed by a rapid drop to 1.2 µA. Two more pulses occurred after intervals t 2 (62 ms) and t 4 (1.19 s). This pattern corresponds to the circuit design, where each pulse represents a wake-up event, with the first pulse being the result of a rising edge in the input and the subsequent two by WDT interrupts. During periods t 2 and t 4 , the µC was in sleep mode with the WDT running. Consequently, the current during these periods was significantly higher than during deep sleep (before t 1 and after t 5 ), with measured currents ranging from 30–50 nA, where all peripherals, including the WDT, were off. Figure 8c illustrates the results of a test conducted to confirm that the MD module remained insensitive to input signals during the masking period.
We also examined the electrical characteristics of the PIR motion detection sensor (Figure 8d). When the sensor detected a moving target within its viewing angle, its output switched to the ON state, drawing approximately 2.5 µA, including the current drawn by resistor R 1 (Figure 3). In the absence of a detected target for a period of 4–5 s, the sensor switched to sleep mode, drawing a very small current of around 1 µA.

3.2. Real-Time Clock Module

The first model of the RTC module (RTC-M1) exhibited a standby current ranging between 100 µA and 140 µA. However, in the other two models (RTC-M2 and RTC-M3), this standby current decreased significantly to approximately 2.23 µA, with about 1 µA attributed to U2 and the remaining portion primarily drawn by U1 (Figure 2). Notably, there was negligible variation between the current consumption of RTC-M2 and RTC-M3. During temperature conversion cycles, occurring every 10 s, the current spiked to 220 µA, as depicted in Figure 9a. Because the conversion time was as small as 10 ms (Figure 9b), the average current consumption only increased to 2.45 µA. It is noteworthy that the U1 datasheet specifies a typical conversion time of 125 ms (with a worst-case scenario not exceeding 200 ms), a maximum temperature conversion current of 575 µA, and a conversion cycle repeated every 64 s. Repeated testing consistently yielded the same results. Recalculating the average current consumption based on the values outlined in the datasheet for the worst-case scenario, the anticipated average current consumption for either RTC-M2 or RTC-M3 is approximately 4 µA.
Given the imperative of maintaining a voltage of ≥2.3 V at the BAT pin for proper functioning of the U1 chip, we scrutinized this voltage ( r t c _ v b a t , Figure 2) across various scenarios: when V C C 1 was present (On), absent (Off), and during On–Off and Off–On transitions. The results of these tests are given in Figure 9c,d. This voltage can be represented by the following equation.
r t c _ v b a t = V C C 1 V D 1 , if V C C 1 V t h V B 2 , otherwise
In normal operation, during timekeeping, r t c _ v b a t (the voltage at the BAT pin) was approximately 3.2 V, with a minor dropout of around 100 mV across resistor R 4 and diode D 1 (Figure 2). Under these conditions, the RTC module drew a current of less than 3 µA, resulting in an extremely low dropout on the resistor (3 µA · 10 Ω = 30 µV). Additionally, we confirmed that the voltage drop across diode D 1 ( V D 1 ) was approximately 100 mV for currents below 5 µA. However, during temperature conversion, the RTC current surged to approximately 220 µA, leading to a dropout of around 250 mV across D 1 and an insignificant dropout of 2 mV across the resistor. Whenever V C C 1 dropped below V t h , the BAT pin connected to the backup battery via Q 1 , facilitated by U2 asserting a low voltage on its reset pin, linked to the Q 1 -Gate.
An approximate interval of 184 ms lapsed between the main supply ( V C C 1 ) power-on and the restoration of r t c _ v b a t to 3.2 V. This delay stemmed from an internal delay within U2, which was designed to overcome glitches. Throughout this period, a current of around 650 µA charged the backup battery. Figure 9e,f illustrates the outcomes of testing RTC-M3; it exhibited behavior akin to that of RTC-M2 with two notable exceptions. First, r t c _ v b a t was either V C C 1 (when V C C 1 > V t h ) or V B 2 due to minuscule dropouts on Q 1 , Q 2 , and Q 3 . Second, during the power-on time ( t d o n ), the backup battery remained uncharged as transistors Q 2 and Q 3 remained inactive. It is noteworthy that Equation (3), with V D 1 replaced by 0, can be used to represent the voltage r t c _ v b a t of RTC-M3.

3.3. Energy Expenditure

In addition to the standby mode, we measured the total energy consumption of the proposed system for four distinct tasks: hourly routine, daily routine, motion detection counting, and video acquisition. The average current at standby, inclusive of the RTC temperature conversion current, remained below 8 µA and 10 µA (equivalent to 60 µW) for the second version of the system (ArcÇav 2.2) with one and three motion detection modules, respectively. This current is merely one three-thousandth of the current consumed by the system presented in [47], roughly one hundredth of that consumed in [48], and one seventy-fifth of the amount consumed in [46].
Triggered by the r t c _ i n t , the main µC awakened every hour to perform the hourly routine (measuring battery voltage and temperature, Section 2.8). The main µC took 6 ms to wake up, capture the RTC alarm, and reset it. Shortly after, the battery was connected to the voltage divider, enabling the main µC to measure the battery voltage (Figure 10a). This was followed by the measurement of the ambient temperature, which was a much lengthier process. The second pulse of Ch3 ( v b a t _ h ) indicates the second voltage measurement but this time by the PU. Whenever the voltage was below the threshold, diode D 3 was bypassed by relay K 1 (Figure 4). The entire process lasted approximately 266 ms and had an average current of around 4 mA.
The initial step of the daily routine (Figure 10b) started after powering up the SD card. This step, which lasted for 625 ms with an average current of about 28 mA, included SD card initialization followed by the recording of collected data such as ambient temperatures and motion detection counts onto the SD card. The last steps of the daily routine (lasting 885 ms) were identical to the hourly routine. It is worth noting that the majority of the energy supplied to the SD card was utilized for the initialization purpose.
We examined how the main µC (Atmel ATmega328P) responded to signals from the MD module. Upon receiving a rising-edge signal from this module ( m d _ o u t ), the main µC woke up from sleep, incremented the relevant counter depending on the triggered sensor, and returned to sleep. This process took 6.5 ms, and the current reached a peak of around 3.6 mA ( t 1 in Figure 10d). The current then reverted to the standby mode in 3.6 ms. The subsequent peak in the current indicated another wake-up event triggered by the falling edge of the m d _ o u t . The Atmel ATmega328P has only two pins with an external interrupt feature, which can be configured to trigger on a falling edge or a rising edge. These two pins were allocated to the interrupts received from the RTC module and one of the MD modules. Therefore for the other two MD modules, we utilized two other pins featuring only interrupt-on-change functionality. Results of the scenario where the motion detection output was connected to the pin with interrupt-on-change capability are presented in Figure 10c,d. As a result, the µC had to wake up twice for each m d _ o u t pulse: on the rising and falling edges. For these pins, the rising edge was detected by the program running on the µC.
When the conditions outlined in Section 2.8 are satisfied, the detection of movement can lead to video acquisition. Figure 10e illustrates test results for this case. There was a delay of approximately 20 ms between the rising edge of the MD module output and powering on the SBC, for which the startup delay approximated 2.4 s. To gauge the startup delay of the SBC, one of its unused pins (Ch3 in Figure 10e) was temporarily configured as a digital output. This pin was set to high level as soon as the startup was completed, and it was forced back to low level at the end of video acquisition. There was a sharp increase of about 240 mA in the current at the end of the startup, corresponding to the activation of the IR emitters. When the camera was capturing a video, the current surged again, fluctuating between 440 mA and 650 mA, with an average of 545 mA.
To have an estimate of the total energy expenditure of the system over a recording season (typically 9–10 months), we calculated the daily energy consumption for each task (Table 1). The calculations were done for two versions of the system (ArcÇav 2.1 and ArcÇav 2.2), each with two variations: one with a single MD module (MDC) and the other with three MD modules (MDA, MDB, and MDC). For these calculations, we assumed that the video acquisition (VA) is triggered 24 times (once an hour) per day, recording 8-second clips (i.e., S3 or VA2). This represents the maximum possible number of VAs, as the interval between subsequent videos is set to one hour. The system also captures a three-second video (i.e., S2 or VA1) daily at a predetermined time. To estimate the required energy for task motion detection counting, we assumed that one or more animals continuously moved inside the filming box and the entrance pipes for a total of one hour per day.
The total energy consumption of ArcÇav 2.2 was significantly lower than that of ArcÇav 2.1: 17% and 14% for the system with three and one MD modules, respectively (Table 1). Particularly, the standby mode of the second version consumed about one thirtieth of the energy of the first version. The energy consumed by the MD counting for ArcÇav 2.1 was slightly less than 1% of the total energy consumed by the entire system. For the same task, ArcÇav 2.2 consumed a few times less energy: around 30 µAh and 77 µAh for the system with one and three MD modules, respectively. This current is negligible compared to the total system consumption. The combined energy consumption of the MD counting task and standby mode for ArcÇav 2.2 was less than 1% of the total energy consumed by the entire system, achieving one of the goals of this study. For a recording season of 300 days under the described conditions, the total energy consumed by ArcÇav 2.2 with three MD modules should be approximately 9.9 Ah, of which a negligible amount (67 mAh) will be used in the standby mode, and almost a third of this amount will be for the MD counting task.

4. Field Results

To study the behavior of lemmings beneath the snow, eight units of ArcÇav 2.1 (all equipped with three MD modules) were installed on Bylot Island, Nunavut, Canada (73° N, 80° W) in August 2021; a detailed description of the study site is given in [51].
The deployment of units took place at sites exhibiting a likelihood of use by lemmings, such as those with deep snow accumulation and with evidence of lemming activity during the preceding winter (such as winter nests and fecal remains [51]). We retrieved data from these units 10 months later in late May or early June 2022. In August 2022, we further deployed ten units of ArcÇav 2.1 and two units of ArcÇav 2.2 at the same site and retrieved data in May or June 2023. For sample figures of the retrieval process, see Appendix C.
During the first recording season, five units operated throughout the entire period (two stopped recording data after a few days and another one was not set up properly). The units took a total of 981 motion-triggered videos, 384 of which included an animal (brown lemming [Lemmus trimucronatus], collard lemming [Dicrostonyx groenlandicus], or ermine [Mustela richardonii]; see Table 2). These videos were classified into three categories: (1) videos featuring animals with sufficient quality for species identification (designated as “High quality”); (2) videos showing animal presence but with poor image quality, hindering species identification (termed as “Low quality”); and (3) videos devoid of any animal presence. In some cases, animals were filmed while staying inside pipe A, and those recordings were categorized as “Low quality”. Samples of animal-containing videos are provided in Supplementary Materials (see Video S1).
Nearly 40% of the videos featured an animal (see samples of high-quality images in Figure 11). Approximately 16% of animal-containing videos were of low quality due to issues such as blurred images (largely attributed to minimal camera–animal distance) and poor views (e.g., partial visibility due to placement within entrance pipes or dead zones of the box). These results represent a noticeable improvement over those reported in [46], where these percentages were 30 to 35%.
The system was programmed to capture a short daily video, gather ambient temperature readings, and record hourly motion detection (MD) counts, all of which were stored in a text file generated once per day (see Section 2.8). We present an overview of the number of files written to the SD cards and the duration of system operation for each unit (Table 3). Each day of operation should have yielded one daily routine video (DRV) and one daily text (DT) file, but occasional absences of these files were observed due to a programming bug in the main µC. This was rectified for units deployed in 2022, and the issue nearly disappeared (Table 3), except for unit U3 (some missing daily videos) and unit U6 (missing daily text files). The source of this problem was likely loose connections in the SD card slots, possibly due to temperature-induced contraction or expansion or improper card installation during setup.
During the second recording season (2022–2023), only 13 MT videos were recorded per unit on average compared to 140 in the previous season even though more systems were deployed (12 vs. 7, respectively; Unit 3 was excluded in 2021–2022). Very few videos featured an animal, suggesting a drastic decline in the lemming population between these two years, which was confirmed by summer trapping [52]. Analysis of daily videos recorded during the winter 2022–2023 revealed no sign of animal visits, such as fecal remnants or vegetation transport, underscoring this population decline.
Examining the seasonal pattern of motion detection (MD) counts can provide quick insights into species activity intensity and its temporal distribution (Figure 12). We see a strong decline in animal detections from August to October 2021 and a near absence of detections throughout the winter, which is indicative of a large decline in animal activity or numbers at the study site. This was confirmed when we related the number of motion detections with the number of videos recorded by the same unit during the same month (Figure 13). We found a strong positive relationship between the monthly MD counts and the monthly video counts, which was highly significant (p < 0.001) and had a correlation coefficient of 0.78 ( R 2 = 0.61 , N = 50 ). The large variation around the predicted line can be partly explained by the fact that video recordings were limited to one per hour, whereas all motion detections were counted. Therefore, an animal passing through the filming box repeatedly or exploring the box for several seconds (or minutes) during a one hour period would yield many motion detections but only one video. In contrast, an animal passing through only once and remaining inside the box for a few seconds would still result in one video but significantly fewer detections compared to the previous case.
Temperature recording by the ArcÇav 2 system can also provide useful information of the thermal environment experienced by lemmings under the snow. Minimum subnivean temperatures exhibited considerably less short-term temporal variation than minimum air temperatures at the study site and remained mostly warmer from December to April (Figure 14), likely due to snow accumulation atop the units. From August 2021 to June 2022, the highest minimum temperature recorded was −15 °C (by unit U1), while the lowest was −28 °C (by units U5 and U6 in February 2022). In 2022–2023, unit U1 again registered the highest minimum temperature at −15 °C, whereas units U5 and U10 experienced a colder winter, with minimum temperatures of −34 °C and −29 °C, respectively.

5. Discussion

Building upon our previous endeavor [46], which successfully captured footage of lemmings under the snow in the High Arctic—a more demanding environment than the Subarctic [42]—our latest monitoring system offers several enhancements.
Primarily, we have achieved a remarkable reduction in standby power consumption by over 70-fold, significantly enhancing energy efficiency, which was the main objective of this study. This frees up a considerable portion of battery capacity that was previously dedicated to supplying standby current in the system proposed in [46] (ArcÇav 1). The saved energy can now be utilized for recording more videos (e.g., by reducing the one-hour interval between subsequent video recordings to 40 min). Generally speaking, the frequency of video recordings in a camera-trapping system is constrained by setting a minimum time interval between consecutive recordings: a measure taken to circumvent power limitations. However, this approach carries the inherent risk of missing valuable information regarding potential animal presence during these intervals. Besides improving power efficiency, we have introduced a novel metric for monitoring animal activity: the motion detection count. This metric is measured by a subsystem with minimal energy consumption, which is only one hundredth of the energy required for video acquisition. MD counts can provide Supplementary Information on activity level in the filming area between video footage, thereby eliminating the risk of losing such valuable data. Statistical analyses strongly support that the MD count can be a reliable index of animal activity, validating our hypothesis in this study. These two improvements, introduced in ArcÇav 2, can alleviate the constraints on the frequency of video recordings inherent in ArcÇav 1 as well as in existing camera-trapping systems. Furthermore, we have noticeably decreased the proportion of videos featuring no animals or of low-quality blurred images or obscured views.
We attribute the increased frequency of videos featuring animals with the ArcÇav 2 system over previous versions [46] to a series of enhancements. These include relocating entrance B (see Appendix A), adjusting the positioning of PIR sensor C (shifting it leftward to avoid triggering due to animal movements outside the filming box in front of pipe A; see Figure A2), integrating additional MD modules to detect animal activity within entrance pipes A and B, and implementing a new triggering algorithm that activates the SBC upon animal entry into pipes A or B. When a lemming passes through either entrance pipe at its typical pace, it requires 1–2 s to enter the filming box. This duration partially compensates for the system startup delay and significantly increases the chance of capturing a video while the animal is still inside the filming box.
Similarly, the reduced occurrence of low-quality videos can be attributed to several modifications. These include relocating entrance B to ensure its visibility within the camera field of view, decreasing the ratio of the camera deadzone to observable areas within the filming box, and extending the length of the filming box. The effectiveness of these adjustments is further complemented by the new triggering algorithm. Additionally, employing a larger number of IR emitters with increased radiant intensity, wider half-intensity angles, and improved positioning around the camera has contributed to the acquisition of higher-quality videos compared to those presented in [46].
The utilization of ArcÇav 2 promises a deeper understanding of lemming winter ecology, providing insights into their activity patterns, social behaviors, and reproductive activities. Such data are invaluable for elucidating the population dynamics of rodents, which are of critical importance to the food chain in many ecosystems. While acknowledging that the filming box may influence natural behaviors of animals and potentially limit our inferences, we believe it currently offers the most reliable means of gathering such information.
ArcÇav 2 records a short video every day, which can provide significant additional information. For instance, daily videos can aid in monitoring environmental changes such as foggy conditions and timing of frost formation on the camera lens. Another valuable data point acquired by ArcÇav 2 is the subnivean temperature, which, when combined with other data such as MD counts and MT videos, provides a foundation for testing hypotheses related to the impact of snow depth on animal population dynamics and the extent of such influence.
Another benefit of ArcÇav 2 is that a simplified version without the camera and related components can be reduced solely to an MD counting tool, which is a significantly cheaper (approximately 200 USD) and more compact system. Basically, it can fit into a small electronic box installed on a pipe like parts H or G depicted in Figure 6. This streamlined setup would result in a much more affordable apparatus and would require a substantially smaller battery, potentially as small as 100 mAh capacity, to operate for an entire year (see Table 1). Therefore, the simplified iteration of the equipment could be deployed in large quantities either independently or alongside fewer ArcÇav 2 units. This approach has the potential to yield larger sample sizes, thereby enhancing the reliability and validity of research findings regarding the population dynamics of studied species at a larger spatial scale.
Despite the proven utility and reliability of ArcÇav 2 in studying the life of Arctic subnivean animals, as demonstrated by experimental results, there are areas where this monitoring system can still be enhanced. While modifications introduced to ArcÇav 2 led to an increase in the percentage of videos featuring animals compared to [46], a significant proportion of videos still lack animal presence. This may result from several scenarios, such as animals swiftly traversing the pipes and filming box or briefly entering the box before quickly exiting through the same pipe in less than 2–3 s. This would activate the motion detection module C and thereby trigger video recording, but due to the relatively long startup delay of the system (≈3 s), the animal is already gone when the video is recorded. The RPi0 (the SBC utilized for video acquisition) typically operates with an operating system. This results in a very slow boot time (over 70 s [46]), which is unacceptable for our application. The approach adopted in this study to address this problem yielded a significantly shorter boot time (≈3 s), but this is still not ideal. Even though much faster triggering of video recordings could be possible if the RPi0 were kept powered on, this would prohibitively increase energy consumption and prevent recording for more than a few days or weeks. Another solution to this challenge would be to extend the length of the inlet pipes to compensate for the startup delay between motion detection and video recording initiation. However, longer pipes may complicate field installation. Alternatively, a technically demanding but more effective solution would be to reduce the startup delay to less than one second or even less than 100 ms. However, implementing this idea with the current generation of SBCs poses significant challenges, if not impossibilities, due to technical reasons beyond the scope of the current study.
Although animals briefly entering the filming box or quickly passing through it probably account for most of the empty videos, PIR sensors may also be triggered by unwanted stimuli. Two potential sources of such stimuli can be identified. Firstly, sudden fluctuations in air temperature, caused for instance by gusts of cold or hot air passing through the entrance pipes, can activate the sensor if they induce sufficient temperature variation. We have confirmed such a scenario by simulating a rapid temperature change in laboratory experiments utilizing a hot air blower. Adding an elbow to the end of the entrance pipes may solve this potential issue. Additionally, based on some MT videos captured by the system, large insects such as spiders may also trigger the PIR sensors. However, further testing is required to conclusively verify this observation.
Despite the superiority of RTC-M3 over RTC-M2, as supported by laboratory tests (see Section 3.2), we did not have the chance to deploy any ArcÇav units equipped with this module in the field yet because the RTC-M3 was completed in a later stage of our project. The RTC-M3 is designed to eliminate the risk of charging the RTC backup battery, which can be problematic for non-rechargeable batteries. However, in applications where the system is not frequently powered on or off, this issue becomes less significant because the risk occurs only during the power transition (switching from the OFF to the ON state). Moreover, when the RTC backup battery is new, its voltage is high, resulting in a very low charging current. Finally, the charging period caused by the internal delay of chip U2 in the RTC-M3 circuit is quite short. Nevertheless, we suggest equipping future ArcÇav units with RTC-M3, particularly if it is required to power on and off the system frequently. Since RTC-M3 eliminates the risk of charging the RTC backup battery, it prevents shortening the battery lifespan and avoids issues such as leakage and rupture.

6. Conclusions

In this paper, we presented the design and testing of ArcÇav 2: an autonomous monitoring system tailored for collecting behavioral data and activity patterns on small mammals beneath the snow in the Arctic. Building on our previous system (ArcÇav 1 [46]), the first autonomous camera system that successfully filmed lemmings in the High Arctic, ArcÇav 2 offers several enhancements, including a remarkable reduction in standby power consumption by over 70-fold. Our approach involves minimizing the energy consumption of each module and introducing a new real-time clock circuit with minimal current draw. We also developed a novel method for collecting complementary information (i.e., MD counts) from animal activities using passive infrared sensor signals. Although existing camera systems, to our knowledge, use this sensor solely to trigger video recording, our method efficiently gathers additional data from this sensor while consuming an extremely low amount of energy. Generally, camera-trapping systems limit the frequency of video recordings by setting a minimum interval between recordings to save power but at the cost of risking the loss of valuable information on animal presence during these intervals. MD counts offer Supplementary Data on activity levels between recordings, thereby eliminating this risk. Our system is the only camera system equipped with this crucial feature.
With sufficient deployment, this tool could facilitate the estimation of small mammal habitat utilization and population dynamics through occupancy modeling, as demonstrated in prior studies [43,54]. Integrating MD counts as a complementary data source, coupled with motion-triggered videos, could potentially provide more precise occupancy models and population estimation compared to relying solely on videos. A concomitant analysis combining information from videos acquired by the ArcÇav system and data from more traditional methods (live-trapping and winter nests of lemmings [14]) from 2018 to 2023 shows good performance of the system [52].
Our equipment provides biologists with an unprecedented opportunity to study subnivean animals throughout the entire winter season. Moreover, while initially developed for studying lemmings in remote areas with harsh climates such as that of the High Arctic, this technology holds promise for studying various small mammals in less extreme environments. It is worth noting that before deploying units to the Arctic, we conducted preliminary tests of two prototypes in the Montmorency Forest (north of Quebec City, QC, Canada) during the winters of 2020 and 2021, and we successfully monitored the subnivean behavior of small animals such as voles, shrews, and squirrels (unpublished work).
Despite some shortcomings (discussed in Section 5), the current ArcÇav system enables year-round monitoring of small animals in their natural habitats, even under extremely harsh environmental conditions. If deployed on a large spatial scale, it will provide biologists with valuable information necessary to address longstanding questions in animal ecology such as the population dynamics of arctic rodents. Furthermore, we hope this study has laid the groundwork for further research and encourages the development of more resilient and power efficient monitoring systems specific to unique environments, such as the subnivean space with its harsh conditions.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics13163254/s1, Video S1: Sample videos recorded by ArcÇav 2 at Bylot Island.

Author Contributions

Conceptualization, D.K. (electronics), M.P. (ecology), X.M. (electronics) and G.G. (ecology); methodology, D.K. (hardware and software design and implementation) and M.P. (ecological aspects and field deployment); software, D.K.; validation, D.K. (laboratory tests and field results) and M.P. (field results); formal analysis, D.K. and M.P.; investigation, D.K. (laboratory tests) and M.P. (fieldwork); resources, X.M., G.G. and C.I.-C.; writing—original draft preparation, D.K. and G.G.; writing—review and editing, D.K., M.P., X.M., G.G. and C.I.-C.; visualization, D.K.; supervision, X.M., G.G. and C.I.-C.; project administration, X.M. and G.G.; funding acquisition, X.M. and G.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Sentinel North program of Université Laval, funded by the Canada First Research Excellence Fund, the Natural Sciences and Engineering Research Council of Canada (NSERC), the Network of Centers of Excellence, ArcticNet, Polar Knowledge Canada, the Polar Continental Shelf Program of Natural Resources Canada, and the Canada Research Chair Program in Infrared Vision.

Institutional Review Board Statement

Field testing on Bylot Island was approved by the Animal Welfare Committee of Université Laval (protocol Nos. 2019-253, VRR-18-050) in accordance with the guidelines of the Canadian Council on Animal Care and by Parks Canada (permit No. SIR-2021-39399).

Data Availability Statement

The original contributions presented in the study are included in the article and Supplementary Materials.

Acknowledgments

The authors would like to thank Marco Béland for providing them with access to his workshop and tools and for assisting with the preparation of the filming boxes and customizing the electrical enclosures and electronic boxes. They also thank Louis-Pierre Ouellet and Emy Gagnon for the analysis of lemming videos and Gabriel Bergeron and Camille Gaudreau-Rousseau for their valuable help in the field while collecting data and deploying the camera systems.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADCAnalog-to-Digital Converter
APIApplication Programming Interface
CMOSComplementary Metal–Oxide–Semiconductor
CSICamera Serial Interface
DIDigital Input
DODigital Output
ExtIntExternal Interrupt
GPIOGeneral-Purpose Input/Output
IDEIntegrated Development Environment
I2CInter-Integrated Circuit
INTInterrupt
IoCInterrupt on Change
LEDLight Emitting Diode
LPPULow-Power Processing Unit
µCMicrocontroller
MDMotion Detection
MMALMultiMedia Abstraction Layer
MOSFETMetal–Oxide–Semiconductor Field Effect Transistor
NIRNear Infrared
PIRPassive Infrared
PUPower Unit
RPi0Raspberry Pi Zero
RTCReal-Time Clock
SBCSingle-Board Computer
SCLSerial Clock
SDASerial Data
SPISerial Peripheral Interface
VAVideo Acquisition
VAUVideo Acquisition Unit
WDTWatchDog Timer

Appendix A

Figure A1. Filming box, crafted from 1.6 mm (1/16 inch) aluminum sheets. The right image is the cross section of the box along a vertical plane passing through the middle of pipe A. Dimensions (L × W × H): 40 × 20 × 20 cm3. A small animal may use pipes A and B to freely enter or exit the box. C: location of the opening for the camera and sensor; a detailed layout is given in Figure A2.
Figure A1. Filming box, crafted from 1.6 mm (1/16 inch) aluminum sheets. The right image is the cross section of the box along a vertical plane passing through the middle of pipe A. Dimensions (L × W × H): 40 × 20 × 20 cm3. A small animal may use pipes A and B to freely enter or exit the box. C: location of the opening for the camera and sensor; a detailed layout is given in Figure A2.
Electronics 13 03254 g0a1
Figure A2. (a) Layout of the camera side of the filming box. It shows the actual positions of the camera, motion detection sensor, and infrared emitting diodes (IR emitters). H1 and H2 represent the height of the camera and the sensor, respectively. (b,c) Lateral and top view of the filming box and the camera field of view. The gray areas are the dead zone of the camera. (d) Camera and near infrared (NIR) illumination board. The system uses nine TSHG6400 IR emitters featuring a peak wavelength of 850 nm (see [44] for the rationale behind selecting this wavelength), a forward voltage of 1.5 V, an angle of half intensity spanning ± 22 degrees, and a radiant intensity of 45 mW/sr (at 100 mA). The IR emitters are positioned in the vicinity of the camera to achieve a relatively uniform distribution of NIR light. To ensure proper functioning, every trio of IR emitters ( D 1 D 3 in the center, D 4 D 6 on the right, and D 7 D 9 on the left) are connected in series and are individually supplied through separate driver circuits, configured to deliver 65 mA to D 1 D 3 and 90 mA to the remaining emitters.
Figure A2. (a) Layout of the camera side of the filming box. It shows the actual positions of the camera, motion detection sensor, and infrared emitting diodes (IR emitters). H1 and H2 represent the height of the camera and the sensor, respectively. (b,c) Lateral and top view of the filming box and the camera field of view. The gray areas are the dead zone of the camera. (d) Camera and near infrared (NIR) illumination board. The system uses nine TSHG6400 IR emitters featuring a peak wavelength of 850 nm (see [44] for the rationale behind selecting this wavelength), a forward voltage of 1.5 V, an angle of half intensity spanning ± 22 degrees, and a radiant intensity of 45 mW/sr (at 100 mA). The IR emitters are positioned in the vicinity of the camera to achieve a relatively uniform distribution of NIR light. To ensure proper functioning, every trio of IR emitters ( D 1 D 3 in the center, D 4 D 6 on the right, and D 7 D 9 on the left) are connected in series and are individually supplied through separate driver circuits, configured to deliver 65 mA to D 1 D 3 and 90 mA to the remaining emitters.
Electronics 13 03254 g0a2
Figure A3. ATmega328P microcontroller board preparation. Numbers 1–4 show the components that should be removed. Removal of components 3 (jumper S J 1 ) and 4 (LED D 3 ) from the board is necessary, but removal is optional for components 1 and 2. (a) Before modifications. (b) After modifications. A schematic of this board is given in [55].
Figure A3. ATmega328P microcontroller board preparation. Numbers 1–4 show the components that should be removed. Removal of components 3 (jumper S J 1 ) and 4 (LED D 3 ) from the board is necessary, but removal is optional for components 1 and 2. (a) Before modifications. (b) After modifications. A schematic of this board is given in [55].
Electronics 13 03254 g0a3

Appendix B

Algorithm A1:  Procedure of the µC of the motion detection module.
1:
procedure µCMD
/* DI: Digital input, DO: Digital output, IoC3: Interrupt on change for GP3
* ExtInt: External interrupt on GP2, SFE: Sleep forever, WDT: Watchdog timer
* MD: Motion detection */
/**** Initialization and Configuration ****/
2:
   t 1 ← 66 ms    t 2 ← 1057 ms    t p m i n ← 10 ms
3:
   s t a t e ← PWR_ON
4:
  Define GP0 as m d _ o u t , GP2 as p i r _ o u t , GP3 as s h d n
5:
  Configure GP2 and GP3 as DI, GP0 as DO, and LFINTOSC (31 kHz) as clock source
/**** Main procedure ****/
6:
   m d _ o u t ← 0
7:
  Wait a few seconds.      ▹ Until other units are ready
8:
  Disable WDT  Enable IoC3  Enable ExtInt
9:
  Go to sleep
10:
   while system is ON do
11:
   if wake-up due to s h d n interrupt then
12:
      s t a t e ← SFE   m d _ o u t ← 0  Disable WDT, IoC3, and ExtInt
13:
    else if  s t a t e is MD and wake-up due to p i r _ o u t interrupt then
14:
     if  p i r _ o u t does not remain high within t p m i n  then
15:
       Return to sleep      ▹ Consider input as noise.
16:
      s t a t e ← WDT1   m d _ o u t ← 1  Set WDT for t 1
17:
     Disable ExtInt  Enable WDT  clear WDT
18:
   else if  s t a t e is WDT1 and wake-up due to WDT interrupt then
19:
      s t a t e ← WDT2   m d _ o u t ← 0  Set WDT for t 2
20:
     Enable WDT  clear WDT
21:
   else if  s t a t e is WDT2 and wake-up due to WDT interrupt then
22:
     Disable WDT   s t a t e ← MD  Enable ExtInt
23:
   Go to sleep
Algorithm A2: Procedure of the µC of the power unit.
1:
procedureµCPU
/**** Initialization and Configuration ****/
2:
  Define GP0 as v b a t _ h , GP2 as e x t _ t r i g , GP3 as s h d n
3:
  Define GP1 as c o n _ r e s _ t o _ b a t , GP4 as s e t _ r e l , GP5 as r e s e t _ r e l
4:
  Configure GP2 and GP3 as DI, GP0 as AI, and GP1, GP4, and GP5 as DO
5:
  Configure ADC      ▹ ADC: analog-to-digital converter
/**** Main procedure ****/
6:
   s e t _ r e l ← 0   r e s e t _ r e l ← 1   c o n _ r e s _ t o _ b a t ← 0  Wait P W R E L
7:
   r e s e t _ r e l ← 0   r e l _ s t a t e ← RESET
8:
  Wait a few seconds.      ▹ Until other units are ready
9:
   v b a t _ h getBatVol   r e l _ s t a t e changeRelState
10:
   Disable WDT, Enable IoC3 and ExtInt
11:
   Go to sleep
12:
   while system is ON do
13:
   if  s h d n interrupt then
14:
      c o n _ r e s _ t o _ b a t ← 0  Disable WDT, IoC3, and ExtInt
15:
      r e s e t _ r e l ← 1  Wait P W R E L
16:
      r e s e t _ r e l ← 0   r e l _ s t a t e ← RESET  Go to sleep
17:
   if  e x t _ t r i g interrupt then
18:
      g p 2 i n t _ c n t ← 1  Set TMR0  Enable TMR0
19:
     while TMR0 is NOT overflowed do
20:
       if  e x t _ t r i g interrupt then
21:
          g p 2 i n t _ c n t g p 2 i n t _ c n t +1
22:
     Disable ExtInt and TMR0
23:
     if  g p 2 i n t _ c n t ==4 then
24:
        v b a t _ h getBatVol   r e l _ s t a t e changeRelState
25:
     else if  g p 2 i n t _ c n t ==6 then
26:
        c o n _ r e s _ t o _ b a t ← 1 Set WDT for 264 ms Enable WDT
27:
     else if  g p 2 i n t _ c n t ==2 then
28:
        c o n _ r e s _ t o _ b a t ← 0 Disable WDT
29:
   if WDT interrupt then
30:
      c o n _ r e s _ t o _ b a t ← 0   s t a t e ← 0  Disable WDT
31:
    g p 2 i n t _ c n t ← 0  Enable ExtInt  Go to sleep
Change the relay state
32:
function changeRelState
33:
   if 2* v b a t _ h V B 1 t h L & r e l _ s t a t e is RESET then
34:
    s e t _ r e l ← 1  Wait P W R E L
35:
    s e t _ r e l ← 0   r e l _ s t a t e ← SET
36:
   else if 2* v b a t _ h V B 1 t h H & r e l _ s t a t e is SET then
37:
    r e s e t _ r e l ← 1  Wait P W R E L
38:
    r e s e t _ r e l ← 0   r e l _ s t a t e ← RESET
39:
   return  r e l _ s t a t e
Battery voltage measurement
40:
function getBatVol
41:
   Disable ExtInt   c o n _ r e s _ t o _ b a t ← 1  Wait 100 µs
42:
   Enable ADC  Wait t A C Q       ▹ the acquisition time
43:
   Start conversion
44:
   while conversion is not done do
45:
   wait 4 µs
46:
   Disable ADC   c o n _ r e s _ t o _ b a t ← 0
47:
   return ADC result
To measure the battery voltage, one of the on-chip analog-to-digital converters (ADCs) of µCPU is used. The A/D conversion involves configuring the ADC module, acquisition, conversion, and reading the result. We set the µC oscillator frequency F O S C ) to 4 MHz and the ADC clock source to F O S C / 8 resulting in a 2 µs A/D clock period ( T A D ). The conversion takes 11 ×  T A D (22 µs), but computing the acquisition time ( t A C Q ) depends on a few parameters, such as elements R 3 , R 4 , and C 3 of the PU circuit (Figure 4), the ADC amplifier settling time, the capacitance and resistance of the input pin, and the temperature. The reader may refer to the datasheet of the µC for details. The chosen t A C Q = 100 µs accommodates worst-case scenarios.

Appendix C

Figure A4. Retrieval process. Bylot Island. (a) ArcÇav 2.1, 28 May 2022. (b) ArcÇav 2.2, 31 May 2023.
Figure A4. Retrieval process. Bylot Island. (a) ArcÇav 2.1, 28 May 2022. (b) ArcÇav 2.2, 31 May 2023.
Electronics 13 03254 g0a4

References

  1. Elias, S. Chapter 11—Changes in Terrestrial Environments. In Threats to the Arctic; Elias, S., Ed.; Elsevier: Amsterdam, The Netherlands, 2021; pp. 323–365. [Google Scholar] [CrossRef]
  2. Post, E.; Alley, R.B.; Christensen, T.R.; Macias-Fauria, M.; Forbes, B.C.; Gooseff, M.N.; Iler, A.; Kerby, J.T.; Laidre, K.L.; Mann, M.E.; et al. The polar regions in a 2 °C warmer world. Sci. Adv. 2019, 5, eaaw9883. [Google Scholar] [CrossRef]
  3. Rantanen, M.; Karpechko, A.Y.; Lipponen, A.; Nordling, K.; Hyvärinen, O.; Ruosteenoja, K.; Vihma, T.; Laaksonen, A. The Arctic has warmed nearly four times faster than the globe since 1979. Commun. Earth Environ. 2022, 3, 168. [Google Scholar] [CrossRef]
  4. Berteaux, D.; Gauthier, G.; Domine, F.; Ims, R.A.; Lamoureux, S.F.; Lévesque, E.; Yoccoz, N. Effects of changing permafrost and snow conditions on tundra wildlife: Critical places and times. Arct. Sci. 2016, 3, 65–90. [Google Scholar] [CrossRef]
  5. Gilg, O.; Sittler, B.; Sabard, B.; Hurstel, A.; Sané, R.; Delattre, P.; Hanski, I. Functional and numerical responses of four lemming predators in high arctic Greenland. Oikos 2006, 113, 193–216. [Google Scholar] [CrossRef]
  6. Schmidt, N.M.; Ims, R.A.; Høye, T.T.; Gilg, O.; Hansen, L.H.; Hansen, J.; Lund, M.; Fuglei, E.; Forchhammer, M.C.; Sittler, B. Response of an arctic predator guild to collapsing lemming cycles. Proc. R. Soc. B Biol. Sci. 2012, 279, 4417–4422. [Google Scholar] [CrossRef] [PubMed]
  7. Elton, C.S. Periodic Fluctuations in the Numbers of Animals: Their Causes and Effects. J. Exp. Biol. 1924, 2, 119–163. [Google Scholar] [CrossRef]
  8. Turchin, P.; Oksanen, L.; Ekerholm, P.; Oksanen, T.; Henttonen, H. Are lemmings prey or predators? Nature 2000, 405, 562. [Google Scholar] [CrossRef]
  9. Oli, M.K. Population cycles in voles and lemmings: State of the science and future directions. Mammal Rev. 2019, 49, 226–239. [Google Scholar] [CrossRef]
  10. Andreassen, H.P.; Sundell, J.; Ecke, F.; Halle, S.; Haapakoski, M.; Henttonen, H.; Huitu, O.; Jacob, J.; Johnsen, K.; Koskela, E.; et al. Population cycles and outbreaks of small rodents: Ten essential questions we still need to solve. Oecologia 2021, 195, 601–622. [Google Scholar] [CrossRef] [PubMed]
  11. Krebs, C.J. Of lemmings and snowshoe hares: The ecology of northern Canada. Proc. R. Soc. B Biol. Sci. 2010, 278, 481–489. [Google Scholar] [CrossRef]
  12. Fauteux, D.; Gauthier, G.; Berteaux, D. Seasonal demography of a cyclic lemming population in the Canadian Arctic. J. Anim. Ecol. 2015, 84, 1412–1422. [Google Scholar] [CrossRef]
  13. Domine, F.; Gauthier, G.; Vionnet, V.; Fauteux, D.; Dumont, M.; Barrere, M. Snow physical properties may be a significant determinant of lemming population dynamics in the high Arctic. Arct. Sci. 2018, 4, 813–826. [Google Scholar] [CrossRef]
  14. Fauteux, D.; Gauthier, G.; Mazerolle, M.J.; Coallier, N.; Bêty, J.; Berteaux, D. Evaluation of invasive and non-invasive methods to monitor rodent abundance in the Arctic. Ecosphere 2018, 9, e02124. [Google Scholar] [CrossRef]
  15. Schmidt, N.M.; van Beest, F.M.; Dupuch, A.; Hansen, L.H.; Desforges, J.P.; Morris, D.W. Long-term patterns in winter habitat selection, breeding and predation in a density-fluctuating, high Arctic lemming population. Oecologia 2021, 195, 927–935. [Google Scholar] [CrossRef] [PubMed]
  16. Caravaggi, A.; Banks, P.B.; Burton, A.C.; Finlay, C.M.; Haswell, P.M.; Hayward, M.W.; Rowcliffe, M.J.; Wood, M.D. A review of camera trapping for conservation behaviour research. Remote Sens. Ecol. Conserv. 2017, 3, 109–122. [Google Scholar] [CrossRef]
  17. Cordier, C.P.; Smith, D.A.E.; Smith, Y.E.; Downs, C.T. Camera trap research in Africa: A systematic review to show trends in wildlife monitoring and its value as a research tool. Glob. Ecol. Conserv. 2022, 40, e02326. [Google Scholar] [CrossRef]
  18. Singh, R.; Qureshi, Q.; Sankar, K.; Krausman, P.R.; Goyal, S.P. Use of camera traps to determine dispersal of tigers in semi-arid landscape, western India. J. Arid. Environ. 2013, 98, 105–108. [Google Scholar] [CrossRef]
  19. Alexander, J.S.; Zhang, C.; Shi, K.; Riordan, P. A granular view of a snow leopard population using camera traps in Central China. Biol. Conserv. 2016, 197, 27–31. [Google Scholar] [CrossRef]
  20. Parsons, A.W.; Forrester, T.; McShea, W.J.; Baker-Whatton, M.C.; Millspaugh, J.J.; Kays, R. Do occupancy or detection rates from camera traps reflect deer density? J. Mammal. 2017, 98, 1547–1557. [Google Scholar] [CrossRef]
  21. La Torre, S.M.D.; Jacobson, S.L.; Chodorow, M.; Yindee, M.; Plotnik, J.M. Day and night camera trap videos are effective for identifying individual wild Asian elephants. PeerJ 2023, 11, e15130. [Google Scholar] [CrossRef]
  22. Sánchez-García, C.; Armenteros, J.A.; Alonso, M.E.; Larsen, R.T.; Lomillos, J.M.; Gaudioso, V.R. Water-site selection and behaviour of red-legged partridge Alectoris rufa evaluated using camera trapping. Appl. Anim. Behav. Sci. 2012, 137, 86–95. [Google Scholar] [CrossRef]
  23. Williams, K.; De Robertis, A.; Berkowitz, Z.; Rooper, C.; Towler, R. An underwater stereo-camera trap. Methods Oceanogr. 2014, 11, 1–12. [Google Scholar] [CrossRef]
  24. Welbourne, D.J.; Paull, D.J.; Claridge, A.W.; Ford, F. A frontier in the use of camera traps: Surveying terrestrial squamate assemblages. Remote Sens. Ecol. Conserv. 2017, 3, 133–145. [Google Scholar] [CrossRef]
  25. Welbourne, D.; Claridge, A.; Paull, D.; Ford, F. Improving terrestrial squamate surveys with camera-trap programming and hardware modifications. Animals 2019, 9, 388. [Google Scholar] [CrossRef] [PubMed]
  26. Sirén, A.P.; Pekins, P.J.; Abdu, P.L.; Ducey, M.J. Identification and density estimation of American martens (Martes americana) using a novel camera-trap method. Diversity 2016, 8, 3. [Google Scholar] [CrossRef]
  27. Littlewood, N.A.; Hancock, M.H.; Newey, S.; Shackelford, G.; Toney, R. Use of a novel camera trapping approach to measure small mammal responses to peatland restoration. Eur. J. Wildl. Res. 2021, 67, 12. [Google Scholar] [CrossRef]
  28. Villette, P.; Krebs, C.J.; Jung, T.S.; Boonstra, R. Can camera trapping provide accurate estimates of small mammal (Myodes rutilus and Peromyscus maniculatus) density in the boreal forest? J. Mammal. 2016, 97, 32–40. [Google Scholar] [CrossRef]
  29. Hedges, L.; Lam, W.Y.; Campos-Arceiz, A.; Rayan, D.M.; Laurance, W.F.; Latham, C.J.; Saaban, S.; Clements, G.R. Melanistic leopards reveal their spots: Infrared camera traps provide a population density estimate of leopards in Malaysia. J. Wildl. Manag. 2015, 79, 846–853. [Google Scholar] [CrossRef]
  30. Prasad, S.; Pittet, A.; Sukumar, R. Who really ate the fruit? A novel approach to camera trapping for quantifying frugivory by ruminants. Ecol. Res. 2010, 25, 225–231. [Google Scholar] [CrossRef]
  31. Liu, X.; Wu, P.; Songer, M.; Cai, Q.; He, X.; Zhu, Y.; Shao, X. Monitoring wildlife abundance and diversity with infra-red camera traps in Guanyinshan Nature Reserve of Shaanxi Province, China. Ecol. Indic. 2013, 33, 121–128. [Google Scholar] [CrossRef]
  32. Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E. Assessing Tiger Population Dynamics Using Photographic Capture–Recapture Sampling. Ecology 2006, 87, 2925–2937. [Google Scholar] [CrossRef]
  33. Kenney, A.J.; Boutin, S.; Jung, T.S.; Murray, D.L.; Johnson, N.; Krebs, C.J. Motion-sensitive cameras track population abundance changes in a boreal mammal community in southwestern Yukon, Canada. J. Wildl. Manag. 2024, 88, e22564. [Google Scholar] [CrossRef]
  34. Gil-Sánchez, J.M.; Antorán-Pilar, E. Camera-trapping for abundance estimation of otters in seasonal rivers: A field evaluation. Eur. J. Wildl. Res. 2020, 66, 72. [Google Scholar] [CrossRef]
  35. Davies, H.F.; Rioli, W.; Puruntatameri, J.; Roberts, W.; Kerinaiua, C.; Kerinauia, V.; Womatakimi, K.B.; Gillespie, G.R.; Murphy, B.P. Estimating site occupancy and detectability of the threatened partridge pigeon (Geophaps smithii) using camera traps. Austral Ecol. 2019, 44, 868–879. [Google Scholar] [CrossRef]
  36. Suzuki, K.K.; Ando, M. Early and efficient detection of an endangered flying squirrel by arboreal camera trapping. Mammalia 2019, 83, 372–378. [Google Scholar] [CrossRef]
  37. Suuri, B.; Baatargal, O.; Badamdorj, B.; Reading, R.P. Assessing wildlife biodiversity using camera trap data on the Mongolian marmot (Marmota sibirica) colonies. J. Arid. Environ. 2021, 188, 104409. [Google Scholar] [CrossRef]
  38. Oberosler, V.; Groff, C.; Iemma, A.; Pedrini, P.; Rovero, F. The influence of human disturbance on occupancy and activity patterns of mammals in the Italian Alps from systematic camera trapping. Mamm. Biol. 2017, 87, 50–61. [Google Scholar] [CrossRef]
  39. Dertien, J.S.; Doherty Jr, P.F.; Bagley, C.F.; Haddix, J.A.; Brinkman, A.R.; Neipert, E.S. Evaluating dall’s sheep habitat use via camera traps. J. Wildl. Manag. 2017, 81, 1457–1467. [Google Scholar] [CrossRef]
  40. Hossain, A.N.M.; Barlow, A.; Barlow, C.G.; Lynam, A.J.; Chakma, S.; Savini, T. Assessing the efficacy of camera trapping as a tool for increasing detection rates of wildlife crime in tropical protected areas. Biol. Conserv. 2016, 201, 314–319. [Google Scholar] [CrossRef]
  41. Kumbhojkar, S.; Yosef, R.; Mehta, A.; Rakholia, S. A camera-trap home-range analysis of the Indian leopard (Panthera pardus fusca) in Jaipur, India. Animals 2020, 10, 1600. [Google Scholar] [CrossRef]
  42. Soininen, E.M.; Jensvoll, I.; Killengreen, S.T.; Ims, R.A. Under the snow: A new camera trap opens the white box of subnivean ecology. Remote Sens. Ecol. Conserv. 2015, 1, 29–38. [Google Scholar] [CrossRef]
  43. Mölle, J.P.; Kleiven, E.F.; Ims, R.A.; Soininen, E.M. Using subnivean camera traps to study Arctic small mammal community dynamics during winter. Arct. Sci. 2022, 8, 183–199. [Google Scholar] [CrossRef]
  44. Pusenkova, A.; Poirier, M.; Kalhor, D.; Galstian, T.; Gauthier, G.; Maldague, X. Optical design challenges of subnivean camera trapping under extreme Arctic conditions. Arct. Sci. 2022, 8, 313–328. [Google Scholar] [CrossRef]
  45. Sturm, M.; Benson, C.S. Vapor transport, grain growth and depth-hoar development in the subarctic snow. J. Glaciol. 1997, 43, 42–59. [Google Scholar] [CrossRef]
  46. Kalhor, D.; Poirier, M.; Pusenkova, A.; Maldague, X.; Gauthier, G.; Galstian, T. A camera trap to reveal the obscure world of the Arctic subnivean ecology. IEEE Sens. J. 2021, 21, 28025–28036. [Google Scholar] [CrossRef]
  47. Humbert, J.W.; Onthank, K.L.; Williams, K. The open-source camera trap for organism presence and underwater surveillance (OCTOPUS). HardwareX 2023, 13, e00394. [Google Scholar] [CrossRef] [PubMed]
  48. Camacho, L.; Baquerizo, R.; Palomino, J.; Zarzosa, M. Deployment of a set of camera trap networks for wildlife inventory in western Aamazon rainforest. IEEE Sens. J. 2017, 17, 8000–8007. [Google Scholar] [CrossRef]
  49. Poirier, M.; Gauthier, G.; Domine, F. What guides lemmings movements through the snowpack? J. Mammal. 2019, 100, 1416–1426. [Google Scholar] [CrossRef]
  50. Analog Devices. General-Purpose RC Timer ICM755. Available online: https://www.analog.com/media/en/technical-documentation/data-sheets/ICM7555-ICM7556.pdf (accessed on 13 February 2020).
  51. Duchesne, D.; Gauthier, G.; Berteaux, D. Habitat selection, reproduction and predation of wintering lemmings in the Arctic. Oecologia 2011, 167, 967–980. [Google Scholar] [CrossRef]
  52. Poirier, M.; Kalhor, D.; Pusenkova, A.; Gauthier, G.; Maldague, X.; Galstian, T. Suitability of Camera Traps to Monitor Lemming Activity under the Snow in the High Arctic; Université Laval: Quebec City, QC, Canada, 2024. [Google Scholar]
  53. CEN. Climate Station Data from Bylot Island in Nunavut, Canada, v. 1.13 (1992–2023). Nordicana D2. 2024. Available online: https://nordicana.cen.ulaval.ca/dpage.aspx?doi=45039SL-EE76C1BDAADC4890 (accessed on 20 May 2024).
  54. Kleiven, E.F.; Nicolau, P.G.; Sørbye, S.H.; Aars, J.; Yoccoz, N.G.; Ims, R.A. Using camera traps to monitor cyclic vole populations. Remote Sens. Ecol. Conserv. 2023, 9, 390–403. [Google Scholar] [CrossRef]
  55. Team Arduino. Arduino Pro Mini schematic. Available online: https://www.arduino.cc/en/uploads/Main/Arduino-Pro-Mini-schematic.pdf (accessed on 12 November 2021).
Figure 1. Block diagram of the proposed monitoring system, ArcÇav 2. SBC: single-board computer, RTC: real-time clock module, NIR: near infrared, BVM: battery voltage monitoring module, and µC: microcontroller.
Figure 1. Block diagram of the proposed monitoring system, ArcÇav 2. SBC: single-board computer, RTC: real-time clock module, NIR: near infrared, BVM: battery voltage monitoring module, and µC: microcontroller.
Electronics 13 03254 g001
Figure 2. RTC module circuit diagram. (a) Model RTC-M1. (b) Model RTC-M2. (c) Model RTC-M3.
Figure 2. RTC module circuit diagram. (a) Model RTC-M1. (b) Model RTC-M2. (c) Model RTC-M3.
Electronics 13 03254 g002
Figure 3. Motion detection circuit diagram. (a) Model MD-M1. (b) Model MD-M2.
Figure 3. Motion detection circuit diagram. (a) Model MD-M1. (b) Model MD-M2.
Electronics 13 03254 g003
Figure 4. Battery voltage monitoring circuit diagram, which is a part of the power unit.
Figure 4. Battery voltage monitoring circuit diagram, which is a part of the power unit.
Electronics 13 03254 g004
Figure 5. System processing flow diagram. PU: power unit; µC: microcontroller; µCLPPU: µC of the low-power processing unit; SBC: single-board computer; VA: video acquisition; acq: acquisition; bat: battery; temp: temperature; int: interrupt; MD: motion detection; MDx denotes either of the MD modules, where x ∈ [A, B, C]; DR: daily routine; and MT: motion-triggered.
Figure 5. System processing flow diagram. PU: power unit; µC: microcontroller; µCLPPU: µC of the low-power processing unit; SBC: single-board computer; VA: video acquisition; acq: acquisition; bat: battery; temp: temperature; int: interrupt; MD: motion detection; MDx denotes either of the MD modules, where x ∈ [A, B, C]; DR: daily routine; and MT: motion-triggered.
Electronics 13 03254 g005
Figure 6. ArcÇav 2 system. (a) Electronic box (A) with PIRC sensor (B) and camera and IR emitters (C). (b) Opened electrical enclosure (D) with the electronic box (A) and battery (E). (c) A fully assembled unit with the closed electrical enclosure (D), filming box (F), and PIRB (G) and PIRA (H) sensors.
Figure 6. ArcÇav 2 system. (a) Electronic box (A) with PIRC sensor (B) and camera and IR emitters (C). (b) Opened electrical enclosure (D) with the electronic box (A) and battery (E). (c) A fully assembled unit with the closed electrical enclosure (D), filming box (F), and PIRB (G) and PIRA (H) sensors.
Electronics 13 03254 g006
Figure 7. Motion detection module timing. (a,b) Model MD-M1, t p = 1.01 s. (c,d) Model MD-M2, t d = 12.8 ms, t p = 81.2 ms, t m 1.3 s. Ch1: p i r _ o u t (module input), Ch2: m d _ o u t (module output). Subfigures (b,d) display the same signals as Subfigures (a,c), respectively, but on different time scales. Time unit division (M) and voltage (or current) unit division of all signals (in this figure, Ch1 and Ch2) are defined at the bottom of each diagram.
Figure 7. Motion detection module timing. (a,b) Model MD-M1, t p = 1.01 s. (c,d) Model MD-M2, t d = 12.8 ms, t p = 81.2 ms, t m 1.3 s. Ch1: p i r _ o u t (module input), Ch2: m d _ o u t (module output). Subfigures (b,d) display the same signals as Subfigures (a,c), respectively, but on different time scales. Time unit division (M) and voltage (or current) unit division of all signals (in this figure, Ch1 and Ch2) are defined at the bottom of each diagram.
Electronics 13 03254 g007
Figure 8. Test results of the MD-M2 model of the motion detection module. (ac) Module without the sensor; Ch1: p i r _ o u t (module input), Ch2: m d _ o u t (module output), and Ch3: module supply current. WDT: watchdog timer, t d = 12.8 ms, t p = 81.2 ms, t 1 = 26.4 ms, t 2 = 61.6 ms, t 3 = 18.8 ms, t 4 = 1.19 s, t 5 = 15.2 ms, and t m 1.3 s (masking period). Subfigure (b) is a detailed illustration of Subfigure (a) on a smaller time scale, with some areas in the middle of t 4 (indicated by the gray bar) omitted. Subfigures (a,c) compare the performance of MD-M2 under one PIR pulse and multiple PIR pulses. (d) PIR sensor current curve example; Ch1: p i r _ o u t (sensor output), Ch3: sensor current.
Figure 8. Test results of the MD-M2 model of the motion detection module. (ac) Module without the sensor; Ch1: p i r _ o u t (module input), Ch2: m d _ o u t (module output), and Ch3: module supply current. WDT: watchdog timer, t d = 12.8 ms, t p = 81.2 ms, t 1 = 26.4 ms, t 2 = 61.6 ms, t 3 = 18.8 ms, t 4 = 1.19 s, t 5 = 15.2 ms, and t m 1.3 s (masking period). Subfigure (b) is a detailed illustration of Subfigure (a) on a smaller time scale, with some areas in the middle of t 4 (indicated by the gray bar) omitted. Subfigures (a,c) compare the performance of MD-M2 under one PIR pulse and multiple PIR pulses. (d) PIR sensor current curve example; Ch1: p i r _ o u t (sensor output), Ch3: sensor current.
Electronics 13 03254 g008
Figure 9. RTC module test results. (a,b) Typical supply current of RTC-M2 on two different time scales. (c) Behavior of RTC-M2 during normal operation. (d) Behavior of RTC-M2 when the main supply ( V C C 1 ) switches. (e,f) Behavior of RTC-M3 when the main supply switches. Subfigure (f) is a detailed illustration of Subfigure (e) on a smaller voltage scale, with some areas in the middle (indicated by the gray bar) omitted. Ch1: V C C 1 = 3.3 V; Ch2: V B 2 = 2.98 V (Subfigures (c,d), 3.0 V for Subfigures (e,f)); Ch3: r t c _ v b a t (Figure 2), V t h = 2.91 V, V D 1 = 0.1 V, and t d o n = 184 ms.
Figure 9. RTC module test results. (a,b) Typical supply current of RTC-M2 on two different time scales. (c) Behavior of RTC-M2 during normal operation. (d) Behavior of RTC-M2 when the main supply ( V C C 1 ) switches. (e,f) Behavior of RTC-M3 when the main supply switches. Subfigure (f) is a detailed illustration of Subfigure (e) on a smaller voltage scale, with some areas in the middle (indicated by the gray bar) omitted. Ch1: V C C 1 = 3.3 V; Ch2: V B 2 = 2.98 V (Subfigures (c,d), 3.0 V for Subfigures (e,f)); Ch3: r t c _ v b a t (Figure 2), V t h = 2.91 V, V D 1 = 0.1 V, and t d o n = 184 ms.
Electronics 13 03254 g009
Figure 10. System performance for different tasks. (a) Hourly routine. Ch1: r t c _ i n t (Figure 2); Ch2: e x t _ t r i g (Figure 4); Ch3: v b a t _ h (Figure 4); Ch4: System current. (b) Daily routine. Ch1: r t c _ i n t signal (Figure 2); Ch2: v b a t _ h ; Ch3: SD card power; Ch4: system current. (c,d) Motion detection counting, shown at two different time scales. Ch1: m d _ o u t (Figure 3); Ch2: system current. (e) Video acquisition; Ch1: m d _ o u t ; Ch2: SBC power ( V C C 2 ); Ch3: SBC internal signal (see text); Ch4: system current.
Figure 10. System performance for different tasks. (a) Hourly routine. Ch1: r t c _ i n t (Figure 2); Ch2: e x t _ t r i g (Figure 4); Ch3: v b a t _ h (Figure 4); Ch4: System current. (b) Daily routine. Ch1: r t c _ i n t signal (Figure 2); Ch2: v b a t _ h ; Ch3: SD card power; Ch4: system current. (c,d) Motion detection counting, shown at two different time scales. Ch1: m d _ o u t (Figure 3); Ch2: system current. (e) Video acquisition; Ch1: m d _ o u t ; Ch2: SBC power ( V C C 2 ); Ch3: SBC internal signal (see text); Ch4: system current.
Electronics 13 03254 g010
Figure 11. Sample images from videos taken by ArcÇav 2 during the 2021–2022 recording season. First row: ermine. Second row: collard lemming. Third row: brown lemming.
Figure 11. Sample images from videos taken by ArcÇav 2 during the 2021–2022 recording season. First row: ermine. Second row: collard lemming. Third row: brown lemming.
Electronics 13 03254 g011
Figure 12. Average daily motion detection counts per month for 5 units (U1 to U8) from August 2021 to May 2022.
Figure 12. Average daily motion detection counts per month for 5 units (U1 to U8) from August 2021 to May 2022.
Electronics 13 03254 g012
Figure 13. Monthly motion detection counts vs. monthly videos recorded by 5 units (U1 to U8) from August 2021 to May 2022. The black line represents the result of regression analysis with a correlation coefficient of 0.78 (p < 0.001).
Figure 13. Monthly motion detection counts vs. monthly videos recorded by 5 units (U1 to U8) from August 2021 to May 2022. The black line represents the result of regression analysis with a correlation coefficient of 0.78 (p < 0.001).
Electronics 13 03254 g013
Figure 14. Minimum daily subnivean temperatures recorded by ArcÇav 2. (a) From August 2021 to May 2022. (b) From August 2022 to May 2023. Minimum air temperature was recorded 2 m above the ground by a weather station (BYLCAMP) at the site [53].
Figure 14. Minimum daily subnivean temperatures recorded by ArcÇav 2. (a) From August 2021 to May 2022. (b) From August 2022 to May 2023. Minimum air temperature was recorded 2 m above the ground by a weather station (BYLCAMP) at the site [53].
Electronics 13 03254 g014
Table 1. Typical energy consumption of the ArcÇav 2 system on a daily basis.
Table 1. Typical energy consumption of the ArcÇav 2 system on a daily basis.
TaskModel/ConditionAvg. Current [mA]Operation Period [s]Daily Energy [mAh]
StandbyArcÇav 2.1 with 1 MD0.23182,5235.29
ArcÇav 2.1 with 3 MDs0.28882,5236.59
ArcÇav 2.2 with 1 MD0.007882,5230.180
ArcÇav 2.2 with 3 MDs0.009782,5230.223
MD countingArcÇav 2.1 with 1 MD0.25836000.258
ArcÇav 2.1 with 3 MDs0.36936000.369
ArcÇav 2.2 with 1 MD0.03036000.030
ArcÇav 2.2 with 3 MDs0.07736000.077
Hourly routine23 times per day4.076.110.007
SD card writingonce per day21.30.8850.005
VA1 (S2)One 3-second video350.460.584
VA2 (S3)24 8-second videos437.526432.1
TotalArcÇav 2.1 with 1 MD 38.2
ArcÇav 2.1 with 3 MDs 39.6
ArcÇav 2.2 with 1 MD 32.9
ArcÇav 2.2 with 3 MDs 33.0
ArcÇav 2.1: system with RTC-M1 and MD-M1. ArcÇav 2.2: system with RTC-M2 and MD-M2. RTC-M1, RTC-M2, MD-M1, and MD-M2 are different models of the RTC and MD modules as described in the text. VA1: video acquisition 1 (daily video). VA2: video acquisition 2 (motion-triggered video). One of the hourly routines is included in the daily data recording (SD card writing) task.
Table 2. Motion-triggered videos recorded by ArcÇav 2 at Bylot Island from August 2021 to June 2022.
Table 2. Motion-triggered videos recorded by ArcÇav 2 at Bylot Island from August 2021 to June 2022.
High QualityLow QualityNo AnimalTotal
Qty.%Qty.%Qty.%Qty.
Unit 18135.293.914060.9230
Unit 24945.41211.14743.5108
Unit 3 *-------
Unit 41560.014.0936.025
Unit 52140.400.03159.652
Unit 66128.483.714667.9215
Unit 7640.0213.3746.715
Unit 88826.2319.221764.6336
All Units32132.7636.459760.9981
* This unit was not properly put into service.
Table 3. Daily routine files recorded by each ArcÇav 2 unit (U1 to U12) at Bylot Island from August 2021 to June 2022 and August 2022 to June 2023.
Table 3. Daily routine files recorded by each ArcÇav 2 unit (U1 to U12) at Bylot Island from August 2021 to June 2022 and August 2022 to June 2023.
PeriodParameterU1U2U3U4U5U6U7U8U9U10U11U12
NDS313281*284284285282280----
2021–2022NDRV292266*42672764267----
NDT312281*62832834274----
NDS296293296278289294281294291294294290
2022–2023NDRV29629397278289294281294291294294290
NDT296293296278289103281294291294294290
* This unit was not properly put into service during this period. NDS: Number of days in service. NDRV: Number of daily routine video files. NDT: Number of daily text files.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kalhor, D.; Poirier, M.; Gauthier, G.; Ibarra-Castanedo, C.; Maldague, X. An Autonomous Monitoring System with Microwatt Technology for Exploring the Lives of Arctic Subnivean Animals. Electronics 2024, 13, 3254. https://doi.org/10.3390/electronics13163254

AMA Style

Kalhor D, Poirier M, Gauthier G, Ibarra-Castanedo C, Maldague X. An Autonomous Monitoring System with Microwatt Technology for Exploring the Lives of Arctic Subnivean Animals. Electronics. 2024; 13(16):3254. https://doi.org/10.3390/electronics13163254

Chicago/Turabian Style

Kalhor, Davood, Mathilde Poirier, Gilles Gauthier, Clemente Ibarra-Castanedo, and Xavier Maldague. 2024. "An Autonomous Monitoring System with Microwatt Technology for Exploring the Lives of Arctic Subnivean Animals" Electronics 13, no. 16: 3254. https://doi.org/10.3390/electronics13163254

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop