Next Article in Journal
Heart-Rate-Corrected QT Interval Response to Ramosetron during Robot-Assisted Laparoscopic Prostatectomy: A Randomized Trial
Next Article in Special Issue
From Skin Barrier Dysfunction to Systemic Impact of Atopic Dermatitis: Implications for a Precision Approach in Dermocosmetics and Medicine
Previous Article in Journal
Serum and Synovial Markers in Patients with Rheumatoid Arthritis and Periprosthetic Joint Infection
Previous Article in Special Issue
Precision Medicine and Childhood Asthma: A Guide for the Unwary
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI-Driven Cell Tracking to Enable High-Throughput Drug Screening Targeting Airway Epithelial Repair for Children with Asthma

by
Alphons Gwatimba
1,2,*,
Tim Rosenow
1,3,
Stephen M. Stick
1,4,5,6,
Anthony Kicic
1,4,6,7,
Thomas Iosifidis
1,6,7,† and
Yuliya V. Karpievitch
1,8,†
1
Wal-Yan Respiratory Research Centre, Telethon Kids Institute, University of Western Australia, Perth, WA 6009, Australia
2
School of Computer Science and Software Engineering, University of Western Australia, Nedlands, WA 6009, Australia
3
Centre for Microscopy, Characterisation and Analysis, University of Western Australia, Nedlands, WA 6009, Australia
4
Division of Paediatrics, Medical School, University of Western Australia, Nedlands, WA 6009, Australia
5
Department of Respiratory and Sleep Medicine, Perth Children’s Hospital, Nedlands, WA 6009, Australia
6
Centre for Cell Therapy and Regenerative Medicine, School of Medicine, University of Western Australia, Nedlands, WA 6009, Australia
7
School of Population Health, Curtin University, Bentley, WA 6102, Australia
8
School of Biomedical Sciences, University of Western Australia, Nedlands, WA 6009, Australia
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
J. Pers. Med. 2022, 12(5), 809; https://doi.org/10.3390/jpm12050809 (registering DOI)
Submission received: 14 April 2022 / Revised: 8 May 2022 / Accepted: 13 May 2022 / Published: 17 May 2022
(This article belongs to the Special Issue Precision Medicine in Childhood Asthma)

Abstract

:
The airway epithelium of children with asthma is characterized by aberrant repair that may be therapeutically modifiable. The development of epithelial-targeting therapeutics that enhance airway repair could provide a novel treatment avenue for childhood asthma. Drug discovery efforts utilizing high-throughput live cell imaging of patient-derived airway epithelial culture-based wound repair assays can be used to identify compounds that modulate airway repair in childhood asthma. Manual cell tracking has been used to determine cell trajectories and wound closure rates, but is time consuming, subject to bias, and infeasible for high-throughput experiments. We therefore developed software, EPIC, that automatically tracks low-resolution low-framerate cells using artificial intelligence, analyzes high-throughput drug screening experiments and produces multiple wound repair metrics and publication-ready figures. Additionally, unlike available cell trackers that perform cell segmentation, EPIC tracks cells using bounding boxes and thus has simpler and faster training data generation requirements for researchers working with other cell types. EPIC outperformed publicly available software in our wound repair datasets by achieving human-level cell tracking accuracy in a fraction of the time. We also showed that EPIC is not limited to airway epithelial repair for children with asthma but can be applied in other cellular contexts by outperforming the same software in the Cell Tracking with Mitosis Detection Challenge (CTMC) dataset. The CTMC is the only established cell tracking benchmark dataset that is designed for cell trackers utilizing bounding boxes. We expect our open-source and easy-to-use software to enable high-throughput drug screening targeting airway epithelial repair for children with asthma.

1. Introduction

Frequent exposure to pathogens, allergens and pollutants results in damage of the epithelial cell layer lining the airways [1]. Following an injury resulting in the disruption of the epithelium, cells at the wound site, referred to as leading edge cells, migrate to heal the injured epithelium and restore the physical cellular barrier [1,2]. However, in patients with asthma, airway epithelial cells fail to restore epithelial integrity following injury, resulting in further damage and inflammation. Furthermore, dysregulated epithelial repair in children even with mild asthma may contribute to the persistence of asthma into adulthood [3]. Accordingly, the identification of therapeutic treatments that restore the repair properties of wounded epithelial cells by screening and assessing the efficacy of different novel or repurposed drugs is essential. High-throughput drug discovery and repurposing experiments focusing on enhancing airway wound healing will constitute a precision-medicine approach to asthma treatment. Such an approach will target the underlying disease processes and reduce disease burden at the early stages of childhood asthma.
Various in vitro assays can be used to study wound repair mechanisms [4] and screen therapeutics that modulate cell migration and repair [5]. The in vitro scratch assay is widely used to assess wound repair outcomes and is scalable for high-throughput screening purposes [6,7,8]. While the method is primarily used to quantitatively assess the migration characteristics of cell populations, it has been extended to include the analysis of individual cell trajectories using time lapse live cell microscopy [5,9,10]. Cell trajectories are currently obtained by manually tracking a number of leading edge cells (commonly 20) across multiple time lapse image frames generated using the extended in vitro scratch assay [5]. Cell migration metrics, such as velocity and directionality, are then quantified from the cell trajectories and used to assess wound repair outcomes, such as the acceleration or delaying of cell migration for wound healing, due to an administered drug.
Manual cell tracking is laborious, slow, subject to bias [11,12] and cannot be feasibly used to analyze the large volumes of data generated in high-throughput drug screening experiments. While automated cell tracking solutions exist [13,14,15], many require cells to be fluorescently labelled [10,16,17], which can cause undesirable changes to cell behavior or even result in cell death [17]. Many solutions are also not truly fully automated, which is infeasible for high-throughput experiments. For example, fully automated tracking without first manually selecting the cells to track is presented as a convenience primarily for fluorescent cells in [11] and suggested mainly for relatively easy use cases in [18]. Additionally, some solutions do not publicly release the software [19,20] and most do not include the capacity to automatically perform useful wound repair analyses, such as automatic cell migration metric quantification [19,21,22,23,24]. In contrast, if this capacity is included, the solution is hindered by one or more of the previously mentioned limitations [25].
Automated cell trackers generally operate by first outlining the exact shapes of, or segmenting, cells in images before tracking their movements across multiple frames [15]. Cell segmentation is commonly performed using traditional segmentation algorithms [26], such as intensity thresholding [17,27]. However, such methods often struggle to resolve individual cells [16], especially in images with high cell density [28] such as wound repair images. Additionally, optimal segmentation parameter selection is difficult and time consuming [29,30]. Instead, artificial intelligence (AI), or more precisely deep learning (DL)-based, segmentation methods [31], which are increasingly utilized by many cell trackers, better identify individual cells and require minimal runtime parameter finetuning [16,24,32,33]. However, to achieve such performance these systems must be trained using images containing many cells that have been manually segmented. Specifically, to obtain the training data, the exact outline of hundreds of cells must be carefully drawn in multiple images (Supplementary Figure S1, Additional File S1), which is time consuming and often challenging for users [34,35]. Alternatively, AI-based methods that do not rely on stringent object segmentation but instead use rectangular bounding boxes to enclose objects for detection [36] require training data that are easier and faster to generate [34,35]. Specifically, obtaining the training data only requires two clicks of a mouse to specify the upper-leftmost and lower-rightmost corners of a bounding box for each cell in the dataset (Supplementary Figure S1, Additional File S1). Most mainstream object trackers use bounding boxes to detect objects [37,38,39], such as pedestrians and cars. For example, ByteTrack [40] accurately tracks detected objects by first associating high score bounding boxes across frames followed by low score ones. TransMOT [41] is a computationally efficient tracker that models the spatial temporal relationships of objects detected using bounding boxes. On the other hand, cell trackers are overwhelmingly segmentation dependent, thereby restricting end users. For example, while a recent cell tracking algorithm (CellTrack R-CNN) could generate and utilize bounding boxes, a trained segmentation model (Mask R-CNN [42]) was still required to track cells [43].
Importantly, most automated trackers rely on algorithms that do not presuppose that image sequences have low resolution (contain objects smaller than 32 by 32 px [44]) and low framerate (have longer than 25 min frame intervals [19]). With a higher framerate, objects are imaged very frequently across time and are displaced by only a few pixels in adjacent frames and are thus easier to track [45,46,47]. Whereas with a lower framerate, the movements of cells, especially those exhibiting drug-accelerated migration, are often larger than distances to nearby cells, thus making tracking harder [19]. In addition, low-resolution cells are harder to detect [44]. Existing solutions do not target challenging use cases with both low resolution and low framerate [22,23,48,49,50]. Hence, existing solutions do not guarantee reliable cell tracking accuracy.
Here, we present EPIC, a fully automated cell tracking software solution that overcomes all the previously mentioned limitations (EPIC is available at: https://github.com/AlphonsG/EPIC-BBox-Cell-Tracking, accessed on 9 February 2022). EPIC uses state-of-the-art AI-based Vision Transformers [51], which we have previously applied to cell image analysis tasks [52], to accurately detect unstained cells using bounding boxes for decreased labor and time necessary for training data generation compared to widely used segmentation-based approaches. We developed a custom object tracking algorithm for high accuracy tracking in low-resolution and low-framerate image sequences. After completing cell tracking, EPIC automatically generates reports with several cell migration metrics and publication-ready figures. We evaluated EPIC using an airway epithelial cell wound repair dataset generated under high-throughput drug screening experimental conditions. EPIC produced cell migration metrics that were comparable to the current gold standard method, manual cell tracking, and outperformed publicly available automated trackers tested on the same dataset. We also showed that EPIC is not limited to airway epithelial repair for children with asthma but can be applied in other cellular contexts. Specifically, EPIC also outperformed the same automated trackers on the recent Cell Tracking with Mitosis Detection Challenge (CTMC) dataset [36], the only established cell tracking benchmark dataset, which is challenging and diverse with 14 different cell lines, that is designed for cell trackers utilizing bounding boxes. We expect our open-source and easy-to-use software to enable high-throughput drug screening targeting airway epithelial repair for children with asthma.

2. Materials and Methods

2.1. Datasets

2.1.1. Wound Repair Dataset

We generated a wound repair dataset under high-throughput drug screening experimental conditions. We obtained human telomerase reverse transcriptase modified airway epithelial cells (NuLi-1) [53] from the American Type Culture Collection (ATCC, Manassas, VA, USA) and cultured cells as previously described [54] using bronchial epithelial basal medium (BEBM™, Lonza, Basel, Switzerland) supplemented with SingleQuot growth additives (Lonza). We utilized an established in vitro scratch assay to assess epithelial cell repair responses to wounding as previously described [5]. Briefly, we established monolayer cell cultures in IncuCyte® ImageLock 96-well plates (Essen Bioscience Inc., Ann Arbor, WI, USA) in culture media lacking epidermal growth factor. We wounded confluent monolayer cultures using the IncuCyte® 96-well WoundMaker Tool (Essen Bioscience) [5]. We then treated subsets of cell cultures with either a specific Akt inhibitor (10 µM MK2206; Sigma-Aldrich, St Louis, MI, USA) to inhibit cell migration, or ROCK inhibitor (10 µM Y27632; Sigma-Aldrich) to accelerate cell migration post wounding. Multiple time lapse 1620 by 1176 px image sequences of wounded cells were captured with a magnification of 1.33 µm/px at 30-min intervals over 10.5 h (22 frames) using IncuCyte® ZOOM (Essen Bioscience) [5]. We refer to image sequences containing Akt-inhibited, untreated and ROCK-inhibited cells as types of experiments: delayed, control and accelerated, respectively.

2.1.2. CTMC Dataset

Unlike the long-existing Cell Tracking Challenge [55], the recently and publicly released CTMC dataset [36] is the only established cell tracking benchmark dataset that is designed for cell trackers utilizing bounding boxes for cell detection instead of performing cell segmentation. Hence, we were able to use the CTMC dataset as an independent validation dataset for EPIC. It consists of 86 diverse image sequences/videos for 14 different cell lines from animals, such as humans and rabbits, and features various cell types, such as myoblasts, fibroblasts and epithelial cells. The 320 by 400 px videos were collected over 300 to 4440 s (depending on the video) using the Nikon TE2000 Differential Interference Contrast imaging modality at 30 second intervals and at an approximate resolution of 0.35 µm/px and fully annotated with bounding boxes.

2.2. Algorithm

We trained a state-of-the-art AI system based on Vision Transformers to detect low-resolution airway epithelial cell nuclei and whole cells in our wound repair dataset and the CTMC dataset, respectively, using bounding boxes. We then developed a custom algorithm capable of tracking cells at low resolution and low framerate. The algorithm extracts two appearance and four motion features from cells detected in every frame of an image sequence. The extracted features are used to link cells with the same identity across frames with a custom multi-stage tracklet association and tracklet cleaving strategy based on combinatorial optimization [56,57,58]. We also developed a custom algorithm that can automatically identify the leading edges in wound repair images by analyzing the cell densities across the image plane. Full method details are given in the Supplementary Materials (Supplementary Methods, Additional File S1).

2.3. Performance Evaluation

We compared EPIC’s cell tracking performance in our wound repair dataset to manual cell tracking, as previously described [5]. We also compared EPIC’s cell tracking performance in our wound repair and the CTMC dataset to that of Viterbi [59] (offered as part of the Baxter Algorithms package [60]) and DeepSORT [61], the best performing publicly available automated trackers of their category benchmarked by Anjum and Gurari in the CTMC [36]. Full details of our comparisons and statistical inference are given in the Supplementary Materials (Supplementary Methods, Additional File S1).

3. Results and Discussion

3.1. Dataset Comparison

Our wound repair dataset was 380% lower resolution (1.33 µm/px) than the CTMC dataset (0.35 µm/px). Furthermore, all cells in our dataset were classified as small objects (<32 by 32 bounding box area) [44]. The CTMC dataset also contained cells under the small object category, as well the medium (>32 by 32 and <96 by 96 bounding box area) and large object (>96 by 96 bounding box area) categories [44]. Our dataset had over 65 times lower framerate (only 0.03 frames/min) than the CTMC dataset (2 frames/min) (Figure 1a,b). The average density in the CTMC dataset was approximately 13, at most ~28, cells per frame [36]. Conversely, each of our experiments had more than 70 times as many cells per frame, with over 1000 cells per frame (Figure 1c). These statistics reinforce the low-resolution, low-framerate and high-density nature of our wound repair dataset as is characteristic of data generated in high-throughput drug screening wound repair experiments.

3.2. Wound Repair Dataset

3.2.1. Cell Detection

The training of the cell detection model was completed in ~30 min. The model achieved average precision and recall values of 87% and 91%, respectively. These metrics can be interpreted as the model correctly detecting ~90% of cells in each image. Figure 2 shows the detection performance on a full-sized wound repair image where most cells were detected as evidenced by the many bounding boxes (Figure 2, middle panel). These bounding boxes were well localized (Figure 2), which is notable given that the low-resolution and high-density cells are classified as small objects and are hence challenging to detect [44]. Importantly, the model was able to ignore cell debris (Figure 2, red arrow in inset), reinforcing EPIC’s ability to robustly learn the appearances of cells.

3.2.2. Cell Tracking

As shown in Table 1, EPIC tracked over a hundred detected cells from the 1st to the 22nd frame without fragmentation in each of the nine total tested delayed, control and accelerated experiments (with three technical replicates per experiment type). From those cell tracks, we randomly sampled 20 leading edge cell tracks per experiment for wound repair analysis as is standard in the literature [5].
In contrast, DeepSORT failed to track any cells from the 1st to the 22nd frame without fragmentation in the same experiments, and with no cells tracked in the initial frames, no leading edge cell tracks could be defined and hence sampled for wound repair analysis (Table 1).
Viterbi had the most variable cell tracking performance in the same experiments, tracking anywhere from 0 to 2726 cells from the 1st to the 22nd frame without fragmentation depending on the experiment (Table 1). We noted that experiments with many (≥1883) or few (≤37) cell tracks (Table 1, Column 6) corresponded to experiments with lower (≤2283) or higher (≥3238) average numbers of cells per frame (as detected by EPIC), respectively. We posit that Viterbi’s non-AI-based image segmentation algorithm struggled to resolve the low-resolution high-density cells, a previously mentioned limitation of the approach, resulting in reduced numbers of generated cell tracks compared to experiments with lower cell density. Nevertheless, we were able to sample the 20 leading edge cell tracks for wound repair analysis in five of the experiments. Viterbi only generated 37 and 13 cell tracks in the control B and delayed C experiments among which there were only four and two leading edge cell tracks, respectively, that could be sampled for wound repair analysis. Viterbi failed to generate any cell tracks in the remaining control A and C experiments (Table 1).
Wound Repair Analysis
The sampled leading edge cell tracks that were generated by EPIC were largely similar to 20 randomly selected and manually tracked leading edge cells in the corresponding experiments, while sampled leading edge cell tracks generated by Viterbi were largely dissimilar to those manual cell tracks (Figure 3). Unlike Viterbi, both manual cell tracking and EPIC generated visibly shorter tracks for delayed experiments (Figure 3, top row) and longer tracks for accelerated experiments (Figure 3, bottom row) as compared to control (Figure 3, middle row), indicating that overall Viterbi cell tracks did not resemble the migration patterns expected of leading edge cells.
We used the manually and automatically generated leading edge cell tracks sampled from the nine total tested delayed, control and accelerated experiments to compute six cell migration metrics: Euclidean distance, accumulated distance, velocity, directionality, Y-forward migration index and end point angle. Cell migration metrics produced by manual cell tracking and EPIC were similar and both were different from metrics produced by Viterbi (Figure 4). The cell migration metrics of EPIC, Viterbi and manual cell tracking are shown in Figure 4 and Supplementary Table S3 (Additional File S1).
Statistical comparison of the cell migration metrics produced by EPIC cell tracks indicated that there were no statistically significant differences to those produced by manual cell tracking for all metrics (Figure 5; Supplementary Tables S4 and S5, Additional File S1). On the other hand, we obtained mostly highly statistically significant differences when comparing Viterbi and manual cell tracks, indicating vast inaccuracies in metrics produced by Viterbi (Figure 5; Supplementary Tables S4 and S6, Additional File S1).
We found that wound repair analysis outcomes of untreated and drug-treated cells according to EPIC were equivalent to those of the current gold standard method, manual cell tracking, while generating at least 20 leading edge cell tracks across the 22 image frames without any user labor required. In contrast, Viterbi did not consistently track at least 20 leading edge cells across 22 frames. Additionally, Viterbi produced wound repair outcomes contradicting the expectations for untreated and drug-treated cells, and many of Viterbi cell tracks were visually incorrect. For instance, closer inspection revealed that, unlike EPIC (Supplementary Figure S5, Additional Files S1 and S2), many of Viterbi cell tracks did not correspond to real cells (Supplementary Figure S5, Additional Files S1 and S3). These results reinforce the challenging nature of the low-resolution, low-framerate and high cell density wound repair dataset even for well-established automated trackers.

3.2.3. Automated Leading Edge Identification (EPIC)

EPIC automatically identified the leading edges of the wound in all experiments (Figure 6). By visual inspection, automatically detected leading edges were well localized even in the presence of significant cell debris. Additionally, we have shown that manual cell tracks sampled with respect to manually defined leading edges are comparable to cell tracks generated by EPIC and sampled with respect to automatically defined leading edges, reinforcing the accuracy of automated leading edge identification (Figure 6).

3.2.4. Runtimes

EPIC detected and tracked thousands of cells in all the nine experiments in ~20 min (Table 2). In contrast, Viterbi and DeepSORT processed the same experiments in ~3 h and ~30 h, respectively (Table 2).
The tracking of hundreds of small objects in an image sequence is a challenging task outside the intended use cases of most existing trackers. Accordingly, EPIC’s efficient design and performance enhancements, such as advanced multicore processing, resulted in magnitudes faster processing times than Viterbi and DeepSORT. We anticipate Viterbi and DeepSORT’s long processing times to render these methods infeasible for use in high-throughput drug screening pipelines involving thousands of experiments. Although manual cell tracking is infeasible for high-throughput drug screening, the 20 cells from each delayed or control experiment were manually tracked within 10 min. Cells from each accelerated experiment were manually tracked within 30 min due to increased difficulty tracking fast-moving cells at the low framerate. The total manual tracking time for all experiments (180 cells) was 2.5 h.

3.3. CTMC Dataset

EPIC outperformed Viterbi and DeepSORT in cell tracking accuracy on the CTMC dataset (Supplementary Materials, Additional Files S1 and S4). Therefore, in addition to accurately tracking hundreds of detected cells in our wound repair dataset, EPIC also accurately detected and tracked cells in the CTMC dataset, which featured a higher framerate, higher resolution, 14 different cell lines and cells with extended cytoplasm. The sustained cell tracking accuracy of EPIC and its outperformance of other automated tools in the distinctly different and challenging dataset indicates robust underlying detection and tracking methods, and a system that is not limited to airway epithelial repair for children with asthma but can be applied in other cellular contexts.

3.4. Software Features and Report

We utilized object orientated software design principles, such as the Factory Method creational software design pattern, to allow developers to easily substitute custom object detectors and even tracking algorithms into EPIC to better suit other use cases, allowing for a highly flexible system. For researchers using EPIC ‘out-of-the-box’, it is a cross platform application that is simple to use through a command line or graphical user interface. Through the available commands, users can perform object detection, tracking and analysis of time lapse images in common formats, such as TIFF and JPEG. EPIC also automatically performs tracking and migration analyses of all cells in wound repair experiments. Generated cell tracks can be exported in multiple formats, such as ImageJ Manual Tracking File [62] and MOTChallenge [63] formats, for further external analyses. Importantly, EPIC generates a HTML report containing cell migration metrics, publication-ready figures such as cell trajectory plots, images and videos visualizing cell detections and tracks, and statistics such as the number of detected cells per frame (Additional File S5). Overall, EPIC can easily integrate into drug screening pipelines ‘out-of-the-box’ and is straightforward to use for non-programmers.

4. Conclusions

EPIC automatically tracked unstained cells (including drug-treated and untreated control) as accurately as manual cell tracking in a challenging low-resolution, low-framerate and high cell density dataset at higher volume and speed. This is unlike tested publicly available automated trackers, which underperformed on such a challenging dataset and hence cannot be reliably used for high-throughput wound repair analyses. EPIC also outperformed the same trackers on the diverse and challenging CTMC dataset, which includes 14 different cell lines, reinforcing that EPIC is not limited to airway epithelial repair for children with asthma but can be applied in other cellular contexts. EPIC tracks cells detected using state-of-the-art AI-based Vision Transformers and bounding boxes with a custom tracking algorithm. This results in highly accurate cell tracking with decreased labor and time necessary for training data generation compared to widely used segmentation-based approaches. We expect our open-source and easy-to-use software to enable high-throughput drug screening targeting airway epithelial repair for children with asthma.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/jpm12050809/s1, Additional Files S1–S5.

Author Contributions

Conceptualization, Y.V.K., T.I., T.R., S.M.S. and A.K.; methodology, A.G., Y.V.K., T.I. and T.R.; software, A.G.; validation, A.G., Y.V.K. and T.I.; formal analysis, A.G., Y.V.K. and T.I.; investigation, T.I.; resources, Y.V.K., T.I., T.R., S.M.S. and A.K.; data curation, A.G., Y.V.K. and T.I.; writing—original draft preparation, A.G., Y.V.K. and T.I.; writing—review and editing, A.G., Y.V.K. and T.I.; visualization, A.G., Y.V.K. and T.I.; supervision, T.R., Y.V.K. and T.I.; project administration, Y.V.K., T.I., T.R., S.M.S. and A.K.; funding acquisition, Y.V.K., T.I., T.R., S.M.S. and A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Wal-yan Respiratory Research Centre Inspiration Award (2020), BHP-Telethon Kids Blue Sky Award (2019) and Cystic Fibrosis Charitable Endowment Charles Bateman Charitable Trust. S.M.S. is an NHMRC Practitioner Fellow (NHMRC1117668). A.K. is a Rothwell Family Fellow.

Institutional Review Board Statement

Not applicable as this study utilized a commercially-available immortalized airway epithelial cell line, NuLi-1 (ATCC, Manassas, VA, USA).

Data Availability Statement

The datasets generated and/or analyzed during the current study are available in the developed software’s repository along with the code, https://github.com/AlphonsG/EPIC-BBox-Cell-Tracking (accessed on 9 February 2022), and MOTChallenge website, https://motchallenge.net/data/CTMC-v1/ (accessed on 9 February 2022).

Acknowledgments

The authors would like to thank Samantha McLean and Scott Winslow for assisting with cell culture maintenance.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Allahverdian, S. Basic Mechanism of Airway Epithelial Repair: Role of IL-13 and EGFR Glycosylation. Ph.D. Thesis, University of British Columbia, Vancouver, BC, Canada, 2008. [Google Scholar]
  2. Kicic, A.; Hallstrand, T.S.; Sutanto, E.N.; Stevens, P.T.; Kobor, M.S.; Taplin, C.; Paré, P.D.; Beyer, R.P.; Stick, S.M.; Knight, D.A. Decreased fibronectin production significantly contributes to dysregulated repair of asthmatic epithelium. Am. J. Respir. Crit. Care Med. 2010, 181, 889–898. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Kicic, A.; Sutanto, E.N.; Stevens, P.T.; Knight, D.A.; Stick, S.M. Intrinsic biochemical and functional differences in bronchial epithelial cells of children with asthma. Am. J. Respir. Crit. Care Med. 2006, 174, 1110–1118. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Lin, J.-Y.; Lo, K.-Y.; Sun, Y.-S. A microfluidics-based wound-healing assay for studying the effects of shear stresses, wound widths, and chemicals on the wound-healing process. Sci. Rep. 2019, 9, 20016. [Google Scholar] [CrossRef] [PubMed]
  5. Iosifidis, T.; Sutanto, E.N.; Buckley, A.G.; Coleman, L.; Gill, E.E.; Lee, A.H.; Ling, K.M.; Hillas, J.; Looi, K.; Garratt, L.W.; et al. Aberrant cell migration contributes to defective airway epithelial repair in childhood wheeze. JCI Insight 2020, 5, e133125. [Google Scholar] [CrossRef] [Green Version]
  6. Ranzato, E.; Martinotti, S.; Burlando, B. Wound healing properties of jojoba liquid wax: An in vitro study. J. Ethnopharmacol. 2011, 134, 443–449. [Google Scholar] [CrossRef]
  7. Pinto, B.I.; Tabor, A.J.; Stearns, D.M.; Diller, R.B.; Kellar, R.S. A Bench-Top In Vitro Wound Assay to Demonstrate the Effects of Platelet-Rich Plasma and Depleted Uranium on Dermal Fibroblast Migration. Appl. In Vitro Toxicol. 2016, 2, 151–156. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Pinto, B.I.; Cruz, N.D.; Lujan, O.R.; Propper, C.R.; Kellar, R.S. In Vitro Scratch Assay to Demonstrate Effects of Arsenic on Skin Cell Migration. JoVE 2019, 144, e58838. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Liang, C.-C.; Park, A.; Guan, J.-L. In vitro scratch assay: A convenient and inexpensive method for analysis of cell migration in vitro. Nat. Protoc. 2007, 2, 329–333. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Meijering, E.; Dzyubachyk, O.; Smal, I. Methods for cell and particle tracking. Methods Enzymol. 2012, 504, 183–200. [Google Scholar]
  11. Cordelières, F.P.; Petit, V.; Kumasaka, M.; Debeir, O.; Letort, V.; Gallagher, S.J.; Larue, L. Automated Cell Tracking and Analysis in Phase-Contrast Videos (iTrack4U): Development of Java Software Based on Combined Mean-Shift Processes. PLoS ONE 2013, 8, e81266. [Google Scholar] [CrossRef]
  12. Sacan, A.; Ferhatosmanoglu, H.; Coskun, H. CellTrack: An open-source software for cell tracking and motility analysis. Bioinformatics 2008, 24, 1647–1649. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Aow Yong, L.Y.; Sulong, G. Automated cell migration tracking technique: A review. J. Teknol. 2015, 75. [Google Scholar] [CrossRef] [Green Version]
  14. Emami, N.; Sedaei, Z.; Ferdousi, R. Computerized cell tracking: Current methods, tools and challenges. Vis. Inform. 2021, 5, 1–13. [Google Scholar] [CrossRef]
  15. Ulman, V.; Maška, M.; Magnusson, K.E.G.; Ronneberger, O.; Haubold, C.; Harder, N.; Matula, P.; Matula, P.; Svoboda, D.; Radojevic, M.; et al. An objective comparison of cell-tracking algorithms. Nat. Methods 2017, 14, 1141–1152. [Google Scholar] [CrossRef] [PubMed]
  16. Tsai, H.-F.; Gajda, J.; Sloan, T.F.W.; Rares, A.; Shen, A.Q. Usiigaci: Instance-aware cell tracking in stain-free phase contrast microscopy enabled by machine learning. SoftwareX 2019, 9, 230–237. [Google Scholar] [CrossRef]
  17. Al-Zaben, N.; Medyukhina, A.; Dietrich, S.; Marolda, A.; Hünniger, K.; Kurzai, O.; Figge, M.T. Automated tracking of label-free cells with enhanced recognition of whole tracks. Sci. Rep. 2019, 9, 3317. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Piccinini, F.; Kiss, A.; Horvath, P. CellTracker (not only) for dummies. Bioinformatics 2016, 32, 955–957. [Google Scholar] [CrossRef]
  19. Hayashida, J.; Bise, R. Cell tracking with deep learning for cell detection and motion estimation in low-frame-rate. In Medical Image Computing and Computer Assisted Intervention–MICCAI 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 397–405. [Google Scholar]
  20. Bise, R.; Kanade, T.; Yin, Z.; Huh, S. Automatic cell tracking applied to analysis of cell migration in wound healing assay. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society IEEE Engineering in Medicine and Biology Society Conference, Boston, MA, USA, 30 August–3 September 2011; Volume 2011, pp. 6174–6179. [Google Scholar]
  21. Van Valen, D.A.; Kudo, T.; Lane, K.M.; Macklin, D.N.; Quach, N.T.; DeFelice, M.M.; Maayan, I.; Tanouchi, Y.; Ashley, E.A.; Covert, M.W. Deep Learning Automates the Quantitative Analysis of Individual Cells in Live-Cell Imaging Experiments. PLoS Comput. Biol. 2016, 12, e1005177. [Google Scholar] [CrossRef] [Green Version]
  22. Hayashida, J.; Nishimura, K.; Bise, R. MPM: Joint Representation of Motion and Position Map for Cell Tracking. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 3822–3831. [Google Scholar]
  23. Nishimura, K.; Hayashida, J.; Wang, C.; Ker, D.F.E.; Bise, R. Weakly-supervised cell tracking via backward-and-forward propagation. In Computer Vision–ECCV 2020; Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 104–121. [Google Scholar]
  24. Hernandez, D.E.; Chen, S.W.; Hunter, E.E.; Steager, E.B.; Kumar, V. Cell Tracking with Deep Learning and the Viterbi Algorithm. In Proceedings of the 2018 International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS), Nagoya, Japan, 4–8 July 2018; pp. 1–6. [Google Scholar]
  25. DuChez, B.J. Automated Tracking of Cell Migration with Rapid Data Analysis. Curr. Protoc. Cell Biol. 2017, 76, 12–16. [Google Scholar] [CrossRef]
  26. Fu, K.S.; Mui, J.K. A survey on image segmentation. Pattern Recognit. 1981, 13, 3–16. [Google Scholar] [CrossRef]
  27. Hilsenbeck, O.; Schwarzfischer, M.; Skylaki, S.; Schauberger, B.; Hoppe, P.S.; Loeffler, D.; Kokkaliaris, K.D.; Hastreiter, S.; Skylaki, E.; Filipczyk, A.; et al. Software tools for single-cell tracking and quantification of cellular and molecular properties. Nat. Biotechnol. 2016, 34, 703–706. [Google Scholar] [CrossRef] [PubMed]
  28. Zhi, X.-H.; Meng, S.; Shen, H.-B. High density cell tracking with accurate centroid detections and active area-based tracklet clustering. Neurocomputing 2018, 295, 86–97. [Google Scholar] [CrossRef]
  29. Hilsenbeck, O.; Schwarzfischer, M.; Loeffler, D.; Dimopoulos, S.; Hastreiter, S.; Marr, C.; Theis, F.J.; Schroeder, T. fastER: A user-friendly tool for ultrafast and robust cell segmentation in large-scale microscopy. Bioinformatics 2017, 33, 2020–2028. [Google Scholar] [CrossRef] [PubMed]
  30. Kasprowicz, R.; Suman, R.; O’Toole, P. Characterising live cell behaviour: Traditional label-free and quantitative phase imaging approaches. Int. J. Biochem. Cell Biol. 2017, 84, 89–95. [Google Scholar] [CrossRef] [PubMed]
  31. Ghosh, S.; Das, N.; Das, I.; Maulik, U. Understanding Deep Learning Techniques for Image Segmentation. ACM Comput. Surv. 2019, 52, 1–35. [Google Scholar] [CrossRef] [Green Version]
  32. Moen, E.; Borba, E.; Miller, G.; Schwartz, M.; Bannon, D.; Koe, N.; Camplisson, I.; Kyme, D.; Pavelchek, C.; Price, T. Accurate cell tracking and lineage construction in live-cell imaging experiments with deep learning. bioRxiv. 2019, 803205. [Google Scholar] [CrossRef]
  33. Lugagne, J.-B.; Lin, H.; Dunlop, M.J. DeLTA: Automated cell segmentation, tracking, and lineage reconstruction using deep learning. PLoS Comput. Biol. 2020, 16, e1007673. [Google Scholar] [CrossRef] [Green Version]
  34. Dai, J.; He, K.; Sun, J. BoxSup: Exploiting Bounding Boxes to Supervise Convolutional Networks for Semantic Segmentation. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Washington, DC, USA, 7–13 December 2015; pp. 1635–1643. [Google Scholar]
  35. Hu, R.; Dollár, P.; He, K.; Darrell, T.; Girshick, R. Learning to Segment Every Thing. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4233–4241. [Google Scholar]
  36. Anjum, S.; Gurari, D. CTMC: Cell Tracking with Mitosis Detection Dataset Challenge. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 14–19 June 2020; pp. 4228–4237. [Google Scholar]
  37. Dendorfer, P.; Osep, A.; Milan, A.; Schindler, K.; Cremers, D.; Reid, I.; Roth, S.; Leal-Taixé, L. MOTChallenge: A Benchmark for Single-Camera Multiple Target Tracking. Int. J. Comput. Vis. 2021, 129, 845–881. [Google Scholar] [CrossRef]
  38. Luo, W.; Xing, J.; Milan, A.; Zhang, X.; Liu, W.; Kim, T.-K. Multiple object tracking: A literature review. Artif. Intell. 2021, 293, 103448. [Google Scholar] [CrossRef]
  39. Ciaparrone, G.; Luque Sánchez, F.; Tabik, S.; Troiano, L.; Tagliaferri, R.; Herrera, F. Deep learning in video multi-object tracking: A survey. Neurocomputing 2020, 381, 61–88. [Google Scholar] [CrossRef] [Green Version]
  40. Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Weng, F.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. ByteTrack: Multi-Object Tracking by Associating Every Detection Box. arXiv 2022, arXiv:211006864. [Google Scholar] [CrossRef]
  41. Chu, P.; Wang, J.; You, Q.; Ling, H.; Liu, Z. TransMOT: Spatial-Temporal Graph Transformer for Multiple Object Tracking. arXiv 2021, arXiv:210400194. [Google Scholar] [CrossRef]
  42. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
  43. Chen, Y.; Song, Y.; Zhang, C.; Zhang, F.; O’Donnell, L.; Chrzanowski, W.; Cai, W. Celltrack R-CNN: A Novel End-To-End Deep Neural Network For Cell Segmentation And Tracking in Microscopy Images. In Proceedings of the 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), Nice, France, 13–16 April 2021; pp. 779–782. [Google Scholar]
  44. Kisantal, M.; Wojna, Z.; Murawski, J.; Naruniec, J.; Cho, K. Augmentation for Small Object Detection; Aircc Publishing Corporation: Chennai, India, 2019. [Google Scholar]
  45. Chen, Y.; Quelhas, P.; Campilho, A. Low frame rate cell tracking: A Delaunay graph matching approach. In Proceedings of the 2011 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, Chicago, IL, USA, 30 March–2 April 2011; pp. 1015–1018. [Google Scholar]
  46. Zhou, Z.; Wang, F.; Xi, W.; Chen, H.; Gao, P.; He, C. Joint Multi-frame Detection and Segmentation for Multi-cell Tracking. In Image and Graphics; Zhao, Y., Barnes, N., Chen, B., Westermann, R., Kong, X., Lin, C., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 435–446. [Google Scholar]
  47. Ling, H.; Wu, Y.; Blasch, E.; Chen, G.; Lang, H.; Bai, L. Evaluation of visual tracking in extremely low frame rate wide area motion imagery. In Proceedings of the 14th International Conference on Information Fusion, Chicago, IL, USA, 5–8 July 2011; pp. 1–8. [Google Scholar]
  48. Barry, D.J.; Durkin, C.H.; Abella, J.V.; Way, M. Open source software for quantification of cell migration, protrusions, and fluorescence intensities. J. Cell Biol. 2015, 209, 163–180. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Hu, T.; Xu, S.; Wei, L.; Zhang, X.; Wang, X. CellTracker: An automated toolbox for single-cell segmentation and tracking of time-lapse microscopy images. Bioinformatics 2021, 37, 285–287. [Google Scholar] [CrossRef] [PubMed]
  50. Winter, M.; Mankowski, W.; Wait, E.; Temple, S.; Cohen, A.R. LEVER: Software tools for segmentation, tracking and lineaging of proliferating cells. Bioinformatics 2016, 32, 3530–3531. [Google Scholar] [CrossRef] [Green Version]
  51. Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16 × 16 Words: Transformers for Image Recognition at Scale. In Proceedings of the International Conference on Learning Representations, Virtual Event, Austria, 3–7 May 2021. [Google Scholar]
  52. Gwatimba, A.; Ho, J.; Iosifidis, T.; Karpievitch, Y.V. Rainbow: Automated air-liquid interface cell culture analysis using deep optical flow. J. Open Source Softw. 2022, 7, 4080. [Google Scholar] [CrossRef]
  53. Zabner, J.; Karp, P.; Seiler, M.; Phillips, S.L.; Mitchell, C.J.; Saavedra, M.; Welsh, M.; Klingelhutz, A.J. Development of cystic fibrosis and noncystic fibrosis airway cell lines. Am. J. Physiol.-Lung Cell. Mol. Physiol. 2003, 284, L844–L854. [Google Scholar] [CrossRef] [Green Version]
  54. Looi, K.; Troy, N.M.; Garratt, L.W.; Iosifidis, T.; Bosco, A.; Buckley, A.G.; Ling, K.M.; Martinovich, K.M.; Kicic-Starcevich, E.; Shaw, N.C.; et al. Effect of human rhinovirus infection on airway epithelium tight junction protein disassembly and transepithelial permeability. Null 2016, 42, 380–395. [Google Scholar] [CrossRef] [Green Version]
  55. Maška, M.; Ulman, V.; Svoboda, D.; Matula, P.; Matula, P.; Ederra, C.; Urbiola, A.; España, T.; Venkatesan, S.; Balak, D.M.; et al. A benchmark for comparison of cell tracking algorithms. Bioinformatics 2014, 30, 1609–1617. [Google Scholar] [CrossRef]
  56. Roth, M.; Bäuml, M.; Nevatia, R.; Stiefelhagen, R. Robust multi-pose face tracking by multi-stage tracklet association. In Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), Tsukuba, Japan, 11–15 November 2012; pp. 1012–1016. [Google Scholar]
  57. Wang, G.; Wang, Y.; Zhang, H.; Gu, R.; Hwang, J.-N. Exploit the Connectivity: Multi-Object Tracking with TrackletNet. In Proceedings of the 27th ACM International Conference on Multimedia, Nice, France, 21–25 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 482–490. [Google Scholar]
  58. Ma, C.; Yang, C.; Yang, F.; Zhuang, Y.; Zhang, Z.; Jia, H.; Xie, X. Trajectory Factory: Tracklet Cleaving and Re-Connection by Deep Siamese Bi-GRU for Multiple Object Tracking. In Proceedings of the 2018 IEEE International Conference on Multimedia and Expo (ICME), San Diego, CA, USA, 23–27 July 2018; pp. 1–6. [Google Scholar]
  59. Magnusson, K.E.G.; Jaldén, J.; Gilbert, P.M.; Blau, H.M. Global Linking of Cell Tracks Using the Viterbi Algorithm. IEEE Trans. Med. Imaging 2015, 34, 911–929. [Google Scholar] [CrossRef] [Green Version]
  60. Magnusson, K. klasma/BaxterAlgorithms. 2021. Available online: https://github.com/klasma/BaxterAlgorithms (accessed on 17 September 2021).
  61. Wojke, N.; Bewley, A.; Paulus, D. Simple online and realtime tracking with a deep association metric. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 3645–3649. [Google Scholar]
  62. Cordelières, F. Manual Tracking. 2004. Available online: https://imagej.nih.gov/ij/plugins/manual-tracking.html (accessed on 19 October 2021).
  63. Milan, A.; Leal-Taixe, L.; Reid, I.; Roth, S.; Schindler, K. MOT16: A Benchmark for Multi-Object Tracking. arXiv 2016, arXiv:160300831. [Google Scholar]
Figure 1. CTMC and wound repair dataset comparison. (A,B): A cell detected in the CTMC (A) and wound repair (B) dataset in two consecutive frames (left and right panels). Due to higher framerate, the cell displacement in panel (A) is almost unnoticeable compared to the cell in panel (B) captured at low framerate, which moved far away. (C): Equal sized image crops showing the difference in cell density between the CTMC (left) and wound repair dataset (right).
Figure 1. CTMC and wound repair dataset comparison. (A,B): A cell detected in the CTMC (A) and wound repair (B) dataset in two consecutive frames (left and right panels). Due to higher framerate, the cell displacement in panel (A) is almost unnoticeable compared to the cell in panel (B) captured at low framerate, which moved far away. (C): Equal sized image crops showing the difference in cell density between the CTMC (left) and wound repair dataset (right).
Jpm 12 00809 g001
Figure 2. EPIC cell detection in a low-resolution and high-density wound repair image of the control experiment type. Left panel: unlabeled raw image, middle panel: same image with cell detections marked with blue bounding boxes, and right panel: enlarged region of middle panel showing accurate cell detections (blue boxes) and a large unlabeled region of cell debris (indicated by red arrow).
Figure 2. EPIC cell detection in a low-resolution and high-density wound repair image of the control experiment type. Left panel: unlabeled raw image, middle panel: same image with cell detections marked with blue bounding boxes, and right panel: enlarged region of middle panel showing accurate cell detections (blue boxes) and a large unlabeled region of cell debris (indicated by red arrow).
Jpm 12 00809 g002
Figure 3. Cell trajectories of the leading edge cells tracked using manual cell tracking, EPIC and Viterbi in a delayed, control and accelerated experiment. Cell trajectories generated using manual cell tracking and EPIC indicate that cells primarily migrated in a positive vertical direction towards the wound region. In contrast, cell trajectories generated using Viterbi do not resemble leading edge cell tracks, instead suggesting that cells moved in more dispersed horizontal and vertical directions, including away from the wound area, contradicting EPIC and manual cell trajectories.
Figure 3. Cell trajectories of the leading edge cells tracked using manual cell tracking, EPIC and Viterbi in a delayed, control and accelerated experiment. Cell trajectories generated using manual cell tracking and EPIC indicate that cells primarily migrated in a positive vertical direction towards the wound region. In contrast, cell trajectories generated using Viterbi do not resemble leading edge cell tracks, instead suggesting that cells moved in more dispersed horizontal and vertical directions, including away from the wound area, contradicting EPIC and manual cell trajectories.
Jpm 12 00809 g003
Figure 4. Cell migration metrics produced by manual cell tracking, EPIC and Viterbi. Various shapes and bars represent the mean and standard deviations, respectively, of cell migration metrics produced by manual (black circle), EPIC (blue triangle) and Viterbi (red star) cell tracking in delayed (Del), control (Con) or accelerated (Acc) experiments.
Figure 4. Cell migration metrics produced by manual cell tracking, EPIC and Viterbi. Various shapes and bars represent the mean and standard deviations, respectively, of cell migration metrics produced by manual (black circle), EPIC (blue triangle) and Viterbi (red star) cell tracking in delayed (Del), control (Con) or accelerated (Acc) experiments.
Jpm 12 00809 g004
Figure 5. Comparison of the cell migration metrics produced by EPIC and Viterbi to manual cell tracking. Each symbol represents a p-value for pairwise comparisons of the sampled cell tracks from EPIC and manual cell tracking (blue filled) and Viterbi and manual cell tracking (black empty) for the delayed, control and accelerated experiments. We performed pairwise comparisons using two-sample Wilcoxon–Mann–Whitney tests. The statistical significance level was set to p < 0.05 (indicated by the dashed grey line). The solid grey line indicates y = 0. Metrics are shown in the following order and are abbreviated for clarity in the figure: Euclidean distance, accumulated distance, velocity, directionality, Y-forward migration index and end point angle.
Figure 5. Comparison of the cell migration metrics produced by EPIC and Viterbi to manual cell tracking. Each symbol represents a p-value for pairwise comparisons of the sampled cell tracks from EPIC and manual cell tracking (blue filled) and Viterbi and manual cell tracking (black empty) for the delayed, control and accelerated experiments. We performed pairwise comparisons using two-sample Wilcoxon–Mann–Whitney tests. The statistical significance level was set to p < 0.05 (indicated by the dashed grey line). The solid grey line indicates y = 0. Metrics are shown in the following order and are abbreviated for clarity in the figure: Euclidean distance, accumulated distance, velocity, directionality, Y-forward migration index and end point angle.
Jpm 12 00809 g005
Figure 6. Automatically identified leading edges visualized as two horizontal red lines in the first frame of a control experiment.
Figure 6. Automatically identified leading edges visualized as two horizontal red lines in the first frame of a control experiment.
Jpm 12 00809 g006
Table 1. A summary of the number of cells tracked from the 1st to the 22nd frame without fragmentation by EPIC, DeepSORT and Viterbi.
Table 1. A summary of the number of cells tracked from the 1st to the 22nd frame without fragmentation by EPIC, DeepSORT and Viterbi.
EPICDeepSORTViterbi
Experiment (Replicate)Total Cell TracksSampled Cell TracksTotal Cell TracksSampled Cell TracksTotal Cell TracksSampled Cell Tracks
Accelerated (A)1272000204620
Accelerated (B)2362000200820
Accelerated (C)2112000201920
Control (A)783200000
Control (B)5392000374
Control (C)586200000
Delayed (A)1462000188320
Delayed (B)7252000272620
Delayed (C)10062000132
Table 2. Total runtimes for EPIC, DeepSORT and Viterbi in the 9 experiments.
Table 2. Total runtimes for EPIC, DeepSORT and Viterbi in the 9 experiments.
EPICDeepSORTViterbi
Total Running Time20 min30 h3 h
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gwatimba, A.; Rosenow, T.; Stick, S.M.; Kicic, A.; Iosifidis, T.; Karpievitch, Y.V. AI-Driven Cell Tracking to Enable High-Throughput Drug Screening Targeting Airway Epithelial Repair for Children with Asthma. J. Pers. Med. 2022, 12, 809. https://doi.org/10.3390/jpm12050809

AMA Style

Gwatimba A, Rosenow T, Stick SM, Kicic A, Iosifidis T, Karpievitch YV. AI-Driven Cell Tracking to Enable High-Throughput Drug Screening Targeting Airway Epithelial Repair for Children with Asthma. Journal of Personalized Medicine. 2022; 12(5):809. https://doi.org/10.3390/jpm12050809

Chicago/Turabian Style

Gwatimba, Alphons, Tim Rosenow, Stephen M. Stick, Anthony Kicic, Thomas Iosifidis, and Yuliya V. Karpievitch. 2022. "AI-Driven Cell Tracking to Enable High-Throughput Drug Screening Targeting Airway Epithelial Repair for Children with Asthma" Journal of Personalized Medicine 12, no. 5: 809. https://doi.org/10.3390/jpm12050809

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop