Next Article in Journal
Joint Ventures and Sustainable Development. A Bibliometric Analysis
Previous Article in Journal
Improving the Strategic Benchmarking of Intellectual Capital Management in Logistics Service Providers
Previous Article in Special Issue
ST-Trie: A Novel Indexing Scheme for Efficiently Querying Heterogeneous, Spatiotemporal IoT Data
 
 
Article
Peer-Review Record

Fossel: Efficient Latency Reduction in Approximating Streaming Sensor Data

Sustainability 2020, 12(23), 10175; https://doi.org/10.3390/su122310175
by Fatima Abdullah, Limei Peng and Byungchul Tak *
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Sustainability 2020, 12(23), 10175; https://doi.org/10.3390/su122310175
Submission received: 13 November 2020 / Revised: 2 December 2020 / Accepted: 3 December 2020 / Published: 5 December 2020
(This article belongs to the Special Issue IoT Data Processing and Analytics for Computational Sustainability)

Round 1

Reviewer 1 Report

Authors have comprehensively updated the manuscript based on the reviewers' comments, therefore, an accept decision is recommended.

Author Response

Thank you for your review.

Reviewer 2 Report

In my opinion, the authors satisfied all reviewers' requests

Author Response

Thank you for your review.

Reviewer 3 Report

Thank you for those revisions.

Author Response

Thank you for your review.

Reviewer 4 Report

The authors gave answers for all my questions and suggestions. I suggest to accept the paper to be published in its newest version.

Author Response

Thank you for your review.

This manuscript is a resubmission of an earlier submission. The following is a list of the peer review reports and author responses from that submission.


Round 1

Reviewer 1 Report

Summary: This article presents a latency optimization strategy for streaming IoT data in the Fog networks. Their evaluations show improvements in latency reduction from both processing and network delays. Authors use reservoir sampling technique to improve the number of computations performed during processing of IoT data.

Detailed Comments:

  1. This article presents an optimization technique based on reservoir sampling. It might be difficult for the reader to understand the sampling processes of an IoT sensor over a streaming data. Therefore, can you present this pictorially, perhaps with a figure?
  2. I think, the general notation to follow contracted author list is for e.g., “Doe et al.”; however, there are several occurrences that does not follow the general practice, so double check: “Rabkinet.al”, “Prosperi et.al”, etc. Also, the in another general notation, we use: “etc.”. But, authors use “etc”. So, double check for such kind of occurrences in the article.
  3. Does Fig. 2 indicate the system module? Or a Fog node? Or end-to-end telematics system over the network? Can you provide more context to the Fig.?
  4. In Table 1, what is the source of latency reduction? It is from the sampling process? Processing delay improvements? Or bandwidth conservation resulting in a reduced queue length? Please explain.
  5. How does your method improve the end-to-end latency especially for Ultra-Low Latency communications, and applications?
  6. Can you show and describe what are the latency components that exists in IoT Fog deployments for a streaming sensor data?
  7. In the proposed sampling process, how do you generally prioritize the critical sensor data dynamically?
  8. Tables 3 and 4 shows marginal improvements, and for table 4 can we translate the data into actual battery savings? May be through analytical evaluations?
  9. For Fig. 7, what is QEF and QEC explain?
  10. Citations can be improved, for instance consider reviewing:
    1. Schulz, Philipp, et al. "Latency critical IoT applications in 5G: Perspective on the design of radio interface and network architecture." IEEE Communications Magazine 55.2 (2017): 70-78.
    2. Nasrallah, Ahmed, et al. "Ultra-low latency (ULL) networks: The IEEE TSN and IETF DetNet standards and related 5G ULL research." IEEE Communications Surveys & Tutorials 21.1 (2018): 88-145.
    3. Maiti, Prasenjit, et al. "An effective approach of latency-aware fog smart gateways deployment for iot services." Internet of Things 8 (2019): 100091.

Reviewer 2 Report

The paper describes the authors' work on an approach for the reduction of latency in sensor data streaming in edge/IoT architectures.

The problem is relevant, and the authors correctly individuate the metrics needed to evaluate the quality of their solution. The proposed approach is compared to another literature solution by means of a simulator.

 

The paper needs a very thorough proofreading, as English language needs a revision , there are dangling sentences and many typos. This seriously affects the presentation and limits the possibility of appreciating the value of the contribution.

The authors should improve the problem statement and should better position their work with respect to literature. The description of the algorithm should be improved. In 3.2 sentence "Now it is clear" should be avoided.

The organization of the proposed system and the role and nature of fog nodes and layers should in general be further clarified. Given the potentially high number of layers, I expected simulated tests with a larger number of sensing nodes and with more different scenarios, in order to actually demonstrate the scalability and include the effects of a larger installation (that is not unlikely in IoT solutions for the applications that the authors point out as of possible interest). Please improve the description of Figure 2.

The proposed solution is presented at a very high level of abstraction: this is positive, but an evaluation, on an abstract level, of general performances should be provided to prove the advantages of the approach in general (e.g. a performance analysis in terms of computational complexity, or other possible analyses).

The experimental section should provide more details on the simulated setup, motivating the choices and detailing, to define the context, the kind of actual technologies to which the authors refer to, and in general the parameters used to dimension the simulations should be motivated (what are the sources of such values? How are energy consumption parameters estimated?). The choice of the simulator should be motivated and the simulator should be presented, as, at the best of my knowledge, it is not widely known to the community (e.g. what are the advantages of using it instead of more popular simulators such as, let us say, ns3? A Systematic Literature Review is available on the use of ns3 for IoT/WSN in literature: is any similar work, or a survey, available about the proposed simulator?). As the authors are proposing a general solution, more cases should be presented to allow a solid and thorough evaluation of the advantages of the proposed approach.

As a single other approach is used for results comparison, a detailed comparison of the two approaches should be provided (including a table with a comparison of the main features and pros/cons).

 

Conclusions should be strenghtened.

Reviewer 3 Report

This paper proposes a framework and an optimization algorithm for latency reduction utilizing a sampling technique. This can be a useful contribution to the streaming data analytics.

I have a few comments:

  1. Please justify clearly (in the Introduction and in Conclusion) how this paper suits in aims and scope of the journal.
  2. A more detailed description of the reservoir sampling would be required in Section 3.3. Is the 'k' and 'K' the same? Please explanation of the probability term 'k/i' also.
  3. Figure 3: Can the plot be produced as a black and white print-friendly?

 

 

Reviewer 4 Report

Summary:

This paper addresses an important issue which is the delays and latencies of the data processing and networking in the context of IoT, Fog, and cloud computing. The paper proposes techniques based on reservoir sampling in optimization and approximate computing. The proposed approach allows to efficiently reduce resources utilization compared to the state-of-the-art approaches.

 

Some typos:

Line 38 “Due to the exponential growth of IoT data and, it has

Line 145 “the effectiveness of proposed”

line 213: “similarly all paths and there delays are initialize “

 

Strengths:

The paper is well written and organized.

The related work is very organized and clear.

There is enough information in the introduction to understand the problem and the proposed solution.

 

 

Comments:

Section 3.1: using “n” for data items and for number of fog layers is misconducting. It is used also for the number of paths in section 3.5 line 202. Authors should use different notations to distinguish different aspects.

Line 172: authors explain how cluster and systematic sampling work without showing how the reservoir method works and how it could be suited for this use case.

Section 3.3: I think there is a mistake in the explanation of the reservoir method. If K is the size of the simple in terms of number of items K/i will be greater or equal to 1 which is not a probability (between 0 and 1).

Line 215: if the average service rate and average arrival rate of fog node is constant, it seems like the delay will be also constant all time which is not realistic.

Section 3.5: what is the frequency of calculating the new paths and optimal nodes for sampling? Is it periodically or trigged by an event?

Section 4: The evaluation methodology is not clear. What kind of data set has been used? Synthetic or real?

What are the values of the parameters (number of layers, number of nodes,

Section 4.5: all the comparisons in tables and figures are redundant. Example the same information in table 1 and figure 3.

 

General comment:

authors said that they used approximate computing and optimization to reduce the latencies but they doesn't explain how the reservoir sampling approach is outperforming SRS which has been used in reference 10. It could be interesting to test different sampling approaches and show the difference between them and how they interact with the optimization approach.

Back to TopTop