Next Article in Journal
Clear-Air Bragg Scattering Observed above the Convective Boundary Layer in the Morning
Previous Article in Journal
Overview of Comprehensive Risk Assessment Methods and Hazards Early Warning System for Geological Hazards in the Mountain Area
 
 
Article
Peer-Review Record

Method of Validating Satellite Surface Reflectance Product Using Empirical Line Method

Remote Sens. 2023, 15(9), 2240; https://doi.org/10.3390/rs15092240
by Meghraj K C 1,2, Larry Leigh 2,*, Cibele Teixeira Pinto 2 and Morakot Kaewmanee 2
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3:
Reviewer 4:
Remote Sens. 2023, 15(9), 2240; https://doi.org/10.3390/rs15092240
Submission received: 14 March 2023 / Revised: 18 April 2023 / Accepted: 21 April 2023 / Published: 23 April 2023

Round 1

Reviewer 1 Report

This work provides in-depth investigation of developing and evaluation of the Landsat L2 surface reflectance products. Their Empirical Line Method (ELM) validated by several limited ground measurements for bright target (Algodones) and AVIRIS for dark target (Salton Sea). They validated the surface reflectance products for L9-OLI-2, L8-OLI, and L5-TM-SR products giving the excellent statistical results. This work is well organized and written. But there are some minor comments for clarifications.

Major comments.

Line 650: Authors applied ‘average of there 3 cross-cal factor (0.993)’. What if 0.995 and 0.996 is closer to the truth under the situation that 0.988 was an extreme outlier? What is the uncertainty of your ASD ground measurement? Please explain the uncertainty of the your absolute calibration.

Line 704: The same question on AVIRIS as a calibration reference. What is the uncertainty level of AVIRIS based absolute calibration? Was there any data that you could get ASD, AVIRIS and Landsat collection at the same time?

Here are my minor comments to the authors.

Line 12: Please define L2C2. Is this one type of Landsat product?

Line 22-26: Results defined as accuracy and precision for Algodones Dunes and MAE for Salton Sea. Would you be consistent? Please define accuracy and precision for your results. People have different perspectives the terms that we are using in Remote Sensing.

Line 32: What does “0.5 to 1 reflectance units” mean? Do you mean 50% to 100%? Would you please specify them in percentage?

Line 28: Validation results are presented by RMSE values and accuracies. Would you define RMSE in terms of what values? What are the relationship between RMSE and accuracy? Is RMSE represent precision? I found the equation section 3.3. You may want to briefly explain these values in abstaract.

Line 49: A brief definition of (L2C2) is needed in abstract.

Line 53: How do you know your results are precise and accurate? What is your reference? Please specify briefly.

Line 123: Please add a Landsat RGB Image for Algodones dunes and Salton Sea. It will give readers better undersigning of your targets. Or place the Figure 2 here. Please explain the path and row (WRS) system here.

Line 220: I assume that there were filed campaigns that you collected Landsat, ASD and AVIRIS at the same time. But there is no section of Field Campaign.

Line 241: Please define SBAF.

Line 268: Which sensor / date was used for Figure 2?

Line 288: Is there any reason that authors put ‘Absolute’ for the surface reflectance model?

Line 403: Is this equation work for all the bands? Or only applicable for certain bands?

Line 490-491: Authors claimed that applying the cross-cal factor gives the absolute calibration. So the AVIRIS and ASD reflectance trends are considered to be absolute calibration? Have authors tried to do cross-calibration between ASD and AVIRIS to confirm their similarities?

Line 808: Excellent results. What would be the possible reasons having slightly larger differences in the short wavelength bands? Could this be the possible on-orbit calibration (Solar Diffuser based calibration) error?  How many collections did authors use for this table? When was the collection time? Was it in early lifetime or later?

Line 856: Very nice results again. Please specify the test collections to get this table.

Line 898: Better results than previous two. Again, please specify the test collections to get this table.

Author Response

Thank you for your valuable comments and suggestions. Please find the attached file containing response of comments.

Author Response File: Author Response.pdf

Reviewer 2 Report

The authors introduce in this work a validation methodology for Surface Reflectance using ELM and they use it to validate results from 3 sensors L9-OLI-2, L8-OLI and L5-TM-SR. A SR model using ELM method is developed from L8 first and later calibrated and compared with the actual SR products from the 3 missions. All considered I find they authors did a detailed study including different aspects of the method in order to improve the accuracy of the results. The results show a that the propose method provides very competitive results in terms of product validation. I also find the article is clearly written and I think it is essentially ready for publication, althouth I would suggest to consider the following comments:

- When describing the ELM method and the use of DN, it is not specified what DN values exactly. Perhaps it shall be mentioned for those less familiar with Ladsat. E.g. are these raw DN values from the sensor? 

- Personally I would have liked some more details on the  calibrating the SR model. In particular, for the uncertainty study the influence of the number of ground truth measurements on the accuracy. Perhaps something to consider for a future study.

- I do not understand what is meant by setence in line 99 "Normalization produces the poorest result, which does not require additional information"

- I think lines 335-338 repeat the same explanation "Another ROI of Algodones dunes was taken to investigate decreasing negative trend, and the temporal trend of L2C2 SR data was observed on that ROI. While observing the same temporal trend, the decreasing trend of L2C2 SR was observed on another ROI of Algodones dunes."

- I do not see an explanation why figure 3 has less data points than Figure 11 (a). Should not be the same amount of data? 

- Black line in figure 5 not distinguishable from the black points. 

- The montecarlo considers uncertainties, which in particular in the case of ASD measurements are based on repeatability. That does not consider other effects like a systematic bias. I guess that is discarded through ASD lab calibration, but in practice one can discard up to a certain level. Perhaps it shall be mentioned that.  

- Actually in figure 11  (a) and (c) one can see like 2 sinusoidal behaviours (one on top of the other). I think it is not explained, but one can guess seasonal variation for the sinusoidal part, but the reason for the shift between the two I guess it has to do with BRDF and view angles. Is it so?

 

 

 

 

Author Response

Thank you for your valuable comments and suggestions. Please find the attached file containing response of comments.

Author Response File: Author Response.pdf

Reviewer 3 Report

The article is devoted to the important problem of estimating the quality of the satellite information for the ground surface reflectance. The method is based on reflection models for the Algodones dunes and Salton Sea sites. In the manuscript it was shown that the considered model can be used to estimate the quality of these data for Landsat-5,8,9 satellites. To prove the applicability of these models, ground-based measurements for these areas of the ground surface are also involved. Previous publications are given in sufficient details and generally well represent the current state of the problem. The results presented have significant practical value. Nevertheless, I have a few questions, which are listed below:

1) Line 114: “The ELM approach has been used on relatively thin atmospheric layers primarily because it necessitates the identification of at least two homogenous targets with contrasting reflectance…”What is meant by “relatively thin atmospheric layers?” What limitations on the atmospheric optical thickness does the method have?

2) Are the models and approach used applicable to other parts of the ground surface?

3) At the text of sections 2.1, 2.2, and 2.3 we can see that for Landsat 9, Landsat 8, and Landsat 5 satellites descriptions are different in structure: In section 2.1 the composition, the application of the Landsat 9 sensors, and also the improvements over previous satellites are described. Section 2.2 contains information about the spatial resolution of Landsat 8 instruments, description of the orbit, etc. Section 2.3 provides information about the launch date, instrument names, description of the satellite's orbit, etc. I recommend the authors to slightly modify these sections so that they are more uniform in terms of the information contained.

4) Line 257: “Algodones dunes and Salton Sea ROIs were selected based on previous knowledge of the sites and available ground measurements from ASD and AVIRIS, allowing us to use them in an ELM development.” If I am not mistaken, the source of ground measurements is not given in the manuscript. Were the measurements made by the authors or taken from somewhere else? If these data are available, it is better to add an appropriate reference.

5) Line 265: “… via using using a time series anaylsis of data from Landsat 8.” Misprints.

6) Equation (2). How was this model obtained? Can it be considered as a Taylor series up to quadratic terms for function of 4 variables?

7) The coefficients ??, ??,…, ??? in equation (2) are important parts of the obtained results. However, if I am not mistaken, their meanings are not given in the manuscript. I consider it necessary to include them in the supplementary materials.

8) Figures 4-6 show normalized reflectance units. If I am not mistaken, there is no explanation how this normalization was performed.

9) Equations (4)-(5). For example, in [Vanhellemont, Q. and Ruddick, K. Turbid wakes associated with offshore wind turbines observed with Landsat 8. Remote Sensing of Environment 2014, 145, 105–115.], the following factors are taken into account in satellite remote sensing of the water surface: 1) specularly reflected solar radiation, 2) radiation reflected by foam and whitecaps, 3) radiation scattered by the upper water layer. Does model (5)-(6) take into account all these factors? If not, which ones does it not take into account and what error does this create?

10) When using formula (5), the key values are the coefficients ?0?, ?1?, ?0?, and ?1. Their meanings, as far as I can see, are not given in the manuscript. I consider it necessary to add them to the supplementary materials.

11) In formulas (8), (9), (12), (14), (15) I consider it necessary to introduce notation and not to use whole words.

12) There are no integration limits in equation (10).

13) Line 568: “Finally, the standard deviation of 160 surface reflectance images gave the overall absolute pixel level uncertainty of the ELM SR model.” When using the Monte Carlo method, the key issue is the error of the resulting estimate. The question arises – is 160 surface reflectance images enough to get a statistically reliable estimate?

14) If I understood correctly, in Section 4.1 the results for the green band are considered, and in Section 4.2 – the results for the red band. Were are the remaining bands analyzed? If so, how much better/worse are their results (for example, are the Histograms of Residual Plot very different?)

15) Line 717: “The histogram ranged from -0.004 to +0.004.” If we turn to Figure 12d, we can see that the boundaries of the histogram are different. Authors are requested to check the correctness of similar comments to Figures 11b, d and 12 b,d.

16) Line 755: “In the CA and blue bands, we can see more scatteredness for low reflective regions or dark sites below 10% reflectance in the dynamic reflectance range. This might be due to the higher contribution of aerosol particles have on the lower reflective areas.” This explanation is questionable. The reason is better described on line 772: “and this might be due to the high atmospheric effect.” As far as I understand, with a decrease in the reflectance, the part of radiation from the observed pixel decreases, and the part of background radiation (scattered in the atmosphere and created by adjacency effect) increases. In the CA and blue bands, the influence of these factors is greater, which may explain the observed effect.

17) An analysis of figures 13, 15, and 16 and tables 2, 3, and 4 shows that for the Blue channel, the “Accuracy”, “Precision”, and RMSE values are lower for the older TM instrument than for the OLI and OLI-2 instruments. What is the reason? At first glance, newer instruments should give results with less error.

18) Line 984: “Based on the SBAF corrected model, gain and bias were calculated using ELM approach, which can be applied to Landsat 9 and Landsat 5 images to produce SR.” If I am not mistaken, SBAF values are given only for some situations, and gain and bias are not given in the manuscript. It would be useful to add these values to the supplementary materials.

In general, taking into account the high practical significance of the results, after taking into account the expressed wishes, I will recommend this manuscript for publication.

Comments for author File: Comments.pdf

Author Response

Thank you for your valuable comments and suggestions. Please find the attached file containing response of comments.

Author Response File: Author Response.pdf

Reviewer 4 Report

Comments are texted in the manuscript. Please see in the manuscript.

Comments for author File: Comments.pdf

Author Response

Thank you for your valuable comments and suggestions. Please find the attached file containing response of comments.

Author Response File: Author Response.pdf

Round 2

Reviewer 3 Report

All my comments and recommendations were taken into account . The manuscript can be published in its present form.

Reviewer 4 Report

I have no further comments.

Back to TopTop