Next Article in Journal
Design and Synthesis of Multi-Mode Bandpass Filter for Wireless Applications
Next Article in Special Issue
An Interactive Self-Learning Game and Evolutionary Approach Based on Non-Cooperative Equilibrium
Previous Article in Journal
A Survey of Machine Learning Techniques for Video Quality Prediction from Quality of Delivery Metrics
Previous Article in Special Issue
FaceVAE: Generation of a 3D Geometric Object Using Variational Autoencoders
 
 
Article
Peer-Review Record

Real-Time Application of Computer Graphics Improvement Techniques Using Hyperspectral Textures in a Virtual Reality System

Electronics 2021, 10(22), 2852; https://doi.org/10.3390/electronics10222852
by Francisco Díaz-Barrancas *,†, Halina Cwierz † and Pedro J. Pardo †
Reviewer 1: Anonymous
Reviewer 2:
Electronics 2021, 10(22), 2852; https://doi.org/10.3390/electronics10222852
Submission received: 20 October 2021 / Revised: 14 November 2021 / Accepted: 17 November 2021 / Published: 19 November 2021
(This article belongs to the Special Issue New Advances in Visual Computing and Virtual Reality)

Round 1

Reviewer 1 Report

The authors present an approach to render three-dimensional objects with hyperspectral textures within a virtual reality scenario.  The present an extensive work, however, certain important aspects are missing, which require authors’ attention.

  1. Many paragraphs are too small having a couple of sentences only (lines 34-39, 40-44, 139-142, 143-145, etc.). It is suggested to combine certain paragraphs in a systematic fashion.
  2. Lines 43-44: Recheck the sentence as it is grammatically incorrect and does not convey what is intended by the authors.
  3. Section 2 – Related work: It is poorly written and does not discuss the related methodologies, systems, frameworks available, and what is missing (which should create the baseline for the proposed work). Moreover, only 3 papers are cited in this section. This section in its current form does not convey any meaningful insights into the existing literature. Authors should put serious efforts into improving this section
  4. The number of references is way too less. Authors can easily find up-to-date literature on hyperspectral texturing in VR.
  5. It is better to perform extensive testing of the proposed system. It is recommended to include more colors (e.g. different shades of red and green, possibly blue colors – may be synthesized if natural fruit is difficult to find, mix of different colors). At least, well know colors such as yellow, orange, purple, pink, brown, etc. should be included in the setup and the color chromaticity of the physical and virtual colors should be reported (similar to Fig 11.
  6. Algorithms 1,2: how the constants are chosen to compute r, g, b, and gamma and why? Will the static constants produce similar colors with all types of physical colors and different color ranges? There must be a separate sub-section on this point where the authors should justify the chosen constants, their alternative values, and how these constants are computed. A strong justification is required, otherwise, the proposed solution will be considered static (which means it will always require manual fixing for different scenes and scenarios.). Authors should also look into if these static constants can be auto-computed (based on certain logic).
  7. Overall, there are serious formatting issues that should be fixed.
  8. English language should be improved.

 

Author Response

The authors present an approach to render three-dimensional objects with hyperspectral textures within a virtual reality scenario.  The present an extensive work, however, certain important aspects are missing, which require authors’ attention.

1. Many paragraphs are too small having a couple of sentences only (lines 34-39, 40-44, 139-142, 143-145, etc.). It is suggested to combine certain paragraphs in a systematic fashion.

 

We have combined several paragraphs into one.

 

2. Lines 43-44: Recheck the sentence as it is grammatically incorrect and does not convey what is intended by the authors.

We have rewritten the sentence to correct grammatical errors.

3. Section 2 – Related work: It is poorly written and does not discuss the related methodologies, systems, frameworks available, and what is missing (which should create the baseline for the proposed work). Moreover, only 3 papers are cited in this section. This section in its current form does not convey any meaningful insights into the existing literature. Authors should put serious efforts into improving this section

 

We have improved this section including more information about the colorimetric transformations and our framework. We think that we have improved seriously the paper, we have marked in yellow all the changes.

 

4. The number of references is way too less. Authors can easily find up-to-date literature on hyperspectral texturing in VR.

We have included more references at the Introduction and Related works sections.

5. It is better to perform extensive testing of the proposed system. It is recommended to include more colors (e.g. different shades of red and green, possibly blue colors – may be synthesized if natural fruit is difficult to find, mix of different colors). At least, well know colors such as yellow, orange, purple, pink, brown, etc. should be included in the setup and the color chromaticity of the physical and virtual colors should be reported (similar to Fig 11.

 

We have added more elements to the scenes with different colors. Also in the Colorchecker we can check various color patches. In addition, we have gone on to perform a color difference test between the real texture and the one obtained after performing the calculations in the virtual reality system. We have added a subsection in the result comparing these RGB textures with those enhanced with our hyperspectral technique.

 

6. Algorithms 1,2: how the constants are chosen to compute r, g, b, and gamma and why? Will the static constants produce similar colors with all types of physical colors and different color ranges? There must be a separate sub-section on this point where the authors should justify the chosen constants, their alternative values, and how these constants are computed. A strong justification is required, otherwise, the proposed solution will be considered static (which means it will always require manual fixing for different scenes and scenarios.). Authors should also look into if these static constants can be auto-computed (based on certain logic).

 

You are right. We have added a subsection explaining which parts are necessary to obtain the matrix. This matrix is device dependent and needs to be updated only if we change device but not for every scenario. In fact, obtaining such a matrix was also done automatically as explained in the color characterization section.

7.Overall, there are serious formatting issues that should be fixed.

We have corrected some things. If the article is finally accepted for publication, our compromise is to send the paper to an editorial editing service.

8. English language should be improved.

We have corrected some sentences and reviewed the language. However, if accepted for publication, we will send the document to an editorial service for language editing.

Author Response File: Author Response.pdf

Reviewer 2 Report

Real-time application of computer graphics improvement techniques using hyperspectral textures in a virtual reality system

Francisco Díaz-Barrancas, Halina Cwierz and Pedro J. Pardo

 

In this paper Díaz-Barrancas and colleagues tackle the challenge of represent colors with higher accuracy in HMD for immersive and three-dimensional view, under changing illumination. The address the issue by attempting a minimization routine that will replacing RGB color coordinates of images of objects by the closest reflectance spectrum of real objects, by a sort of spectral mapping to known RGB colors.

 

The paper is well organized with solid references, but I think the main result of the paper is hidden inside the conclusions and there are results missing to support the main claims.

 

224-225 – “This allows us to simulate our objects with a color representation more faithful to the one used by the classic system up to now.”

 

This is the core of the article and a very strong statement. But where are the results that support this idea?

Yes, there are results that confirm that when reproducing colors in the HMD and compare them using the real colors, there are errors that are perceived but manageable. But nothing is said about the error that may exist between using the original RGB or the closest spectral reflectance converted into new RGB.

The results presented compare flat colors with their representation in the HMD. But what about the hyperspectral textures in real fruits? What are the errors? In particular, what are the chromatic errors when comparing the original RGB of the fruit with the RGB obtained after selecting a real spectrum of a real lemon? I suspect it is the 1.8 value that shows up in Conclusions, but there is nothing on the paper that support this value or that proves this value is correct.

 

Also:

77 – “Moreover, this reflectance guarantees an experience similar to what our visual system can perceive in a physical environment.” – Overall, this is a truthful sentence, but not in the case of 3D reality googles, as only RGB will be used. If original data is available in the form of reflectances, things can by worked out and have a better RGB, but in the end we only work with RGB. So, can the author elaborate on this idea and explain the full meaning of this sentence?

 

 

94 – Section 3.1 - What was measured by the Minolta device is missing from the description. It is important so describe such here, so then it can be related to Equation 2 and to the numbers presented in Algorithm 1

 

128 – “In addition, in Fig. 3 we can also see its power spectral curve.” - This sentence is not clear. The spectral curve is from the LED that is used to light the object to be scanned, correct? If so, was the LED in the scanner itself or was used as an external light source?

 

134 – “This texture is generated in bytes” - What is the meaning of this sentence? Any information used in digital format is in bytes... can you please clarify your idea?

 

141 – “absolute white capsule” Please provide spectral reflectance, maker and model of the white standard.

 

144 – word illuminants - Since they are physical and are metamers of the intended illuminants, these should be labeled as light sources.

 

169 - Figure 7 is not informative and can be removed. What is important in this point is to explain what was used to consider one image rendered by deferred HDPR better than rendered by Forward HDPR. Visual image comparison, image metrics computation against a ground truth?

 

 

Author Response

In this paper Díaz-Barrancas and colleagues tackle the challenge of represent colors with higher accuracy in HMD for immersive and three-dimensional view, under changing illumination. The address the issue by attempting a minimization routine that will replacing RGB color coordinates of images of objects by the closest reflectance spectrum of real objects, by a sort of spectral mapping to known RGB colors.

 

The paper is well organized with solid references, but I think the main result of the paper is hidden inside the conclusions and there are results missing to support the main claims.

 Thank you very much for your contribution. We are grateful for your feedback and comments.

224-225 – “This allows us to simulate our objects with a color representation more faithful to the one used by the classic system up to now.”

 This is the core of the article and a very strong statement. But where are the results that support this idea?

You are right, we planned the work based on images and qualitative comparison of results but we have the quantitative results too, so, we have include now. We consider that this information has improved our work definitively

Yes, there are results that confirm that when reproducing colors in the HMD and compare them using the real colors, there are errors that are perceived but manageable. But nothing is said about the error that may exist between using the original RGB or the closest spectral reflectance converted into new RGB.

The results presented compare flat colors with their representation in the HMD. But what about the hyperspectral textures in real fruits? What are the errors? In particular, what are the chromatic errors when comparing the original RGB of the fruit with the RGB obtained after selecting a real spectrum of a real lemon? I suspect it is the 1.8 value that shows up in Conclusions, but there is nothing on the paper that support this value or that proves this value is correct.

We have added the chromatic error in several steps. First, chromatic characterization of system HMD, PC and Graphic Card: Average Error 1.8 DE00 units. Second, chromatic reproduction error, measuring directly through the lens of the HMD and comparing this measurement with real physical NCS samples: Average Errors around 3.5 DE00 units. Finally, we have compare the chromatic difference of both methods (with hyperspectral textures and without) using as reference the theoretical computed values, finding that in all cases, the hyperspectral methos is better that the RGB method. 

Also:

77 – “Moreover, this reflectance guarantees an experience similar to what our visual system can perceive in a physical environment.” – Overall, this is a truthful sentence, but not in the case of 3D reality googles, as only RGB will be used. If original data is available in the form of reflectances, things can by worked out and have a better RGB, but in the end we only work with RGB. So, can the author elaborate on this idea and explain the full meaning of this sentence?

 Thank you so much for this comment. Many times we are focused on our works and we thought that all reader are knowers of the framework of our work. We have included this paragraph to explain that question related with spectral matching VS metamers matching:

“Currently, the RGB color space is used to specify the color of such objects, as the use of the RGB color space is widespread across all digital devices. However, this color space is not the most appropriate if we want to ensure maximum color fidelity because this color space is device-dependent. Moreover, if we want to reproduce inside a virtual reality scene how the light interact with objects in the real world, it will be necessary to use a more adequate color space such as CIE 1931 XYZ. This device-independent color space allows us to compute the color of the real objects using the spectral reflectance of the surface of the objects and the spectral power distribution of the light source. Of course we still have a need to use the RGB color space because this is the native color space of all HMDs.It is necessary to remark that we are still working in the framework of metameric matching of real-world colors using the light stimuli generated at the displays located inside the HMDs. In this paradigm we have only three primary lights mixed in proper amounts to match the desired color. There is no spectral matching of the color generated at the display and the real-world color used as reference. Like all tri-primary color systems, we are only matching the color using the phenomenon of metamerism. This is the reason why it is important to include spectral information of the light source and the objects at virtual reality systems, because metamerism does not support changes in light sources.”

 94 – Section 3.1 - What was measured by the Minolta device is missing from the description. It is important so describe such here, so then it can be related to Equation 2 and to the numbers presented in Algorithm 1

We have misinterpreted that the procedure was understood. To clarify the measurement procedure we have added a paragraph explaining that the central capsule was measured with the different RGB values in 5 steps. This measurement was performed with a software automation while changing the color of the capsule.

 128 – “In addition, in Fig. 3 we can also see its power spectral curve.” - This sentence is not clear. The spectral curve is from the LED that is used to light the object to be scanned, correct? If so, was the LED in the scanner itself or was used as an external light source?

The power spectral curve corresponds to the scanner LED. We have rewritten the sentence to make this clear. The scanner has its own LED as the reviewer well appreciates.

 134 – “This texture is generated in bytes” - What is the meaning of this sentence? Any information used in digital format is in bytes... can you please clarify your idea?

 In this sentence we wanted to imply that a .bmp or .png file was not used directly to read from Unity. We have transformed the .bmp file to  .bytes file and read it with the Unity library for byte file. Once the bytes have been read we have used the information as such. We have introduced a sentence in the document to specify this.

141 – “absolute white capsule” Please provide spectral reflectance, maker and model of the white standard.

 We have added the manufacturer, and model and the spectral reflectance value (99,9% in visible range). We haven`t introduce a figure because the reflectante is really flat and the figure is not informative.

144 – word illuminants - Since they are physical and are metamers of the intended illuminants, these should be labeled as light sources.

Absolutely true. We have replaced all the words illuminants by light sources where the context was focused on a metamer.

169 - Figure 7 is not informative and can be removed. What is important in this point is to explain what was used to consider one image rendered by deferred HDPR better than rendered by Forward HDPR. Visual image comparison, image metrics computation against a ground truth?

The reviewer makes a good point in this annotation and we fully agree to remove the figure from the paper.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors have addressed most of the major concerns I had raised in my previous iteration. Moderate English changes are required though to finally accept the article for publication.

Reviewer 2 Report

The manuscript was greatly improved and all my major questions and concerns were properly addressed. 

Back to TopTop