Next Article in Journal
In Vitro Antioxidant Effects of Coenzyme Q10 on Cellular Metabolism in Aged Mesenchymal Stem Cells
Next Article in Special Issue
Development of an Anomaly Classification Model and a Decision Support Tool for Firewall Policy Configuration
Previous Article in Journal
Longitudinal Bending Stiffness Analysis of Composite Carbon Plates and Shoe Sole, Based on Three-Point Bending Test
Previous Article in Special Issue
NDEExplorer: Visual Analytics for Exploring Damage Modes via Multimodal Data in the Non-Destructive Examination of Composite Materials
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Implementation of a WebGPU-Based Volume Rendering Framework for Interactive Visualization of Ocean Scalar Data

State Key Laboratory of Marine Geology, Tongji University, Shanghai 200092, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(5), 2782; https://doi.org/10.3390/app15052782
Submission received: 18 January 2025 / Revised: 18 February 2025 / Accepted: 4 March 2025 / Published: 5 March 2025
(This article belongs to the Special Issue Data Visualization Techniques: Advances and Applications)

Abstract

:
Visualization contributes to an in-depth understanding of ocean variables and phenomena, and a web-based three-dimensional visualization of ocean data has gained significant attention in oceanographic research. However, many challenges remain to be addressed while performing a real-time interactive visualization of large-volume heterogeneous scalar datasets in a web environment. In this study, we propose a WebGPU-based volume rendering framework for an interactive visualization of ocean scalar data. The ray casting algorithm, optimized with early ray termination and adaptive sampling methods, is adopted as the core volume rendering algorithm to visualize three-dimensional gridded data preprocessed from regular and irregular gridded volume datasets generated by ocean numerical modeling, utilizing the Babylon.js rendering engine and WebGPU technology. Moreover, the framework integrates a set of interactive visual analysis tools, providing functionalities such as volume cutting, value-based spatial data filtering, and time-series animation playback, enabling users to effectively display, navigate, and explore multidimensional datasets. Finally, we conducted several experiments to evaluate the visual effects and performance of the framework. The results suggest that the proposed WebGPU-based volume rendering framework is a feasible web-based solution for visualizing and analyzing large-scale gridded ocean scalar data.

1. Introduction

Oceanographic studies typically involve observations and numerical modeling. With the rapid development of computational and numerical modeling technologies, ocean numerical modeling data and the derived reanalysis datasets have become the main contributors to ocean scientific data. Ocean numerical modeling data are time-varying, multidimensional, multiscale, and multivariate, usually characterized by larger volumes and higher spatiotemporal resolutions compared to observational data. Ocean numerical modeling, which produces simulation data of different scalar and vector fields structured in regular or irregular grids, is generally performed on three-dimensional grids. Many important ocean environmental variables, such as temperature, salinity, density, and chlorophyll concentration, are categorized as scalar data. These scalar data, described solely by numerical magnitude without directional attributes, are essential for oceanographic studies. However, the heterogeneous and complex nature of ocean scalar data poses challenges in data management, presentation, and analysis [1], especially when only relying on traditional data visual analysis approaches, such as data tables and static charts. Ocean data visualization approaches [2,3,4,5,6], as more advanced techniques, can address these limitations by intuitively representing the geospatial characteristics of large-volume ocean spatiotemporal datasets, contributing to an in-depth understanding of ocean water masses and dynamic changes within ocean environmental phenomena. Therefore, this study focuses on developing a visualization framework for large-scale three-dimensional ocean scalar data on both regular and irregular grids, providing efficient data representation and interactive visual analysis capabilities for oceanographers and stakeholders.
Ocean data visualization approaches are commonly categorized into two-dimensional and three-dimensional techniques [7]. Traditional two-dimensional data visualization techniques, typically based on Geographic Information Systems (GISs) in the form of 2D maps, have been widely adopted to depict and analyze ocean variables [8,9,10]. However, using 2D thematic maps to visualize complex data with three-dimensional spatial characteristics presents limitations. In contrast to two-dimensional visualization techniques, three-dimensional visualization techniques can better show the entire structure information of water masses and display the scattered features as a whole in three-dimensional space, enabling researchers to explore and comprehend water masses from multiple perspectives efficiently, thus supporting research in the field of ocean data analysis [11,12]. Driven by the fast-paced advancements in computer graphics, the scope and sophistication of ocean data visualization research are continuously evolving, with an increasing focus on three-dimensional visualization techniques due to their ability to reveal spatiotemporal variability and complex characteristics of water masses in recent years.
Three-dimensional data visualization encompasses various techniques, among which volume rendering is regarded as one of the most valuable. The primary techniques of volume rendering include indirect and direct volume rendering [13]. Indirect volume rendering, also known as iso-surfacing [14], includes algorithms like marching cubes and marching tetrahedra, which typically display three-dimensional data as constant-value contour surfaces. These algorithms can effectively capture and represent contour information of multidimensional volume data while providing advantages in computational efficiency and resource consumption. However, volume visualization based on a conventional iso-surface has limitations in demonstrating the complete internal structure of the ocean scalar data used to characterize water masses. By comparison, direct volume rendering algorithms can visualize the data as a continuous three-dimensional volume, clearly displaying the potential features within the volume data. Over the past few decades, volume rendering has become a key area of research in three-dimensional data visualization, with diverse applications in many fields, such as medicine [15], geology [16], and meteorology [17].
Most three-dimensional ocean data visualization tools have been developed based on a sophisticated Client/Server (C/S) architecture, such as those desktop-based three-dimensional frameworks for an interactive visualization of large-scale ocean data developed by Lv et al. [18] and Tian et al. [19]. As web technologies continue to advance rapidly, the Browser/Server (B/S) architecture has increasingly been adopted in many visualization studies over the traditional C/S architecture. The B/S architecture centralizes the core part on the server side, offering greater openness, integrability, and shareability while simplifying the system’s development, maintenance, and usage processes. It has thus become one of the leading technical solutions for building data visualization frameworks and systems across various fields [20,21,22,23]. WebGL, a JavaScript-based graphics rendering Application Programming Interface (API) based on the OpenGL ES standard, is currently the preferred technology for developing web-based 3D platforms, and numerous studies have been conducted on three-dimensional interactive ocean data visualization utilizing WebGL. For instance, Lu et al. [24] used 3D tiles and WebGIS technology to achieve a real-time online three-dimensional visualization of weather radar data, allowing for visualizing data from different horizontal planes by setting the vertical interval. Li et al. [25] designed an adaptive ray casting spherical volume rendering method based on the open-source web API Cesium.js to express ocean scalar fields. Liu et al. [26] developed a three-dimensional interactive visualization framework for ocean eddies in web browsers that supports eddy extraction, tracking, and querying, facilitating cross-platform scientific collaboration. In our previous work [27], Plotly.js (v1.47.4), an open-source graphical JavaScript library based on WebGL, was adopted to develop a web-based three-dimensional interactive visualization framework for time-varying and large-volume ocean forecasting data.
Nonetheless, due to WebGL’s lack of explicit support for parallel processing and inability to fully utilize modern graphics hardware [28], WebGL-based visualization frameworks sometimes face difficulties in meeting the gradually growing performance demands. Challenges remain in visualizing large-scale data efficiently on web platforms [29], particularly in interactive and real-time visualization scenarios. Therefore, there is a need to introduce advanced high-performance rendering technologies for smooth and high-quality three-dimensional visualization and dynamic interactions in a web environment.
As the latest WebGPU technology emerges, these limitations are expected to be overcome. WebGPU is a JavaScript API that is based on graphics APIs, including Vulkan, Metal, and Direct3D 12, and it uses WebGPU Shading Language (WGSL) as its shading language. It enables more direct access to modern multicore GPUs, providing efficient and flexible graphics programming interfaces for web application developers. Compared to WebGL, a mature and widely adopted web graphics technology, WebGPU is considered the next-generation trend for web graphics rendering, offering improvements in performance, functionality, and design [30], presenting significant advantages in the following aspects: First, WebGPU effectively reduces the overhead of frequent data transfers between the CPU and GPU, thereby improving the performance of web-based 3D applications. Additionally, it also provides powerful parallel computing capabilities, along with general-purpose computation functionality beyond rendering, allowing for the rapid handling of large-scale computational tasks. Moreover, WebGPU has a low-level nature, allowing developers to create highly realistic and sophisticated 3D graphics by using more power of the GPU. Given the high-performance processing capabilities of WebCPU, it has been suggested that WebGPU is expected to become one of the key technologies for developing web-based 3D platforms in the near future [31]. However, research and applications of WebGPU are still in their infancy, specifically in the field of ocean data visualization.
As a further improvement of our previous work [27], we propose a three-dimensional volume rendering framework to implement the efficient and high-quality visualization of massive ocean scalar data by utilizing WebGPU technology, with the goal of exploring the feasibility and potential applications of WebGPU-based volume rendering in the oceanographic scientific visualization field. In this work, modeling outputs generated with various types of grids and different spatial resolutions are first preprocessed with the aim of enhancing the suitability of the original dataset for subsequent visualization. Then, the ray casting algorithm, optimized with early ray termination and adaptive sampling methods, is adopted as the core volume rendering algorithm, with Babylon.js as the rendering engine, to visualize both large-scale regular and irregular gridded datasets in a web environment. Moreover, interactive visual analysis tools are also incorporated to enable an in-depth exploration of the volume data. Finally, several experiments were performed to evaluate the visual effects and efficiency of the proposed WebGPU-based framework using the ocean numerical modeling datasets describing the ocean temperature field (temperature data), flow field (flow velocity magnitude data), and acoustic field (acoustic propagation loss data).
The remainder of the paper is organized as follows: Section 2 provides a detailed description of the adopted methods and the implementation of the WebGPU-based volume rendering. Section 3 presents and discusses the visual effects of the three-dimensional visualization, as well as conducts comparative analyses of the proposed framework’s efficiency. Finally, Section 4 summarizes the conclusion.

2. Methods

2.1. Preprocessing of Regular and Irregular Gridded Ocean Scalar Data

The two datasets used in this study include the simulation results of the ocean environmental parameters and ocean acoustic field, respectively, provided by the South China Sea Institute of Oceanology (SCSIO). The first dataset contains scalar data such as salinity, temperature, and flow velocity magnitude data, using the World Geodetic System 1984 (WGS84) coordinate system. In the horizontal direction, the dataset is structured in regular grids and uniformly distributed with a spatial resolution of 1/60° × 1/60°. In contrast, the vertical grid spacing is uneven, with a resolution of 5 m from 0 m to −500 m and 100 m from −500 m to −2000 m, respectively, for a total of 116 vertical layers. The temporal resolution of the dataset is 1 h, consisting of 121 time steps in total. Figure 1a illustrates the spatial distribution structure of the dataset.
The second dataset containing acoustic propagation loss data is structured in irregular grids horizontally. This three-dimensional dataset features a cylindrical grid structure, which is centered around a certain point and consists of multiple circular surfaces. For each horizontal plane, the data points are distributed on concentric circles rather than arranged in typical latitude−longitude grids, with one data point located every 3° along each circle. The max radius of the concentric circles is approximately 50 km, with a radial step size of 90 m between adjacent circles. Vertically, the maximum depth of the dataset extends to approximately −3700 m, with a vertical step size of 6 m. Its spatial distribution structure is illustrated in Figure 1b. In particular, to provide a uniform spatial reference for data visualization, the Universal Transverse Mercator (UTM) coordinates of each data point at the i-th depth layer, where i ranges from 1 to 630, are first calculated as detailed in Equation (1). Specifically, θ and r represent the angle and the distance between the data point and the central point, respectively, with xcenter and ycenter denoting the UTM coordinates of the central point in the horizontal plane. Then, the calculated UTM coordinates (x, y, z) need to be converted into the WGS84 coordinates (longitude, latitude, altitude) to maintain consistency with the coordinate system of the first dataset.
x = cos θ π 180 r + x c e n t e r y = sin θ π 180 r + y c e n t e r z = 6 i
The two datasets are typically stored in the self-describing binary data format, NetCDF (Network Common Data Form). Although NetCDF files are ideal for storing scientific data [32], there are challenges present in efficiently loading large-volume NetCDF files and rendering them in web browsers. Therefore, in the context of fast and smooth visualization in web-based environments, a more lightweight and compact format is needed. JSON (JavaScript Object Notation), a readable and web-friendly format with a simple and text-based syntax rule, is commonly used in web application development for data preprocessing and exchange tasks due to its low-overhead data storage and transmission capabilities. To improve the efficiency of further processing, the original NetCDF files are converted into a series of JSON files according to variables and time dimensions. Moreover, additional preprocessing steps involving data interpolation and volume texture generation are carried out to address the technical challenges for volume rendering posed by the uneven structure of the data grids, making them suitable for shader programming.
  • Data interpolation: Concerning the first dataset, interpolation in the vertical direction is necessary due to its nonuniform depth intervals. We utilize the spline interpolation method to interpolate the data in the depth range from 0 m to −2000 m with a 5 m interval, resulting in a total of 401 vertical layers. Similarly, horizontal interpolation is required for the second dataset. The cubic spline interpolation method is employed to generate regular latitude and longitude grids, and the horizontal grid size of each depth layer is 300 × 300 after interpolation, containing a total of 90,000 data points.
  • Volume texture generation: Volume textures, also known as 3D textures, are generally used to store volume data, enabling color mapping during volume rendering. Volume textures are a logical extension of traditional 2D textures. Whereas a 2D texture is typically a picture that provides surface color information for a 3D model, a volume texture is composed of multiple 2D textures grouped together to describe three-dimensional spatial data. Volume textures are particularly well suited for shader programming as they can be efficiently accessed and manipulated in shaders, and their sizes are compact enough to conserve memory resources consumed during the visualization process. Therefore, in this study, we ultimately process the data into volume textures. To generate the volume texture, the data in each JSON file are first organized into a three-dimensional array with a shape of (Nlongitude, Nlatitude, Ndepth), where N represents the number of data layers in each corresponding dimension. After mapping the data values to the range of 0 to 255, the data are subsequently converted into a RAW file since the RAW image format is one of the most commonly used volume texture file formats. The data mapping process is illustrated in Equation (2), where v represents the original data value, v′ represents the mapped data value, and vmin and vmax denote the minimum and maximum data values, respectively.
v = v v m i n 255 v m a x v m i n

2.2. Ray Casting Algorithm Optimized with Early Ray Termination and Adaptive Sampling Methods

Technically, there are many well-established direct volume rendering algorithms that have seen extensive applications in visualization research, including ray casting [33,34], shear-warp [35], texture slicing [36], and splatting [37]. Among these, ray casting, which is based on image sequences [38], is one of the most widely used volume rendering algorithms due to its intuitive nature, straightforward implementation, and capability to produce high-quality visual effects. Thus, we employ it as the core algorithm to visualize the ocean scalar fields. The fundamental principle of the ray casting algorithm applied in this study is as follows:
First, for each pixel on the image plane, a ray is emitted from the viewpoint in the direction towards the pixel and traverses the three-dimensional volume data. Next, equidistant sampling is performed along the ray with a fixed step size Δt, as described in Equation (3), where pi represents the texture coordinates of the i-th sampling point, and pstart indicates the texture coordinates of the entry point where the ray intersects the volume. Additionally, d denotes the direction of the ray, and ti represents the distance traversed by the ray within the volume. Subsequently, each sampling point’s voxel value can be obtained by interpolation based on the calculated texture coordinates and the volume texture, which stores the volume data. The color and opacity values of the sampling point are then determined according to the voxel value using a specific color mapping method.
p i = p s t a r t + t i d t i = i Δ t
Finally, each pixel’s color and opacity values are composited based on the color and opacity values of the sampling points in a front-to-back order, as described in Equation (4). Specifically, Ci represents the color value of the i-th sampling point, and Ai denotes its opacity value. The accumulated color and opacity values at the i-th sampling point are represented as C i Δ and A i Δ , respectively, whereas C i 1 Δ and A i 1 Δ represent the accumulated values at the previous sampling point. For each pixel, the sampling and calculation processes are completed once the ray finishes traversing the entire volume data completely, generating the color and opacity of the current pixel and ultimately resulting in the final rendered image.
C i Δ = 1 A i 1 Δ A i C i + C i 1 Δ A i Δ = 1 A i 1 Δ A i + A i 1 Δ
Although the ray casting algorithm offers many advantages, it has limitations in terms of performance. On the one hand, interpolation involving a large number of numerical operations is employed during rendering, significantly increasing the complexity of computations. Moreover, whenever the viewpoint changes, such as zooming in, zooming out, or panning in the volume rendering viewer, the sampling and calculation processes need to be re-executed to generate the updated volume rendering result, which requires a substantial computational overhead. As a result, the efficiency of real-time rendering may be affected, particularly in interactive applications.
Therefore, researchers have proposed various optimization methods to reduce the computation and memory overhead, such as early ray termination [39], adaptive sampling [40], and octree [41,42]. To achieve a smoother and more real-time visualization, we employ several methods to improve the efficiency of the ray casting algorithm in this study. The early ray termination method is first utilized as it is an efficient, straightforward, and easy-to-integrate optimization strategy. The core principle of early ray termination is that, as the ray traverses the volume data, the contribution of subsequent sampling to the pixel’s final color can be ignored if the accumulated opacity value at a particular sampling point is already sufficiently large. Once the accumulated opacity value exceeds a specified threshold, the sampling process terminates immediately, avoiding unnecessary computations and reducing the overall computational load. Furthermore, since seabed topography and islands typically exist within the ocean modeling computational domains, a large number of grid points are assigned null values, generating continuous null-value regions where the opacity of the data points is zero. Given this characteristic of the ocean scalar data, this study also employs an adaptive sampling method to leap the null-value space quickly. As shown in Equation (5), if the absolute value of the difference between A i Δ and A i 1 Δ is less than the sufficiently small pre-set threshold ϵ (ϵ > 0), the ray is considered to have entered the null-value region, and the sampling step size is appropriately increased. This optimization can efficiently accelerate the sampling process while preserving the original high-quality visual effects.
A i Δ A i 1 Δ < ϵ

2.3. Interactive Visual Analysis Tools for Real-Time Rendering

In the process of 3D ocean data visualization, interactive visualization techniques are effective for exploring complex ocean data [19]. It is often necessary to provide users with the flexibility to interact with and manipulate the visualization result in real time. This involves allowing users to switch between different datasets, dynamically adjust various rendering parameters, or modify the camera angles and distances to explore the data from multiple perspectives. The goal is to ensure that the operations performed by users through mouse or keyboard inputs can be immediately updated and reflected in the results. To achieve this feature, a set of comprehensive and user-friendly interactive tools is indispensable. In this study, to achieve interactivity of volume rendering, the proposed framework integrates a set of interactive visual analysis tools. These tools include the following main functionalities:
  • Data source selection and switching: In oceanographic research, the data required for visualization are often diverse and complex, often necessitating frequent switching between different data sources for analysis. This tool pre-loads a list of all available datasets, allowing users to quickly switch between them by simply selecting the desired data source from a dropdown menu.
  • Basic rendering parameter setting: This feature allows for a dynamic adjustment of key rendering parameters, including the sampling step size in the ray casting algorithm and transparency, either by using the mouse-controlled sliders or by directly entering values through the keyboard to enhance the visual representation of the data. Users can also modify the colorbar through a dropdown menu.
  • Bounding box and axes overlaying: To allow intuitive spatial references for the data’s distribution, users can overlay the volume rendering result with the bounding box, as well as horizontal (latitude and longitude) and vertical (depth) axes through checkboxes. The inclusion of the bounding box helps to visually frame the dataset, while the axes enable users to easily understand the position of the data within the geographic coordinate system.
  • Volume cutting: The framework provides a volume cutting feature that allows users to slice the 3D data along coordinate axes, enabling a detailed examination of specific regions or layers within the volume, which is particularly useful for analyzing complex structures or features that may not be immediately visible in the full three-dimensional representation.
  • Spatial data filtering based on value: This tool helps users focus on specific regions of interest (ROI) by adjusting the minimum and maximum scalar field values, thereby filtering out the non-relevant data and displaying the relevant portion, enabling a more targeted analysis of ocean variables.
  • Time-series animation playback: This feature is especially useful for representing the dynamic processes and changes in ocean environmental phenomena over time. Users can specify a time range of interest and visualize the corresponding time-varying volume data in an animation format.

2.4. Implementation of WebGPU-Based Volume Rendering Framework with Babylon.js

There are many JavaScript libraries, such as Babylon.js, Three.js, vtk.js, and Orillusion, that support WebGPU API, enabling web developers to leverage the capabilities of the GPU for building high-performance and interactive web-based 3D applications or visualizing three-dimensional data. Among these, Babylon.js and Three.js are among the most popular and advanced web-based 3D rendering engines, with support for both WebGL and WebGPU APIs. They not only provide a robust and stable ability to build realistic scenes but also offer flexible capabilities to support the development of scientific data visualization. As a result, they can facilitate a better integration of these two aspects, making it easier to combine data visualization with diverse 3D scenes, thereby showcasing their advantages in terms of building compatible and extensible data visualization frameworks. In particular, Babylon.js (https://www.babylonjs.com/, accessed 16 February 2025), a powerful, beautiful, simple, and completely open-source web rendering engine developed by Microsoft and other contributors, offers several distinctive advantages: First, Babylon.js integrates comprehensive rendering capabilities and performance optimization strategies, making it suitable for handling complex 3D applications with performance-intensive requirements. Second, Babylon.js benefits from its active community and support forum, ensuring timely updates and continuous improvements to stay aligned with the latest WebGPU advancements. Third, Babylon.js is also a game engine designed for the web, and in addition to offering new ways of visualizing spatial information, game engines introduce numerous possibilities for interactivity, animation, and simulation [43]. Finally, and most importantly, Babylon.js is a pioneering 3D rendering engine that supports WebGPU, and it has extensive experience in leveraging this cutting-edge technology. Babylon.js began providing full support for WebGPU as early as version 5.0, and related functionalities are currently under active development, further solidifying its position as a leader in the field.
Given the benefits of this 3D rendering engine, we develop a volume rendering framework utilizing the HTML5 standard and JavaScript combined with Babylon.js (v7.29.0) and WebGPU, along with Python (v3.10.13) scripts for data preprocessing. The framework achieves a real-time three-dimensional visualization of large-scale gridded ocean scalar data, providing interactive visual analysis capabilities and enabling users to display, navigate, and explore large multidimensional datasets from different perspectives in a web environment. The main workflow of the proposed framework is illustrated in Figure 2 and detailed in the following:
  • Data preprocessing and loading: The data preprocessing tasks are conducted using Python scripts, which run on the server side and are automatically triggered when new datasets are available. The conversion from NetCDF files to JSON files is first performed using the netCDF4 library. Then, after the JSON files are initially read using the NumPy library, the PyProj library is employed to transform the coordinates of the ocean acoustic field forecast dataset from the UTM coordinate system to the WGS84 coordinate system. Data interpolation for both datasets is then carried out using the SciPy library. Finally, using the NumPy library, all datasets are reorganized into RAW files, which serve as volume textures for subsequent rendering. Specifically, the processed files are dynamically retrieved on demand according to front-end requests without the need to load all data from the server at once.
  • Initialization of the volume rendering viewer: First, a blank canvas is established as the carrier for displaying the rendering results, and the WebGPU engine is initialized. A 3D scene is then constructed to serve as the container for all graphic objects, with the engine handling the task of rendering them. To facilitate a flexible exploration of the volume rendering results from different viewpoints, a camera is created in the scene and configured to support the rotation, zooming, and panning of the results through interactive actions such as mouse drags and keyboard inputs. Additionally, detailed legend information, including elements such as the title, colorbar, unit, and numerical labels, is also attached to the viewer, providing contextual information for the data visualization.
  • Building vertex and fragment shaders: WGSL is employed to build vertex and fragment shaders for volume rendering, and a corresponding material is generated based on these shaders. In the vertex shader, coordinates are transformed from the local system to the global system, and the direction vector between the viewpoint and each position is calculated and passed to the fragment shader. In the fragment shader, for each pixel, a ray originating from the viewpoint, with its direction determined by the input direction vector, traverses the volume data. Next, the entry and exit points where the ray intersected the volume are calculated, and adaptive sampling is performed from the entry point to the exit point along the ray. The basic sampling step size is initialized by users but is increased to 1.5 times the user-defined value when the ray enters the null-value region. The values of the sampling points are retrieved from the current volume texture and mapped to corresponding colors based on a color texture. Each pixel’s color and opacity values are continuously calculated until the accumulated opacity value exceeds the specified threshold of 0.98 or the ray reaches the exit point. Finally, a box mesh is created in the scene, with its material set to the customized material, generating the final volume rendering result.
  • Integration of interactive visual analysis tools: The developed framework integrates a set of interactive visual analysis tools for an in-depth exploration of the multidimensional volume datasets, enabling a dynamic adjustment of multiple rendering parameters. When users interact with the tools, the framework maps the user inputs via sliders, dropdown menus, or checkboxes to corresponding rendering parameters and automatically updates the fragment shader with the new parameters, triggering real-time re-rendering. These tools encompass a range of functionalities, including data source selection and switching, a basic rendering parameter setting, bounding box and axes overlaying, volume cutting, spatial data filtering based on value, and time-series animation playback.

3. Results and Discussion

To evaluate the visual effects and performance of the proposed WebGPU-based volume rendering framework, we conducted a series of experiments using the two datasets mentioned in Section 2.1, with a total original file size of approximately 2.9 GB. Each single RAW file generated from the ocean environmental parameters forecast dataset, which includes temperature and flow velocity magnitude data, contains 119 × 115 × 401 (5,487,685) data points, and the RAW file generated from the ocean acoustic field forecast dataset, which consists of acoustic propagation loss data, contains 300 × 300 × 630 (56,700,000) data points. The experiments were performed in Google Chrome 129 on the Windows 11 operation system. The hardware environment included an Intel i5-12600KF processor (Intel, Santa Clara, CA, USA), 32 GB of RAM, and an Nvidia RTX 3070 graphics card with 8 GB of Video RAM (NVIDIA, Santa Clara, CA, USA).

3.1. Visual Effects of Volume Rendering

This section examines the visual effects of the proposed WebGPU-based volume rendering framework, with the first experiment comparing the visualization results across various rendering configurations. Figure 3 illustrates the volume rendering results of the temperature and flow velocity magnitude data structured in regular grids. Similarly, Figure 4 illustrates the volume rendering results of the ocean acoustic field structured in irregular grids.
As demonstrated in this experiment, the framework can effectively present the complex three-dimensional structures of various volume datasets and enable an insightful visual exploration of large-scale ocean scalar data. By adjusting and optimizing the rendering parameters, the framework can achieve ideal visual effects, accurately revealing the internal features of ocean scalar fields. Figure 3a illustrates the gradual decrease in temperature of the water mass with increasing depth, indicating that surface seawater has a higher temperature than the water at deeper levels in the modeling computational domain. Figure 3b shows the spatial arrangement of regions with different flow velocity magnitudes, depicting the flow characteristics of the water mass at different locations. Meanwhile, as illustrated in Figure 4, the propagation attenuation pattern of the acoustic field in a complex marine environment is effectively demonstrated, and a clear representation of the spatial distribution of acoustic wave energy is provided. By comparing Figure 3a and Figure 4a with Figure 3b and Figure 4b, it is evident that the internal spatiotemporal distribution characteristics of the water masses can be more clearly observed by changing the transparency parameter, which is inversely related to the opacity of each data point, in combination with an appropriate colorbar, while the display of details can be controlled by modifying the sampling step size. The experimental results indicate that the proposed framework can provide both a clear and accurate data representation while delivering high-quality visual effects, offering a comprehensive web-based visualization solution.
Additionally, we conducted an experiment on the developed interactive visual analysis tools to demonstrate their capacities. Figure 5 illustrates the results of the interactive visual analysis of the ocean scalar data.
Figure 5a,b show the volume rendering results of the flow velocity magnitude data, cut along a single direction with vertical and horizontal cutting planes using the volume cutting tool, respectively. Figure 5c further demonstrates the multidirectional cutting capability of this tool, enabling a multifaced spatial exploration of the acoustic propagation loss data. Unlike traditional slicing methods, the proposed functionality allows for the observation of both the profile and the preserved data block, enabling users to intuitively identify and analyze valuable information on the cutting plane as well as within the data block. Figure 5d presents the volume data within the ROI, filtered with a user-specified value range using the value-based spatial data filtering tool, highlighting the acoustic propagation loss data that users aim to visualize. This tool can effectively isolate the desired feature, enabling a better understanding of the ocean environmental phenomena of interest. The experimental results indicate that these user-friendly interactive visual analysis tools can allow users to gain scientific insights into ocean numerical modeling outputs through straightforward interactive operations, providing a more customizable and flexible data visualization experience.

3.2. Performance Analysis of Volume Rendering

Several experiments were conducted in this section to evaluate the performance of the proposed WebGPU-based volume rendering framework. During the experiments, the maximum memory usage of our framework was approximately 700 MB, which is within an acceptable range for most modern hardware configurations. As this work is an enhancement of our previous research [27], the first experiment aims to compare the performance of the visualization solution proposed in this study with the one proposed in our previous work, which was based on the Plotly.js graphic library, with the texture slicing algorithm applied to render the volume data as translucent image stacks. It is important to emphasize that the number of temperature data points used in the visualization solution proposed in this study is approximately three times that of the one proposed in our previous work. Similarly, for the acoustic propagation loss data, the number of data points is approximately 1.3 times that of the previous visualization solution. Figure 6 shows the rendering time for the volume data using different visualization solutions, respectively.
As observed during the experiment, there is a noticeable difference in the rendering time of both visualization solutions. As illustrated in Figure 6, the proposed visualization solution renders the volume data significantly faster than the one in our previous work. The visualization solution developed in our previous work presents limitations in processing large datasets and may suffer from slow responses during interactive operations. In contrast, the visualization solution proposed in this study significantly improves the loading speed of volume rendering, reducing the rendering time by approximately 11.7 times for the temperature data and 5.7 times for the acoustic propagation loss data. The experiment demonstrates that the visualization solution based on Babylon.js and WebGPU proposed in this study can enable efficient three-dimensional visualization and responsive real-time interactions, and it is a feasible solution for visualizing and analyzing large-scale gridded data in a web environment.
To further evaluate the performance advantages of WebGPU, a WebGL-based volume rendering framework is developed with Babylon.js and the same ray casting algorithm, offering the same functionalities as the WebGPU version for comparative analysis. In the second experiment, we compare both versions’ frame rate and GPU utilization using the same rendering parameters and data, as shown in Figure 7.
Figure 7a shows an increase in the frame rate with approximately 40 frames per second (fps) when rendering the temperature data (5,487,685 data points) with the WebGPU version. For rendering the acoustic propagation loss data (56,700,000 data points), the improvement is approximately 25 fps. The results suggest that the WebGPU version achieves a higher frame rate than the WebGL version, indicating that the WebGPU-based visualization solution provides a smoother visualization experience than the WebGL-based one. In Figure 7b, it is evident that the GPU utilization of the WebGPU version is higher than that of the WebGL version during the rendering process, suggesting that WebGPU has the ability to utilize the power of the GPU more effectively. Thus, the WebGPU-based volume rendering framework can ensure immediate feedback in a web environment. The experimental results further confirm the performance advantages and excellent potential of WebGPU in large-scale three-dimensional data visualization, especially in visualization tasks requiring high real-time performance and interactivity. We can conclude that visualization solutions based on WebGPU are expected to play an important role in the field of three-dimensional visualization.

4. Conclusions

The challenges in the real-time interactive visualization of large-volume heterogeneous scalar datasets in a web environment have motivated our design and implementation of the WebGPU-based volume rendering framework for an interactive visualization of large-scale ocean scalar data. The framework proposed in this study first applies the ray casting algorithm, optimized with early ray termination and adaptive sampling methods, as the core algorithm for volume rendering. Next, by completing the initialization of the volume rendering viewer and building vertex and fragment shaders with the Babylon.js 3D rendering engine and the latest WebGPU technology, the framework successfully achieves efficient three-dimensional volume rendering of massive gridded data, which are preprocessed through steps including format conversion, data interpolation, and volume texture generation from complex regular and irregular gridded ocean numerical modeling datasets. It also integrates interactive visual analysis functionalities, including volume cutting, value-based spatial data filtering, and time-series animation playback, supporting visualization with different configurations of parameters. Finally, the WebGPU-based framework is compared to our previous work and an additionally developed WebGL-based version. The experimental results show that the proposed framework performs well in displaying structures and details of ocean scalar data, and it can provide high-quality visual effects while enabling smooth real-time rendering. Although this framework is designed for the visualization of ocean data, we believe that the framework can be easily customized to visualize other scientific data or integrated into different visualization scenarios with some modifications. Its extensibility, along with the technical solutions accumulated from our previous work, also allows for easy-to-implement integration with the visualization of other data types, such as seabed terrain or real-time sensor data. The findings further highlight WebGPU’s advantages in visualization tasks requiring high real-time performance and interactivity. In conclusion, visualization solutions based on WebGPU are expected to contribute to the advancement of data visualization and become valuable technical tools in future visualization studies of oceanography, environmental monitoring, and other related research fields. To enable a more comprehensive and insightful analysis of ocean data, future research will focus on incorporating a dynamic three-dimensional visualization of ocean vector fields into this framework and developing more interactive visual analysis tools to enhance intelligent analysis capabilities, as well as further build WebVR applications based on the existing work to provide more intuitive ocean data visualization and presentation. Additionally, WebGPU still faces some challenges, such as the steep learning curve for new developers and a relatively small number of available tools. As it is still in the stage of continuous development, issues related to instability and compatibility across different browsers and browser versions remain challenging. To address these problems, we will pay close attention to the development of WebGPU and continuously upgrade and improve our framework to enhance its stability and reliability while continuing to optimize it in conjunction with other new network technologies, such as WebAssembly.

Author Contributions

Conceptualization, J.Y. and R.Q.; methodology, J.Y.; software, J.Y.; validation, J.Y.; data curation, Z.X.; writing—original draft preparation, J.Y.; writing—review and editing, J.Y. and R.Q.; supervision, R.Q. and Z.X.; project administration, R.Q.; funding acquisition, R.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China (2021YFC2800500).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are not publicly available due to confidentiality agreements related to the project. Access to the data should be requested by contacting the corresponding author, who will require a detailed explanation of the intended use of the data.

Acknowledgments

The authors are grateful to Rufu Qin for his support in the theoretical aspects of this study. The authors thank Zhounan Xu for his help in data curation and preprocessing. The authors thank Rufu Qin for his help in reviewing and editing this paper. We also thank the reviewers and editors for their suggestions to improve the quality of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Huang, D.; Zhao, D.; Wei, L.; Wang, Z.; Du, Y. Modeling and Analysis in Marine Big Data: Advances and Challenges. Math. Probl. Eng. 2015, 2015, 384742. [Google Scholar] [CrossRef]
  2. Wang, Y.; Li, F.; Zhang, B.; Li, X. Development of a component-based interactive visualization system for the analysis of ocean data. Big Earth Data 2022, 6, 219–235. [Google Scholar] [CrossRef]
  3. Buck, V.; Stäbler, F.; González, E.; Greinert, J. Digital Earth Viewer: A 4D Visualisation Platform for Geoscience Datasets. In Proceedings of the 9th Workshop on Visualisation in Environmental Sciences (EnvirVis), Virtual Event, Switzerland, 14 June 2021; The Eurographics Association: Eindhoven, The Netherlands, 2021; pp. 33–37. [Google Scholar]
  4. Liu, S.; Chen, G.; Yao, S.; Tian, F.; Liu, W. A framework for interactive visual analysis of heterogeneous marine data in an integrated problem solving environment. Comput. Geosci. 2017, 104, 20–28. [Google Scholar] [CrossRef]
  5. Zheng, H.; Shao, Q.; Chen, J.; Shan, Y.; Qin, X.; Ma, J.; Xu, X. LIC color texture enhancement algorithm for ocean vector field data based on HSV color mapping and cumulative distribution function. Acta Oceanol. Sin. 2022, 41, 171–180. [Google Scholar] [CrossRef]
  6. Shi, Q.; Ai, B.; Wen, Y.; Feng, W.; Yang, C.; Zhu, H. Particle System-Based Multi-Hierarchy Dynamic Visualization of Ocean Current Data. ISPRS Int. J. Geo-Inf. 2021, 10, 667. [Google Scholar] [CrossRef]
  7. Xie, C.; Li, M.; Wang, H.; Dong, J. A survey on visual analysis of ocean data. Vis. Inform. 2019, 3, 113–128. [Google Scholar] [CrossRef]
  8. Fang, G.; Wang, D.; Huang, H.; Chen, J. A WebGIS system on the base of satellite data processing system for marine application. In Proceedings of the Remote Sensing for Environmental Monitoring, GIS Applications, and Geology VII, Florence, Italy, 17–20 September 2007; pp. 562–570. [Google Scholar]
  9. Spondylidis, S.; Topouzelis, K.; Kavroudakis, D.; Vaitis, M. Mesoscale Ocean Feature Identification in the North Aegean Sea with the Use of Sentinel-3 Data. J. Mar. Sci. Eng. 2020, 8, 740. [Google Scholar] [CrossRef]
  10. Zhang, Y.; Li, G.; Yue, R.; Liu, J.; Shan, G. PEViz: An in situ progressive visual analytics system for ocean ensemble data. J. Vis. 2023, 26, 423–440. [Google Scholar] [CrossRef]
  11. Gao, X.; Zhang, T. Three dimensional visualization analysis for marine field data based on 3D-GIS. In Proceedings of the 6th International Symposium on Digital Earth: Models, Algorithms, and Virtual Reality, Beijing, China, 9–12 September 2009; pp. 331–337. [Google Scholar]
  12. Su, T.; Cao, Z.; Lv, Z.; Liu, C.; Li, X. Multi-Dimensional visualization of large-scale marine hydrological environmental data. Adv. Eng. Softw. 2016, 95, 7–15. [Google Scholar] [CrossRef]
  13. Xu, C.; Sun, G.; Liang, R. A survey of volume visualization techniques for feature enhancement. Vis. Inform. 2021, 5, 70–81. [Google Scholar] [CrossRef]
  14. El Seoud, M.S.A.; Mady, A.S. A comprehensive review on volume rendering techniques. In Proceedings of the 8th International Conference on Software and Information Engineering, Cairo, Egypt, 9–12 April 2019; pp. 126–131. [Google Scholar]
  15. Zhang, Q.; Eagleson, R.; Peters, T.M. Volume Visualization: A Technical Overview with a Focus on Medical Applications. J. Digit. Imaging 2011, 24, 640–664. [Google Scholar] [CrossRef] [PubMed]
  16. Liu, R.; Guo, H.; Yuan, X. Seismic structure extraction based on multi-scale sensitivity analysis. J. Vis. 2014, 17, 157–166. [Google Scholar] [CrossRef]
  17. Song, Y.; Ye, J.; Svakhine, N.; Lasher-Trapp, S.; Baldwin, M.; Ebert, D. An Atmospheric Visual Analysis and Exploration System. IEEE Trans. Vis. Comput. Graph. 2006, 12, 1157–1164. [Google Scholar] [CrossRef]
  18. Lv, T.; Fu, J.; Li, B. Design and Application of Multi-Dimensional Visualization System for Large-Scale Ocean Data. ISPRS Int. J. Geo-Inf. 2022, 11, 491. [Google Scholar] [CrossRef]
  19. Tian, F.; Mao, Q.; Zhang, Y.; Chen, G. i4Ocean: Transfer function-based interactive visualization of ocean temperature and salinity volume data. Int. J. Digit. Earth 2021, 14, 766–788. [Google Scholar] [CrossRef]
  20. Ates, O.; Appukuttan, S.; Fragnaud, H.; Fragnaud, C.; Davison, A.P. NeoViewer: Facilitating reuse of electrophysiology data through browser-based interactive visualization. SoftwareX 2024, 26, 101710. [Google Scholar] [CrossRef]
  21. Diblen, F.; Hendriks, L.; Stienen, B.; Caron, S.; Bakhshi, R.; Attema, J. Interactive Web-Based Visualization of Multidimensional Physical and Astronomical Data. Front. Big Data 2021, 4, 626998. [Google Scholar] [CrossRef]
  22. Chen, T.-T.; Sun, Y.-C.; Chu, W.-C.; Lien, C.-Y. BlueLight: An Open Source DICOM Viewer Using Low-Cost Computation Algorithm Implemented with JavaScript Using Advanced Medical Imaging Visualization. J. Digit. Imaging 2023, 36, 753–763. [Google Scholar] [CrossRef]
  23. Fan, D.; Liang, T.; He, H.; Guo, M.; Wang, M. Large-Scale Oceanic Dynamic Field Visualization Based on WebGL. IEEE Access 2023, 11, 82816–82829. [Google Scholar] [CrossRef]
  24. Lu, M.; Wang, X.; Liu, X.; Chen, M.; Bi, S.; Zhang, Y.; Lao, T. Web-Based real-time visualization of large-scale weather radar data using 3D tiles. Trans. GIS 2021, 25, 25–43. [Google Scholar] [CrossRef]
  25. Li, W.; Liang, C.; Yang, F.; Ai, B.; Shi, Q.; Lv, G. A Spherical Volume-Rendering Method of Ocean Scalar Data Based on Adaptive Ray Casting. ISPRS Int. J. Geo-Inf. 2023, 12, 153. [Google Scholar] [CrossRef]
  26. Liu, L.; Silver, D.; Bemis, K. Visualizing Three-Dimensional Ocean Eddies in Web Browsers. IEEE Access 2019, 7, 44734–44747. [Google Scholar] [CrossRef]
  27. Qin, R.; Feng, B.; Xu, Z.; Zhou, Y.; Liu, L.; Li, Y. Web-based 3D visualization framework for time-varying and large-volume oceanic forecasting data using open-source technologies. Environ. Model. Softw. 2021, 135, 104908. [Google Scholar] [CrossRef]
  28. Usta, Z. Webgpu: A New Graphic Api for 3D Webgis Applications. In Proceedings of the 8th International Conference on GeoInformation Advances, Istanbul, Turkey, 11–12 January 2024; pp. 377–382. [Google Scholar]
  29. Wang, Z.; Yang, L. Performance optimization methods for large scene in WebGL. In Proceedings of the 6th International Conference on Computer Information Science and Application Technology (CISAT 2023), Hangzhou, China, 26–28 May 2023; pp. 1360–1365. [Google Scholar]
  30. Chickerur, S.; Balannavar, S.; Hongekar, P.; Prerna, A.; Jituri, S. WebGL vs. WebGPU: A Performance Analysis for Web 3.0. Procedia Comput. Sci. 2024, 233, 919–928. [Google Scholar] [CrossRef]
  31. Yu, G.; Liu, C.; Fang, T.; Jia, J.; Lin, E.; He, Y.; Fu, S.; Wang, L.; Wei, L.; Huang, Q. A survey of real-time rendering on Web3D application. Virtual Real. Intell. Hardw. 2023, 5, 379–394. [Google Scholar] [CrossRef]
  32. Rew, R.; Davis, G. NetCDF: An interface for scientific data access. IEEE Comput. Graph. Appl. 1990, 10, 76–82. [Google Scholar] [CrossRef]
  33. Feng, C.; Qin, T.; Ai, B.; Ding, J.; Wu, T.; Yuan, M. Dynamic typhoon visualization based on the integration of vector and scalar fields. Front. Mar. Sci. 2024, 11, 1367702. [Google Scholar] [CrossRef]
  34. Jia, Z.; Chen, D.; Wang, B. Research on Improved Ray Casting Algorithm and Its Application in Three-Dimensional Reconstruction. Shock Vib. 2021, 2021, 8718523. [Google Scholar] [CrossRef]
  35. Qu, N.; Yan, Y.; Cheng, T.; Li, T.; Wang, Y. Construction of Underground 3D Visualization Model of Mining Engineering Based on Shear-Warp Volume Computer Rendering Technology. Mob. Inf. Syst. 2022, 2022, 8472472. [Google Scholar] [CrossRef]
  36. Zhang, X.; Yue, P.; Chen, Y.; Hu, L. An efficient dynamic volume rendering for large-scale meteorological data in a virtual globe. Comput. Geosci. 2019, 126, 1–8. [Google Scholar] [CrossRef]
  37. Ess, E.; Sun, Y. Visualizing 3D vector fields with splatted streamlines. In Proceedings of the Visualization and Data Analysis 2006, San Jose, CA, USA, 15–19 January 2006. [Google Scholar]
  38. Levoy, M. Display of surfaces from volume data. Comput. Des. 1988, 20, 29–37. [Google Scholar] [CrossRef]
  39. Weiler, M.; Kraus, M.; Merz, M.; Ertl, T. Hardware-Based ray casting for tetrahedral meshes. In Proceedings of the IEEE Conference on Visualization, Seattle, WA, USA, 19–24 October 2003; pp. 333–340. [Google Scholar]
  40. Wang, H.; Xu, G.; Pan, X.; Liu, Z.; Lan, R.; Luo, X.; Zhang, Y. A Novel Ray-Casting Algorithm Using Dynamic Adaptive Sampling. Wirel. Commun. Mob. Comput. 2020, 2020, 8822624. [Google Scholar] [CrossRef]
  41. Wang, J.; Bi, C.; Deng, L.; Wang, F.; Liu, Y.; Wang, Y. A composition-free parallel volume rendering method. J. Vis. 2021, 24, 531–544. [Google Scholar] [CrossRef]
  42. Jing, G.; Song, W. An octree ray casting algorithm based on multi-core cpus. In Proceedings of the International Symposium on Computer Science and Computational Technology, ISCSCT, Shanghai, China, 20–22 December 2008; pp. 783–787. [Google Scholar]
  43. Keil, J.; Edler, D.; Schmitt, T.; Dickmann, F. Creating Immersive Virtual Environments Based on Open Geospatial Data and Game Engines. KN J. Cartogr. Geogr. Inf. 2021, 71, 53–65. [Google Scholar] [CrossRef]
Figure 1. Schematic showing the general structure of the datasets: (a) the ocean environmental parameters forecast dataset; (b) the ocean acoustic field forecast dataset.
Figure 1. Schematic showing the general structure of the datasets: (a) the ocean environmental parameters forecast dataset; (b) the ocean acoustic field forecast dataset.
Applsci 15 02782 g001
Figure 2. Main workflow of the WebGPU-based volume rendering framework for interactive visualization of large-scale ocean scalar data.
Figure 2. Main workflow of the WebGPU-based volume rendering framework for interactive visualization of large-scale ocean scalar data.
Applsci 15 02782 g002
Figure 3. Volume rendering results of regular gridded data: (a) the visualization of the complete temperature data with a transparency value of 0 and a basic sampling step size of 0.2; (b) the visualization of the complete flow velocity magnitude data with a transparency value of 0.1 and a basic sampling step size of 0.4.
Figure 3. Volume rendering results of regular gridded data: (a) the visualization of the complete temperature data with a transparency value of 0 and a basic sampling step size of 0.2; (b) the visualization of the complete flow velocity magnitude data with a transparency value of 0.1 and a basic sampling step size of 0.4.
Applsci 15 02782 g003
Figure 4. Volume rendering results of irregular gridded data: (a) the visualization of the complete acoustic propagation loss data with a transparency value of 0 and a basic sampling step size of 0.1; (b) the visualization of the same data with a transparency value of 0.3 and a basic sampling step size of 0.3.
Figure 4. Volume rendering results of irregular gridded data: (a) the visualization of the complete acoustic propagation loss data with a transparency value of 0 and a basic sampling step size of 0.1; (b) the visualization of the same data with a transparency value of 0.3 and a basic sampling step size of 0.3.
Applsci 15 02782 g004
Figure 5. Interactive visualization results: (a) the flow velocity magnitude data cut along the latitude of 9.127 °N; (b) the flow velocity magnitude data cut at a vertical depth of −175 m; (c) the acoustic propagation loss data cut along the longitude of 115.589 °E and the latitude of 9.749 °N; (d) the ROI of the acoustic propagation loss data filtered with the data values ranging from 175.901 dB to 313.694 dB. Each red box in the subfigures indicates the key corresponding interactive visual analysis tool.
Figure 5. Interactive visualization results: (a) the flow velocity magnitude data cut along the latitude of 9.127 °N; (b) the flow velocity magnitude data cut at a vertical depth of −175 m; (c) the acoustic propagation loss data cut along the longitude of 115.589 °E and the latitude of 9.749 °N; (d) the ROI of the acoustic propagation loss data filtered with the data values ranging from 175.901 dB to 313.694 dB. Each red box in the subfigures indicates the key corresponding interactive visual analysis tool.
Applsci 15 02782 g005
Figure 6. Rendering time comparison of volume rendering of the temperature and acoustic propagation loss data between the visualization solution proposed in this study and the one proposed in our previous work.
Figure 6. Rendering time comparison of volume rendering of the temperature and acoustic propagation loss data between the visualization solution proposed in this study and the one proposed in our previous work.
Applsci 15 02782 g006
Figure 7. Performance comparison of volume rendering of the temperature and acoustic propagation loss data between the WebGPU and WebGL versions based on Babylon.js: (a) frame rate comparison; (b) GPU utilization comparison.
Figure 7. Performance comparison of volume rendering of the temperature and acoustic propagation loss data between the WebGPU and WebGL versions based on Babylon.js: (a) frame rate comparison; (b) GPU utilization comparison.
Applsci 15 02782 g007
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yu, J.; Qin, R.; Xu, Z. The Implementation of a WebGPU-Based Volume Rendering Framework for Interactive Visualization of Ocean Scalar Data. Appl. Sci. 2025, 15, 2782. https://doi.org/10.3390/app15052782

AMA Style

Yu J, Qin R, Xu Z. The Implementation of a WebGPU-Based Volume Rendering Framework for Interactive Visualization of Ocean Scalar Data. Applied Sciences. 2025; 15(5):2782. https://doi.org/10.3390/app15052782

Chicago/Turabian Style

Yu, Jiaqi, Rufu Qin, and Zhounan Xu. 2025. "The Implementation of a WebGPU-Based Volume Rendering Framework for Interactive Visualization of Ocean Scalar Data" Applied Sciences 15, no. 5: 2782. https://doi.org/10.3390/app15052782

APA Style

Yu, J., Qin, R., & Xu, Z. (2025). The Implementation of a WebGPU-Based Volume Rendering Framework for Interactive Visualization of Ocean Scalar Data. Applied Sciences, 15(5), 2782. https://doi.org/10.3390/app15052782

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop