applsci-logo

Journal Browser

Journal Browser

Big Data in Seismology: Methods and Applications

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Earth Sciences".

Deadline for manuscript submissions: closed (30 September 2023) | Viewed by 8589

Special Issue Editors


E-Mail Website
Guest Editor
Department of Electronic Engineering, Hellenic Mediterranean University, 3 Romanou Str., Chalepa, 73133 Chania, Greece
Interests: seismic big data; deep learning; spatial data processing; seismic data processing; earth science informatics; analysis and visualization of earth science data; seismic sequences
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Electronic Engineering, Hellenic Mediterranean University, 3 Romanou Str., Chalepa, 73133 Chania, Greece
Interests: seismic big data; deep learning; heterogeneous parallel processing; spatio-temporal algorithms

E-Mail Website
Guest Editor
Department of Surveying and Geoinformatics Engineering, University of West Attica, 12243 Athens, Greece
Interests: geoinformatics; spatial data processing; geodetic satellite positioning; networks of permanent GNSS reference stations; geodetic reference systems; monitoring tectonic displacements with GNSS

Special Issue Information

Dear Colleagues,

We invite you to contribute to this Special Issue on Big Data in Seismology: Methods and Applications.

The term "Big Data" has grown in prominence over the last ten years. As a result, the field of big geospatial data has recently received considerable attention from various areas. This Special Issue outlines the development of big data theory and practice as a scientific area, as well as the key characteristics and processing strategies. For instance, big data sources in Earth sciences include meteorology, seismic exploration, and remote sensing. We also discuss seismic monitoring data, which can become big data combined with other geophysical data. In addition, researchers now have instant access to processing power thanks to cloud computing services.

Additionally, parallel and distributed computing enable scientists to run a variety of computations concurrently, frequently dispersed across multiple workstations. The utilization of data has played a significant role in geosciences. New potential for the extensive use of data in geosciences has emerged due to recent scientific and professional advancements in data collection in geosciences and data analysis using big data and machine learning.

Since the Mediterranean is a tectonic boundary zone between the African and European plates, it is a particularly seismically active region. Additionally, it is a highly populated area with significant seismic risk exposure. Applying approaches that seek to detect and analyze earthquakes would result in a rich and high-resolution dataset, which would, in turn, help shed more light on the underlying mechanisms. We aim to produce unique research publications regarding studies of earthquake swarms and complex seismic sequences in volcanic and tectonic environments to advance our understanding of the underlying physics of the processes at their genesis.

For this Special Issue, we encourage articles, reviews, and technical reports that combine theory and experiments to process seismic data using parallel computing and deep learning. Appropriate articles will discuss the most recent advancements and cutting-edge research in the geospatial big data field, with a focus on the following issues:

  • Reporting and characterizing swarm-like and complex sequences in terms of spatial and temporal evolution, scaling properties, and triggering processes;
  • Data selection and acquisition;
  • Seismotectonic and seismicity of the region;
  • Artificial intelligence and geoscience-based design methodologies (machine learning and deep learning);
  • Data science's role in the geosciences' conventional and new issues;
  • Needs and perspectives for the use of data in geosciences;
  • Simulating the essential mechanical conditions resulting in rupture dynamics using laboratory and numerical models.

Dr. Alexandra Moshou
Dr. Antonios Konstantaras
Dr. Michail Gianniou
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 12112 KiB  
Article
Estimating Liquefaction Susceptibility Using Machine Learning Algorithms with a Case of Metro Manila, Philippines
by Joenel Galupino and Jonathan Dungca
Appl. Sci. 2023, 13(11), 6549; https://doi.org/10.3390/app13116549 - 27 May 2023
Cited by 2 | Viewed by 4306
Abstract
Soil liquefaction is a phenomenon that can occur when soil loses strength and behaves like a liquid during an earthquake. A site investigation is essential for determining a site’s susceptibility to liquefaction, and these investigations frequently generate project-specific geotechnical reports. However, many of [...] Read more.
Soil liquefaction is a phenomenon that can occur when soil loses strength and behaves like a liquid during an earthquake. A site investigation is essential for determining a site’s susceptibility to liquefaction, and these investigations frequently generate project-specific geotechnical reports. However, many of these reports are frequently stored unused after construction projects are completed. This study suggests that when these unused reports are consolidated and integrated, they can provide valuable information for identifying potential challenges, such as liquefaction. The study evaluates the susceptibility of liquefaction by considering several geotechnical factors modeled by machine learning algorithms. The study estimated site-specific characteristics, such as ground elevation, groundwater table elevation, SPT N-value, soil type, and fines content. Using a calibrated model represented by an equation, the investigation determined several soil properties, including the unit weight and peak ground acceleration (PGA). The study estimated PGA using a linear model, which revealed a significant positive correlation (R2 = 0.89) between PGA, earthquake magnitude, and distance from the seismic source. On the Marikina West Valley Fault, the study also assessed the liquefaction hazard for an anticipated 7.5 M and delineated a map that was validated by prior studies. Full article
(This article belongs to the Special Issue Big Data in Seismology: Methods and Applications)
Show Figures

Figure 1

14 pages, 2352 KiB  
Article
Calculation of Theoretical Travel Time and Automatic Picking of Actual Travel Time in Seismic Data
by Wenqi Gao, Youxue Wang, Yang Yang, Sanxi Peng, Songping Yu, Lu Liu and Lei Yan
Appl. Sci. 2023, 13(3), 1341; https://doi.org/10.3390/app13031341 - 19 Jan 2023
Viewed by 1707
Abstract
We used the ray tracing technique based on the IASP91 Earth model to calculate the travel times in order to identify the phases. This technique can calculate the travel times for the seismic phases in the conventional travel time tables. The waveform data [...] Read more.
We used the ray tracing technique based on the IASP91 Earth model to calculate the travel times in order to identify the phases. This technique can calculate the travel times for the seismic phases in the conventional travel time tables. The waveform data received from the stations in the Guangxi area are selected for analysis and discussion. The outcomes of the numerical modeling and its use demonstrate that there is good agreement in terms of the absolute differences between the calculated and theoretical travel times from the ISAP91 tables. The relative residuals are determined directly from the actual arrival times picking during the correlation analysis, and the validity of the travel time method for picking seismic phases by correlation analysis can be demonstrated. Full article
(This article belongs to the Special Issue Big Data in Seismology: Methods and Applications)
Show Figures

Figure 1

16 pages, 4745 KiB  
Article
An Interactive System Based on the IASP91 Earth Model for Earthquake Data Processing
by Wenqi Gao, Youxue Wang and Songping Yu
Appl. Sci. 2022, 12(22), 11846; https://doi.org/10.3390/app122211846 - 21 Nov 2022
Viewed by 1579
Abstract
System software for interactive human–computer data processing based on the IASP91 Earth model was designed. An interactive data processing system for visualizing earthquake data was designed and implemented via the Intel Fortran platform. The system reads and processes broadband seismic data acquired by [...] Read more.
System software for interactive human–computer data processing based on the IASP91 Earth model was designed. An interactive data processing system for visualizing earthquake data was designed and implemented via the Intel Fortran platform. The system reads and processes broadband seismic data acquired by field stations, mainly including the reading and import of raw data, pre-processing, identification of seismic phases and inter-correlation traveltimes picking. In the data processing step, shortcomings have been improved and functions have been gradually refined and enhanced, making it easier and faster to process data. It has already processed more than 1000 large seismic events received by the station from 2013 to 2018. The practical application shows that the human–computer interaction system is easy to operate, accurate, fast and flexible, and is an effective tool for processing seismic data. Full article
(This article belongs to the Special Issue Big Data in Seismology: Methods and Applications)
Show Figures

Figure 1

Back to TopTop