1. Introduction
The availability of an accurate and effective method for measuring the traffic quality and congestion level on a segment or part of the network represents a fundamental aspect of the planning, designing, and controlling of a transport system. In the field of road transport, especially for highways, several decades of studies and research have pursued this specific objective. Various discussions have been proposed about the meaning of the quality of circulation and congestion to define models and operating procedures that can represent and observe it.
Over the years, the growing availability of information about mobility and traffic has led to further considerations related to using these data to monitor traffic quality and control/manage infrastructure systems and mobility demand. These systems are known as Intelligent Transportation Systems (ITSs) [
1]. As in [
2], the ITS research topic in the highway sector, although central in the last twenty years, has only occasionally been transformed into medium- and long-term projects. However, today, the proliferation of intelligent technology, which we often refer to as smart technology, is the primary driver of innovation in the highway sector [
3]. The smart road represents the frontier of innovation in the road and highway sector, where the most innovative automation and communication technologies, big data, and artificial intelligence integrate each other to offer a transport service suitable for the highly dynamic needs of modern mobility [
3].
From this point of view, it is evident that the availability of data on the movements of road users has experienced an incredible boost in recent years; today, data perform the lion’s share of the work of smart technology, especially in the highway field. The ubiquitous deployment of technology for tracking movement using personal and compact GPS or cell phone devices has made available, for instantaneous and massive use, various online information systems and platforms capable of providing representations of the state of use of the road network. The analysis of this information, commonly referred to as big data and complemented by social applications with information shared by users, has made it possible to offer the public some immediate tools to inquire about the traffic situation in a specific territorial context. Online platforms—among which we can mention the widely used services provided by Google Traffic, TomTom Move, Bing Traffic, Inrix, and Waze—provide some information on travel times and travel speeds on roads or sections of roads on simple and easily navigable maps, both in real time (live traffic) and as average values over selectable daily and hourly intervals (typical seasonal daily situations). On the one hand, we can say that these tools provide a simple and direct representation of the quality of circulation. However, on the other hand, the information transferred in terms of average speeds or travel times, or even more aggregated qualitative indices, is only partial for those dealing with the planning, designing, and controlling of transport systems and networks.
Traffic engineering relies on multidimensional knowledge of traffic flow phenomena, and average information about speed or travel time on a sample of users, if not accompanied by additional information, is inadequate to define the infrastructure operations. Only through the in-depth knowledge of traffic phenomena and the macroscopic and microscopic study of the involved variables, we can represent traffic conditions and circulation quality and evaluate the distance from the congestion.
Starting with Greenshields’s seminal works in the first half of the last century [
4], several perspectives have been proposed, searching for a method to study traffic behaviors and measure circulation quality. From this point of view, first, we can differentiate two approaches [
5]:
Among the first traffic flow scholars of the last century, Greenshields, Platt, and Drew proposed studies and methods related to the first approach, adopting time-to-time specific quality indices [
6,
7,
8,
9,
10]. As highlighted in [
5], this approach has not had significant applications, probably due to what appears to be an abstract definition of the quality of circulation, to a certain arbitrariness in the selection of variables, and often to the difficulty of measuring them in actual situations on the road.
The second approach includes the methods and procedures in the different editions, up to the latest, current one, of the Highway Capacity Manual (HCM) [
11], which is widespread internationally and is adopted in various national contexts. The generalized speed–flow curves for standard cases periodically updated in the various HCM editions and their easy applicability for technicians and practitioners have made the fortune of this approach, which is now extremely widespread all over the world for the analysis of the quality of circulation.
In addition to these two approaches, we can add a third one that links the traffic circulation quality to some reliability measures, i.e., the reliability approach. As in [
12], we can consider a system reliable if it can fully perform the tasks for which it was designed. From this point of view, system reliability is the probability that it adequately performs its tasks within a given time interval and under specified environmental conditions. Going beyond descriptive studies and applications based on the analysis of the travel time reliability (e.g., [
13,
14,
15,
16,
17,
18,
19]), the approach that we define here in terms of reliability collects the theoretical contributions of the analysis of the speed random processes [
1,
12,
20,
21,
22,
23,
24,
25] and the probabilistic analysis of breakdown phenomena and capacity [
26,
27,
28,
29,
30]. In general, we can observe that the contributions of the probabilistic analysis of breakdowns have found various applications in recent years, up to also being included in the HCM (in a simplified way) for the estimation of the probabilistic distribution of the lane capacity [
11]. Despite its numerous applications in the estimation of statistical models for short-term forecasts, of which we find an interesting review in [
31], the analysis of the speed processes has been scarcely used in the assessment of the circulation quality, perhaps due to the greater theoretical complexity and burden of calculation in practical applications.
Aim of Research
In the context briefly outlined here, the work presented in this paper addresses the two lines of the investigation cited above for the analysis of reliability—i.e., stochastic speed processes and the probabilistic analysis of breakdown phenomena and capacity—to propose a single procedural framework able to integrate both and to connect them with the critical elements of LOS analyses according to the HCM.
Thus, the paper wants to examine the probability of a traffic breakdown, the probabilistic distribution of lane capacity, and the probability of a lane operating at any LOS, based on the analysis of the stochastic processes of vehicle speeds in a freeway section. The objective is to propose a more general approach, going beyond the relationships that can we find in the literature, obtained under certain initial and boundary conditions.
The proposed procedure can be useful for evaluating the circulation quality and the Level of Service on a freeway segment from the reliability point of view by observing how the speed processes develop in the situation that is under observation from time to time, for historical data and applications in near real-time.
The paper is organized as follows:
Section 2 addresses the primary literature references and the theoretical and applicative assumptions for measuring the quality of freeway traffic.
Section 3 proposes some fundamental aspects related to the analysis of speed processes, with some details in the
Appendix A.
Section 4 describes the model characterization and the quantitative procedure that the paper proposes. Finally,
Section 5 proposes an explicative application to a real case on the Italian Brenner A22 freeway.
3. Speed Random Process
3.1. Speed Random Process Definition
To analyze the reliability of freeway traffic, it is necessary to investigate the behaviors that characterize a vehicular flow made by vehicles that follow each other. Considering the sequence of vehicles passing a section of the leftmost lane of a freeway carriageway, we consider the instants in which each component of a vehicle succession passes through section and the speed of the vehicle passing at instant . Therefore, the sequence …… is the realization of the random process that we indicate as the speed process.
If we denote by
the dynamic mean value, or level, at instant
conditioned by the previous realization of the speed process up to
, it can be verified that
[
12,
20,
25]. It means that the vehicle passing at time
has a speed deviation
with respect to dynamic speed level
due to the above realization, where
is a random variable with mean zero and variance
[
12,
20,
25]. If we consider the whole vehicle succession in
, all
are independent and identically distributed random variables with zero mean and variance
. Some experiments have proved [
20] that if at instant
, there is a deviation
from level
, this deviation influences the level at
, i.e., the conditional mean of the previous realization up to
, for a quantity
, with
being a coefficient between 0 and 1. Thus, we can write
, and setting
, it results that
(see
Appendix A), which corresponds to an ARIMA (0,1,1) model [
12,
20,
25].
It can be shown (see
Appendix A) that the study of traffic reliability involves the analysis of the speed process defined by
. With
and the distribution of
completely defined by
, the same quantity
represents a measure of the reliability of the speed process, i.e., of the greater or lesser probability that the flow is stable over time. If, on the one hand,
is small (at the limit equal to zero, with low flow rate and large spacings) or if
is small even in the presence of a non-low
, the probability that the process deviates enough from a constant value over time is low, and the process can be considered to be stable, with vehicles that condition themselves little; speeds appear normally distributed with constant mean equal to
and variance
coinciding with that of residuals
and whose sequence is the realization of a renewal process. If, on the other hand,
and
are not too small, the probability of a consistent deviation from the constant value over time is high, and the speed process is constantly unstable. In any case, parameters
and
are estimable considering the process defined by
, which is a stationary process with
and
identically distributed, that is, a first-order moving average (MA) (1) process [
12,
20,
25].
3.2. Speed Process and Flow Rate Analysis
Going back to section on the leftmost lane of a freeway carriageway, we can consider a succession , , …, , , , … of time intervals of duration covering an interval of total duration with an almost constant flow rate . Let , , …, , , , … be the sequence of vehicles transited in each interval, each one as the realization of a random process. Considering succession of the vehicles passing during as the realization of a first-order stationary process, if is the flow rate in generic interval , i.e., , and is the dynamic mean speed at the instant of the passage of the t-th vehicle (, density can be estimated as the ratio between and , i.e., . If flow rate is constant during , random variations in speed level generate random variations in density . If these random variations produce a exceeding a limit value
, we have traffic instability at the instant of the passage of the t-th vehicle. Thus, the probability of having a certain instant during with can identify probability that a traffic block phenomenon may occur in during , assigned a density threshold .
Instability, as a random event, depends on the random process of the speed level. Thus, the probability that the instability event does not occur in a certain instant defines reliability of the flow. In these terms, the reliability of a constant traffic flow for a time interval can be defined as the conditional probability that dynamic speed level does not decrease during in such a way that density can reach limit density (provided that in , the level of speed corresponds to a stable flow). If this occurs, the control mechanism implemented by the drivers goes into crisis, and the flow becomes unstable.
Thus, if we consider a sequence of speed values measured over a reasonably long period of time , during which flow varies in a sufficiently wide range for the same environmental conditions, traffic composition, and driver population, the whole sequence of the detected speeds can be subdivided into sequences that are the realization of homogeneous processes. The generic -th process is characterized by specific values of and , lasting , with a certain flow , and by a density at the instant of the passage of the -th vehicle of the process realization. The values of and can be estimated considering that for each identified sequence, is the realization of a first-order moving average process (MA) (1).
3.3. Speed Process and Traffic Stream Reliability
Parameter
, which we can estimate for each sequence
, appears to be an increasing linear function of the logarithm of average density
. Some previous studies [
12,
20,
21,
22] demonstrated that regardless of the different circumstances in which the observations are made,
can be well represented by the following linear equation:
. The same studies also confirmed that the trend of
is also linear with respect to
and passes through point [
=30 (
= 3.4;
= 1.5 m
2s
−2]. Linear equation
, and in particular angular coefficient
, summarizes the current characteristics of the traffic flow with respect to its vehicular composition, the population of drivers, and the environment. Its intersection with the axis of density
identifies average limit density
, at which the traffic flow is only possible if the speed fluctuations do not exist and beyond which there is a high probability of a crisis of the traffic flow. In these terms, the limit value represents a measure of critical density
.
In consideration of the above, we can observe that reliability depends on the following:
A time interval , in which we evaluate the probability that an instability event does not occur;
A flow rate , which we assume is constant throughout ;
An angular coefficient of straight line , which summarizes the characteristics of the traffic flow.
Through simulations of the random process of the speed levels between zero speed and maximum speed, it is possible to obtain [
12,
20,
21,
22,
23] the expressions of
for a traffic flow in the leftmost lane of a freeway carriageway depending on
(in minutes),
(in vehicles/hour/lane), and
(in m
2 km s
−2). Considering the level of speed
at
(corresponding to a stable flow) expressed by equation
, according to [
20], we can consider the regression of
with respect to
,
, and
, given by
Furthermore, starting from this expression, it is possible to obtain the value of flow rate
, which transits on the leftmost lane of a freeway carriageway with a reliability value ∅:
If we set a reliability value
close to one (generally,
= 0.8–0.9) and a conventional period (e.g.,
= 15 min), the
value obtained with Equation (2) is capacity
of the fastest lane (i.e., the leftmost lane). The instability, which is reached with a probability
when the
flow rate on the leftmost lane reaches its capacity
(
/
= 1), determines the flow instability over the entire carriageway. In this regard, in a two-lane carriageway, we can determine carriageway capacity
by considering the experimental relationships between flow rates
and
on the two lanes, as the total flow rate,
, on the entire carriageway varies [
12,
46].
The knowledge of reliability
in a section and of its variation over time can be used to prepare functional criteria for activating traffic control strategies [
1]. These can prevent the onset of instability phenomena by acting on the values of
and
. Based on Equation (1), in fact, if
and
are high, it is possible to increase
by reducing
(hence, variance
) even without acting on traffic flow rate
. If, on the other hand, the value of
below the reliability threshold occurs with a value of
that is already low, then the increase in
cannot be achieved, except with an adjustment of flow rate
[
20].
As we have seen, the analysis of the speed process allows us to obtain a formulation of reliability
expressed by Equation (1) and to derive the probabilistic distribution of capacity based on Equation (2), according to the method introduced by [
12,
20]. It should be emphasized, however, that Equations (1) and (2) are regressions obtained by the simulating of random processes under specific boundary conditions. Therefore, they are not directly generalizable but require a simulation and regression process for each application to contextualize them time to time. On the other hand, the current simulation techniques and the improvement of the computational performances make it possible to study the stability through the analysis of the speed processes in near-real-time mode, with evaluations that we can carry out under the current flow conditions. This type of analysis is the novelty proposed in this paper. Details about the proposed model are presented in
Section 4, and an exemplificative application is presented in
Section 5. Thus, the main goal of the proposed method is to go beyond the use of standardized functions in the literature. As we said, these functions result from regression using simulations made once and for all with well-defined assumptions and boundary conditions [
12,
20,
21,
22,
23], leading to Equations (1) and (2).
4. Simulation and Analysis Procedure
4.1. Speed Process Simulation Model for Reliability Analysis
As we mention in the previous sections, the study of the reliability of traffic in the leftmost lane of a freeway carriageway, with hourly flow rate passing through a section in a time interval with a dynamic average speed level , involves the analysis of the speed random process defined by , with and the distribution of fully defined by , according to an ARIMA (0,1,1) model.
To try to estimate the probability that during a certain time interval , hourly flow rate can determine a congestion phenomenon, i.e., when average density exceeds a threshold value with below a limit value , it is possible to simulate the realization of a large number of speed sequences on the basis of parameters and . For crisis probability estimation, we can verify how many simulated sequences produce or . Parameters and , which particularize the ARIMA model (0,1,1), can be estimated on the basis of the recorded speed sequence, considering the random process defined by , i.e., , which is a stationary process with and being identically distributed according to a first-order moving average (MA) (1) process.
In a real-life situation, with a succession of vehicle transits , we can assume that the related speed sequence is the realization of the MA process with parameters and . Considering the recorded speed values , the estimates of and can be obtained using the models of econometric statistics, taking care to test the primary hypothesis of adequacy of the auto-regressive model.
Let us consider a speed sequence over a time period , during which hourly flow rate varies within a sufficiently wide range of values for the same environmental conditions, traffic composition, and driver population. The sequence of the detected speeds can be divided into some sub-sequences, which are the realization of homogeneous speed processes. The generic -th random process consisting in speed values can be characterized by specific values of and , by a certain hourly flow rate , and by a density that we can approximate with , regarding instants of sub-period that sees the -th vehicle.
Following [
12,
20], in practical applications, it is possible to identify sub-sequences with a fixed number of transits
(e.g., 50 vehicles each) and with a variable duration
. For each sub-sequence of
vehicles, it is possible to determine the corresponding hourly flow rate (
) and the average speed level (
) and to approximate the average density (
). As indicated above, for each sub-sequence, the values of
and
can be estimated, considering the first-order moving average process (MA) (1) of the deviations of the vehicle velocities
. After verifying the adequacy of the MA (1) model according to specific hypothesis tests, for each interval
that subdivides
, the estimated models can be used to simulate a large number
of speed sub-sequences that are homogeneous with respect to the real one. Based on the estimates obtained for the parameters of each process, we can simulate the speeds of the vehicles in transit in a particular test interval
, e.g., the 5 min interval following
. Thus, if
is constant even in test interval
and speeds continue to be generated according to the same random process, a large number
of sequences of deviations
in
can be obtained through a Monte Carlo simulation. From the
-th sequence of deviations
, we can generate the
-th speed sequence in
, assuming, for example, a starting value equal to the overall average (
) of speeds in
. In this way, assuming that
is the speed of the first vehicle of the
-th simulated sequences, the speed values of the other vehicles in the sequence can be generated accordingly as
for the second vehicle,
for the third vehicle, etc., until the end of period
τ.
Assuming that density is the MOE to identify the onset of a crisis phenomenon, following the view proposed by the latest editions of the HCM and supposing to be constant in test interval after , we can calculate for each 50 veh sub-sequence and for each Monte Carlo iteration mean speed at the end of and density . Having identified a density threshold value for the traffic crisis, we can calculate the number () of sequences of simulated speed for which . Thus, ratio can be used to evaluate reliability for flow rate recorded in during the next -long interval.
4.2. Application of the Product Limit Method for the Probabilistic Analysis of Traffic Performance
The results obtained through the simulations of the speed processes according to the methodology proposed in
Section 4.1 can be used to produce further assessments regarding the distribution of the flow values with respect to various density thresholds on the leftmost lane of the carriageway. A first analysis can be obtained by considering
as the limit density for reaching the capacity. For this purpose, we can use the Product Limit Method (PLM) [
41], considering the sequence of flow rates (
) during
and, for each of them in correspondence with a certain
and a fixed following time interval
, the
values of
. Using average speed level
, we evaluate density
for each term
of the flow rate sequence. If
, then
is beyond the capacity; therefore, it is excluded from the subsequent analysis, since it does not contain information on the value of the same capacity. If
, then the
values of
take on importance in evaluations. Thus, the entire set of
values that we find during
with
is divided into two sub-sets:
{A} is the set of values for which , indicating the density threshold not exceeded both in and in test time in Monte Carlo iteration ;
{B} is the set of values for which , indicating the density threshold not exceeded in and exceeded in test time in Monte Carlo iteration .
In this way, in the case of , based on Monte Carlo simulations, the value of each appears times in dataset {A} and in dataset {B}.
The Product Limit Method that we propose in these analyses presents similarities with that indicated by Brilon (e.g., [
27,
28]), based on van Toorenburg’s approach [
42]. The difference is that in the application we discuss here, the breakdown is not evaluated in the different
, but based on the probabilistic results in the
simulations for test interval
that would hypothetically follow each
, under the conditions of flow constancy and speed process homogeneity.
The method is based on the estimate of the survival function of the hourly flow rate in consideration of the crisis limit imposed by threshold
. The capacity distribution (i.e., for
exceeding
) can be written as
and can be estimated with
where
is a certain value for the hourly flow rate between a minimum and a maximum considered for evaluation;
is the hourly flow rate of generic interval
;
is the number of times when
;
is the number of times when
for
; and finally, {B} is the set of
values for which
.
As known, the PLM does not require the assumption of a specific type of distribution function. However, it should be noted that the maximum value of the estimated distribution function reaches the unit value only if the maximum flow observed in the section belongs to set {B}. On the other hand, if the maximum observed flow does not belong to {B}, then the estimated distribution function
stops at a value of less than 1;therefore, the complete trend cannot be estimated. To overcome this problem, it is necessary to hypothesize the mathematical form assumed by distribution function
. As in [
27], we can consider Weibull functions as follows:
where
and
β are distribution parameters that can be estimated with the maximum likelihood approach [
27].
For
, the threshold values can be assumed based on the analysis of the dispersion of the experimental points in the two-dimensional diagrams of the macroscopic variables of the traffic or through the calibration of the Fundamental Diagram according to a preselected mathematical formulation that allows the value of the critical density at capacity to be identified [
45]. However, a conventional value can also be assumed, for example, using the density threshold that identifies the limit between LOS E and LOS F according to the latest editions of the HCM [
11], i.e., considering
vehicles/kilometer/lane.
The PLM analysis can be extended towards the probability of exceeding the limit density for Levels of Service that come before LOS F in the HCM density ranges. To this end, we can proceed by reiterating the PLM after choosing a new value for
. To explore the probabilities of exceeding the flow rates for LOS A, B, C, and D (thus of having LOS B, LOS C, D, and E), we can use the limit density values provided by the HCM, i.e.,
,
,
, and
vehicles/kilometer/lane. In this way, probabilistic charts for the LOS analysis based on the hourly flow can be created, as we see in the case study in
Section 5 of this paper.
It should be noted that no considerations are made regarding the type of vehicles in transit. We only consider vehicles, which we can assume here belong to a single homogeneous class, i.e., passenger cars. If the flow consists of different vehicle classes, for example, passenger cars and freight vehicles, this classification should be taken into consideration. Even the reference densities of the HCM consider equivalent vehicle units, based on adequate homogenization coefficients [
11]. These aspects, excluded in this study stage, will be the subject of future investigations and may produce an even more robust generalization of the procedure.
5. Application of the Simulation Model to a Real Case
In this section, for illustrative purposes, we propose applying the simulation model described in
Section 4 to a case study on the Italian freeway network, a two-lane section of the A22 Brenner freeway between the Trento Sud and Rovereto Nord toll stations. The detection devices were placed at Km 156 on the southbound carriageway on a straight and flat segment, at a sufficient distance from the exit ramp at Rovereto Nord (about 2.5 km).
Figure 1 shows the location along the route of A22 and the positioning of the counting devices. Data received from the freeway concessionaire were analyzed relating to the transits between 1 June 2014 and 30 June 2014, previously validated by the same concessionaire with the relative quality controls and consistency of measurements.
The data collected using the cross-sectional monitoring systems (inductive loop detectors) allowed us to qualify each transit concerning the following information: transit instant (Unix date format in seconds from 1 January 1970); identification code of the detection apparatus; lane identification; space headway; time headway; compliant/wrong direction flag; speed in km/h; vehicle length in cm.
The database included 732,700 vehicle passages on the two lanes, 416,057 on the rightmost lane and 316,643 on the leftmost lane, with an average daily traffic of about 24,400 total daily vehicles. Individual transit data were aggregated by 5 min intervals carrying out the flow rate in veh/h/lane and the harmonic mean of the speed as an approximation of the space average speed.
Figure 2 shows the scatter diagrams of the experimental points in the speed–flow plane in 5 min intervals, both as regards the carriageway as a whole (a) and with detail of the leftmost lane (b).
The analysis of the reliability using the method proposed in
Section 4 can be based on observing the vehicle sequences on the leftmost lane of the freeway carriageway grouped into sub-sequences consisting of 50 vehicles each. In the case study, we proceeded by dividing the whole vehicle succession on the leftmost lane per monitoring interval (i.e., 316,643 between 1 June 2014 and 30 June 2014) into sequences of 50 vehicles and calculating for each of them the corresponding hourly flow rate (
), the harmonic mean speed (
), and the mean density (
). However, the application, made here on historical data, can also be produced in near-real-time mode, based on the sequences of 50 vehicles found In the current state as time progresses.
These time series considered a term for each sub-sequence of 50 vehicles in instant in which the passage of the last vehicle took place, measured in seconds. The time origin was the instant of the first transit of the first sub-sequence (start monitoring) on the leftmost lane.
For each sub-sequence of 50 vehicles, we can assume that a first-order moving average process (MA) (1) is the generating process for speed deviations
. Based on this assumption, discussed in the previous sections and as detailed in
Appendix A, we can estimate the
and
parameters of the 6332 MA (1) models (i.e., one model for each sequence of 50 vehicles). It should be specified that before estimating the MA (1) models, we verified the time series stationarity with the ADF (augmented Dickey–Fuller) unit root test. Based on the ADF test, all 6332 series of
confirmed their stationarity with 95% confidence.
The MA (1) models were estimated using the regARIMA function in Matlab 2020a with specification (0,0,1). For each of the 6332 series of , the adequacy of the MA (1) model was evaluated using the portmanteau test by Ljung and Box, verifying the null hypothesis that the residuals did not show autocorrelation. In 92% of cases (5845 series out of 6332), there was insufficient evidence to reject the null hypothesis of no residual autocorrelation (20 lags). Thus, also in this case study, it was possible to confirm, as in the literature, the adequacy of the MA (1) model to represent the succession of speed deviations .
Using each of the 6332 MA (1) models estimated considering the sequences of 50 vehicles passing the leftmost lane, we simulated 200 alternative realizations for each sequence. The simulations were obtained with the Monte Carlo method using the simulate function in Matlab 2020a, which allows one to simulate a sample path from an ARIMA model (MA (1) in this case). For each sub-sequence of 50 vehicles, which took place in an interval of variable duration , =1, 2, …, 200, the vehicle speed successions were simulated for a test interval represented by the following 5 min. As mentioned, the hypothesis was to consider hourly flow rate , corresponding to 50 passages in each , to also be constant in test interval and their speeds to be generated according to the same random process of the original sub-sequence.
Thus, for each sub-sequence
, we simulated
= 200 sequences of deviations
in
, and starting from them, we generated the relative speed sequences, assuming a starting value equal to mean speed
in each
. In this way, for each sequence
and each simulation
, we found
,
,
, up to the last simulated vehicle. Based on the simulated speeds, for each of the 6332 sub-sequences and each of the 200 alternative realizations, we calculated average speed
at the end of
and density
.
Figure 3 and
Figure 4 show two heatmap graphs of the average speed and density values obtained in the simulation with 200 iterations (
x-axis) for the 6332 sequences of 50 vehicles (
y-axis).
Using
, it is possible to identify the number of sequences
out of the total
= 200 simulated for which
. This value can be used to estimate reliability
of the flow rate in
, i.e.,
, during the subsequent test interval
. For threshold value
, we assumed the conventional density threshold that identifies the entrance to the LOS F according to the latest editions of the [
11], i.e.,
= 28 vehicles/kilometer/lane.
Figure 5 shows the trend of the reliability over the entire duration of the time series,
, while
Figure 6 shows a zoom of a time window in which the reliability shows substantial reductions.
Figure 7 shows the trend of
for the whole monitoring period, with the superimposition (red dots) of values for which
resulted to be less than 80%, i.e., with a probability of exceeding the conventional density at capacity per lane identified by the HCM greater than 20%.
Figure 8 shows the dispersion diagram of the
values and of the respective reliability values,
, during the monitoring period.
As mentioned in
Section 4.2, the results obtained through the simulations of the speed processes can be used to produce further evaluations concerning the distribution of the flow rates for various density thresholds. A first analysis can be obtained by considering
as the limit density for reaching lane capacity and using the PLM [
41]. The PLM can also be applied by proceeding with an extension of the analysis to investigate the probability that the limit ranges of LOS C, D, and E are exceeded using the limit density values provided by the HCM, i.e.,
,
, and
. The dotted curves in
Figure 9 show the trend of the reliability functions with different values of
, which correspond to the probabilities of exceeding
as a function of
, which were obtained considering the
= 200 simulations of the 6632 sub-sequences of 50 vehicles. The continuous curves in
Figure 9 show the reliability trend using Weibull functions. The interpolating functions showed an excellent fit, as shown by the graphical comparison with the discontinuous data, with high values of R
2.
Figure 10 shows the trend of the reliability function of the hourly flow rates for exceeding
together with the Weibull function estimated according to [
27]. The latter is represented by points (green dots), obtained using the same dataset and considering, as the breakdown threshold, a speed value of 80 km/h and 5 min fixed intervals for data aggregation.
Table 1 shows the two estimated functions and the relative parameters, which appear essentially superimposable as in
Figure 10.
Based on the reliability functions with the different values of
as the limits between LOS B and C, LOS C and D, LOS D and E, and LOS E and F,
Figure 11 shows the probability curves
of exceeding the limits of density for each LOS.
These curves can represent a helpful tool for the probabilistic analysis of the performance of the leftmost lane and the entire freeway section.
Figure 11 shows as an example the case of a flow rate equal to 1500 vehicles/hour/lane. From the intersection with the different probabilistic curves, it is possible to identify the probability of exceeding each density limit. In the case study, based on the data collected for the entire monitoring period and the simulations of the speed processes, we can state that a flow rate of 1500 vehicles/hour in the leftmost lane is such as to exceed the LOS B limit in this lane with a probability of 99.5%, 33.2% for LOS C, 10.3% for LOS D, and 7.1% LOS E. Using the same curves, we can say that a flow rate of 1500 vehicles/hour/lane is such as to generate LOS A or B with 0.5% probability, LOS C with 66.3% probability, LOS D with 22.9% probability, LOS E with 3.2% probability, and LOS F with 7.1% probability. For this analysis, we can use the probabilistic distributions of the LOSs for each value of the hourly flow shown in
Figure 12.
6. Conclusions
The reliability approach to circulation quality involves the most recent research, resorting to a probabilistic description of traffic phenomena and the effects of interactions between vehicles on the quality of traffic. In this context, the paper retraces the probabilistic point of view of the traffic circulation quality, which concerns the analysis of breakdown phenomena and capacity on the one hand and the analysis of the reliability based on the random processes of vehicle speeds on the other hand. In particular, the paper explores the main aspects of this latter approach, addressing and deepening the literature regarding the description of the speed random processes in the leftmost lane of a motorway carriageway, in which congestion phenomena occur first.
According to the reviewed literature, the study of the reliability of the traffic in the leftmost lane of a freeway carriageway involves the analysis of the speed random process according to an ARIMA (0,1,1) model. The literature provides regression equations for the reliability function and the capacity distribution. However, these regressions were obtained by simulating random processes under specific boundary conditions. For these reasons, the paper highlights that the regression equations for traffic reliability and lane capacity are not directly generalizable, requesting the reiteration of the simulation and regression process more correctly to particularize them each time to the specific case.
For this purpose, the work outlines a general procedure based on the estimation and simulation of ARIMA models for speed random processes in a freeway section to assess the traffic reliability function using historical data or near-real-time information flow obtained using speed monitoring devices. The paper shows a further novelty in the reliability analysis through the estimation and simulation of ARIMA models for speed random processes, proposing a method for the analysis of the distribution of the flow rate concerning various density thresholds on the leftmost lane of the carriageway. For these analyses, the study shows the use of the PLM, widely used in the breakdown and capacity analysis approach. This operation is performed precisely in consideration of the purpose of this research, which was to enclose in a single operative framework the probabilistic approaches to the quality of circulation—i.e., the random speed processes and the probabilistic analysis of breakdown phenomena and capacity in the critical lane—and to connect them with the critical elements of LOS analyses according to the HCM.
The procedure outlined in the paper starts from analyzing and simulating the speed processes of vehicle sequences to evaluate the traffic reliability, i.e., the probability of not exceeding a certain criticality density threshold in the leftmost lane in a 5 min interval with a constant flow rate. Thus, it obtains the probabilistic distribution of the corresponding flow rate values through the application of the PLM. This distribution represents the probabilistic distribution of the capacity if we consider the critical density at capacity as the threshold, e.g., by setting the value indicated by the HCM as the limit for LOS E. Further distributions can be characterized by varying the threshold value considering the density values that separate the Levels of Service, from A/B to E/F, according to the HCM.
At the end of these conclusions, we would like to underline the current limitation of this procedure. This limit consists in considering only one vehicle class. In illustrating the procedure, we consider vehicles that we can assume here belong to a single homogeneous class, i.e., passenger cars. On the other hand, this is true in the case study concerning A22 del Brennero, where heavy vehicles cannot transit on the leftmost lane due to the overtaking ban on the infrastructure. In the most general case, this classification should be considered if the flow consists of different vehicle classes—for example, passenger cars and freight vehicles. Even the reference densities of the HCM consider equivalent vehicle units based on adequate homogenization coefficients. These aspects, which the study does not deal with in this stage, will be the subject of future investigations and may produce an even more robust generalization of the procedure.