Next Article in Journal
CNVbd: A Method for Copy Number Variation Detection and Boundary Search
Previous Article in Journal
The Friedrichs Extension of Elliptic Operators with Conditions on Submanifolds of Arbitrary Dimension
Previous Article in Special Issue
A New Family of Archimedean Copulas: The Half-Logistic Family of Copulas
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Directional Dependence Orders of Random Vectors

by
Enrique de Amo
1,
María del Rosario Rodríguez-Griñolo
2 and
Manuel Úbeda-Flores
1,*
1
Department of Mathematics, University of Almería, 04120 Almería, Spain
2
Department of Economics, Quantitative Methods and Economic History, Pablo de Olavide University, 41013 Seville, Spain
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(3), 419; https://doi.org/10.3390/math12030419
Submission received: 30 November 2023 / Revised: 23 January 2024 / Accepted: 24 January 2024 / Published: 27 January 2024
(This article belongs to the Special Issue Dependence Modeling with Copulas and Their Applications)

Abstract

:
In this paper, we define a multivariate order based on the concept of orthant directional dependence and study some of its properties. The relationships with other dependence orders given in the literature are also studied. We analyze the order between two random vectors in terms of their associated copulas and illustrate our results with several examples.

1. Introduction

There are various approaches to examine how random variables relate in terms of dependence. Jogdeo [1] highlights that this area stands as one of the most extensively researched subjects within the realms of probability and statistics. A multivariate model should be analyzed for the type of dependence structure that it covers so one can know whether a particular model might be usable for a given application or dataset. Among the types of dependence studied in the literature, we focus on negative or positive dependence. A positive dependence notion is any criterion which can mathematically describe the tendency of the components of a n-variate random vector to assume concordant values [2]. In this work, there is no attempt to be exhaustive in giving all dependence concepts studied in the literature. We restrict the attention to some dependence structures.
Let ( Ω , F , P ) be a probability space, where Ω is a non-empty set, F is a σ -algebra of subsets of Ω , and P is a probability measure on F . Let n be a natural number such that n 2 , and let X = ( X 1 , X 2 , , X n ) be a random vector from Ω to R ¯ n = [ , + ] n with distribution function F and survival function F ¯ , where F ¯ ( x 1 , x 2 , , x n ) = P [ X 1 > x 1 , X 2 > x 2 , , X n > x n ] , for all x i R ¯ . It is said that X - or F- is positive upper orthant-dependent (PUOD) if
F ¯ ( x 1 , x 2 , , x n ) i = 1 n P [ X i > x i ] for   all x i R ¯
and X - or F- is positive lower orthant-dependent (PLOD) if
F ( x 1 , x 2 , , x n ) i = 1 n P [ X i x i ] for   all x i R ¯
(see, e.g., [3]). If both (1) and (2) hold, then X or F is said to be positive orthant-dependent (POD). Note that, for the bivariate case, (1) and (2) are equivalent—this is not the case in higher dimensions: see [4] (Example 5.26)—and in this case, the dependence property is called positive quadrant dependence (PQD). If the inequalities (1) and (2) are reversed, then it is said that X is negative upper orthant-dependent (NUOD) and negative lower orthant-dependent (NUOD), respectively. For more details on these and other dependence concepts, see, e.g., [2,3,5,6,7,8].
In [9], the orthant dependence according to a direction is defined as follows: Let α = ( α 1 , α 2 , , α n ) R n such that | α i | = 1 for all i = 1 , 2 , , n . X or F is said to be orthant positive- (respectively, negative-) dependent according to the direction  α —written PD( α ) (respectively, ND ( α ))—if
P i = 1 n ( α i X i > x i ) i = 1 n P [ α i X i > x i ] for   all x i R ¯
(respectively, we reverse the inequality in (3)). Directional coefficients which detect dependence in multivariate distributions are studied in [10].
If X is both PD( α ) (respectively, ND( α )) and PD(- α ) (respectively, ND(- α )), then X is said to be strongly positive (respectively, negative), and dependent according to the direction  α , written SPD( α ) (respectively, SND( α )).
Note that, for α = 1 = ( 1 , 1 , , 1 ) , the concepts of PD( α ) (respectively, ND( α )) and PUOD (respectively, NUOD are the same; and for α = 1 = ( 1 , 1 , , 1 ) , the concepts of PD( α ) (respectively, ND( α ))) and PLOD (respectively, NLOD) are the same.
In the following, we will restrict our study based on the positive PLOD, PUOD, PD( α ), and SPD( α ) concepts. Similar results can be obtained if we base it on the respective negative concepts.
The positive dependence concepts defined above result from comparing a multivariate vector with a random vector of independent random variables with the same corresponding univariate margins. Of course, comparisons can be made via dependence orderings. Several dependence (partial) orderings, which compare the amount of dependence in two different random vectors of the same length and with the same marginal distributions, have been studied (see, e.g., [3,5,11,12,13]). Particularly, in a parametric family of multivariate distributions, the parameter is interpretable as a dependence parameter if the amount of dependence is increasing, or decreasing, as the parameter increases. It is the interest of comparing two multivariate distributions in the sense of some dependence concept.
The next definition recalls some dependence orderings, where Γ n ( F 1 , F 2 , F n ) , n 2 , denotes the class of all the n-dimensional distributions with univariate marginals F 1 , F 2 , , F n , that is, the Fréchet class, and the function ϕ : R n R is supermodular, i.e., which satisfies
ϕ ( x ) + ϕ ( y ) ϕ ( x y ) + ϕ ( x y )
for any x , y R n , where x y = ( x 1 y 1 , x 2 y 2 , , x n y n ) and x y = ( x 1 y 1 , x 2 y 2 , , x n y n ) , where ∧ and ∨ are the minimum and the maximum operators, respectively.
Definition 1. 
Let X and Y be two random vectors with respective distribution functions F and G in Γ n ( F 1 , F 2 , , F n ) , n 2 , and survival functions F ¯ and G ¯ . It is said that:
(i) 
X is smaller than Y in the positive upper orthant dependence order (denoted by X P U O D Y ) if F ¯ ( x ) G ¯ ( x ) for all x R n .
(ii) 
X is smaller than Y in the positive lower orthant dependence order (denoted by X P L O D Y ) if F ( x ) G ( x ) for all x R n .
(iii) 
X is smaller than Y in the positive orthant dependence order (denoted by X P O D Y ) if F ¯ ( x ) G ¯ ( x ) and F ( x ) G ( x ) for all x R n .
(iv) 
X is smaller than Y in the supermodular order (denoted by X s m Y ) if E [ ϕ ( X ) ] E [ ϕ ( Y ) ] for any supermodular function ϕ : R n R provided the expectations exit.
Note that, for the bivariate case, the P O D order in Definition 1(iii) can be said to be the P Q D order, and from (i), (ii) and (iii), we have
X P O D Y X P U O D Y   and   X P L O D Y .
Moreover,
X s m Y X P O D Y
(see [13] [Equation (9.A.17)]), and, for n = 2 , s m and P Q D are equivalent. For further details about these and others dependence orders, see, e.g., [2,3,5,13,14,15] and the references therein.
Joe [5] [p. 39] postulated a number of desirable axioms that a multivariate positive dependence order should satisfy. Later, Colangelo et al. [2] gave the following slight variation of these postulates:
P1.
It should be a pre-order (reflexive and transitive).
P2.
It should be antisymmetric.
P3.
It should imply the PQD order of every (corresponding) bivariate marginal distribution.
P4.
It should be closed under marginalization.
P5.
It should be closed under limits in distribution.
P6.
It should be closed under the permutation of the components.
P7.
It should be closed under component-wise strictly increasing transformation.
P8.
It should be maximal at the upper Fréchet bound; and, in the bivariate case, it should be minimal at the lower Fréchet bound.
Our main goal in this work is the study, in any dimension greater than or equal to 2, of the presence of orders that are not “detected” by the well-known PLOD and PUOD orders, and for this, we use the notion of orthant dependence according to the direction of a vector.
This paper is structured as follows. In Section 2, we define a new order based on the concept of positive dependence given in [9] and study some of its properties. Also, the relationship with other dependence orders given in the literature are studied. In Section 3, we study the comparison of two random vectors in terms of their associated copulas and provide several examples.

2. New Definitions and Basic Properties

Based on the PD( α ) notion given in (3), and with the aim of comparing the strength of the positive dependence in a particular direction of two underlying multivariate distributions, we provide the following dependence orderings.
Definition 2 (The PD( α ) order).
Let X = ( X 1 , X 2 , , X n ) and Y = ( Y 1 , Y 2 , , Y n ) be two random vectors with respective distribution functions F and G and survival functions F ¯ and G ¯ . Let α = ( α 1 , α 2 , , α n ) R n such that | α i | = 1 for all i = 1 , 2 , , n . X is said to be smaller than Y in the positive dependence according to the direction α order, denoted by X P D ( α ) Y , if, for every x = ( x 1 , x 2 , , x n ) R ¯ n , we have
P i = 1 n ( α i X i > x i ) P i = 1 n ( α i Y i > x i ) .
Note that, from Equations (3) and (6), X is PD ( α ) if, and only if, X I n d P D ( α ) X , where X I n d is a random vector with the same univariate marginals as X but with independent components. Also observe that X P D ( α ) Y is equivalent to α X P U O D α Y or α X P L O D α Y , where α X = ( α 1 X 1 , , α n X n ) and similarly α Y .
From the SPD( α ) notion, a stronger order than that of the PD( α ) order can be defined as follows.
Definition 3 (The SPD( α ) order).
Let X = ( X 1 , X 2 , , X n ) and Y = ( Y 1 , Y 2 , , Y n ) be two random vectors with respective distribution functions F and G and survival functions F ¯ and G ¯ . Let α = ( α 1 , α 2 , , α n ) R n such that | α i | = 1 for all i = 1 , 2 , , n . X is said to be smaller than Y in the strongly positive dependence according to the direction α order, denoted by X S P D ( α ) Y , if, X P D ( α ) Y and X P D ( α ) Y .
Based on [2] (Example 2.15), the next example shows two vectors which are ordered in the sense of the PD( α ) order but not in the SPD( α ) one.
Example 1. 
Let X 1 , X 2 , X 3 be three independent and identically distributed Bernoulli random variables with a common parameter 0.7 , and let Y = ( Y 1 , Y 2 , Y 3 ) be a random vector such that P [ Y 1 = 0 , Y 2 = 0 , Y 3 = 1 ] = 0.2 , P [ Y 1 = 1 , Y 2 = 1 , Y 3 = 1 ] = 0.5 and P [ Y 1 = 0 , Y 2 = 1 , Y 3 = 0 ] = P [ Y 1 = 1 , Y 2 = 0 , Y 3 = 0 ] = P [ Y 1 = 1 , Y 2 = 1 , Y 3 = 0 ] = 0.1 . Note that Y 1 , Y 2 and Y 3 are Bernoulli distributed random variables with parameters 0.7 . After some straightforward calculations, it can be proven that X P D ( 1 , 1 , 1 ) Y , given that P [ X 1 > x 1 , X 2 > x 2 , X 3 > x 3 ] P [ Y 1 > x 1 , Y 2 > x 2 , Y 3 > x 3 ] for all ( x 1 , x 2 , x 3 ) . However, P [ X 1 0 , X 2 0 , X 3 0 ] = 0 . 3 3 0 = P [ Y 1 0 , Y 2 0 , Y 3 0 ] , and thus, X P D ( 1 , 1 , 1 ) Y . Therefore, X S P D ( 1 , 1 , 1 ) Y .
Some closure properties of the PD ( α ) order are described in the next theorem, whose proof is straightforward and we omit it.
Theorem 1. 
Let X = ( X 1 , X 2 , , X n ) and Y = ( Y 1 , Y 2 , , Y n ) be two random vectors in the same Fréchet class.
(a)
If X P D ( α ) Y , then X I P D ( α ) Y I , for each I { 1 , 2 , n } , and where α is the subvector of α whose components are in I. In other words, the PD ( α ) order is closed under marginalization.
(b)
If X P D ( α ) Y , then
( g 1 ( X 1 ) , g 2 ( X 2 ) , , g n ( X n ) ) P D ( α ) ( g 1 ( Y 1 ) , g 2 ( Y 2 ) , , g n ( Y n ) ) ,
whenever g i , i = 1 , 2 , , n , are n real-valued and increasing functions.
(c)
If X P D ( α ) Y and U P D ( β ) V , with X and Y independent of U and V , respectively, then, ( X , U ) P D ( α , β ) ( Y , V ) .
Proof. 
Let J = { 1 , 2 , , n } and I J .
Firstly, since X P D ( α ) Y , for any ( x 1 , x 2 , , x n ) R ¯ n , it follows that
P i I ( α i X i > x i ) = P i I ( α i X i > x i ) , i J I ( α i X i > y i ) P i I ( α i Y i > x i ) , i J I ( α i Y i > y i ) = P i I ( α i Y i > x i ) ,
where y i is the left endpoint in support of α i X i for every i J I , whence part (a) has been proven.
By considering I = { i J : α i > 0 } , part (b) follows from the following:
P i = 1 n ( α i g i ( X i ) > x i ) = P i I ( g i ( X i ) > x i ) , i J I ( g i ( X i ) < x i ) = P i I ( X i > g i 1 ( x i ) ) , i J I ( X i < g i 1 ( x i ) ) P i I ( Y i > g i 1 ( x i ) ) , i J I ( Y i < g i 1 ( x i ) ) = P i = 1 n ( α i g i ( Y i ) > x i ) .
Finally, for part (c), let U and V be two random vectors with dimension m. It follows
P i = 1 n ( α i X i > x i ) , i = 1 m ( β i U i > x i ) = P i = 1 n ( α i X i > x i ) P i = 1 m ( β i U i > x i ) P i = 1 n ( α i Y i > x i ) P i = 1 m ( β i V i > x i ) = P i = 1 n ( α i Y i > x i ) , i = 1 m ( β i V i > y i ) ,
completing the proof. □
Note that the properties in Theorem 1 are some of the desirable postulates that a multivariate positive dependence order should satisfy (specifically, P4 and P7). Moreover, the P D ( α ) order is also reflexive, transitive, and antisymmetric.
The following example shows that the P D ( α ) order does not imply the PQD order of every (corresponding) bivariate marginal, that is, postulate P3 is not satisfied.
Example 2. 
Let X 1 , X 2 , X 3 be three independent and identically distributed Bernoulli random variables with respective parameters 0.5 , 0.9 , and 0.8 , and let Y = ( Y 1 , Y 2 , Y 3 ) be a random vector such that P [ Y 1 = 0 , Y 2 = 0 , Y 3 = 1 ] = 0.1 , P [ Y 1 = 1 , Y 2 = 1 , Y 3 = 1 ] = 0.5 and P [ Y 1 = 0 , Y 2 = 1 , Y 3 = 0 ] = P [ Y 1 = 0 , Y 2 = 1 , Y 3 = 1 ] = 0.2 . Note that Y 1 , Y 2 and Y 3 are Bernoulli distributed random variables with parameters 0.5 , 0.9 , and 0.8 , respectively. After some straightforward calculations, it can be proven that Y P D ( 1 , 1 , 1 ) X , given that P [ Y 1 x 1 , Y 2 > x 2 , Y 3 > x 3 ] P [ X 1 x 1 , X 2 > x 2 , X 3 > x 3 ] for all ( x 1 , x 2 , x 3 ) . However, P [ Y 1 > 0 , Y 2 > 0 ] = 0.5 0.45 = P [ X 1 > 0 , X 2 > 0 ] , and thus, ( Y 1 , Y 2 ) P Q D ( X 1 , X 2 ) does not hold.
Now, we prove that the PD( α ) order is closed under weak convergence, where s t denotes convergence in distribution.
Theorem 2. 
Let { X j } j N and { Y j } j N be two sequences of random vectors such that X j s t X and Y j s t Y . If X j P D ( α ) Y j for all j N , then X P D ( α ) Y .
Proof. 
If X j P D ( α ) Y j for all j N , then α X j P U O D α Y j for all j N . Moreover, by using the continuous mapping theorem [16,17], it follows that α X j s t α X and α Y j s t α Y . Thus, given that the PUOD order is closed under weak convergence, we have α X P U O D α Y , whence X P D ( α ) Y , which completes the proof. □
In addition, some results related to the Fréchet upper bound for the bi- and trivariate cases are given. Recall that the Fréchet upper bound F + in the class Γ n ( F 1 , F 2 , , F n ) , n 2 , is defined as F + ( x 1 , x 2 , , x n ) = min { F 1 , F 2 , , F n } .
Proposition 1. 
Let X and X + be two bivariate random vectors with respective distribution functions F and F + in Γ n ( F 1 , F 2 ) , and let α = ( α 1 , α 2 ) such that | α i | = 1 , i = 1 , 2 .
(i) 
If α 1 · α 2 = 1 , then X P D ( α ) X + .
(ii) 
If α 1 · α 2 = 1 , then X + P D ( α ) X .
Proof. 
First, note that, for α = ( 1 , 1 ) or α = ( 1 , 1 ) , the PD( α ) order is equivalent to the PQD order between random vectors, and it is well known (see [13], p. 390) that X P Q D X + . Thus, the result in (i) holds.
Now, consider α = ( 1 , 1 ) . It follows
P ( X 1 x 1 , X 2 > x 2 ) = F 1 ( x 1 ) P [ X 1 x 1 , X 2 x 2 ] F 1 ( x 1 ) P [ X 1 + x 1 , X 2 + x 2 ] = P [ X 1 + x 1 , X 2 + > x 2 ] ,
where the inequality follows from (i), that is, F ( x 1 , x 2 ) F + ( x 1 , x 2 ) for all ( x 1 , x 2 ) . Thus, X + P D ( 1 , 1 ) X . The proof for α = ( 1 , 1 ) is analogously obtained and therefore, (ii) holds. □
The following example shows that, for the trivariate case, the results in Proposition 1 do not hold.
Example 3. 
Let Y = ( Y 1 , Y 2 , Y 3 ) be a random vector defined as in Example 2, and let Y + = ( Y 1 + , Y 2 + , Y 3 + ) be a random vector with the same univariate marginals of Y and joint distribution function given that F Y + ( y 1 , y 2 , y 3 ) = min { F 1 ( y 1 ) , F 2 ( y 2 ) , F 3 ( y 3 ) } , that is, the trivariate upper Fréchet bound. It is easy to show that, for α = ( 1 , 1 , 1 ) , we have 0.1 = P [ Y 1 0.5 , Y 2 0.5 , Y 3 > 0.5 ] > P [ Y 1 + 0.5 , Y 2 + 0.5 , Y 3 + > 0.5 ] = 0 , but 0.1 = P [ Y 1 0.5 , Y 2 1 , Y 3 > 0.5 ] < P [ Y 1 + 0.5 , Y 2 + 1 , Y 3 + > 0.5 ] = 0.3 . Moreover, for α = ( 1 , 1 , 1 ) , we obtain 0 = P [ Y 1 0.5 , Y 2 0.5 , Y 3 0.5 ] < P [ Y 1 + 0.5 , Y 2 + 0.5 , Y 3 + 0.5 ] = 0.1 , but 0.2 = P [ Y 1 0.5 , Y 2 1 , Y 3 0.5 ] > P [ Y 1 + 0.5 , Y 2 + 1 , Y 3 + 0.5 ] = 0.1 .
To conclude this section, regarding the relationship with other stochastic orders, we summarize some straightforward results:
(a)
For n = 2 , and α = ( 1 , 1 ) or α = ( 1 , 1 ) , then
X P D ( α ) Y   is   equivalent   to   X P Q D Y .
(b)
If n > 2 and α = 1 , then
X P D ( α ) Y   is   equivalent   to   X P U O D Y .
(c)
If n > 2 and α = 1 , then
X P D ( α ) Y   is   equivalent   to   X P L O D Y .
(d)
If α = 1 , from (4), (7) and (8),
X S P D ( α ) Y   if   and   only   if   X P Q D Y .
(e)
From (5), if X s m Y , then X S P D ( α ) Y with α = 1 .
(f)
In the general case, for α = ( α 1 , α 2 , , α n ) with | α i |   =   1 , i = 1 , 2 , , n , from (4), it follows that
X S P D ( α ) Y α X P O D α Y   a n d   α X P O D α Y .
(g)
Finally, X P D ( α ) Y does not generally imply that α X s m α Y . For instance, for α = 1 , it can be seen by using Example 1 and taking into account that X s m Y implies P [ X x ] P [ Y x ] for all x .

3. Directional Dependence Orders and Copulas

Copulas are a very useful tool for studying the positive dependence property of a random vector—since it contains the dependence properties of the corresponding multivariate distribution function, independently of the marginal distributions—and scale-free measures of dependence, and they represent a starting point for constructing families of distributions (see [18]). Our goal now is the study of some of the orders given in the previous section in terms of copulas.
For n 2 , an n-dimensional copula (n-copula, for short) is the restriction to [ 0 , 1 ] n of a continuous n-dimensional distribution function whose univariate margins are uniform on [ 0 , 1 ] . The importance of copulas in statistics is described in the following result due to Abe Sklar [19]: let X be a random vector with a joint distribution function F and one-dimensional marginal distributions F 1 , F 2 , , F n , respectively. Then, there exists an n-copula C (which is uniquely determined on × i = 1 n Range F i ) such that
F ( x ) = C ( F 1 ( x 1 ) , F 2 ( x 2 ) , , F n ( x n ) ) for all x R ¯ n
(for a complete proof of this result, see [20]). Thus, copulas link joint distribution functions to their one-dimensional margins. For a survey on copulas, see [4,21] and some results about positive dependence properties and ordering by using copulas can be seen, for instance, in [4,5,9,22,23,24,25].
Let X be a random vector with associated n-copula C, and let X i j denote the pair of random variables with components i and j of X . C X i j denotes the ( i , j ) -margin of C, i.e., C i j ( u i , u j ) = C ( 1 , , 1 , u i , 1 , , 1 , u j , 1 , , 1 ) , for every 1 i < j n , which is the 2-copula associated with the pair X i j .
Archimedean copulas are an important class of copulas because they are easily constructed and possess many nice properties; there is a great variety of families of copulas in this class and they have important applications in different areas. Let ϕ be a continuous and non-increasing function from [ 0 , + ] to [ 0 , 1 ] such that ϕ ( 0 ) = 1 and ϕ ( + ) = 0 , and let ϕ 1 be the inverse of ϕ . Then, the function given by
C ϕ ( u ) = ϕ i = 1 n ϕ 1 u i , u [ 0 , 1 ] n ,
is an n-copula if and only if ( 1 ) k ϕ ( k ) ( t ) 0 for k = 0 , 1 , , n 2 , where ϕ ( k ) denotes the k- t h derivative of ϕ , and ( 1 ) n 2 ϕ ( n 2 ) is non-increasing and convex. In such a case, we say that C ϕ is an Archimedean n-copula, and the function ϕ is called a generator of C ϕ . For more details, see [4,26].
In this section, we deal with the study of n-copulas associated with random vectors which are ordered in the sense of the PD( α ) order.

3.1. The Bivariate Case

By simplicity, we start our study with the bivariate case.
Theorem 3. 
Let X = ( X 1 , X 2 ) and Y = ( Y 1 , Y 2 ) be two random vectors with respective associated 2-copulas C X and C Y . Let α = ( α 1 , α 2 ) such that | α i |   =   1 ,   i = 1 , 2 .
(i) 
If α 1 · α 2 = 1 , then X P D ( α ) Y if and only if C X ( u , v ) C Y ( u , v ) for all ( u , v ) [ 0 , 1 ] 2 .
(ii) 
If α 1 · α 2 = 1 , then X P D ( α ) Y if and only if C Y ( u , v ) C X ( u , v ) for all ( u , v ) [ 0 , 1 ] 2 .
Proof. 
Consider the random vectors X = ( α 1 X 1 , α 2 X 2 ) and Y = ( α Y 1 , α 2 Y 2 ) with α = ( α 1 , α 2 ) and | α i |   =   1 ,   i = 1 , 2 , and assume that X P D ( α ) Y . Then, from Definition 2, it holds that
F ¯ X ( x 1 , x 2 ) G ¯ Y ( x 1 , x 2 )
for all ( x 1 , x 2 ) R 2 , where F ¯ X and G ¯ Y are the respective survival functions of X and Y . By using the relationship between the survival function of a random vector and its associated survival copula (see, [4], p. 32), (9) is equivalent to
C ^ X ( F ¯ α 1 X 1 ( x 1 ) , F ¯ α 2 X 2 ( x 2 ) ) C ^ Y ( F ¯ α 1 X 1 ( x 1 ) , F ¯ α 2 X 2 ( x 2 ) ) ,
where C ^ X and C ^ Y are the respective survival copulas associated with X and Y . Moreover, given that X and Y are in the same Fréchet class and by considering the relationship between the copula and the corresponding survival copula, (10) is equivalent to
C X ( F α 1 X 1 ( x 1 ) , F α 2 X 2 ( x 2 ) ) C Y ( F α 1 X 1 ( x 1 ) , F α 2 X 2 ( x 2 ) ) .
Therefore, X P D ( α ) Y is equivalent to C X ( u , v ) C Y ( u , v ) , for all u , v [ 0 , 1 ] .
Since copulas are invariant under the strictly increasing transformation of their components (see [4] (Theorem 2.4.3)), we have that, for α = ( 1 , 1 ) , X P D ( α ) Y is equivalent to C X ( u , v ) C Y ( u , v ) , for all u , v [ 0 , 1 ] . Furthermore, using [4] (Theorem 2.4.4), it follows that, for α = ( 1 , 1 , ) , C X ( u , v ) = u + v 1 + C X ( 1 u , 1 v ) , and analogously for C Y ( u , v ) . So, for this case, X P D ( α ) Y is equivalent to C X ( 1 u , 1 v ) C Y ( 1 u , 1 v ) for all u , v [ 0 , 1 ] , that is, (i) is obtained.
By using [4] (Theorem 2.4.4), the result in (ii) is obtained following the same steps as above, which completes the proof. □
In the sequel, with the use of copulas, for the PD( α ) order, we will use both the notations X P D ( α ) Y and C X P D ( α ) C Y interchangeably.
Example 4. 
Let C θ CA be the parametric family of Cuadras–Augé two-copulas given by C θ CA ( u , v ) = ( u v ) 1 θ min { u , v } θ for every ( u , v ) [ 0 , 1 ] 2 , where θ [ 0 , 1 ] (see [4,27]). In [4], (Example 2.19), it is shown that C θ 1 CA P D ( 1 , 1 ) C θ 2 CA for θ 1 θ 2 . Furthermore, if α 1 · α 2 = 1 (respectively, α 1 · α 2 = 1 ), we have C θ 1 CA P D ( α ) C θ 2 CA if and only if θ 1 θ 2 (respectively, θ 2 θ 1 ).
Example 5. 
Let C ϕ , δ AMH be the Ali–Mikhail–Haq (AMH) Archimedean 2-copula [28] given by
C ϕ , δ AMH ( u , v ) = u v 1 + δ ( 1 u ) ( 1 v )
for all ( u , v ) [ 0 , 1 ] 2 , with δ [ 1 , 1 ] and generator ϕ ( t ) = 1 δ e t δ . In [4] (Exercise 2.32), it is stated that C ϕ , δ 1 AMH P D ( 1 , 1 ) C ϕ , δ 2 AMH for δ 1 δ 2 . Furthermore, if α 1 · α 2 = 1 (respectively, α 1 · α 2 = 1 ), we have C ϕ , δ 1 AMH P D ( α ) C ϕ , δ 2 AMH if, and only if, δ 1 δ 2 (respectively, δ 2 δ 1 ).

3.2. The Trivariate Case

Usually, the properties and results obtained for two-copulas become more difficult to develop in higher dimensions. Next, we show this fact, focusing on the trivariate case, for the sake of simplicity.
Following the same development as that of [9] (Theorem 2), the next result holds.
Theorem 4. 
Let X and Y be two trivariate random vectors with respective associated 3-copulas C X and C Y . Let C X i j and C Y i j denote the ( i , j ) —margin of C X and C Y , respectively, for 1 i < j 3 . Then, for all ( u , v , w ) [ 0 , 1 ] 3 , we have:
(i) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X ( u , v , w ) C Y ( u , v , w ) .
(ii) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X 12 ( u , v ) C X ( u , v , w ) C Y 12 ( u , v ) C Y ( u , v , w ) .
(iii) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X 13 ( u , w ) C X ( u , v , w ) C Y 13 ( u , w ) C Y ( u , v , w ) .
(iv) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X 23 ( v , w ) C X ( u , v , w ) C Y 23 ( v , w ) C Y ( u , v , w ) .
(v) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X ( u , v , w ) C X 12 ( u , v ) C X 13 ( u , w ) C Y ( u , v , w ) C Y 12 ( u , v ) C Y 13 ( u , w ) .
(vi) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X ( u , v , w ) C X 12 ( u , v ) C X 23 ( v , w ) C Y ( u , v , w ) C Y 12 ( u , v ) C Y 23 ( u , w ) .
(vii) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X ( u , v , w ) C X 13 ( u , w ) C X 23 ( v , w ) C Y ( u , v , w ) C Y 13 ( u , w ) C Y 23 ( v , w ) .
(viii) 
X P D ( 1 , 1 , 1 ) Y if and only if
C X 12 ( u , v ) + C X 13 ( u , w ) + C X 23 ( v , w ) C X ( u , v , w ) C Y 12 ( u , v ) + C Y 13 ( u , w ) + C Y 23 ( v , w ) C Y ( u , v , w ) .
Proof. 
Let X be the random vector with the joint distribution function F X and associated three-copula C X . For α = ( 1 , 1 , 1 ) , we have
P α 1 X 1 > x 1 , α 2 X 2 > x 2 , α 3 X 3 > x 3 = P X 1 x 1 , X 2 x 2 , X 3 x 3 = F X ( x 1 , x 2 , x 3 )
for all ( x 1 , x 2 , x 3 ) R ¯ 3 . From Sklar’s theorem, we have
F X ( x 1 , x 2 , x 3 ) = C X ( F 1 ( x 1 ) , F 2 ( x 2 ) , F 3 ( x 3 ) ) = C X ( u 1 , u 2 , u 3 )
for all ( u 1 , u 2 , u 3 ) [ 0 , 1 ] 3 , whence part (i) easily follows.
To prove part (ii), note that, for α = ( 1 , 1 , 1 ) , we have
P α 1 X 1 > x 1 , α 2 X 2 > x 2 , α 3 X 3 > x 3   =   P X 1 x 1 , X 2 x 2 , X 3 > x 3 = P X 1 x 1 , X 2 x 2 P X 1 x 1 , X 2 x 2 , X 3 x 3 = F 1 , 2 ( x 1 , x 2 ) F X ( x 1 , x 2 , x 3 )
for all ( x 1 , x 2 , x 3 ) R ¯ 3 , where F i , j denotes the ( i , j ) -margin of F X for 1 i < j n . Parts (iii) and (iv) can be proved in a similar way.
Part (v)—and similarly, parts (vi) and (vii)—follows from the fact that, for α = ( 1 , 1 , 1 ) and for all ( x 1 , x 2 , x 3 ) R ¯ 3 , we have
P α 1 X 1 > x 1 , α 2 X 2 > x 2 , α 3 X 3 > x 3   =   P X 1 x 1 , X 2 > x 2 , X 3 > x 3 = P X 1 x 1 , X 2 > x 2 P X 1 x 1 , X 2 > x 2 , X 3 x 3 = P X 1 x 1 P X 1 x 1 , X 2 x 2 P X 1 x 1 , X 3 x 3 + P X 1 x 1 , X 2 x 2 , X 3 x 3 = x 1 F 1 , 2 ( x 1 , x 2 ) F 1 , 3 ( x 1 , x 3 ) + F X ( x 1 , x 2 , x 3 ) .
Finally, for part (viii), note that, for every ( x 1 , x 2 , x 3 ) R ¯ 3 and using (11), we have
P α 1 X 1 > x 1 , α 2 X 2 > x 2 , α 3 X 3 > x 3 = P X 2 > x 2 , X 3 > x 3 P X 1 x 1 , X 2 > x 2 , X 3 > x 3 = P X 2 > x 2 P X 2 > x 2 , X 3 x 3 P X 1 x 1 , X 2 > x 2 , X 3 > x 3 = 1 x 2 P X 3 x 3 + P X 2 x 2 , X 3 x 3 P X 1 x 1 , X 2 > x 2 , X 3 > x 3 = 1 x 1 x 2 x 3 + F 1 , 2 ( x 1 , x 2 ) + F 1 , 3 ( x 1 , x 3 ) + F 2 , 3 F X ( x 2 , x 3 ) ,
which completes the proof. □
Example 6. 
Let C θ FGM be the one-parameter three-copula given by
C θ FGM ( u ) = i = 1 3 u i 1 + θ i = 1 3 ( 1 u i ) , u [ 0 , 1 ] 3 ,
with θ in [ 0 , 1 ] . C θ FGM belongs to the Farlie–Gumbel–Morgenstern family of 3-copulas (see [4,21]). Consider two members of this family, say, C θ 1 FGM and C θ 2 FGM . For i = 1 3 α i = 1 (respectively, i = 1 3 α i = 1 ), we have that C θ 1 FGM P D ( α ) C θ 2 FGM if and only if θ 1 θ 2 (respectively, θ 2 θ 1 ).
An additional example involving a bi-parametric family of three-copulas and the three-copula for three independent random variables is given in [9] (Example 3).
The following result, in which = s t denotes equality in distribution, shows that if two (trivariate) random vectors are ordered in all directions, then they have the same distribution.
Theorem 5. 
Given the two trivariate random vectors X and Y , we have that X P D ( α ) Y for every direction α [ 0 , 1 ] 3 , with | α i | = 1 , i = 1 , 2 , 3 , if and only if X = s t Y .
Proof. 
Assume that X P D ( α ) Y , for all directions α [ 0 , 1 ] 3 . From items (i) and (iii) in Theorem 4 and for every ( u , v , w ) [ 0 , 1 ] 3 , it follows that C Y ( u , v , w ) C X ( u , v , w ) C Y ( u , v , w ) + C X 13 ( u , w ) C Y 13 ( u , w ) , and therefore, C X 13 ( u , w ) C Y 13 ( u , w ) . On the other hand, from (ii) and (v), we obtain C Y ( u , v , w ) + C X 12 ( u , v ) C Y 12 ( u , v ) + C X 13 ( u , w ) C Y 13 ( u , w ) C X ( u , v , w ) C Y ( u , v , w ) + C X 12 ( u , v ) C Y 12 ( u , v ) , and therefore C X 13 ( u , w ) C Y 13 ( u , w ) . Combining both results, it follows that C X 1 , 3 ( u , w ) = C Y 13 ( u , w ) . Similarly, using the different items, we obtain C X 12 ( u , v ) = C Y 12 ( u , v ) and C X 23 ( v , w ) = C Y 23 ( v , w ) . From (iii), we have C Y ( u , v , w ) C X ( u , v , w ) + C Y 13 ( u , w ) C X 13 ( u , w ) = C X ( u , v , w ) , and hence, we conclude C Y ( u , v , w ) = C X ( u , v , w ) . □

3.3. The PD ( 1 ) Order for Archimedean n-Copulas

The next result—whose proof can be found in [29] for the bivariate case, and in [30] for the general case—shows, under some conditions, the PD ( 1 ) order for two Archimedean n-copulas. For that, we recall that a function f defined on [ 0 , + ] is super-additive if f ( x + y ) f ( x ) + f ( y ) for all x , y [ 0 , + ] .
Proposition 2. 
For two Archimedean n-copulas C 1 and C 2 with respective generators ϕ 1 and ϕ 2 , if ϕ 2 1 ϕ 1 is super-additive, and then C 1 P D ( 1 ) C 2 .
As an application of Proposition 2, we provide an example.
Example 7. 
For all t [ 0 , + ] , given the generators ϕ 1 ( t ) = 1 δ e t δ , with δ [ 0 , 1 ] , and ϕ 2 ( t ) = ( 1 + γ t ) 1 / γ , with γ > 0 , we consider the generalizations to n-dimensions of the AMH family of Archimedean two-copulas—denoted by C n , ϕ 1 , δ AMH — given in Example 5 (see [31]) and a Clayton subfamily of Archimedean two-copulas—denoted by C n , ϕ 2 , γ C — (see [31,32]). For the sake of simplicity, we consider γ = 1 . Since ( ϕ 2 1 ϕ 1 ) ( t ) = e t δ 1 δ 1 for all t [ 0 , + ] , we have that, for all x , y [ 0 , + ] ,
ϕ 2 1 ϕ 1 ( x + y ) ϕ 2 1 ϕ 1 ( x ) + ϕ 2 1 ϕ 1 ( y )
if and only if
e x + y δ 1 δ 1 e x δ 1 δ 1 + e y δ 1 δ 1 ,
which is equivalent to
e x + y e x + e y 1 ,
i.e.,
( e x 1 ) ( e y 1 ) 0 ,
whence ϕ 2 1 ϕ 1 is super-additive, and hence, from Proposition 2, we have C n , ϕ 1 , δ AMH P D ( 1 ) C n , ϕ 2 , 1 C .
Remark 1. 
As Nelsen [4] notes, verifying the super-additivity of ϕ 2 1 ϕ 1 is not easy, but there exist several results that give sufficient conditions for that super-additivity—in principle, for the bivariate case— and can be generalized to n dimensions. We refer to [4,29,33] for more details.
However, in general, the Archimedean copulas are not ordered in the sense of the P D ( α ) order for α 1 , as the following example shows.
Example 8. 
For all t [ 0 , + ] , given the generator ϕ 3 , β ( t ) = e x p t 1 / β , with β [ 1 , + ] , we consider the Gumbel–Hougaard family of Archimedean two-copulas (see [4,34,35]). A generalization to n dimensions of this family, which we denote by C n , ϕ 3 , β GH , can be found in [4] (Example 4.25). We consider two members of this family, i.e., C n , ϕ 3 , β i GH for i = 1 , 2 . In [4], (Example 4.12), it is shown that C 2 , ϕ 3 , β 2 GH P D ( 1 ) C 2 , ϕ 3 , β 1 GH if and only if β 2 β 1 . From Proposition 2, it is easy to prove that “ C n , ϕ 3 , β 2 GH P D ( 1 ) C n , ϕ 3 , β 1 GH if, and only if, β 2 β 1 ” is also satisfied.
Now, let us consider the direction α = ( 1 , 1 , 1 ) . For ( u , v , w ) = ( 0.43 , 0.52 , 0.43 ) , by using Theorem 4(ii), we have C 3 , ϕ 3 , 3 GH ( 0.43 , 0.52 , 1 ) C 3 , ϕ 3 , 3 GH ( 0.43 , 0.52 , 0.43 ) = 0.06 > 0.02 = C 3 , ϕ 3 , 8 GH ( 0.43 , 0.52 , 1 ) C 3 , ϕ 3 , 8 GH ( 0.43 , 0.52 , 0.43 ) ; however, for ( u , v , w ) = ( 0.29 , 0.26 , 0.1 ) , we obtain C 3 , ϕ 3 , 3 GH ( 0.29 , 0.26 , 1 ) C 3 , ϕ 3 , 3 GH ( 0.29 , 0.26 , 0.1 ) = 0.11 < 0.14 = C 3 , ϕ 3 , 8 GH ( 0.29 , 0.26 , 1 ) C 3 , ϕ 3 , 8 GH ( 0.29 , 0.26 , 0.1 ) ; therefore, these three-copulas are not ordered in this direction α.

3.4. Directional Coefficients

One of the most important nonparametric measures of association between the components of a continuous random pair ( X , Y ) is Spearman’s rho, which we denote by ρ ( C ) , where C is the two-copula associated with the pair ( X , Y ) , being
ρ ( C ) = 12 [ 0 , 1 ] 2 C ( u , v ) d u d v 3
(see [4] and the references therein for more details). This measure of association—in fact, a measure of concordance [36]—provides information about the magnitude and direction of the association between the random variables.
As an immediate consequence of Theorem 3, we have the following result.
Corollary 1. 
Let X = ( X 1 , X 2 ) and Y = ( Y 1 , Y 2 ) be two random vectors with respective associated two-copulas C X and C Y . Let α = ( α 1 , α 2 ) such that | α i | = 1 , i = 1 , 2 .
(i) 
If α 1 · α 2 = 1 , then X P D ( α ) Y implies ρ C X ρ C Y .
(ii) 
If α 1 · α 2 = 1 , then X P D ( α ) Y implies ρ C Y ρ C X .
Now, we consider the trivariate case (given the complexity in higher dimensions). For a trivariate random vector ( X 1 , X 2 , X 3 ) of continuous random variables uniform on [ 0 , 1 ] , whose distribution function is the 3-copula C, the directional ρ -coefficients are defined for each ( α 1 , α 2 , α 3 ) , with | α i | = 1 for i = 1 , 2 , 3 , as
ρ 3 ( α 1 , α 2 , α 3 ) ( C ) = 8 [ 0 , 1 ] 3 Q α 1 α 2 α 3 ( x 1 , x 2 , x 3 ) d x 1 d x 2 d x 3 ,
where
Q α 1 α 2 α 3 ( x 1 , x 2 , x 3 ) = P i = 1 3 ( α i X i > α i x i ) i = 1 3 P [ α i X i > α i x i ]
(see [10]). Unlike the measure Spearman’s rho, the coefficient ρ 3 ( α 1 , α 2 , α 3 ) is not a multivariate measure of association.
Example 9. 
Consider the subfamily of FGM three-copulas given by (12). Then, it is easy to show that: (i) For i = 1 3 α i = 1 , we have ρ 3 ( α 1 , α 2 , α 3 ) C θ FGM = θ 27 ; and (ii) for i = 1 3 α i = 1 , we have ρ 3 ( α 1 , α 2 , α 3 ) C θ FGM = θ 27 .
As a result of our findings, we have the following outcome.
Corollary 2. 
Let X and Y be two trivariate vectors of continuous random variables uniform on [ 0 , 1 ] whose respective distribution functions are the three-copulas C X and C Y . Let α = ( α 1 , α 2 , α 3 ) be a direction such that | α i | = 1 , i = 1 , 2 , 3 . If X P D ( α ) Y , then ρ 3 ( α 1 , α 2 , α 3 ) ( C X ) ρ 3 ( α 1 α 2 , α 3 ) ( C Y ) .
We note that Corollary 2 generalizes that, for the bivariate case, the P O D order between two vectors implies the order between their corresponding Spearman’s ρ coefficients (see, for instance, [5]).

4. Conclusions

In this paper, we establish a multivariate order by leveraging the principle of orthant directional dependence and delve into an in-depth exploration of its inherent properties. Our investigation extends to scrutinizing the connections it shares with alternative dependence orders expounded in existing literature. Furthermore, we engage in a comprehensive analysis by comparing two random vectors through their respective associated copulas. To provide a tangible and illustrative dimension to our findings, we incorporate a diverse set of examples that serve to underscore and elucidate the nuances of our results.
The study of certain outcomes regarding the Baire category for the stochastic orders introduced within this work (similarly to those investigated in [37]) stands as a focal point for future research.

Author Contributions

Methodology, M.d.R.R.-G. and M.Ú.-F.; Validation, E.d.A., M.d.R.R.-G. and M.Ú.-F.; Formal analysis, E.d.A.; Investigation, E.d.A.; Writing—original draft, M.d.R.R.-G.; Writing—review & editing, M.Ú.-F.; Supervision, M.Ú.-F. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the program ERDF-Andalucía 2014-2020 (University of Almería) under research project UAL2020-AGR-B1783 and project PID2021-122657OB-I00 by the Ministry of Science and Innovation (Spain). The first author is also partially supported by the CDTIME of the University of Almería.

Data Availability Statement

No new data were created or analyzed in this study.

Acknowledgments

The authors acknowledge the comments and suggestions of three anonymous reviewers.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Jogdeo, K. Concepts of dependence. In Encyclopedia of Statistical Sciences; Kotz, S., Johnson, N.L., Eds.; Wiley: New York, NY, USA, 1982; Volume 1, pp. 324–334. [Google Scholar]
  2. Colangelo, A.; Scarsini, M.; Shaked, M. Some positive dependence stochastic orders. J. Multivar. Anal. 2006, 97, 46–78. [Google Scholar] [CrossRef]
  3. Colangelo, A.; Scarsini, M.; Shaked, M. Some notions of multivariate positive dependence. Insur. Math. Econ. 2005, 37, 13–26. [Google Scholar] [CrossRef]
  4. Nelsen, R.B. An Introduction to Copula, 2nd ed.; Springer: New York, NY, USA, 2006. [Google Scholar]
  5. Joe, H. Multivariate Models and Dependence Concepts; Chapman & Hall: London, UK, 1997. [Google Scholar]
  6. Puccetti, G.; Wang, R. Extremal dependence concepts. Statist. Sci. 2015, 30, 485–517. [Google Scholar] [CrossRef]
  7. Shaked, M. A general theory of some positive dependence notions. J. Multivar. Anal. 1982, 12, 199–218. [Google Scholar] [CrossRef]
  8. Zamani, Z.; Mohtashami Borzadaran, G.R.; Amini, M. On a new positive dependence concept based on the conditional mean inactivity time order. Comm. Statist. Theory Methods 2017, 46, 1779–1787. [Google Scholar] [CrossRef]
  9. Quesada-Molina, J.J.; Úbeda-Flores, M. Directional dependence of random vectors. Inf. Sci. 2012, 215, 67–74. [Google Scholar] [CrossRef]
  10. Nelsen, R.B.; Úbeda-Flores, M. Directional dependence in multivariate distributions. Ann. Inst. Stat. Math. 2012, 64, 677–685. [Google Scholar] [CrossRef]
  11. Kimeldorf, G.; Sampson, A.R. Positive dependence orderings. Ann. Inst. Statist. Math. 1987, 39, 113–128. [Google Scholar] [CrossRef]
  12. Kimeldorf, G.; Sampson, A.R. A framework for positive dependence. Ann. Inst. Statist. Math. 1989, 41, 31–45. [Google Scholar] [CrossRef]
  13. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer: New York, NY, USA, 2007. [Google Scholar]
  14. Fernández-Ponce, J.M.; Rodríguez-Griñolo, M.R. New properties of the orthant convex-type stochastic orders. Test 2017, 26, 618–637. [Google Scholar] [CrossRef]
  15. Müller, A.; Scarsini, M. Some remarks on the supermodular order. J. Multivar. Anal. 2000, 73, 107–119. [Google Scholar] [CrossRef]
  16. Billingsley, P. Convergence of Probability Measures, 2nd ed.; John Wiley & Sons, Inc.: New York, NY, USA, 1999. [Google Scholar]
  17. van der Vaart, A.W.; Wellner, J.A. Weak Convergence and Empirical Processes, 2nd ed.; Springer: New York, NY, USA, 2000. [Google Scholar]
  18. Fisher, N.I. Copulas. In Encyclopedia of Statistical Sciences; Kotz, S., Read, C.B., Banks, D.L., Eds.; Wiley: New York, NY, USA, 1997; Volume 1, pp. 159–163. [Google Scholar]
  19. Sklar, A. Fonctions de répartition à n dimensions et leurs marges. Publ. Inst. Statist. Univ. Paris 1959, 8, 229–231. [Google Scholar]
  20. Úbeda-Flores, M.; Fernández-Sánchez, J. Sklar’s theorem: The cornerstone of the Theory of Copulas. In Copulas and Dependence Models with Applications; Úbeda Flores, M., de Amo Artero, E., Durante, F., Fernández Sánchez, J., Eds.; Springer: Cham, Swizterland, 2017; pp. 241–258. [Google Scholar]
  21. Durante, F.; Sempi, C. Principles of Copula Theory; Chapman & Hall/CRC: Boca Raton, FL, USA, 2016. [Google Scholar]
  22. Müller, A.; Scarsini, M. Archimedean copulae and positive dependence. J. Multivar. Anal. 2006, 93, 434–445. [Google Scholar] [CrossRef]
  23. Navarro, J.; Pellerey, F.; Sordo, M.A. Weak dependence notions and their mutual relationships. Mathematics 2021, 9, 81. [Google Scholar] [CrossRef]
  24. Wei, Z.; Wang, T.; Nguyen, P.A. Multivariate dependence concepts through copulas. Int. J. Approx. Reason. 2015, 65, 24–33. [Google Scholar] [CrossRef]
  25. Wei, Z.; Wang, T.; Panichkitkosolkul, W. Dependence and association concepts through copulas. In Modeling Dependence in Econometrics—Advances in Intelligent Systems and Computing; Huynh, V.N., Kreinovich, V., Sriboonchitta, S., Eds.; Springer: Cham, Switzerland, 2014; Volume 251, pp. 113–126. [Google Scholar]
  26. McNeil, A.J.; Nešlehová, J. Multivariate Archimedean copulas, d-monotone functions and l1-norm symmetric distributions. Ann. Stat. 2009, 37, 3059–3097. [Google Scholar] [CrossRef]
  27. Cuadras, C.M.; Augé, J. A continuous general multivariate distribution and its properties. Comm. Statist. Theory Methods 1981, 10, 339–353. [Google Scholar] [CrossRef]
  28. Ali, M.M.; Mikhail, N.N.; Haq, M.S. A class of bivariate distributions including the bivariate logistic. J. Multivar. Anal. 1978, 8, 405–412. [Google Scholar] [CrossRef]
  29. Schweizer, B.; Sklar, A. Probabilistic Metric Spaces; Dover Publications, Inc.: New York, NY, USA, 2011. [Google Scholar]
  30. Li, X.; Fang, R. Ordering properties of order statistics from random variables of Archimedean copulas with applications. J. Multivar. Anal. 2015, 133, 304–320. [Google Scholar] [CrossRef]
  31. Pérez, A.; Prieto-Alaiz, M.; Chamizo, F.; Liebscher, E.; Úbeda-Flores, M. Nonparametric estimation of the multivariate Spearman’s footrule: A further discussion. Fuzzy Sets Syst. 2023, 467, 108489. [Google Scholar] [CrossRef]
  32. Clayton, D.G. A model for association in bivariate life tables and its application in epidemiological studies of familial tendency in chronic disease incidence. Biometrika 1978, 65, 141–151. [Google Scholar] [CrossRef]
  33. Genest, C.; MacKay, J. Copules archimédiennes et familles de lois bidimensionnelles dont les marges sont données. Canad. J. Statist. 1986, 14, 145–159. [Google Scholar] [CrossRef]
  34. Gumbel, E.J. Distributions des valeurs extremes en plusiers dimensions. Publ. Inst. Statist. Univ. Paris 1960, 9, 171–173. [Google Scholar]
  35. Hougaard, P. A class of multivariate failure time distributions. Biometrika 1986, 73, 671–678. [Google Scholar] [CrossRef]
  36. Scarsini, M. On measures of concordance. Stochastica 1984, 8, 201–218. [Google Scholar]
  37. Durante, F.; Fernández-Sánchez, J.; Ignazzi, C. Baire category results for stochastic orders. Rev. Real Acad. Cienc. Exactas Fis. Nat. A Mat. 2022, 116, 188. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

de Amo, E.; Rodríguez-Griñolo, M.d.R.; Úbeda-Flores, M. Directional Dependence Orders of Random Vectors. Mathematics 2024, 12, 419. https://doi.org/10.3390/math12030419

AMA Style

de Amo E, Rodríguez-Griñolo MdR, Úbeda-Flores M. Directional Dependence Orders of Random Vectors. Mathematics. 2024; 12(3):419. https://doi.org/10.3390/math12030419

Chicago/Turabian Style

de Amo, Enrique, María del Rosario Rodríguez-Griñolo, and Manuel Úbeda-Flores. 2024. "Directional Dependence Orders of Random Vectors" Mathematics 12, no. 3: 419. https://doi.org/10.3390/math12030419

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop