Next Article in Journal
Orientation Asymmetric Surface Model for Membranes: Finsler Geometry Modeling
Next Article in Special Issue
An Evaluation of ARFIMA (Autoregressive Fractional Integral Moving Average) Programs
Previous Article in Journal
Euclidean Algorithm for Extension of Symmetric Laurent Polynomial Matrix and Its Application in Construction of Multiband Symmetric Perfect Reconstruction Filter Bank
Previous Article in Special Issue
Operational Solution of Non-Integer Ordinary and Evolution-Type Partial Differential Equations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multivariate Extended Gamma Distribution

by
Dhannya P. Joseph
Department of Statistics, Kuriakose Elias College, Mannanam, Kottayam, Kerala 686561, India
Axioms 2017, 6(2), 11; https://doi.org/10.3390/axioms6020011
Submission received: 7 June 2016 / Revised: 7 April 2017 / Accepted: 13 April 2017 / Published: 24 April 2017

Abstract

:
In this paper, I consider multivariate analogues of the extended gamma density, which will provide multivariate extensions to Tsallis statistics and superstatistics. By making use of the pathway parameter β , multivariate generalized gamma density can be obtained from the model considered here. Some of its special cases and limiting cases are also mentioned. Conditional density, best predictor function, regression theory, etc., connected with this model are also introduced.

1. Introduction

Consider the generalized gamma density of the form
g ( x ) = c 1 x γ e - a x δ , x 0 , a > 0 , δ > 0 , γ + 1 > 0 ,
where c 1 = δ a γ + 1 δ Γ ( γ + 1 δ ) , is the normalizing constant. Note that this is the generalization of some standard statistical densities such as gamma, Weibull, exponential, Maxwell-Boltzmann, Rayleigh and many more. We will extend the generalized gamma density by using pathway model of [1] and we get the extended function as
g 1 ( x ) = c 2 x γ [ 1 + a ( β - 1 ) x δ ] - 1 β - 1 , x 0 , β > 1 , a > 0 , δ > 0
where c 2 = δ ( a ( β - 1 ) ) γ + 1 δ Γ ( 1 β - 1 ) Γ ( γ + 1 δ ) Γ ( 1 β - 1 - γ + 1 δ ) , is the normalizing constant.
Note that g 1 ( x ) is a generalized type-2 beta model. Also lim β 1 g 1 ( x ) = g ( x ) , so that it can be considered to be an extended form of g ( x ) . For various values of the pathway parameter β a path is created so that one can see the movement of the function denoted by g 1 ( x ) above towards a generalized gamma density. From the Figure 1 we can see that, as β moves away from 1 the function g 1 ( x ) moves away from the origin and it becomes thicker tailed and less peaked. From the path created by β we note that we obtain densities with thicker or thinner tail compared to generalized gamma density. Observe that for β < 1 , writing β - 1 = - ( 1 - β ) in Equation (2) produce generalized type-1 beta form, which is given by
g 2 ( x ) = c 3 x γ [ 1 - a ( 1 - β ) x δ ] 1 1 - β , 1 - a ( 1 - β ) x δ 0 , β < 1 , a > 0 , δ > 0
where c 3 = δ ( a ( 1 - β ) ) γ + 1 δ Γ ( 1 1 - β + 1 + γ + 1 δ ) Γ ( γ + 1 δ ) Γ ( 1 1 - β + 1 ) , is the normalizing constant (see [2]).
From the above graph, one can see the movement of the extended gamma density denoted by g 1 ( x ) towards the generalized gamma density, for various values of the pathway parameter β . Beck and Cohen’s superstatistics belong to the case (2) [3,4]. For γ = 1 , a = 1 , δ = 1 we have Tsallis statistics [5,6] for β > 1 from (2).
Several multivariate extensions of the univariate gamma distributions exist in the literature [7,8,9]. In this paper we consider a multivariate analogue of the extended gamma density (2) and some of its properties.

2. Multivariate Extended Gamma

Various multivarite generalizatons of pathway model are discussed in the papers of Mathai [10,11]. Here we consider the multivariate case of the extended gamma density of the form (2). For X i 0 , i = 1 , 2 , , n , let
f β ( x 1 , x 2 , , x n ) = k β x 1 γ 1 x 2 γ 2 x n γ n [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n x n δ n ) ] - η β - 1 , β > 1 , η > 0 , δ i > 0 , a i > 0 , i = 1 , 2 , , n ,
where k β is the normalizing constant, which will be given later. This multivariate analogue can also produce multivariate extensions to Tsallis statistics [5,12] and superstatistics [3]. Here the variables are not independently distributed, but when β 1 we have a result that X 1 , X 2 , , X n will become independently distributed generalized gamma variables. That is,
lim β 1 f β ( x 1 , x 2 , , x n ) = f ( x 1 , x 2 , , x n ) = k x 1 γ 1 x 2 γ 2 x n γ n e - b 1 x 1 δ 1 - - b n x n δ n , x i 0 , b i = η a i > 0 , δ i > 0 , i = 1 , 2 , , n ,
where k = i = 1 n δ i b i γ i + 1 δ i Γ ( γ i + 1 δ i ) , γ i + 1 > 0 , i = 1 , 2 , , n .
The following are the graphs of 2-variate extended gamma with γ 1 = 1 , γ 2 = 1 , a 1 = 1 , a 2 = 1 , δ 1 = 2 , δ 2 = 2 and for various values of the pathway parameter β . From the Figure 2, Figure 3 and Figure 4, we can see the effect of the pathway parameter β in the model.

Special Cases and Limiting Cases

  • When β → 1, (3) will become independently distributed generalized gamma variables. This includes multivariate analogue of gamma, exponential, chisquare, Weibull, Maxwell- Boltzmann, Rayleigh, and related models.
  • If n = 1 , a 1 = 1 , δ 1 = 1 , β = 2 , (3) is identical with type-2 beta density.
  • If β = 2 , a 1 = a 2 = = a n = 1 , δ 1 = δ 2 = = δ n = 1 in (3), then (3) becomes the type-2 Dirichlet density,
    D ( x 1 , x 2 , , x n ) = d x 1 ν 1 - 1 x 2 ν 2 - 1 x n ν n - 1 [ 1 + x 1 + x 2 + + x n ] - ( ν 1 + + ν n + 1 ) , x i 0 ,
    where ν i = γ i + 1 , i = 1 , 2 , , n , ν n + 1 = η - ( ν 1 + + ν n ) and d is the normalizing constant (see [13,14]).
A sample of the surface for n = 2 is given in the Figure 5.

3. Marginal Density

We can find the marginal density of X i , by integrating out X 1 , X 2 , , X i - 1 , X i + 1 , , X n - 1 , X n . First let us integrate out X n , then the joint density of X 1 , X 2 , , X n - 1 denoted by f 1 is given by
f 1 ( x 1 , x 2 , , x n - 1 ) = x n > 0 f β ( x 1 , x 2 , , x n ) d x n = k β x 1 γ 1 x 2 γ 2 x n - 1 γ n - 1 [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + + a n - 1 x n - 1 δ n - 1 ) ] - η β - 1 × x n x n γ n [ 1 + C x n δ n ] - η β - 1 d x n ,
where C = ( β - 1 ) a n [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + + a n - 1 x n - 1 δ n - 1 ) ] . By putting y = C x n δ n and integrating we get
f 1 ( x 1 , x 2 , , x n - 1 ) = k β Γ ( η β - 1 - γ n + 1 δ n ) Γ ( γ n + 1 δ n ) δ n [ a n ( β - 1 ) ] γ n + 1 δ n Γ ( η β - 1 ) x 1 γ 1 x 2 γ 2 x n - 1 γ n - 1 × [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n - 1 x n - 1 δ n - 1 ) ] - η β - 1 - γ n + 1 δ n ,
x i 0 , i = 1 , 2 , , n - 1 , a i > 0 , δ i > 0 , i = 1 , 2 , , n , β > 1 , η > 0 , η β - 1 - γ n + 1 δ n > 0 , γ n + 1 > 0 .
In a similar way we can integrate out X 1 , X 2 , , X i - 1 , X i + 1 , , X n - 1 . Then the marginal density of X i is denoted by f 2 and is given by
f 2 ( x i ) = k 2 x i γ i [ 1 + ( β - 1 ) a i x i δ i ] - η β - 1 - γ n + 1 δ n - - γ i - 1 + 1 δ i - 1 - γ i + 1 + 1 δ i + 1 - γ 1 + 1 δ 1 ,
where x i 0 , β > 1 , δ i > 0 , η > 0 ,
k 2 = δ i ( a i ( β - 1 ) ) γ i + 1 δ i Γ ( η β - 1 - γ n + 1 δ n - - γ i - 1 + 1 δ i - 1 - γ i + 1 + 1 δ i + 1 - - γ 1 + 1 δ 1 ) Γ ( γ i + 1 δ i ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) ,
γ i + 1 > 0 , η β - 1 - γ n + 1 δ n - - γ i - 1 + 1 δ i - 1 - γ i + 1 + 1 δ i + 1 - - γ 1 + 1 δ 1 > 0 , η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n > 0 .
If we take any subset of ( X 1 , , X n ) , the marginal densities belong to the same family. In the limiting case they will also become independently distributed generalized gamma variables.

Normalizing Constant

Integrating out X i from (8) and equating to 1, we will get the normalizing constant k β as
k β = δ 1 δ 2 δ n ( a 1 ( β - 1 ) ) γ 1 + 1 δ 1 ( a 2 ( β - 1 ) ) γ 2 + 1 δ 2 ( a n ( β - 1 ) ) γ n + 1 δ n Γ ( η β - 1 ) Γ ( γ 1 + 1 δ 1 ) Γ ( γ 2 + 1 δ 2 ) Γ ( γ n + 1 δ n ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) ,
δ i > 0 , a i > 0 , γ i + 1 > 0 , i = 1 , 2 , , n , β > 1 , η > 0 , η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n > 0 .

4. Joint Product Moment and Structural Representations

Let ( X 1 , , X n ) have a multivariate extended gamma density (23). By observing the normalizing constant in (9), we can easily obtained the joint product moment for some arbitrary ( h 1 , , h n ) ,
E ( x 1 h 1 x 2 h 2 x n h n ) = k β Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ n + h n + 1 δ n ) Γ ( η β - 1 - γ 1 + h 1 + 1 δ 1 - - γ n + h n + 1 δ n ) δ 1 δ 2 δ n ( a 1 ( β - 1 ) ) γ 1 + h 1 + 1 δ 1 ( a n ( β - 1 ) ) γ n + h n + 1 δ n Γ ( η β - 1 ) = Γ ( η β - 1 - γ 1 + h 1 + 1 δ 1 - - γ n + h n + 1 δ n ) i = 1 n Γ ( γ i + h i + 1 δ i ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) i = 1 n [ a i ( β - 1 ) ] h i δ i Γ ( γ i + 1 δ i ) ,
η β - 1 - i = 1 n γ i + h i + 1 δ i > 0 , γ i + h i + 1 > 0 , η β - 1 - i = 1 n γ i + 1 δ i > 0 , γ i + 1 > 0 , a i > 0 , β > 1 , δ i > 0 , i = 1 , 2 , , n .
Property 1.
The joint product moment of the multivariate extended gamma density can be written as
E ( x 1 h 1 x 2 h 2 x n h n ) = Γ η β - 1 - i = 1 n γ i + h i + 1 δ i Γ η β - 1 - i = 1 n γ i + 1 δ i i = 1 n E ( y i h i ) ,
where Y i ’ s are generalized gamma random variables having density function
f y ( y i ) = c i y i γ i e - [ a i ( β - 1 ) y i ] δ i , y i 0 , β > 1 , a i > 0 , δ i > 0 ,
where c i = δ i [ a i ( β - 1 ) ] γ i + 1 δ i Γ ( γ i + 1 δ i ) , γ i + 1 > 0 , i = 1 , 2 , , n , is the normalizing constant.
Property 2.
Letting h 2 = = h n = 0 , in (10), we get
E ( x 1 h 1 ) = Γ ( η β - 1 - γ 1 + h 1 + 1 δ 1 - γ 2 + 1 δ 2 - - γ n + 1 δ n ) Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) [ a 1 ( β - 1 ) ] h 1 δ 1 Γ ( γ 1 + 1 δ 1 ) ,
η β - 1 - γ 1 + h 1 + 1 δ 1 - i = 2 n γ i + 1 δ i > 0 , γ 1 + h 1 + 1 δ 1 > 0 , η β - 1 - i = 1 n γ i + 1 δ i > 0 , γ 1 + 1 > 0 , a 1 > 0 , β > 1 , δ 1 > 0 . (13) is the h 1 t h moment of a random variable with density function of the the form (8),
f 3 ( x 1 ) = k 3 x 1 γ 1 [ 1 + ( β - 1 ) a 1 x 1 δ 1 ] - η β - 1 - γ 2 + 1 δ 2 - - γ n + 1 δ n ,
where k 3 is the normalizing constant. Then
E ( x 1 h 1 ) = k 3 0 x 1 γ 1 [ 1 + a 1 ( β - 1 ) x 1 δ 1 ] - [ η β - 1 - γ 2 + 1 δ 2 - - γ n + 1 δ n ] d x 1
Making the substitution y = a 1 ( β - 1 ) x 1 δ 1 , then it will be in the form of a type-2 beta density and we can easily obtained the h 1 t h moment as in (13).
Property 3.
Letting h 3 = = h n = 0 , in (10), we get
E ( x 1 h 1 x 2 h 2 ) = Γ ( η β - 1 - γ 1 + h 1 + 1 δ 1 - γ 2 + h 2 + 1 δ 2 - - γ n + 1 δ n ) Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ 2 + h 2 + 1 δ 2 ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) [ a 1 ( β - 1 ) ] h 1 δ 1 [ a 2 ( β - 1 ) ] h 2 δ 2 Γ ( γ 1 + 1 δ 1 ) Γ ( γ 2 + 1 δ 2 ) ,
η β - 1 - γ 1 + h 1 + 1 δ 1 - γ 2 + h 2 + 1 δ 2 - i = 3 n γ i + 1 δ i > 0 , η β - 1 - i = 1 n γ i + 1 δ i > 0 , β > 1 , γ i + h i + 1 > 0 , γ i + 1 > 0 , a i > 0 , δ i > 0 , i = 1 , 2 , which is the joint product moment of a bivariate extended gamma density is denoted by f 4 and is given by
f 4 ( x 1 x 2 ) = k 4 x 1 γ 1 x 2 γ 2 [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 ) ] - η β - 1 - γ 3 + 1 δ 3 - - γ n + 1 δ n ,
where k 4 is the normalizing constant. (17) is obtained by integrating out X 3 , , X n from (3). By putting h 4 = = h n = 0 , in (10), we get the joint product moment of trivariate extended gamma density and so on.
Theorem 1.
When X 1 , , X n has density in (3), then
E { x 1 h 1 x n h n [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + + a n x n δ n ) ] h } = k β Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ n + h n + 1 δ n ) Γ ( η β - 1 - h - γ 1 + h 1 + 1 δ 1 - - γ n + h n + 1 δ n ) δ 1 δ 2 δ n ( a 1 ( β - 1 ) ) γ 1 + h 1 + 1 δ 1 ( a n ( β - 1 ) ) γ n + h n + 1 δ n Γ ( η β - 1 - h ) = Γ ( η β - 1 ) Γ ( η β - 1 - h - γ 1 + h 1 + 1 δ 1 - - γ n + h n + 1 δ n ) i = 1 n Γ ( γ i + h i + 1 δ i ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) Γ ( η β - 1 - h ) i = 1 n [ a i ( β - 1 ) ] h i δ i Γ ( γ i + 1 δ i ) ,
η β - 1 - h - i = 1 n γ i + h i + 1 δ i > 0 , γ i + h i + 1 > 0 , η β - 1 - i = 1 n γ i + 1 δ i > 0 , γ i + 1 > 0 , a i > 0 , β > 1 , η > 0 , δ i > 0 , i = 1 , 2 , , n .
Corollary 1.
When X 1 , , X n has density in (3), then
E { [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + + a n x n δ n ) ] h } = Γ ( η β - 1 ) Γ ( η β - 1 - h - γ 1 + h 1 + 1 δ 1 - - γ n + h n + 1 δ n ) Γ ( η β - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) Γ ( η β - 1 - h ) ,
η β - 1 - h - i = 1 n γ i + h i + 1 δ i > 0 , η β - 1 - h > 0 , η β - 1 - i = 1 n γ i + 1 δ i > 0 , a i > 0 , β > 1 , η > 0 , δ i > 0 , i = 1 , 2 , , n .

4.1. Variance-Covariance Matrix

Let X be a n × 1 vector. Variance-covariance matrix is obtained by taking E [ ( X - E ( X ) ) ( X - E ( X ) ) ] . Then the elements will be of the form
E [ ( X - E ( X ) ) ( X - E ( X ) ) ] = Var ( x 1 ) Cov ( x 1 , x 2 ) Cov ( x 1 , x n ) Cov ( x 2 , x 1 ) Var ( x 2 ) Cov ( x 2 , x n ) Cov ( x n , x 1 ) Cov ( x n , x 2 ) Var ( x n )
where
Cov ( x i , x j ) = E ( x i x j ) - E ( x i ) E ( x j ) , i , j = 1 , 2 , , n , i j
and
Var ( x i ) = E ( x i 2 ) - [ E ( x i ) ] 2 , i = 1 , 2 , , n .
E ( x i x j ) ’s are obtained from (10) by putting h i = h j = 1 and all other h k = 0 , k = 1 , 2 , , n , k i , j . E ( x i ) ’s and E ( x i 2 ) ’s are respectively obtained from (10) by putting h i = 1 and h i = 2 and all other h k = 0 , k = 1 , 2 , , n , k i . Where
E ( x 1 x 2 ) = 0 0 x 1 x 2 f 2 ( x 1 , x 2 ) d x 1 d x 2 .

4.2. Normalizing Constant

Integrate out x i from (8) and equate with 1, we will get the normalizing constant K α as
K α = δ 1 δ 2 δ n ( a 1 ( α - 1 ) ) γ 1 + 1 δ 1 ( a 2 ( α - 1 ) ) γ 2 + 1 δ 2 ( a n ( α - 1 ) ) γ n + 1 δ n Γ ( η α - 1 ) Γ ( γ 1 + 1 δ 1 ) Γ ( γ 2 + 1 δ 2 ) Γ ( γ n + 1 δ n ) Γ ( η α - 1 - γ 1 + 1 δ 1 - - γ n + 1 δ n ) .

5. Regression Type Models and Limiting Approaches

The conditional density of X i given X 1 , X 2 , , X i - 1 , X i + 1 , , X n is denoted by f 5 and is given by
f 5 ( x i | x 1 , x 2 , , x i - 1 , x i + 1 , , x n ) = f β ( x 1 , x 2 , , x n ) f 6 ( x 1 , x 2 , , x i - 1 , x i + 1 , , x n ) = δ i [ a i ( β - 1 ) ] γ i + 1 δ i Γ ( η β - 1 ) Γ ( γ i + 1 δ i ) Γ ( η β - 1 - γ i + 1 δ i ) x i γ i × [ 1 + ( β - 1 ) a i x i δ i 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a i - 1 x i - 1 δ i - 1 + a i + 1 x i + 1 δ i + 1 + + a n x n δ n ) ] - η β - 1 × [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a i - 1 x i - 1 δ i - 1 + a i + 1 x i + 1 δ i + 1 + + a n x n δ n ) ] - γ i + 1 δ i ,
where f 6 is the joint density of X 1 , X 2 , , X i - 1 , X i + 1 , , X n . When we take the limit as β 1 in Equation (24), we can see that the conditional density will be in the form of a generalized gamma density and is given by
lim β 1 f 5 ( x i | x 1 , x 2 , , , x i - 1 , x i + 1 , , x n ) = δ i ( η a i ) γ i + 1 δ i Γ ( γ i + 1 δ i ) x i γ i e - a i x i δ i ,
x i 0 , δ i > 0 , η > 0 , γ i + 1 > 0 .
Theorem 2.
Let ( X 1 , X 2 , , X n ) have a multivariate extended gamma density (3), then the limiting case of the conditional density f β ( x i | x 1 , x 2 , , , x i - 1 , x i + 1 , , x n ) will be a generalized gamma density (25).

Best Predictor

The conditional expectation, E ( x n | x 1 , , x n - 1 ) , is the best predictor, best in the sense of minimizing the expected squared error. Variables which are preassigned are usually called independent variables and the others are called dependent variables. In this context, X n is the dependent variable or being predicted and X 1 , , X n - 1 are the preassigned variables or independent variables. This ‘best’ predictor is defined as the regression function of X n on X 1 , , X n - 1 .
E ( x n | x 1 , , x n - 1 ) = x n = 0 x n f 7 ( x n | x 1 , , x n - 1 ) d x n = δ n [ a n ( β - 1 ) ] γ n + 1 δ n Γ ( η β - 1 ) Γ ( γ n + 1 δ n ) Γ ( η β - 1 - γ n + 1 δ n ) [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n - 1 x n - 1 δ n - 1 ) ] - γ n + 1 δ n + η β - 1 × x n = 0 x n γ n + 1 [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n x n δ n ) ] - η β - 1 d x n .
We can integrate the above integral as in the case of Equation (6). Then after simplification we will get the best predictor of X n at preassigned values of X 1 , , X n - 1 which is given by
E ( x n | x 1 , , x n - 1 ) = δ n [ a n ( β - 1 ) ] - 1 δ n Γ ( η β - 1 - γ n + 2 δ n ) Γ ( γ n + 2 δ n ) Γ ( γ n + 1 δ n ) Γ ( η β - 1 - γ n + 1 δ n ) × [ 1 + ( β - 1 ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n - 1 x n - 1 δ n - 1 ) ] - 1 δ n ,
δ n > 0 , a n > 0 , β > 1 , x i > 0 , i = 1 , 2 , , n - 1 , η β - 1 - γ n + 2 δ n > 0 , γ n + 1 > 0 . We can take the limit β 1 in (27). For taking limit, let us apply Stirling’s approximations for gamma functions, see for example [15]
Γ ( z + a ) ( 2 π ) 1 2 z z + a - 1 2 e - z , for | z | and a is bounded
to the gamma’s in (27). Then we will get
lim β 1 E ( x n | x 1 , , x n - 1 ) = δ n Γ ( γ n + 2 δ n ) ( a n η ) 1 δ n Γ ( γ n + 1 δ n )
which is the moment of a generalized gamma density as given in (25).

6. Multivariate Extended Gamma When β < 1

Consider the case when the pathway parameter β is less than 1, then the pathway model has the form
g ( x ) = K x γ [ 1 - a ( 1 - β ) x δ ] η 1 - β , β < 1 , a > 0 , δ > 0 , η > 0 ,
1 - a ( 1 - β ) x δ 0 , and K is the normalizing constant. g ( x ) is the generalized type-1 beta model. Let us consider a multivariate case of the above model as
g β ( x 1 , x 2 , , x n ) = K β x 1 γ 1 x 2 γ 2 x n γ n [ 1 - ( 1 - β ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n x n δ n ) ] η 1 - β , β < 1 , η > 0 , δ i > 0 , a i > 0 , i = 1 , 2 , , n , 1 - ( 1 - β ) ( a 1 x 1 δ 1 + a 2 x 2 δ 2 + + a n x n δ n ) 0 .
where K β is the normalizing constant and it can be obtained by solving
K β x 1 γ 1 x n γ n [ 1 - ( 1 - β ) ( a 1 x 1 δ 1 + + a n x n δ n ) ] η 1 - β d x 1 d x n = 1
Integration over x n yields the following,
K β x 1 γ 1 x 2 γ 2 x n - 1 γ n - 1 [ 1 - ( 1 - β ) ( a 1 x 1 δ 1 + + a n - 1 x n - 1 δ n - 1 ) ] η 1 - β 0 u x n γ n [ 1 + C 1 x n δ n ] - η β - 1 d x n ,
where u = 1 - ( 1 - β ) ( a 1 x 1 δ 1 + + a n - 1 x n - 1 δ n - 1 ) a n ( 1 - β ) 1 δ and C 1 = ( 1 - β ) a n [ 1 - ( 1 - β ) ( a 1 x 1 δ 1 + + a n - 1 x n - 1 δ n - 1 ) ] . Letting y = C 1 x n δ n , then the above integral becomes a type-1 Dirichlet integral and the normalizing constant can be obtained as
K β = j = 1 n [ δ j ( ( 1 - β ) a j ) γ j + 1 δ j ] Γ ( 1 + η 1 - β + γ 1 + 1 δ 1 + + γ n + 1 δ n ) Γ ( γ 1 + 1 δ 1 ) Γ ( γ n + 1 δ n ) Γ ( 1 + η 1 - β )
When β 1 , (31) will become the density of independently distributed generalized gamma variables. By observing the normalizing constant in (34), we can easily obtaine the joint product moment for some arbitrary ( h 1 , , h n ) ,
E ( x 1 h 1 x 2 h 2 x n h n ) = K β Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ n + h n + 1 δ n ) Γ ( 1 + η 1 - β ) j = 1 n [ δ j ( ( 1 - β ) a j ) γ j + h j + 1 δ j ] Γ ( 1 + η 1 - β + γ 1 + h 1 + 1 δ 1 + + γ n + h n + 1 δ n ) = Γ ( 1 + η 1 - β + γ 1 + 1 δ 1 + + γ n + 1 δ n ) Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ n + h n + 1 δ n ) j = 1 n [ ( ( 1 - β ) a j ) h j δ j ] Γ ( 1 + η 1 - β + γ 1 + h 1 + 1 δ 1 + + γ n + h n + 1 δ n ) Γ ( γ 1 + 1 δ 1 ) Γ ( γ n + 1 δ n ) ,
γ i + h j + 1 > 0 , γ j + 1 > 0 , a j > 0 , β < 1 , δ j > 0 , j = 1 , 2 , , n .
Letting h 2 = = h n = 0 , in (35), we get
E ( x 1 h 1 ) = Γ ( 1 + η 1 - β + γ 1 + 1 δ 1 + + γ n + 1 δ n ) Γ ( γ 1 + h 1 + 1 δ 1 ) [ ( 1 - β ) a 1 ] h 1 δ 1 ] Γ ( 1 + η 1 - β + γ 1 + h 1 + 1 δ 1 + γ 2 + 1 δ 2 + γ n + 1 δ n ) Γ ( γ 1 + 1 δ 1 ) ,
γ 1 + h 1 + 1 > 0 , γ j + 1 > 0 , a 1 > 0 , β < 1 , δ j > 0 , η > 0 , j = 1 , 2 , , n .
(13) is the h 1 th moment of a random variable with density function,
g 1 ( x 1 ) = K 1 x 1 γ 1 [ 1 - ( 1 - β ) a 1 x 1 δ 1 ] η 1 - β + γ 2 + 1 δ 2 + + γ n + 1 δ n ,
where K 1 is the normalizing constant.
Letting h 3 = = h n = 0 , in (35), we get
E ( x 1 h 1 x 2 h 2 ) = Γ ( 1 + η 1 - β + γ 1 + 1 δ 1 + + γ n + 1 δ n ) Γ ( γ 1 + h 1 + 1 δ 1 ) Γ ( γ 2 + h 2 + 1 δ 2 ) j = 1 2 [ ( ( 1 - β ) a j ) h j δ j ] Γ 1 + η 1 - β + i = 1 2 γ i + h i + 1 δ i + j = 3 n γ j + 1 δ j Γ ( γ 1 + 1 δ 1 ) Γ ( γ 2 + 1 δ 2 ) ,
γ 1 + h 1 + 1 ) > 0 , ( γ 2 + h 2 + 1 > 0 , γ j + 1 > 0 , a 1 > 0 , a 2 > 0 β < 1 , δ j > 0 , γ j + 1 > 0 , j = 1 , 2 , , n .
If we proceed in the similar way as in Section 4.1, here we can deduce the variance-covariance matrix of multivariate extended gamma for β < 1 .

7. Conclusions

Multivariate counterparts of the extended generalized gamma density is considered and some properties are discussed. Here we considered the variables as not independently distributed, but when the pathway parameter β 1 we can see that X 1 , X 2 , , X n will become independently distributed generalized gamma variables. Joint product moment of the multivariate extended gamma is obtained and some of its properties are discussed. We can see that the limiting case of the conditional density of this multivariate extended gamma is a generalized gamma density. A graphical representation of the pathway is given in Figure 1, Figure 2, Figure 3 and Figure 4.

Acknowledgments

The author acknowledges gratefully the encouragements given by Professor A. M. Mathai, Department of Mathematics and Statistics, McGill University, Montreal, QC, Canada in this work.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Mathai, A.M. A Pathway to matrix-variate gamma and normal densities. Linear Algebra Appl. 2005, 396, 317–328. [Google Scholar] [CrossRef]
  2. Joseph, D.P. Gamma distribution and extensions by using pathway idea. Stat. Pap. Ger. 2011, 52, 309–325. [Google Scholar] [CrossRef]
  3. Beck, C.; Cohen, E.G.D. Superstatistics. Physica A 2003, 322, 267–275. [Google Scholar] [CrossRef]
  4. Beck, C. Stretched exponentials from superstatistics. Physica A 2006, 365, 96–101. [Google Scholar] [CrossRef]
  5. Tsallis, C. Possible generalizations of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  6. Mathai, A.M.; Haubold, H.J. Pathway model, superstatistics Tsallis statistics and a generalized measure of entropy. Physica A 2007, 375, 110–122. [Google Scholar] [CrossRef]
  7. Kotz, S.; Balakrishman, N.; Johnson, N.L. Continuous Multivariate Distributions; John Wiley & Sons, Inc.: New York, NY, USA, 2000. [Google Scholar]
  8. Mathai, A.M.; Moschopoulos, P.G. On a form of multivariate gamma distribution. Ann. Inst. Stat. Math. 1992, 44, 106. [Google Scholar]
  9. Furman, E. On a multivariate gamma distribution. Stat. Probab. Lett. 2008, 78, 2353–2360. [Google Scholar] [CrossRef]
  10. Mathai, A.M.; Provost, S.B. On q-logistic and related models. IEEE Trans. Reliab. 2006, 55, 237–244. [Google Scholar] [CrossRef]
  11. Haubold, H.J.; Mathai, A.M.; Thomas, S. An entropic pathway to multivariate Gaussian density. arXiv, 2007; arXiv:0709.3820v. [Google Scholar]
  12. Tsallis, C. Nonextensive Statistical Mechanics and Thermodynamics. Braz. J. Phys. 1999, 29, 1–35. [Google Scholar]
  13. Thomas, S.; Jacob, J. A generalized Dirichlet model. Stat. Probab. Lett. 2006, 76, 1761–1767. [Google Scholar] [CrossRef]
  14. Thomas, S.; Thannippara, A.; Mathai, A.M. On a matrix-variate generalized type-2 Dirichlet density. Adv. Appl. Stat. 2008, 8, 37–56. [Google Scholar]
  15. Mathai, A.M. A Handbook of Generalized Special Functions for Statistical and Physical Sciences; Oxford University Press: Oxford, UK, 1993; pp. 58–116. [Google Scholar]
Figure 1. The graph of g 1 ( x ) , for γ = 1 , a = 1 , δ = 2 , η = 1 and for various values of β .
Figure 1. The graph of g 1 ( x ) , for γ = 1 , a = 1 , δ = 2 , η = 1 and for various values of β .
Axioms 06 00011 g001
Figure 2. β = 1 . 2 .
Figure 2. β = 1 . 2 .
Axioms 06 00011 g002
Figure 3. β = 1 . 5 .
Figure 3. β = 1 . 5 .
Axioms 06 00011 g003
Figure 4. β = 2 .
Figure 4. β = 2 .
Axioms 06 00011 g004
Figure 5. The graph of bivariate type-2 Dirichlet with γ 1 = γ 2 = 1 , η = 6 .
Figure 5. The graph of bivariate type-2 Dirichlet with γ 1 = γ 2 = 1 , η = 6 .
Axioms 06 00011 g005

Share and Cite

MDPI and ACS Style

Joseph, D.P. Multivariate Extended Gamma Distribution. Axioms 2017, 6, 11. https://doi.org/10.3390/axioms6020011

AMA Style

Joseph DP. Multivariate Extended Gamma Distribution. Axioms. 2017; 6(2):11. https://doi.org/10.3390/axioms6020011

Chicago/Turabian Style

Joseph, Dhannya P. 2017. "Multivariate Extended Gamma Distribution" Axioms 6, no. 2: 11. https://doi.org/10.3390/axioms6020011

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop