2.2.4. Lifetime Analysis

The BS model is commonly used to explain survival and material resistance data. The survival and risk (or hazard) functions are important indicators in such fields. For the FBS model these functions are given next.

**Proposition 4.** *Let T* ∼ *FBS*(*<sup>α</sup>*, *β*, *δ*, *λ*) *with α*, *β* ∈ R<sup>+</sup> *and δ*, *λ* ∈ R*. Then*


$$r(t) = \begin{cases} \frac{c\_{\delta}\mu\_{t}^{\epsilon}\phi(|a\_{t}|+\delta)\Phi(\lambda a\_{t})}{1-c\_{\delta}\Phi\_{\text{BN}\_{\lambda}}\left(\frac{\lambda\delta}{\sqrt{1+\lambda^{2}}},a\_{t}-\delta\right)}, & \text{if } \ 0 < t < \beta\\ \frac{c\_{\delta}\mu\_{t}^{\epsilon}\phi(|a\_{t}|+\delta)\Phi(\lambda a\_{t})}{1-c\_{\delta}\left[\Phi\_{\text{BN}\_{\lambda}}\left(\frac{\lambda\delta}{\sqrt{1+\lambda^{2}}},-\delta\right)+\Phi\_{\text{BN}\_{\lambda}}\left(-\frac{\lambda\delta}{\sqrt{1+\lambda^{2}}},a\_{t}+\delta\right)-\Phi\_{\text{BN}\_{\lambda}}\left(-\frac{\lambda\delta}{\sqrt{1+\lambda^{2}}},\delta\right)\right]}, & \text{if } \ t \ge \beta \end{cases}$$

*with* <sup>Φ</sup>*BNλ*(·) *the cdf of the bivariate normal given in Proposition 1.*

In Figure 3, the hazard function for those pdf's considered in Figures 1 and 2 are plotted. These graphs show that, for the FBS distribution, the hazard function admits a variety of shapes, which is interesting from an applied point of view.

**Figure 3.** Hazard function of the FBS distribution for plots corresponding to Figure 1 (a), (b): *α* = 0.75, *β* = 1 (both fixed), *δ* = 0.75 (red solid line), 1.5 (green dashed line), 2.25 (black dotted line) and 3.0 (blue dashed dotted line), in (a) *λ* = 1 versus (b) *λ* = −1. For plots corresponding to Figure 2 (a), (b): *α* = 0.30, *β* = 0.75 (both fixed), *δ* = −0.75 (red solid line), –1.5 (green dashed line), –2.25 (black dotted line) and –3.0 (blue dashed dotted line), in (a) *λ* = 0.5 versus (b) *λ* = −0.5.

**Remark 2.** *More complicated hazard functions than the traditional ones are obtained when we are dealing with models with complex structure, as it happens with the FBS. For instance, in Figure 3, we have two situations:*

*1. r*(*t*) *corresponding to Figure 1a,b. These are, first, quickly increasing, later decreasing more slowly or even in a flat way. It can be applied in practical situations in which the risk of failure increases quickly until certain point in which its behaviour becomes flatter. As [23] points out, the flat area is very interesting in survival analysis and reliability contexts.*

*2. r*(*t*) *corresponding to Figure 2a,b are increasing-decreasing-increasing. This kind of hazard functions has been recently introduced and discussed in literature, due to its interest in reliability of systems, see for instance [23] or [24] (and references therein). In plot for Figure 2b, r*(*t*) *is (quickly) increasing—or (quickly) decreasing. On the other hand, for Figure 2a the initial effect increasing—decreasing is less accentuated.*

### **3. Moments and Maximum Likelihood Estimation**

Moments of the FBS model can be obtained from the moments of the flexible skew-normal model given in [2]. The following results present important properties relating those distributions, and the expressions for the first moment and variance in the FBS model.

**Theorem 2.** *Let T* ∼ *FBS*(*<sup>α</sup>*, *β*, *δ*, *λ*) *and Z* ∼ *FSN*(*<sup>δ</sup>*, *<sup>λ</sup>*). *Then* <sup>E</sup>(*Tr*)*, r* = 0, 1, ...*, always exists. Moreover*

$$\mathbb{E}(T') = \frac{\beta^r}{4^r} \sum\_{k=0}^{2r} \binom{2r}{k} \mathbb{E}\left[ (aZ)^k \left( a^2 Z^2 + 4 \right)^{\frac{2r-k}{2}} \right], \qquad r = 0, 1, \ldots \tag{15}$$

**Proof.** From (7), we can write

$$T = \frac{\beta}{4} \left\{ \alpha Z + \left( \alpha^2 Z^2 + 4 \right)^{1/2} \right\}^2$$

Taking expectation of the rth-power of T

$$\mathbb{E}(T^r) = \frac{\beta^r}{4^r} \mathbb{E}\left[ \left\{ aZ + \left( a^2 Z^2 + 4 \right)^{1/2} \right\}^{2r} \right]. \tag{16}$$

.

From (16), note that for *r* = 0, 1, ..., E(*Tr*) exists if and only if E(*Z*2*r*) exists. On the other hand, it can be seen in [2] that E(*Z*2*r*) always exists, and therefore E(*Tr*) too.

Finally note that (15) is the result of applying the binomial formula to (16).

Next, explicit expressions for the expected value and variance of *T* ∼ *FBS*(*<sup>α</sup>*, *β*, *δ*, *λ*) are given. In these expressions, *κj* = E*SFN Zj* 2 √*α*2*Z*<sup>2</sup> + 4 with *Z* ∼ *FSN*(*<sup>δ</sup>*, *<sup>λ</sup>*).

**Theorem 3.** *Let T* ∼ *FBS*(*<sup>α</sup>*, *β*, *δ*, *<sup>λ</sup>*)*. Then*

$$\mathbb{E}(T) = \beta \left[ 1 + a\kappa\_1 + \frac{a^2}{2}c\_\delta \left\{ (1 + \delta^2)(1 - \Phi(\delta)) - \delta\phi(\delta) \right\} \right],\tag{17}$$

$$\begin{aligned} \mathbb{E}(T^2) &= \beta^2 \left[ \frac{7a^4 c\_\delta}{16} \left( (3 + 6\delta^2 + \delta^4)(1 - \Phi(\delta)) - \delta(5 + \delta^2)\Phi(\delta) \right) + a^3 \kappa\_3 + 2a\kappa\_1 + 1 \right] \\ &+ 2a^2 \beta^2 c\_\delta \left( (1 + \delta^2)(1 - \Phi(\delta)) - \delta\Phi(\delta) \right) \end{aligned}$$

*and*

$$\begin{split} Var(T) &= \beta^2 \left[ \frac{7a^4 c\_\delta}{16} \left( (3 + 6\delta^2 + \delta^4)(1 - \Phi(\delta)) - \delta(5 + \delta^2)\Phi(\delta) \right) + a^3 \kappa\_3 - a^2 \kappa\_1 + 1 \right] \\ &- \frac{a^2 \beta^2 c\_\delta}{4} \left( (1 + \delta^2)(1 - \Phi(\delta)) - \delta\phi(\delta) \right) \left( a^2 c\_\delta \left( (1 + \delta^2)(1 - \Phi(\delta)) - \delta\phi(\delta) \right) + 4(1 - a\kappa\_1) \right) \\ &\qquad \quad - (\beta^j \sqrt{\gamma \sigma \tau}) \left( \gamma \frac{\sigma \tau \sigma}{2\pi \tau} \right) \end{split}$$

*where κj* = E*FSN Zj* 2 √*α*2*Z*<sup>2</sup> + 4*and Z* ∼ *FSN*(*<sup>δ</sup>*, *<sup>λ</sup>*)*.* **Proof.** These results follows from Theorem 2 and the expressions of *<sup>κ</sup>j*, which have been computed by using the results for moments of *Z* ∼ *FSN*(*<sup>δ</sup>*, *λ*) obtained in [2].

As illustration, note that for the case *r* = 1, (15) reduces to

$$\begin{aligned} \mathbb{E}(T) &= \frac{\beta}{4} \left[ \mathbb{E}(a^2 Z^2 + 4) + 2\mathbb{E}\left( (aZ)\sqrt{a^2 Z^2 + 4} \right) + \mathbb{E}(a^2 Z^2) \right] \\ &= \beta \left[ 1 + a\mathbb{x}\_1 + \frac{a^2}{2} \mathbb{E}(Z^2) \right] \end{aligned}$$

it can be seen in [2] that E(*Z*<sup>2</sup>) = *<sup>c</sup>δ*(<sup>1</sup> + *δ*<sup>2</sup>)(1 − Φ(*δ*)) − *δφ*(*δ*)), and so (17) is obtained.

### *3.1. Maximum Likelihood Estimators*

Parameter estimation in the BS model has been the topic of interest in many papers. Among others, we mentioned [25–27]. To estimate the parameters in the usual BS model, the modified moment method (MME) and maximum likelihood (MLE) are commonly used. To start the maximum likelihood approach moment estimators are used which are given by

$$
\beta\_M = \sqrt{s r}, \qquad \pounds\_M = \sqrt{2\left(\sqrt{\frac{s}{r}} - 1\right)}.
$$

where *s* = 1 *n* ∑*n i*=1 *ti* and *r* = 1 *n* ∑*n i*=1 1 *ti* −1 are the sample (arithmetic) and harmonic mean, respectively. Relevant aspects of this distribution such as its robustness with respect to parameter estimation and *<sup>O</sup>*(*n*<sup>−</sup><sup>1</sup>) bias corrections for MLEs, can be seen in [25–27].

In the following, we discuss MLE estimation for the FBS model in depth. Thus, given *n* observations independent and identically distributed, *T*1, *T*2, ..., *Tn*, with *Ti* ∼ *FBS*(*<sup>α</sup>*, *β*, *δ*, *<sup>λ</sup>*), the log-likelihood function for the parameter vector *θ* = (*<sup>α</sup>*, *β*, *δ*, *λ*) is given by

$$\ell(\boldsymbol{\theta}) = -n\left(\log(a) + \frac{1}{2}\log(\beta) + \log\left(1 - \Phi(\delta)\right)\right) - \frac{3}{2}\sum\_{i=1}^{n} \log(t\_i) + \sum\_{i=1}^{n} \log(t\_i + \beta)$$

$$-\frac{1}{2}\sum\_{i=1}^{n} (a\_{l\_i}^2 + 2\delta|a\_{l\_i}| + \delta^2) + \sum\_{i=1}^{n} \log\left(\Phi(\lambda a\_{l\_i})\right). \tag{18}$$

To maximize *l*(*θ*) in *θ*, consider the first derivatives of *l*(*θ*) with respect to *α*, *β*, *δ* and *λ*, denoted as ˙ *l<sup>α</sup>*, ˙ *lβ*, ˙ *lδ* and ˙ *lλ*, respectively. From ˙ *lα* = 0, ˙ *lβ* = 0, ˙ *lδ* = 0 and ˙ *lλ* = 0, the likelihood equations are given by

$$-n + \sum\_{i=1}^{n} a\_{t\_i}^2 - \delta \sum\_{i=1}^{n} |a\_{t\_i}| - \lambda \sum\_{i=1}^{n} a\_{t\_i} \frac{\phi(\lambda a\_{t\_i})}{\Phi(\lambda a\_{t\_i})} = 0 \tag{19}$$

$$-\frac{n}{2\beta} + \sum\_{i=1}^{n} \frac{1}{t\_i + \beta} + \frac{1}{2a\beta^{3/2}} \sum\_{i=1}^{n} \left( \text{sgn}(a\_{t\_i}) \left( |a\_{t\_i}| + \delta \right) - \lambda \frac{\Phi(\lambda a\_{t\_i})}{\Phi(\lambda a\_{t\_i})} \right) \frac{t\_i + \beta}{\sqrt{t\_i}} = 0 \tag{20}$$

$$\delta - \frac{\phi(\delta)}{1 - \Phi(\delta)} = -\frac{1}{n} \sum\_{i=1}^{n} |a\_{t\_i}| \tag{21}$$

$$\sum\_{i=1}^{n} a\_{t\_i} \frac{\phi(\lambda a\_{t\_i})}{\Phi(\lambda a\_{t\_i})} = \begin{array}{c} \text{0} \end{array} \tag{22}$$

in which *sgn*(·) denotes the *sign* function.

The solution to the previous system of equations must be obtained by iterative methods such as the Newton-Raphson or quasi-Newton procedures, which can be implemented using the statistical software R, [28].

As initial estimates of *α* and *β* can be proposed the estimates of these parameters obtained in the basic BS model, denoted as *α*0 and *β* 0. These estimates can be plugged into (21) and (22) to obtain preliminar estimates of *δ* and *λ*, *δ* 0 and *λ*0, and so, start the recursion.

### *3.2. Expected and Observed Information Matrices*

Recall that, the Fisher information matrix is given by

$$I(\theta) \;=\; \left(j\_{i,j}\right)\_{i,j=\alpha,\theta,\delta,\lambda} \; ^\*$$

which entries are equal to minus the second partial derivatives of the log-likelihood function given in (18) with respect to the parameters in the model. They are denoted as *jαα* = − *∂*2 *∂α*<sup>2</sup> *l*(*θ*), and so on. So we have

$$j\_{aa} = -\frac{n}{a^2} + \frac{1}{a^2} \sum\_{i=1}^n \left(3a\_{t\_i}^2 + 2\delta |a\_{t\_i}|\right) + \frac{\lambda}{a^2} \sum\_{i=1}^n a\_{t\_i} w\_i \left(2 + \lambda a\_{t\_i} B\_i\right),$$

$$j\_{\beta a} = -\frac{1}{a^3 \beta^2} \sum\_{i=1}^n \left(\frac{\beta^2 - t\_i^2}{t\_i}\right) + \frac{1}{2a^2 \beta^{3/2}} \sum\_{i=1}^n \left(\delta \text{sgn}(a\_{t\_i}) + \lambda w\_i (-1 + \lambda a\_{t\_i} B\_i)\right) \left(\frac{t\_i + \beta}{\sqrt{t\_i}}\right) \gamma$$

$$\begin{split} j\_{\beta\delta} &= -\frac{n}{2\beta^2} + \sum\_{i=1}^n \frac{1}{t\_i + \beta} + \frac{1}{a^2 \beta^3} \sum\_{i=1}^n t\_i + \frac{1}{4a\beta^{5/2}} \sum\_{i=1}^n \left(\delta \text{sgn}(a\_{i\}) - \lambda w\_i\right) \left(\frac{3t\_i + \beta}{\sqrt{t\_i}}\right) \\ &+ \frac{\lambda^2}{4a^2 \beta^{3/2}} \sum\_{i=1}^n w\_i B\_i \left(\frac{t\_i + \beta}{\sqrt{t\_i}}\right)^2, \\ j\_{\delta\kappa} &= -\frac{1}{a} \sum\_{i=1}^n |a\_{i\mid}|, \qquad j\_{\delta\beta} = -\frac{1}{2a^2 \beta^{3/2}} \sum\_{i=1}^n s\_i m(a\_{i\}) \left(\frac{t\_i + \beta}{\sqrt{t\_i}}\right), \\ j\_{\lambda\kappa} &= \frac{1}{a} \sum\_{i=1}^n a\_{\delta\_i} w\_i \left(1 - \lambda a\_{\delta\_i} B\_i\right), \qquad j\_{\lambda\beta} = \frac{1}{2a\beta^{3/2}} \sum\_{i=1}^n w\_i \left(1 + \lambda a\_{\delta} B\_i\right) \left(\frac{t\_i + \beta}{\sqrt{t\_i}}\right). \end{split}$$

$$\begin{aligned} \cdots &= 1 \\ j\_{\delta\delta} &= n \left( w\_{\delta} (\delta - w\_{\delta}) + 1 \right) \end{aligned} \qquad \qquad \begin{aligned} \tau^{\lambda}\nu & \quad \iota = 1 \\ j\_{\delta a} &= j\_{\delta \delta} = j\_{\lambda \delta} = 0 \\ \tau^{\lambda} \delta &= \sum\_{i=1}^{n} a\_{t\_i}^{2} w\_i B\_i \end{aligned}$$

,

where *w* = *φ*(*<sup>λ</sup>at*) <sup>Φ</sup>(*<sup>λ</sup>at*), *wδ* = *φ*(*δ*)/(1 − Φ(*δ*)) and *B* = *λat* + *w*.

The Fisher (expected) information matrix would be obtained by computing the expected values of the above second derivatives. Taking in this matrix *δ* = *λ* = 0, that is, *T* ∼ *BS*(*<sup>α</sup>*, *β*), and, using results in [21], we have

$$I(\theta) = \begin{pmatrix} \frac{2}{a^2} & 0 & -\frac{1}{\pi}\sqrt{\frac{2}{\pi}} & 0\\ 0 & a^{-2}\beta^{-2}\left(1 + \frac{aq(a)}{\sqrt{2}\pi}\right) & 0 & \frac{1}{\sqrt{2}\pi}\frac{1}{a\beta^{3/2}}A\_1(t)\\ -\frac{1}{a}\sqrt{\frac{2}{\pi}} & 0 & 1 - \frac{2}{\pi} & 0\\ 0 & \frac{1}{\sqrt{2}\pi}\frac{1}{a\beta^{3/2}}A\_1(t) & 0 & \frac{2}{\pi} \end{pmatrix}.$$

where *<sup>A</sup>*1(*t*) = E *ti*+*<sup>β</sup>* √*ti* , *q*(*α*) = *α*& 2*π* − *π* exp( 2*α*2 ) 2 *er f c*( 2*α* ), with *er f <sup>c</sup>*(·) the error function, i.e., *er f c*(*x*) = √2*π* ∞*x* exp(−*t*<sup>2</sup>)*dt*, see [29].

It can be shown that |*I*(*θ*)| = 0, so the Fisher information matrix is not singular at *δ* = *λ* = 0. Hence, for large samples, the MLE, *θ*, of *θ* is asymptotically normal, that is,

$$\sqrt{n}\left(\widehat{\mathfrak{q}}-\mathfrak{q}\right) \stackrel{\mathcal{L}}{\longrightarrow} N\_4(\mathbf{0}, \ I(\mathfrak{q})^{-1})\_\prime$$

resulting that the asymptotic variance of the MLE, *θ*ˆ, is the inverse of Fisher information matrix *<sup>I</sup>*(*θ*). Since the parameters are unknown, usually the observed information matrix is considered where the unknown parameters are estimated by ML.

Asymptotic confidence intervals for the parameters in the FBS model can be obtained from these results.
