A New Effective Jackknifing Estimator in the Negative Binomial Regression Model
Abstract
:1. Introduction
2. Existing Estimators
2.1. Negative Binomial Regression Model MLE Estimator
2.2. Liu-Type Negative Binomial Regression Estimator (LNBR)
2.3. Ridge Negative Binomial Regression Estimator (RNBR)
3. Proposed Estimator
4. Comparison of Efficiency
- The suggested estimator is more efficient than to estimate if and only if for the NBR model.
- ii.
- The suggested estimator is more efficient than to estimate if and only if under the NRB model.
- iii.
- The suggested estimator is more efficient than to estimate if and only if under the NBR model.
- iv.
- The suggested estimator is more efficient than to estimate if and only if under the NBR model.
5. Application
5.1. Simulation Studies
5.1.1. Simulation Study 1
5.1.2. Simulation Study 2
5.2. Real Dataset Example
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Pittman, B.; Buta, E.; Krishnan-Sarin, S.; O’Malley, S.S.; Liss, T.; Gueorguieva, R. Models for analyzing zero-inflated and overdispersed count data: An application to cigarette and marijuana use. Nicotine Tob. Res. 2020, 22, 1390–1398. [Google Scholar] [CrossRef] [PubMed]
- Alrumayh, A.; Khogeer, H.A. A New Two-Parameter Discrete Distribution for overdispersed and Asymmetric Data: Its Properties, Estimation, Regression Model, and Applications. Symmetry 2023, 15, 1289. [Google Scholar] [CrossRef]
- Cameron, A.C.; Trivedi, P.K. Regression Analysis of Count Data; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Mamun, A.; Paul, S. Model Selection in Generalized Linear Models. Symmetry 2023, 15, 1905. [Google Scholar] [CrossRef]
- Liu, K. A new class of biased estimate in linear regression. Commun. Stat.-Theory Methods 1993, 22, 393–402. [Google Scholar]
- Månsson, K. On ridge estimators for the negative binomial regression model. Econ. Model. 2012, 29, 178–184. [Google Scholar] [CrossRef]
- Månsson, K. Developing a Liu estimator for the negative binomial regression model: Method and application. J. Stat. Comput. Simul. 2013, 83, 1773–1780. [Google Scholar] [CrossRef]
- Asar, Y. Liu-type negative binomial regression: A comparison of recent estimators and applications. In Trends and Perspectives in Linear Statistical Inference; Springer: Cham, Switzerland, 2018; pp. 23–39. [Google Scholar]
- Türkan, S.; Özel, G. A Jackknifed estimator for the negative binomial regression model. Commun. Stat.-Simul. Comput. 2018, 47, 1845–1865. [Google Scholar] [CrossRef]
- Alobaidi, N.N.; Shamany, R.E.; Algamal, Z.Y. A new ridge estimator for the negative binomial regression model. Thail. Stat. 2021, 19, 116–125. [Google Scholar]
- Akdeniz Duran, E.; Akdeniz, F. Efficiency of the modified jackknifed Liu-type estimator. Stat. Pap. 2012, 53, 265–280. [Google Scholar] [CrossRef]
- Arum, K.C.; Ugwuowo, F.I.; Oranye, H.E. Robust Modified Jackknife Ridge Estimator for the Poisson Regression Model with Multicollinearity and outliers. Sci. Afr. 2022, 17, e01386. [Google Scholar] [CrossRef]
- Akram, M.N.; Abonazel, M.R.; Amin, M.; Kibria, B.M.G.; Afzal, N. A new Stein estimator for the zero-inflated negative binomial regression model. Concurr. Comput. Pract. Exp. 2022, 34, e7045. [Google Scholar] [CrossRef]
- Dawoud, I.; Awwad, F.A.; Tag Eldin, E.; Abonazel, M.R. New Robust Estimators for Handling Multicollinearity and Outliers in the Poisson Model: Methods, Simulation and Applications. Axioms 2022, 11, 612. [Google Scholar] [CrossRef]
- Akram, M.N.; Amin, M.; Sami, F.; Mastor, A.B.; Egeh, O.M.; Muse, A.H. A new Conway Maxwell–Poisson Liu regression estimator method and application. J. Math. 2022, 2022, 3323955. [Google Scholar] [CrossRef]
- Amin, M.; Akram, M.N.; Kibria, B.G. A new adjusted Liu estimator for the Poisson regression model. Concurr. Comput. Pract. Exp. 2021, 33, e6340. [Google Scholar] [CrossRef]
- Sami, F.; Amin, M.; Butt, M.M. On the ridge estimation of the Conway-Maxwell Poisson regression model with multicollinearity: Methods and applications. Concurr. Comput. Pract. Exp. 2022, 34, e6477. [Google Scholar] [CrossRef]
- Amin, M.; Akram, M.N.; Amanullah, M. On the James-Stein estimator for the Poisson regression model. Commun. Stat.-Simul. Comput. 2022, 51, 5596–5608. [Google Scholar] [CrossRef]
- Amin, M.; Akram, M.N.; Majid, A. On the estimation of Bell regression model using ridge estimator. Commun. Stat.-Simul. Comput. 2023, 52, 854–867. [Google Scholar] [CrossRef]
- Batool, A.; Amin, M.; Elhassanein, A. On the performance of some new ridge parameter estimators in the Poisson-inverse Gaussian ridge regression. Alex. Eng. J. 2023, 70, 231–245. [Google Scholar] [CrossRef]
- Abonazel, M.R. New modified two-parameter Liu estimator for the Conway–Maxwell Poisson regression model. J. Stat. Comput. Simul. 2023, 93, 1976–1996. [Google Scholar] [CrossRef]
- Algamal, Z.; Lukman, A.; Golam, B.K.; Taofik, A. Modified Jackknifed Ridge Estimator in Bell Regression Model: Theory, Simulation and Applications. Iraqi J. Comput. Sci. Math. 2023, 4, 146–154. [Google Scholar]
- Algamal, Z.Y.; Abonazel, M.R.; Awwad, F.A.; Eldin, E.T. Modified Jackknife Ridge Estimator for the Conway-Maxwell-Poisson Model. Sci. Afr. 2023, 19, e01543. [Google Scholar] [CrossRef]
- Kibria, B.M.G.; Lukman, A.F. A new ridge-type estimator for the linear regression model: Simulations and applications. Scientifica 2020, 2020, 9758378. [Google Scholar] [CrossRef] [PubMed]
- Lukman, A.F.; Amin, M.; Kibria, B.G. K-L estimator for the linear mixed models: Computation and simulation. Concurr. Comput. Pract. Exp. 2022, 34, e6780. [Google Scholar] [CrossRef]
- Aladeitan, B.B.; Adebimpe, O.; Lukman, A.F.; Oludoun, O.; Abiodun, O.E. Modified Kibria-Lukman (MKL) estimator for the Poisson Regression Model: Application and simulation. F1000Research 2021, 10, 548. [Google Scholar] [CrossRef]
- Ugwuowo, F.I.; Oranye, H.E.; Arum, K.C. On the jackknife Kibria-Lukman estimator for the linear regression model. Commun. Stat.-Simul. Comput. 2021, 1–13. [Google Scholar] [CrossRef]
- Rasheed, H.A.; Sadik, N.J.; Algamal, Z.Y. Jackknifed Liu-type estimator in the Conway-Maxwell Poisson regression model. Int. J. Nonlinear Anal. Appl. 2022, 13, 3153–3168. [Google Scholar]
- Jabur, D.M.; Rashad, N.K.; Algamal, Z.Y. Jackknifed Liu-type estimator in the negative binomial regression model. Int. J. Nonlinear Anal. Appl. 2022, 13, 2675–2684. [Google Scholar]
- Dawoud, I.; Abonazel, M.R.; Awwad, F.A. Generalized Kibria-Lukman estimator: Method, simulation, and application. Front. Appl. Math. Stat. 2022, 8, 31. [Google Scholar] [CrossRef]
- Abonazel, M.R.; Saber, A.A.; Awwad, F.A. Kibria–Lukman estimator for the Conway–Maxwell Poisson regression model: Simulation and applications. Sci. Afr. 2023, 19, e01553. [Google Scholar] [CrossRef]
- Hilbe, J.M. Negative Binomial Regression; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Månsson, K.; Shukur, G. A Poisson ridge regression estimator. Econ. Model. 2011, 28, 1475–1481. [Google Scholar] [CrossRef]
- Algamal, Z.Y.; Abonazel, M.R. Developing a Liu-type estimator in beta regression model. Concurr. Comput. Pract. Exp. 2022, 34, e6685. [Google Scholar] [CrossRef]
- Singh, B.; Chaubey, Y.P.; Dwivedi, T.D. An almost unbiased ridge estimator. Sankhyā Indian J. Stat. Ser. B 1986, 48, 342–346. [Google Scholar]
- Khurana, M.; Chaubey, Y.P.; Chandra, S. Jackknifing the ridge regression estimator: A revisit. Commun. Stat.-Theory Methods 2014, 43, 5249–5262. [Google Scholar] [CrossRef]
- Oranye, H.E.; Ugwuowo, F.I. Modified jackknife Kibria–Lukman estimator for the Poisson regression model. Concurr. Comput. Pract. Exp. 2022, 34, e6757. [Google Scholar] [CrossRef]
- Farebrother, R.W. Further results on the mean square error of ridge regression. J. R. Stat. Soc. Ser. B 1976, 38, 248–250. [Google Scholar] [CrossRef]
- Trenkler, G.; Toutenburg, H. Mean squared error matrix comparisons between biased estimators—An overview of recent results. Stat. Pap. 1990, 31, 165–179. [Google Scholar] [CrossRef]
- Kibria, B.M.G. Performance of some new ridge regression estimators. Commun. Stat.-Simul. Comput. 2003, 32, 419–435. [Google Scholar] [CrossRef]
- Kibria, B.M.G.; Banik, S. Some Ridge Regression Estimators and Their Performances. J. Mod. Appl. Stat. Methods 2016, 15, 206–238. [Google Scholar] [CrossRef]
- Kibria, B.M.G. More than hundred (100) estimators for estimating the shrinkage parameter in a linear and generalized linear ridge regression models. J. Econom. Stat. 2022, 2, 233–252. [Google Scholar]
- Jäntschi, L. Detecting extreme values with order statistics in samples from continuous distributions. Mathematics 2020, 8, 216. [Google Scholar] [CrossRef]
- Huang, J.; Yang, H. A two-parameter estimator in the negative binomial regression model. J. Stat. Comput. Simul. 2014, 84, 124–134. [Google Scholar] [CrossRef]
- R Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing: Vienna, Austria, 2022. Available online: https://www.R-project.org/ (accessed on 25 January 2022).
- Lukman, A.F.; Kibria, B.M.G.; Nziku, C.K.; Amin, M.; Adewuyi, E.T.; Farghali, R. KL estimator: Dealing with multicollinearity in the logistic regression model. Mathematics 2023, 11, 340. [Google Scholar] [CrossRef]
- Koç, T.; Dünder, E. Jackknife Kibria-Lukman estimator for the beta regression model. Commun. Stat.-Theory Methods 2023, 1–17. [Google Scholar] [CrossRef]
- Kartal Koc, E.; Bozdogan, H. Model selection in multivariate adaptive regression splines (MARS) using information complexity as the fitness function. Mach. Learn. 2015, 101, 35–58. [Google Scholar] [CrossRef]
- Dünder, E. A hybridized consistent Akaike type information criterion for regression models in the presence of multicollinearity. Commun. Stat.-Simul. Comput. 2023, 1–10. [Google Scholar] [CrossRef]
- Dünder, E.; Gümüştekin, S.; Murat, N.; Cengiz, M.A. Variable selection in linear regression analysis with alternative Bayesian information criteria using differential evaluation algorithm. Commun. Stat.-Simul. Comput. 2018, 47, 605–614. [Google Scholar] [CrossRef]
- Jochmann, M. zic: Bayesian Inference for Zero-Inflated Count Models. R Package Version 0.9, 1. England. 2017. Available online: https://cran.r-project.org/web/packages/zic/zic.pdf (accessed on 25 January 2022).
- Alin, A. Multicollinearity. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 370–374. [Google Scholar] [CrossRef]
- Payne, E.H.; Gebregziabher, M.; Hardin, J.W.; Ramakrishnan, V.; Egede, L.E. An empirical approach to determine a threshold for assessing overdispersion in Poisson and negative binomial models for count data. Commun. Stat.-Simul. Comput. 2018, 47, 1722–1738. [Google Scholar] [CrossRef]
r | n | MLE | LNBR | RNBR | KLNBR | JKLNBR | MLE | LNBR | RNBR | KLNBR | JKLNBR |
---|---|---|---|---|---|---|---|---|---|---|---|
r = 0.90 | 20 | 0.82281 | 0.80917 | 0.73124 | 0.61829 | 0.59465 | 0.52961 | 0.51995 | 0.48750 | 0.41220 | 0.39651 |
25 | 0.57873 | 0.54248 | 0.51045 | 0.43776 | 0.40874 | 0.36949 | 0.34634 | 0.32589 | 0.27948 | 0.26096 | |
30 | 0.35849 | 0.33246 | 0.32299 | 0.28693 | 0.25846 | 0.22667 | 0.21021 | 0.20422 | 0.18147 | 0.16342 | |
50 | 0.19054 | 0.18237 | 0.18016 | 0.16644 | 0.14805 | 0.12097 | 0.11578 | 0.11438 | 0.10567 | 0.09399 | |
100 | 0.08354 | 0.08036 | 0.08084 | 0.07586 | 0.06927 | 0.05342 | 0.05139 | 0.05170 | 0.04851 | 0.04430 | |
300 | 0.02408 | 0.02368 | 0.02382 | 0.02324 | 0.02249 | 0.01541 | 0.01516 | 0.01525 | 0.01487 | 0.01439 | |
r = 0.95 | 20 | 0.98488 | 0.89712 | 0.73530 | 0.62715 | 0.61519 | 0.62685 | 0.57381 | 0.47031 | 0.40113 | 0.39343 |
25 | 0.76901 | 0.69927 | 0.60834 | 0.52946 | 0.49644 | 0.47459 | 0.43155 | 0.37543 | 0.32675 | 0.30638 | |
30 | 0.55637 | 0.50926 | 0.46567 | 0.40748 | 0.35669 | 0.35479 | 0.32475 | 0.29695 | 0.25984 | 0.22746 | |
50 | 0.34771 | 0.32462 | 0.30729 | 0.27100 | 0.22617 | 0.21601 | 0.21409 | 0.20167 | 0.16835 | 0.14050 | |
100 | 0.14216 | 0.13461 | 0.13320 | 0.12002 | 0.10488 | 0.09171 | 0.08684 | 0.08593 | 0.07743 | 0.06766 | |
300 | 0.03964 | 0.03872 | 0.03899 | 0.03761 | 0.03583 | 0.02518 | 0.02459 | 0.02476 | 0.02389 | 0.02276 | |
r = 0.99 | 20 | 6.78309 | 5.53115 | 2.25137 | 2.25111 | 1.72358 | 4.52296 | 3.68817 | 1.50121 | 1.50104 | 1.14928 |
25 | 4.85253 | 4.07813 | 1.97189 | 1.92584 | 1.43208 | 2.87613 | 2.41714 | 1.16875 | 1.14146 | 0.84880 | |
30 | 3.17543 | 2.75877 | 1.57596 | 1.49806 | 1.08658 | 1.92094 | 1.66888 | 0.95335 | 0.90623 | 0.65731 | |
50 | 1.75179 | 1.57307 | 1.06358 | 0.97317 | 0.69003 | 1.08999 | 0.97878 | 0.66177 | 0.60552 | 0.42935 | |
100 | 0.58161 | 0.52103 | 0.43475 | 0.35117 | 0.24148 | 0.37236 | 0.33365 | 0.27833 | 0.22482 | 0.15460 | |
300 | 0.13537 | 0.12723 | 0.12440 | 0.10977 | 0.09105 | 0.08632 | 0.08133 | 0.07932 | 0.07000 | 0.05809 | |
Average | 1.23209 | 1.06463 | 0.64223 | 0.59502 | 0.47787 | 0.77630 | 0.67125 | 0.40537 | 0.37493 | 0.30162 |
r | n | MLE | LNBR | RNBR | KLNBR | JKLNBR | MLE | LNBR | RNBR | KLNBR | JKLNBR |
---|---|---|---|---|---|---|---|---|---|---|---|
r = 0.90 | 20 | 1.76150 | 1.74362 | 1.71281 | 1.60892 | 1.46444 | 1.06560 | 1.05478 | 1.03615 | 0.97330 | 0.88590 |
25 | 1.15564 | 1.12731 | 1.12589 | 1.08779 | 0.90920 | 0.71065 | 0.69323 | 0.69235 | 0.66892 | 0.55910 | |
30 | 0.71259 | 0.67670 | 0.64511 | 0.59164 | 0.49423 | 0.44048 | 0.41830 | 0.39877 | 0.36572 | 0.30551 | |
50 | 0.29787 | 0.28684 | 0.28329 | 0.26057 | 0.21953 | 0.19296 | 0.18582 | 0.18352 | 0.16880 | 0.14221 | |
100 | 0.10262 | 0.09947 | 0.10043 | 0.09453 | 0.08510 | 0.06751 | 0.06544 | 0.06607 | 0.06219 | 0.05598 | |
300 | 0.03446 | 0.03408 | 0.03428 | 0.03369 | 0.03257 | 0.02144 | 0.02121 | 0.02133 | 0.02096 | 0.02027 | |
r = 0.95 | 20 | 3.76635 | 3.46502 | 2.99998 | 2.82094 | 2.28779 | 2.23235 | 2.05375 | 1.77811 | 1.67200 | 1.35599 |
25 | 2.27831 | 2.10201 | 1.85031 | 1.71650 | 1.37987 | 1.58131 | 1.45895 | 1.28425 | 1.19138 | 0.95773 | |
30 | 1.18608 | 1.10448 | 1.00059 | 0.90914 | 0.72249 | 0.78217 | 0.72836 | 0.65985 | 0.59954 | 0.47645 | |
50 | 0.49967 | 0.47245 | 0.45084 | 0.39886 | 0.31565 | 0.32582 | 0.30807 | 0.29397 | 0.26008 | 0.20582 | |
100 | 0.20908 | 0.20175 | 0.20106 | 0.18566 | 0.15935 | 0.13429 | 0.12958 | 0.12914 | 0.11925 | 0.10235 | |
300 | 0.06094 | 0.06010 | 0.06043 | 0.05890 | 0.05605 | 0.03609 | 0.03559 | 0.03579 | 0.03488 | 0.03320 | |
r = 0.99 | 20 | 19.08371 | 17.55004 | 11.24236 | 11.10921 | 10.01920 | 11.51473 | 10.58935 | 6.78342 | 6.70308 | 6.04539 |
25 | 11.17668 | 10.27346 | 6.67784 | 6.26168 | 5.55210 | 6.87297 | 6.31754 | 4.10646 | 3.85055 | 3.41420 | |
30 | 5.50593 | 5.06716 | 3.41039 | 3.49166 | 2.47234 | 3.40347 | 3.13225 | 2.10812 | 2.15836 | 1.52827 | |
50 | 2.08247 | 1.93114 | 1.44001 | 1.29915 | 0.77994 | 1.29572 | 1.20156 | 0.89598 | 0.80834 | 0.48528 | |
100 | 0.90630 | 0.86540 | 0.76670 | 0.68415 | 0.47490 | 0.58710 | 0.56061 | 0.49667 | 0.44319 | 0.30764 | |
300 | 0.24833 | 0.24247 | 0.24001 | 0.22519 | 0.19391 | 0.16335 | 0.15950 | 0.15788 | 0.14813 | 0.12756 | |
Average | 2.83714 | 2.62797 | 1.90235 | 1.82434 | 1.53437 | 1.74600 | 1.61744 | 1.17377 | 1.12493 | 0.94494 |
r | n | MLE | LNBR | RNBR | KLNBR | JKLNBR | MLE | LNBR | RNBR | KLNBR | JKLNBR |
---|---|---|---|---|---|---|---|---|---|---|---|
c = 0.3 | 20 | 1.25489 | 0.38792 | 0.29644 | 0.28027 | 0.16622 | 0.75913 | 0.23467 | 0.17933 | 0.16955 | 0.10055 |
25 | 0.89141 | 0.36293 | 0.30554 | 0.29336 | 0.16624 | 0.54816 | 0.22318 | 0.18789 | 0.18040 | 0.10223 | |
30 | 0.63844 | 0.33503 | 0.31864 | 0.26668 | 0.17622 | 0.39465 | 0.20710 | 0.19697 | 0.16485 | 0.10893 | |
50 | 0.47549 | 0.30699 | 0.33580 | 0.23874 | 0.18365 | 0.30803 | 0.19887 | 0.21753 | 0.15466 | 0.11897 | |
100 | 0.37725 | 0.27441 | 0.33559 | 0.25214 | 0.20106 | 0.24817 | 0.18052 | 0.22077 | 0.16587 | 0.13227 | |
300 | 0.33704 | 0.27174 | 0.32888 | 0.27919 | 0.23617 | 0.20971 | 0.16908 | 0.20463 | 0.17372 | 0.14695 | |
c = 0.6 | 20 | 1.22176 | 0.53134 | 0.38674 | 0.27531 | 0.16582 | 0.72415 | 0.31493 | 0.22922 | 0.16318 | 0.09828 |
25 | 0.83048 | 0.42347 | 0.35960 | 0.25951 | 0.17069 | 0.57641 | 0.29392 | 0.24959 | 0.18012 | 0.11847 | |
30 | 0.57322 | 0.33557 | 0.33807 | 0.24391 | 0.17606 | 0.37801 | 0.22129 | 0.22294 | 0.16085 | 0.11610 | |
50 | 0.42685 | 0.27269 | 0.32254 | 0.22820 | 0.18142 | 0.27833 | 0.17781 | 0.21032 | 0.14880 | 0.11830 | |
100 | 0.36879 | 0.27047 | 0.33222 | 0.24679 | 0.19795 | 0.23687 | 0.17372 | 0.21338 | 0.15851 | 0.12714 | |
300 | 0.33852 | 0.27097 | 0.33074 | 0.27906 | 0.23501 | 0.20049 | 0.16049 | 0.19588 | 0.16528 | 0.13919 | |
c = 0.9 | 20 | 0.86408 | 0.35249 | 0.39052 | 0.25198 | 0.16802 | 0.52137 | 0.21269 | 0.23563 | 0.15204 | 0.10138 |
25 | 0.70937 | 0.34565 | 0.37079 | 0.24876 | 0.17214 | 0.43622 | 0.21255 | 0.22801 | 0.15297 | 0.10586 | |
30 | 0.57578 | 0.33235 | 0.35001 | 0.24682 | 0.17845 | 0.35592 | 0.20544 | 0.21636 | 0.15257 | 0.11031 | |
50 | 0.46243 | 0.30905 | 0.35338 | 0.23802 | 0.18442 | 0.28773 | 0.19229 | 0.21987 | 0.14810 | 0.11475 | |
100 | 0.36648 | 0.26186 | 0.33526 | 0.24317 | 0.19546 | 0.23741 | 0.16963 | 0.21718 | 0.15753 | 0.12662 | |
300 | 0.33285 | 0.26179 | 0.32622 | 0.27294 | 0.22891 | 0.21895 | 0.17221 | 0.21459 | 0.17954 | 0.15058 | |
Average | 0.61362 | 0.32815 | 0.33983 | 0.25805 | 0.18800 | 0.38443 | 0.20669 | 0.21445 | 0.16270 | 0.11871 |
Regressor | Description |
---|---|
age | Age of the people |
agesq | Square of the age |
health | Health satisfaction point between [0, 10] |
handicap | A dummy variable (0–1) the handicapping situation |
hdegree | The percentage of the handicap degree |
married | A dummy variable (0–1) for the marital status |
schooling | The number of years for schooling |
hhincome | Household income per month |
self | A dummy variable (0–1) for the self-employed status |
civil | A dummy variable (0–1) for the civil servant |
bluec | A dummy variable (0–1) for the blue-collar employee |
employed | A dummy variable (0–1) for the employed status |
public | A dummy variable (0–1) for the health insurance |
addon | A dummy variable (0–1) for the add-on insurance |
Age | Agesq | Health | Handicap | Hdegree | Married | Schooling | Hhincome | Self | Civil | Bluec | Employed | Public | Addon | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
age | 1 | |||||||||||||
agesq | 0.993 | 1 | ||||||||||||
health | −0.242 | −0.243 | 1 | |||||||||||
handicap | 0.290 | 0.303 | −0.313 | 1 | ||||||||||
hdegree | 0.274 | 0.290 | −0.325 | 0.913 | 1 | |||||||||
married | 0.447 | 0.414 | −0.067 | 0.084 | 0.056 | 1 | ||||||||
schooling | −0.078 | −0.092 | 0.097 | −0.154 | −0.162 | −0.055 | 1 | |||||||
hhincome | 0.039 | 0.020 | 0.121 | −0.113 | −0.123 | 0.132 | 0.310 | 1 | ||||||
self | 0.005 | −0.007 | 0.048 | −0.083 | −0.082 | 0.022 | 0.075 | 0.106 | 1 | |||||
civil | −0.004 | −0.016 | 0.057 | −0.043 | −0.049 | 0.014 | 0.263 | 0.146 | −0.106 | 1 | ||||
bluec | −0.114 | −0.122 | 0.021 | −0.069 | −0.094 | 0.040 | −0.351 | −0.124 | −0.210 | −0.229 | 1 | |||
employed | −0.207 | −0.254 | 0.200 | −0.311 | −0.364 | 0.082 | 0.112 | 0.229 | 0.144 | 0.157 | 0.311 | 1 | ||
public | 0.001 | 0.017 | −0.077 | 0.078 | 0.084 | 0.010 | −0.323 | −0.247 | −0.169 | −0.639 | 0.275 | −0.117 | 1 | |
addon | 0.006 | 0.003 | 0.002 | −0.006 | −0.017 | 0.024 | 0.051 | 0.022 | 0.028 | −0.013 | −0.037 | 0.049 | 0.056 | 1 |
Coefficient | MLE | LNBR | RNBR | KLNBR | JKLNBR |
---|---|---|---|---|---|
(Intercept) | 3.37579 | 2.08590 | 2.25278 | −0.71307 | −0.21224 |
age | −0.06361 | −0.00910 | −0.01601 | 0.10531 | 0.10237 |
agesq | 0.89263 | 0.27995 | 0.35745 | −0.99982 | −0.50932 |
health | −0.26084 | −0.24793 | −0.24971 | −0.21675 | −0.21561 |
handicap | 0.15441 | 0.12473 | 0.12958 | 0.03661 | 0.00472 |
hdegree | 0.00191 | 0.00305 | 0.00289 | 0.00606 | 0.00606 |
married | −0.11917 | −0.16474 | −0.15907 | −0.25701 | −0.23221 |
schooling | 0.00166 | 0.00670 | 0.00597 | 0.01976 | 0.01963 |
hhincome | 0.00665 | 0.00743 | 0.00735 | 0.00870 | 0.00867 |
self | −0.23726 | −0.22652 | −0.22812 | −0.19742 | −0.16125 |
civil | −0.05374 | −0.02211 | −0.02617 | 0.04492 | 0.03463 |
bluec | 0.19122 | 0.20849 | 0.20629 | 0.24496 | 0.21506 |
employed | 0.03120 | −0.01845 | −0.01260 | −0.11041 | −0.09170 |
public | 0.15695 | 0.21742 | 0.20944 | 0.35222 | 0.28912 |
addon | 0.42515 | 0.36128 | 0.37208 | 0.16040 | 0.03457 |
MSE | 6.71325 | 4.13946 | 3.41933 | 0.76437 | 0.25113 |
K = 1.255031 | |||||
d = 7.794738e-10 |
Data | ML | LNBR | RNBR | KLNBR | JKLNBR |
---|---|---|---|---|---|
Test | 30.6046 | 30.7806 | 30.7560 | 31.2358 | 26.5837 |
Train | 22.1274 | 22.3768 | 22.3426 | 23.0007 | 19.4763 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Koç, T.; Koç, H. A New Effective Jackknifing Estimator in the Negative Binomial Regression Model. Symmetry 2023, 15, 2107. https://doi.org/10.3390/sym15122107
Koç T, Koç H. A New Effective Jackknifing Estimator in the Negative Binomial Regression Model. Symmetry. 2023; 15(12):2107. https://doi.org/10.3390/sym15122107
Chicago/Turabian StyleKoç, Tuba, and Haydar Koç. 2023. "A New Effective Jackknifing Estimator in the Negative Binomial Regression Model" Symmetry 15, no. 12: 2107. https://doi.org/10.3390/sym15122107
APA StyleKoç, T., & Koç, H. (2023). A New Effective Jackknifing Estimator in the Negative Binomial Regression Model. Symmetry, 15(12), 2107. https://doi.org/10.3390/sym15122107