[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Ethics Considerations::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Indexing and Abstracting



 
..
Social Media

..
Licenses
Creative Commons License
This Journal is licensed under a Creative Commons Attribution NonCommercial 4.0
International License
(CC BY-NC 4.0).
 
..
Similarity Check Systems


..
:: Search published articles ::
Showing 237 results for Type of Study: Research

Ebrahim Amini-Seresht, Ghobad Barmalzan, Ebrahim Nasiroleslami‎,
Volume 16, Issue 1 (9-2022)
Abstract

This paper deals with some stochastic comparisons of convolution of random variables comprising scale variables. Sufficient conditions are established for these convolutions' likelihood ratio ordering and hazard rate order. The results established in this paper generalize some known results in the literature. Several examples are also presented for more illustrations.


Eisa Mahmoudi, Soudabeh Sajjadipanah, Mohammad Sadegh Zamani,
Volume 16, Issue 1 (9-2022)
Abstract

In this paper, a modified two-stage procedure in the Autoregressive model  AR(1) is considered, which investigates the point and the interval estimation of the mean based on the least-squares estimator. The modified two-stage procedure is as effective as the best fixed-sample size procedure. In this regard, the significant properties of the procedure, including asymptotic risk efficiency, first-order efficiency, consistent, and asymptotic distribution of the mean, are established. Then, a Monte Carlo simulation study is deduced to investigate the modified two-stage procedure. The performance of estimators and confidence intervals are evaluated utilizing a simulation study. Finally, real-time series data is considered to illustrate the applicability of the modified two-stage procedure.

Farzad Eskandari, Hamid Haji Aghabozorgi,
Volume 16, Issue 1 (9-2022)
Abstract

Graphical mixture models provide a powerful tool to visually depict the conditional independence relationships between high-dimensional heterogeneous data. In the study of these models, the distribution of the mixture components is mostly considered multivariate normal with different covariance matrices. The resulting model is known as the Gaussian graphical mixture model. The nonparanormal graphical mixture model has been introduced by replacing the limiting normal assumption with a semiparametric Gaussian copula, which extends the nonparanormal graphical model and mixture models. This study proposes clustering based on the nonparanormal graphical mixture model with two forms of $ell_1$ penalty function (conventional and unconventional), and its performance is compared with the clustering method based on the Gaussian graphical mixture model. The results of the simulation study on normal and nonparanormal datasets in ideal and noisy settings, as well as the application to breast cancer data set, showed that the combination of the nonparanormal graphical mixture model and the penalty term depending on the mixing proportions, both in terms of cluster reconstruction and parameters estimation, is more accurate than the other model-based clustering methods.

Abedin Haidari, Mostafa Sattari, Ghobad Barmalzan,
Volume 16, Issue 1 (9-2022)
Abstract

Consider two parallel systems with their component lifetimes following a generalized exponential distribution. In this paper, we introduce a region based on existing shape and scale parameters included in the distribution of one of the systems. If another parallel system's vector of scale parameters lies in that region, then the likelihood ratio ordering between the two systems holds. An extension of this result to the case when the lifetimes of components follow exponentiated Weibull distribution is also presented. 


Abouzar Bazyari, Morad Alizadeh,
Volume 16, Issue 1 (9-2022)
Abstract

In this paper, the collective risk model of an insurance company with constant surplus initial and premium when the claims are distributed as Exponential distribution and process number of claims distributed as Poisson distribution is considered. It is supposed that the reinsurance is done based on excess loss, which in that insurance portfolio, the part of total premium is the share of the reinsurer. A general formula for computing the infinite time ruin probability in the excess loss reinsurance risk model is presented based on the classical ruin probability. The random variable of the total amount of reinsurer's insurer payment in the risk model of excess loss reinsurance is investigated and proposed explicit formulas for calculating the infinite time ruin probability in the risk model of excess loss reinsurance. Finally, the results are examined for Lindley and Exponential distributions with numerical data. 

Aliakbar Hosseinzadeh, Ghobad Barmalzan, Mostafa Sattari,
Volume 16, Issue 1 (9-2022)
Abstract

In this paper, we discuss the hazard rate order of (n-1)-out-of-n systems arising from two sets of independent multiple-outlier modified proportional hazard rates components. Under certain conditions on the parameters and the sub-majorization order between the sample size vectors, the hazard rate order between the (n-1)-out-of-n systems from multiple-outlier modified proportional hazard rates is established.

Masumeh Ghahramani‎, Maryam Sharafi, Reza Hashemi,
Volume 16, Issue 1 (9-2022)
Abstract

One of the most critical challenges in progressively Type-II censored data is determining the removal plan. It can be fixed or random so that is chosen according to a discrete probability distribution. Firstly, this paper introduces two discrete joint distributions for random removals, where the lifetimes follow the two-parameter Weibull distribution. The proposed scenarios are based on the normalized spacings of exponential progressively Type-II censored order statistics. The expected total test time has been obtained under the proposed approaches. The parameters estimation are derived using different estimation procedures as the maximum likelihood, maximum product spacing and least-squares methods. Next, the proposed random removal schemes are compared to the discrete uniform, the binomial, and fixed removal schemes via a Monte Carlo simulation study in terms of their biases; root means squared errors of estimators and their expected experiment times. The expected experiment time ratio is also discussed under progressive Type-II censoring to the complete sampling plan. 


Parviz Nasiri, Raouf Obeidi,
Volume 16, Issue 1 (9-2022)
Abstract

This paper presents the inverse Weibull-Poisson distribution to fit censored lifetime data. The parameters of scale, shape and failure rate are considered in terms of estimation and hypothesis testing, so the parameters are estimated under the type-II of censorship using the maximum likelihood and Bayesian methods. In Bayesian analysis, the parameters are estimated under different loss functions. The simulation section presents the symmetric confidence interval and HPD, and the estimators are compared using statistical criteria. Finally, the model's goodness of fit is evaluated using an actual data set.

Mrs Elham Khaleghpanah Noughabi, Dr. Majid Chahkandi, Dr. Majid Rezaei,
Volume 16, Issue 2 (3-2023)
Abstract

In this paper, a new representation of the mean inactivity time of a coherent system with dependent identically distributed (DID) components is obtained. This representation compares the mean inactivity times of two coherent systems. Some sufficient conditions such that one coherent system dominates another system concerning ageing faster order in the reversed mean and variance residual life order are also discussed. These results are derived based on a representation of the system reliability function as a distorted function of the common reliability function of the components. Some examples are given to explain the results.
Jalal Etminan, Mohammad Khanjari Sadegh, Maid Chahkandi,
Volume 16, Issue 2 (3-2023)
Abstract

This paper considers series and parallel systems with independent and identically distributed component lifetimes. The reliability of these systems can be improved by using the reduction method. In the reduction method, system reliability is increased by reducing the failure rates of some of its components by a factor 0<ρ<1, called the equivalent reliability factor. Closed formulas are obtained for some reliability equivalence factors. In comparisons among the performance of the systems, these factors are helpful. We discuss that the reduction method can be considered as a particular case of the proportional hazard rates (PHR) model. Sufficient conditions for the relative aging comparison of the improved series and parallel systems under the PHR model and reduction method are also developed.

Ali Mohammadian Mosammam, , Jorge Mateu,
Volume 16, Issue 2 (3-2023)
Abstract

An important issue in many cities is related to crime events, and spatio–temporal Bayesian approach leads to identifying crime patterns and hotspots. In Bayesian analysis of spatio–temporal crime data, there is no closed form for posterior distribution because of its non-Gaussian distribution and existence of latent variables. In this case, we face different challenges such as high dimensional parameters, extensive simulation and time-consuming computation in applying MCMC methods. In this paper, we use INLA to analyze crime data in Colombia. The advantages of this method can be the estimation of criminal events at a specific time and location and exploring unusual patterns in places.


Mr. Ali Rostami, Dr. Mohammad Khanjari Sadegh, Dr. Mohammad Khorashadizadeh,
Volume 16, Issue 2 (3-2023)
Abstract

In this article, we consider the estimation of R{r,k}= P(X{r:n1} < Y{k:n2}), when the stress X and strength Y are two independent random variables from inverse Exponential distributions with unknown different scale parameters. R{r,k} is estimated using the maximum likelihood estimation method, and also, the asymptotic confidence interval is obtained. Simulation studies and the performance of this model for two real data sets are presented.


 
Zahra Zandi, Hossein Bevrani,
Volume 16, Issue 2 (3-2023)
Abstract

This paper suggests Liu-type shrinkage estimators in linear regression model in the presence of multicollinearity under subspace information. The performance of the proposed estimators is compared to Liu-type estimator in terms of their relative efficiency via a Monte Carlo simulation study and a real data set. The results reveal that the proposed estimators outperform better than the Liu-type estimator.


Dr. Abouzar Bazyari,
Volume 16, Issue 2 (3-2023)
Abstract

In this paper, the individual risk model of the insurance company with dependent claims is considered and assumes that the binary vector of random variables of claim sizes is independent. Also, they have a common joint distribution function. A recursive formula for infinite time ruin probability is obtained according to the initial reserve and joint probability density function of random variables of claim sizes using probability inequalities and the induction method. Some numerical examples and simulation studies are presented for checking the results related to the light-tailed bivariate Poisson, heavy-tailed Log-Normal and Pareto distributions. The results are compared for Farlie–Gambel–Morgenstern and bivariate Frank copula functions. The effect of claims with heavy-tailed distributions on the ruin probability is also investigated.
Dr Alireza Chaji,
Volume 16, Issue 2 (3-2023)
Abstract

High interpretability and ease of understanding decision trees have made
them one of the most widely used machine learning algorithms. The key to building
efficient and effective decision trees is to use the suitable splitting method. This
paper proposes a new splitting approach to produce a tree based on the T-entropy criterion
for the splitting method. The method presented on three data sets is examined
by 11 evaluation criteria. The results show that the introduced method in making
the decision tree has a more accurate performance than the well-known methods of
Gini index, Shannon, Tisalis, and Renny entropies and can be used as an alternative
method in producing the decision tree.
Dr. Robab Afshari,
Volume 16, Issue 2 (3-2023)
Abstract

Although the multiple dependent state sampling (MDS) plan is preferred over the conditional plans due to the small size required, it is impossible to use it in a situation where the quality of manufactured products depends on more than one quality characteristic. In this study, to improve the performance of the mentioned method, S^T_{pk}-based MDS plan is proposed, which is applicable to inspect products with independent and multivariate normally distributed characteristics. The principal component analysis technique is used to develop an application of the proposed plan in the presence of dependent variables. Moreover, optimal values of plan parameters are obtained based on a nonlinear optimization problem. Findings indicate that compared to S^T_{pk}-based variable single sampling and repetitive group sampling plans, the proposed method is the best in terms of required sample size and OC curve. Finally, an industrial example is given to explain how to use the proposed plan.
Alla Alhamidah, Mehran Naghizadeh, ,
Volume 16, Issue 2 (3-2023)
Abstract

This paper discusses the  Bayesian and E-Bayesian estimators in Burr type-XII model is discussed. The estimators are obtained based on type II censored data under the bounded reflected gamma loss function. The relationship between E-Bayesian estimators and their asymptotic properties is presented. The performance of the proposed estimators is evaluated using Monte Carlo simulation.
Meisam Moghimbeygi,
Volume 16, Issue 2 (3-2023)
Abstract

This article introduces a semiparametric multinomial logistic regression model to classify labeled configurations. In the regression model, the explanatory variable is the kernel function obtained using the power-divergence criterion. Also, the response variable was categorical and showed the class of each configuration. This semiparametric regression model is introduced based on distances defined in the shape space, and for this reason, the correct classification of shapes using this method has been improved compared to previous methods. ‎The performance of this model has been investigated in the comprehensive simulation study‎. ‎Two real datasets were analyzed using this article's method as an application‎. ‎Finally‎, ‎the method presented in this article was compared with the techniques introduced in the literature‎, ‎which shows the proper performance of this method in classifying configurations‎.


Mr Arta Roohi, Ms Fatemeh Jahadi, Dr Mahdi Roozbeh, Dr Saeed Zalzadeh,
Volume 17, Issue 1 (9-2023)
Abstract

‎The high-dimensional data analysis using classical regression approaches is not applicable, and the consequences may need to be more accurate.
This study tried to analyze such data by introducing new and powerful approaches such as support vector regression, functional regression, LASSO and ridge regression. On this subject, by investigating two high-dimensional data sets  (riboflavin and simulated data sets) using the suggested approaches, it is progressed to derive the most efficient model based on three criteria (correlation squared, mean squared error and mean absolute error percentage deviation) according to the type of data.


Mehdi Kiani,
Volume 17, Issue 1 (9-2023)
Abstract

In the 1980s, Genichi Taguchi, a Japanese quality advisor, claimed that most of the variability affiliated with the response could be attributed to the company of unmanageable (noise) factors. In some practical cases, his modeling proposition evidence leads the quality improvement to many runs in a crossed array. Hence, several researchers have em-braced noteworthy attitudes of response surface methodology along with the robust parameter design action as alternatives to Taguchi's plan. These alternatives model the response's mean and variance corresponding to the combination of control and noise factors in a combined array to accomplish a robust process or production. Indeed, using response surface methods to the robust parameter design minimises the impression of noise factors on assembling processes or productions. This paper intends to develop further modeling of the predicted response and variance in the presence of noise factors based on unbiased and robust estimators. Another goal is to design the experiments according to the optimal designs to improve these estimators' accuracy and precision simultaneously.

Page 10 from 12     

مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences

Persian site map - English site map - Created in 0.05 seconds with 52 queries by YEKTAWEB 4714