[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Ethics Considerations::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Indexing and Abstracting



 
..
Social Media

..
Licenses
Creative Commons License
This Journal is licensed under a Creative Commons Attribution NonCommercial 4.0
International License
(CC BY-NC 4.0).
 
..
Similarity Check Systems


..
:: Search published articles ::
Showing 7 results for Hashemi

Reza Hashemi, Ghobad Barmalzan, Abedin Haidari,
Volume 3, Issue 2 (3-2010)
Abstract

Considering the characteristics of the bivariate normal distribution, in which uncorrelation of two random variables is equivalent to their independence, it is interesting to verify this issue in other distributions in other words whether or not the multivariate normal distribution is the only distribution in which uncorrelation is equivalent to independence. This paper aims to answer this question by presenting some concepts and introduce another family in which uncorrelation is equivalent to independence.

Ali Sharifi, Seyedreza Hashemi,
Volume 8, Issue 1 (9-2014)
Abstract

A semiparametric additive-multiplicative intensity function for recurrent events data under two competing risks have been supposed in this paper. The model contains unknown baseline hazard function that defined separately intensity function for different competing risks effects on subjects failure. The presented model is based on regression parameters for effective covariates and frailty variable which describe correlation between terminal event and recurrent events and personal difference of under study subjects. The model support right censored and informative censored survival data. For estimating unknown parameters, numerical methods have been used and baseline hazard parameters are approximated using Taylor series expansion. A simulation study and application of the model to the bone marrow transplantation data are performed to illustrate the performance of the proposed model.

Shahrokh Hashemi-Bosra, Ebrahim Salehi,
Volume 11, Issue 1 (9-2017)
Abstract

The (n-k+1)-out-of-n systems are important types of coherent systems and have many applications in various areas of engineering. In this paper, the general inactivity time of failed components of (n-k+1)-out-of-n system is studied when the system fails at time t>0. First we consider a parallel system including two exchangeable components and then using Farlie-Gumbel-Morgenstern copula, investigate the behavior of mean inactivity time of failed components of the system. In the next part, (n-k+1)-out-of-n systems with exchangeable components are considered and then, some stochastic ordering properties of the general inactivity time of the systems are presented based on one sample or two samples.


Masumeh Ghahramani‎, Maryam Sharafi, Reza Hashemi,
Volume 16, Issue 1 (9-2022)
Abstract

One of the most critical challenges in progressively Type-II censored data is determining the removal plan. It can be fixed or random so that is chosen according to a discrete probability distribution. Firstly, this paper introduces two discrete joint distributions for random removals, where the lifetimes follow the two-parameter Weibull distribution. The proposed scenarios are based on the normalized spacings of exponential progressively Type-II censored order statistics. The expected total test time has been obtained under the proposed approaches. The parameters estimation are derived using different estimation procedures as the maximum likelihood, maximum product spacing and least-squares methods. Next, the proposed random removal schemes are compared to the discrete uniform, the binomial, and fixed removal schemes via a Monte Carlo simulation study in terms of their biases; root means squared errors of estimators and their expected experiment times. The expected experiment time ratio is also discussed under progressive Type-II censoring to the complete sampling plan. 


Bahram Haji Joudaki, Reza Hashemi, Soliman Khazaei,
Volume 17, Issue 2 (2-2024)
Abstract

 In this paper, a new Dirichlet process mixture model with the generalized inverse Weibull distribution as the kernel is proposed. After determining the prior distribution of the parameters in the proposed model, Markov Chain Monte Carlo methods were applied to generate a sample from the posterior distribution of the parameters. The performance of the presented model is illustrated by analyzing real and simulated data sets, in which some data are right-censored. Another potential of the proposed model demonstrated for data clustering. Obtained results indicate the acceptable performance of the introduced model.
Farzane Hashemi,
Volume 18, Issue 2 (2-2025)
Abstract

One of the most widely used statistical topics in research fields is regression problems. In these models, the basic assumption of model errors is their normality, which, in some cases, is different due to asymmetry features or break points in the data. Piecewise regression models have been widely used in various fields, and it is essential to detect the breakpoint. The break points in piecewise regression models are necessary to know when and how the pattern of the data structure changes. One of the major problems is that there is a heavy tail in these data, which has been solved by using some distributions that generalize the normal distribution. In this paper, the piecewise regression model will be investigated based on the scale mixture of the normal distribution. Also, this model will be compared with the standard piecewise regression model derived from normal errors.
Bahram Haji Joudaki, Soliman Khazaei, Reza Hashemi,
Volume 19, Issue 1 (9-2025)
Abstract

Accelerated failure time models are used in survival analysis when the data is censored, especially when combined with auxiliary variables. When the models in question depend on an unknown parameter, one of the methods that can be applied is Bayesian methods, which consider the parameter space as infinitely dimensional. In this framework, the Dirichlet process mixture model plays an important role. In this paper, a Dirichlet process mixture model with the Burr XII distribution as the kernel is considered for modeling the survival distribution in the accelerated failure time. Then, MCMC methods were employed to generate samples from the posterior distribution. The performance of the proposed model is compared with the Polya tree mixture models based on simulated and real data. The results obtained show that the proposed model performs better.

Page 1 from 1     

مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences

Persian site map - English site map - Created in 0.11 seconds with 39 queries by YEKTAWEB 4704