[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Ethics Considerations::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Indexing and Abstracting



 
..
Social Media

..
Licenses
Creative Commons License
This Journal is licensed under a Creative Commons Attribution NonCommercial 4.0
International License
(CC BY-NC 4.0).
 
..
Similarity Check Systems


..
:: Search published articles ::
Showing 237 results for Type of Study: Research

Hossein Baghishani, Mohammad Mahdi Tabatabaei,
Volume 1, Issue 1 (9-2007)
Abstract

In parameter driven models, the main problem is likelihood approximation and also parameter estimation. One approach to this problem is to apply simpler likelihoods such as composite likelihood. In this paper, we first introduce the parameter driven models and composite likelihood and then define a new model selection criterion based on composite likelihood. Finally, we demonstrate composite likelihood's capabilities in inferences and accurate model selection in parameter driven models throughout a simulation study.
Mostafa Razmkhah, Jafar Ahmadi, Bahareh Khatib Astaneh,
Volume 1, Issue 1 (9-2007)
Abstract

A sequence of observations in which only successive minimum (maximum) values are observed are called record values. One sampling scheme for generating record values is: "Data are obtained via the inverse sampling scheme, where items are presented sequentially and sampling is terminated when n'th record is observed''. In this plan, the expectation of inter record times are infinite and in practice the number of records are few. Under the assumption that the process of observing record values can be replicated, one may consider the repetition of inverse sampling plan to achieve the specific number of records. In the latter scheme, we assume $m$ independent samples are obtained sequentially from the parent distribution and only recod data are observed. Two sampling (consecutive and repetition) plan are compared with regard to Fisher information contained in the extracted record data and general results are obtained. The proposed procedure is illustrated by considering several life time distributions such as Exponential, Burr XII and Weibull.
Firouzeh Rivaz, Mohsen Mohammadzadeh, Majid Jafari Khaledi,
Volume 1, Issue 1 (9-2007)
Abstract

In Bayesian prediction of a Gaussian space-time model, unknown parameters are considered as random variables with known prior distributions and, then the posterior and Bayesian predictive distributions are approximated with discritization method. Since prior distributions are often unknown, in this paper, parametric priors are considered. Then the empirical Bayes approach is used to estimate the prior distributions. Replacing these estimates in the Bayesian predictive distribution, an empirical Bayes space-time predictor and prediction variance are determined. Then an environmental example is used to illustrate the application of the proposed method. Finally the accuracy of the empirical Bayes space-time predictor is considered with cross validation criterion.
Mohammad Reza Alavi, Rahim Chinipardaz,
Volume 1, Issue 1 (9-2007)
Abstract

The classical analysis is based on random samples. However, in many situations the observations are recorded according to a nonnegative function of observations. In this case the mechanism of sampling is called weighted sampling. The usual statistical methods based on a weighted sample may be not valid and have to be adjusted. In this paper adjusted methods under some particular weight functions for normal distribution are studied and a new distribution called double normal distribution, is introduced as a weighted normal distribution.
Mohammad Arashi, Mahammad Mahdi Tabatabaei,
Volume 1, Issue 2 (2-2008)
Abstract

In this paper, we obtain the generalized least square, restricted generalized least square and shrinkage estimators for the regression vector parameter assuming that the errors have multivariate t distribution. Also we calculate their quadratic risks and propose the dominance order of the underlying estimators.
Arezoo Habibi Rad, Naser Reza Arghami,
Volume 1, Issue 2 (2-2008)
Abstract

The estimate of entropy (sample entropy), has been introduced by Vasicek (1976), for the first time. In this paper, we provide an estimate of entropy of order statistics, that is the extention of the entropy estimate. Then we present an application of the entropy estimate of order statistics as a test statistic for symmetry of distribution versus skewness. The proposed test has been compared with some other existing tests. A Monte Carlo simulation study shows that the proposed test has more power than the Park's (1999) test.
Mohammad Reza Farid Rohani, Khalil Shafiei Holighi,
Volume 1, Issue 2 (2-2008)
Abstract

In recent years, some statisticians have studied the signal detection problem by using the random field theory. In this paper we have considered point estimation of the Gaussian scale space random field parameters in the Bayesian approach. Since the posterior distribution for the parameters of interest dose not have a closed form, we introduce the Markov Chain Monte Carlo (MCMC) algorithm to approximate the Bayesian estimations. We have also applied the proposed procedure to real fMRI data, collected by the Montreal Neurological Institute.
Ahmad Parsian, Shahram Azizi Sazi,
Volume 2, Issue 1 (8-2008)
Abstract

In this paper, a new class of estimators namely Constrained Bayes Estimators are obtained under Balanced Loss Function (BLF) and Weighted Balanced Loss Function (WBLF) using a ``Bayesian solution". The Constrained Bayes Estimators are calculated for the natural parameter of one-parameter exponential families of distributions. A common approach to the prior uncertainty in Bayesian analysis is to choose a class $Gamma$ of prior distributions and look for an optimal decision within the class $Gamma$. This is known as robust Bayesian methodology. Among several methods of choosing the optimal rules in the context of the robust Bayes method, we discuss obtaining Posterior Regret Constrained Gamma-Minimax (PRCGM) rule under Squared Error Loss and then employing the ``Bayesian solution", we obtain the optimal rules under BLF and WBLF.


Mojtaba Khazaei,
Volume 2, Issue 1 (8-2008)
Abstract

One of the models that can be used to study the relationship between Boolean random sets and explanatory variables is growth regression model which is defined by generalization of Boolean model and permitting its grains distribution to be dependent on the values of explanatory variables. This model can be used in the study of behavior of Boolean random sets when their coverage regions variation is associated with the variation of grains size. In this paper we make possible the identification and fitting suitable growth model using available information in Boolean model realizations and values of explanatory variables. Also, a suitable method for fitting growth regression model is presented and properties of its obtained estimators are studied by a simulation study.

Mitra Rahimzadeh, Ebrahim Hajizadeh, Farzad Eskandari, Soleyman Kheiri,
Volume 2, Issue 1 (8-2008)
Abstract

In the survival analysis, when there is a cure fraction and the occurrence times of events are correlated, the cure frailty model is utilized. The main objective is to propose a method of analysis for two types of correlated frailty in the non-mixture cured model in order to separate the individual and shared heterogeneity between subjects. The cure models with correlated frailty and promotion time are considered. In both models, the likelihood function are based on piecewise exponential distribution for hazard function. To estimate the parameters, hierarchical Bayesian modeling is employed. Due to non-closed forms of the posteriors, they are estimated by MCMC algorithms. The Cox correlated frailty model is used as a benchmark and models are compared by DIC Criterion . The results show the superiority of cure models with correlated frailty.

Hossein Naraghi, Ali Iranmanesh,
Volume 2, Issue 1 (8-2008)
Abstract

In this paper, first we define the commutativity of two fuzzy subgroups, and then we computed the probability of commutativity of the group Zpn which its support is exactly  Zp m for m<=n.
Nasim Ejlali, Hamid Pezeshk,
Volume 2, Issue 2 (2-2009)
Abstract

Hidden Markov models are widely used in Bioinformatics. They are applied to protein sequence alignment, protein family annotation and gene-finding.The Baum-Welch training is an expectation-maximization algorithm for training the emission and transition probabilities of hidden Markov models. For very long training sequence, even the most efficient algorithms are memory-consuming. In this paper we discuss different approaches to decrease the memory use and compare the performance of different algorithms. In addition, we propose a bidirection algorithm with linear memory. We apply this algorithm to simulated data of protein profile to analyze the strength and weakness of the algorithm.

Nabaz Esmaeilzadeh, Hooshang Talebi,
Volume 2, Issue 2 (2-2009)
Abstract

So far, the Plackett-Burman (PB) designs have been considered as saturated non-regular fractional factorial designs for screening purposes. Since introduction of the hidden projection of PB's by Wang and Wu (1995), the estimation capability of such projections onto a subset of factors has been investigated by many researchers. In this paper, by considering the search and estimation capability of a design, we introduce the post-stage search designs, using sparsity principle of factorial effects. That is, by the post-stage property of a design, we mean the capability of such a design in searching and estimating possible nonzero 3-factorial interactions along with estimation of the general mean, main effects and active 2-factor interaction effects, identified in the pre-stage. We show that a 12-runs PB projections onto 4 and 5 factors are post-stage search designs.

Mehdi Tazhibi, Nasollah Bashardoost, Mahboubeh Ahmadi,
Volume 2, Issue 2 (2-2009)
Abstract

Receiver Operating Characteristic (ROC) curves are frequently used in biomedical informatics research to evaluate classification and prediction models to support decision, diagnosis, and prognosis. ROC analysis investigates the accuracy of models and ability to separate positive from negative cases. It is especially useful in evaluating predictive models and in comparing with other tests which produce output values in a continuous range. Empirical ROC curve is jagged but a true ROC curve is smooth. For this purpose kernel smoothing is used. The Area Under ROC Curve (AUC) is frequently used as a measure of the effectiveness of diagnostic markers. In this study we compare estimation of this area based on normal assumptions and kernel smoothing. This study used measurements of TSH from patients and non-patients in congenital hypothyroidism screening in Isfahan province. Using this method, TSH ROC curves from infants in Isfahan were fitted. For evaluating of accuracy of this test, AUC and its standard error calculated. Also effectiveness of the kernel methods in comparison with other methods are showed.

Ehsan Zamanzadeh, Naser Arghami,
Volume 2, Issue 2 (2-2009)
Abstract

In this paper, we first introduce two new entropy estimators. These estimators are obtained by correcting Corea(1995)'s estimator in the extreme points and also assigning different weights to the end points.We then make a comparison among our proposed new entropy estimators and the entropy estimators proposed by Vasicek (1976), Ebrahimi, et al. (1994) and Corea(1995). We also introduce goodness of fit tests for exponentiality and normality based on our proposed entropy estimators. Results of a simulation study show that the proposed estimators and goodness of fit tests have good performances in comparison with the leading competitors.

Maliheh Abbasnejad, Marzeiyeh Shakouri,
Volume 2, Issue 2 (2-2009)
Abstract

In this paper, we establish a goodness of fit test for exponentiality based on the estimated Renyi information. We use an estimator for Renyi distance in manner of Correa entropy estimate. Critical values of the test are computed by Monte Carlo simulation. Also we compute the power of the test under different alternatives and show that it compares favorably with the leading competitor.

Rahman Farnoosh, Afshin Fallah, Arezoo Hajrajabi,
Volume 2, Issue 2 (2-2009)
Abstract

The modified likelihood ratio test, which is based on penalized likelihood function, is usually used for testing homogeneity of the mixture models. The efficiency of this test is seriously affected by the shape of penalty function that is used in penalized likelihood function. The selection of penalty function is usually based on avoiding of complexity and increasing tractability, hence the results may be far from optimality. In this paper, we consider a more general form of penalty function that depends on a shape parameter. Then this shape parameter and the parameters of mixture models are estimated by using Bayesian paradigm. It is shown that the proposed Bayesian approach is more efficient in comparison to modified likelihood test. The proposed Bayesian approach is clearly more efficient, specially in nonidentifiability situation, where frequentist approaches are almost failed.

Masoumeh Izanloo, Arezou Habibirad,
Volume 3, Issue 1 (9-2009)
Abstract

Unified hybrid censoring scheme is a mixture of generalized Type-I and Type-II hybrid censoring schemes. In this paper, we mainly consider the analysis of unified hybrid censored data when the lifetime distribution of the individual item is a two-parameter generalized exponential distribution. It is observed that the maximum likelihood estimators can not be obtained in a closed form. We obtain the maximum likelihood estimates of the parameters by using Newton-Raphson algorithm. The Fisher information matrix has been obtained and it can be used for constructing asymptotic confidence intervals. We also obtain the Bayes estimates of the unknown parameters under the assumption of independent gamma priors using the importance sampling procedure. Simulations are performed to compare the performances of the different schemes and one data set is analyzed for illustrative purposes.
Rahim Chinipardaz, Hoda Kamranfar,
Volume 3, Issue 1 (9-2009)
Abstract

This paper is concerned with the study of the effect of outliers in GARCH models. Four common outliers are considered: additive outliers, innovation outliers, level change and temporary change. Each of the outlier is embedded to a GARCH model and then the effectness of outliers in this model is studied. The residuals of the models have been investigated for both cases, the usual GARCH model and the GARCH model in the present of outliers.
Ameneh Kheradmandi, Nahid Sanjari Fasipour,
Volume 3, Issue 1 (9-2009)
Abstract

Gomez et al. (2007) introduced the skew t-normal distribution, showing that it is a good alternative to model heavy tailed data with strong symmetrical nature, specially because it has a larger range of skewness than the skew-normal distribution. Gomez et al. (2007) and Lin et al. (2009) described some properties of this distribution. In this paper, we consider some further properties of skew student-t-normal distribution. Also, we present four theorems for constructing of this distribution. Next we illustrate a numerical example to model the Vanadium pollution data in the Shadegan Wetland by using skew student-t-normal distribution.

Page 1 from 12    
First
Previous
1
...
 

مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences

Persian site map - English site map - Created in 0.03 seconds with 53 queries by YEKTAWEB 4710