|
|
 |
Search published articles |
 |
|
Showing 17 results for Ahmadi
Mostafa Razmkhah, Jafar Ahmadi, Bahareh Khatib Astaneh, Volume 1, Issue 1 (9-2007)
Abstract
A sequence of observations in which only successive minimum (maximum) values are observed are called record values. One sampling scheme for generating record values is: "Data are obtained via the inverse sampling scheme, where items are presented sequentially and sampling is terminated when n'th record is observed''. In this plan, the expectation of inter record times are infinite and in practice the number of records are few. Under the assumption that the process of observing record values can be replicated, one may consider the repetition of inverse sampling plan to achieve the specific number of records. In the latter scheme, we assume $m$ independent samples are obtained sequentially from the parent distribution and only recod data are observed. Two sampling (consecutive and repetition) plan are compared with regard to Fisher information contained in the extracted record data and general results are obtained. The proposed procedure is illustrated by considering several life time distributions such as Exponential, Burr XII and Weibull.
Mehdi Tazhibi, Nasollah Bashardoost, Mahboubeh Ahmadi, Volume 2, Issue 2 (2-2009)
Abstract
Receiver Operating Characteristic (ROC) curves are frequently used in biomedical informatics research to evaluate classification and prediction models to support decision, diagnosis, and prognosis. ROC analysis investigates the accuracy of models and ability to separate positive from negative cases. It is especially useful in evaluating predictive models and in comparing with other tests which produce output values in a continuous range. Empirical ROC curve is jagged but a true ROC curve is smooth. For this purpose kernel smoothing is used. The Area Under ROC Curve (AUC) is frequently used as a measure of the effectiveness of diagnostic markers. In this study we compare estimation of this area based on normal assumptions and kernel smoothing. This study used measurements of TSH from patients and non-patients in congenital hypothyroidism screening in Isfahan province. Using this method, TSH ROC curves from infants in Isfahan were fitted. For evaluating of accuracy of this test, AUC and its standard error calculated. Also effectiveness of the kernel methods in comparison with other methods are showed.
Mohammadvali Ahmadi, Majid Sarmad, Volume 3, Issue 2 (3-2010)
Abstract
Because of importance and popularity of the Normal distribution, the samples based on this distribution has been considered and the outliers are identified using cut-off values which are dependent on the sample size. A decision problem has been structured to obtain the optimal cut-off value. The problem is solved by a simulation study with a minimax rule.
Elham Zamanzadeh, Jafar Ahmadi, Volume 5, Issue 1 (9-2011)
Abstract
In this paper, first a brief introduction of ranked set sampling is presented. Then, construction of confidence intervals for a quantile of the parent distribution based on ordered ranked set sample is given. Because the corresponding confidence coefficient is an step function, one may not be able to find the exact prescribed value. With this in mind, we introduce a new method and show that one can obtained an optimal confidence interval by appealing the proposed approach. We also compare the proposed scheme with the other existence methods.
Amal Saki Malehi, Ebrahim Hajizadeh, Kambiz Ahmadi, Volume 6, Issue 1 (8-2012)
Abstract
The survival analysis methods are usually conducted based on assumption that the population is homogeneity. However, generally, this assumption in most cases is unrealistic, because of unobserved risk factors or subject specific random effect. Disregarding the heterogeneity leads to unbiased results. So frailty model as a mixed model was used to adjust for uncertainty that cannot be explained by observed factors in survival analysis. In this paper, family of power variance function distributions that includes gamma and inverse Gaussian distribution were introduced and evaluated for frailty effects. Finally the proportional hazard frailty models with Weibull baseline hazard as a parametric model used for analyzing survival data of the colorectal cancer patients.
Mohamad Bayat, Jafar Ahmadi, Volume 6, Issue 2 (2-2013)
Abstract
Nowadays, the use of various types of censoring plan in studies of lifetime engineering systems and industrial experiment are worthwhile. In this paper, by using the idea in Cramer and Iliopoulos (2010), an adaptive progressive Type-I censoring is introduced. It is assumed that the next censoring number is random variable and depends on the previous censoring numbers, previous failure times and censoring times. General distributional results are obtained in explicit analytic forms. It is shown that maximum likelihood estimators coincide with those in deterministic progressive Type-I censoring. Finally, in order to illustrate and make a comparison, simulation study is done for one-parameter exponential distribution.
Reza Alizadeh Noughabi, Jafar Ahmadi, Volume 6, Issue 2 (2-2013)
Abstract
In some practical problems, obtaining observations for the variable of interest is costly and time consuming. In such situations, considering appropriate sampling schemes, in order to reduce the cost and increase the efficiency are worthwhile. In these cases, ranked set sampling is a suitable alternative for simple random sampling. In this paper, the problem of Bayes estimation of the parameter of Pareto distribution under squared error and LINEX loss functions is studied. Using a Monte Carlo simulation, for both sampling methods, namely, simple random sampling and ranked set sampling, the Bayes risk estimators are computed and compared. Finally, the efficiency of the obtained estimators is illustrated throughout using a real data set. The results demonstrate the superiority of the ranked set sampling scheme, therefore, we recommend using ranked set sampling method whenever possible.
Fatemeh Safaei, Jafar Ahmadi, Volume 9, Issue 1 (9-2015)
Abstract
Consider a repairable system where two types of failures occur with different rate functions. The choice of minimal repair or replacement depends on the types of failures. The length of replacement cycle becomes optimal in terms of the cost function and the concept of discounted cost. In this paper, for two repairable systems the optimal replacement cycles are compared based on failure rate functions and probability of minimal repairs. Based on our results, one can make a decision for the period of replacement times. In order to illustrate the obtained results, numerical examples and simulation study are given.
Fatemeh Hooti, Jafar Ahmadi, Volume 10, Issue 1 (8-2016)
Abstract
In this paper, the quantile function is recalled and some reliability measures are rewritten in terms of quantile function. Next, quantile based dynamic cumulative residual entropy is obtained and some of its properties are presented. Then, some characterization results of uniform, exponential and Pareto distributions based on quantile based dynamic cumulative entropy are provided. A simple estimator is also proposed and its performance is studied for exponential distribution. Finally discussion and results are presented.
Jafar Ahmadi, Mansoureh Razmkhah, Volume 11, Issue 1 (9-2017)
Abstract
Consider a repairable system which starts operating at t=0. Once the system fails, it is immediately replaced by another one of the same type or it is repaired and back to its working functions. In this paper, the system's activity is studied from t>0 for a fixed period of time w. Different replacement policies are considered. In each cases, for a fixed period of time w, the probability model and likelihood function of repair process, say window censored, are obtained. The obtained results depend on the lifetime distribution of the original system, so, expression for the maximum likelihood estimator and Fisher information are derived, by assuming the lifetime follows an exponential distribution.
Jafar Ahmadi, Fatemeh Hooti, Volume 13, Issue 2 (2-2020)
Abstract
In survival studies, frailty models are used to explain the unobserved heterogeneity hazards. In most cases, they are usually considered as the product of the function of the frailty random variable and baseline hazard rate. Which is useful for right censored data. In this paper, the frailty model is explained as the product of the frailty random variable and baseline reversed hazard rate, which can be used for left censored data. The general reversed hazard rate frailty model is introduced and the distributional properties of the proposed model and lifetime random variables are studied. Some dependency properties between lifetime random variable and frailty random variable are investigated. It is shown that some stochastic orderings preserved from frailty random variables to lifetime variables. Some theorems are used to obtain numerical results. The application of the proposed model is discussed in the analysis of left censored data. The results are used to model lung cancer data.
Reza Ahmadi, Volume 14, Issue 1 (8-2020)
Abstract
We propose an integrated approach for decision making about repair and maintenance of deteriorating systems whose failures are detected only by inspections. Inspections at periodic times reveal the true state of the system's components and preventive and corrective maintenance actions are carried out in response to the observed system state. Assuming a threshold-type policy, the paper aims at minimizing the long-run average maintenance cost per unit time by determining appropriate inspection intervals and a maintenance threshold. Using the renewal reward theorem, the expected cost per cycle and expected cycle length emerge as solutions of equations, and a recursive scheme is devised to solve them. We demonstrate the procedure and its outperformance over specific cases when the components' lifetime conforms to a Weibull distribution. Further, a sensitivity analysis is performed to determine the impact of the model's parameters. Attention has turned to perfect repair and inspection, but the structure allows different scenarios to be explored.
Seyede Toktam Hosseini, Jafar Ahmadi, Volume 14, Issue 2 (2-2021)
Abstract
In this paper, using the idea of inaccuracy measure in the information theory, the residual and past inaccuracy measures in the bivariate case are defined based on copula functions. Under the assumption of radial symmetry, the equality of these two criteria is shown, also by the equality between these two criteria, radially symmetrical models are characterized. A useful bound is provided by establishing proportional (inverse) hazard rate models for marginal distributions. Also, the proportional hazard rate model in bivariate mode is characterized by assuming proportionality between the introduced inaccuracy and its corresponding entropy. In addition, orthant orders are used to obtain inequalities. To illustrate the results, some examples and simulations are presented.
Motahare Zaeamzadeh, Jafar Ahmadi, Bahareh Khatib Astaneh, Volume 15, Issue 2 (3-2022)
Abstract
In this paper, the lifetime model based on series systems with a random number of components from the family of power series distributions has been considered. First, some basic theoretical results have been obtained, which have been used to optimize the number of components in series systems. The average lifetime of the system, the cost function, and the total time on test have been used as an objective function in optimization. The issue has been investigated in detail when the lifetimes of system components have Weibull distribution, and the number of components has geometric, logarithmic, or zero-truncated Poisson distributions. The results have been given analytically and numerically. Finally, a real data set has been used to illustrate the obtained results.
Doctor Masoumeh Akbari, Mrs Arefeh Kasiri, Doctor Kambiz Ahmadi, Volume 17, Issue 1 (9-2023)
Abstract
In this paper, quantile-based dynamic cumulative residual and failure extropy measures are introduced. For a presentation of their applications, first, by using the simulation technique, a suitable estimator is selected to estimate these measures from among different estimators. Then, based on the equality of two extropy measures in terms of order statistics, symmetric continuous distributions are characterized. In this regard, a measure of deviation from symmetry is introduced and how it is applied is expressed in a real example. Also, among the common continuous distributions, the generalized Pareto distribution and as a result the exponential distribution are characterized, and based on the obtained results, the exponentiality criterion of a distribution is proposed.
Ali Khosravi Tanak, M. Fashandi, J. Ahmadi, M. Najafi, Volume 17, Issue 2 (2-2024)
Abstract
Record values have many applications in reliability theory, such as the shock and minimal repairs models. In this regard, many works have been done based on records in the classical model. In this paper, the records are studied in the geometric random model. The concept of the mean residual of records is defined in the random record model and some of its properties are investigated in the geometric random record model. Then, it is shown that the parent distribution can be characterized by using the sequence of the mean residual of records in a geometric random model. Finally, the application of the characterization results to job search models in labor economics is mentioned.
Jalal Chachi, Mohammadreza Akhond, Shokoufeh Ahmadi, Volume 18, Issue 2 (2-2025)
Abstract
The Lee-Carter model is a useful dynamic stochastic model representing the evolution of central mortality rates over time. This model only considers the uncertainty about the coefficient related to the mortality trend over time but not the age-dependent coefficients. This paper proposes a fuzzy extension of the Lee-Carter model that allows quantifying the uncertainty of both kinds of parameters. The variability of the time-dependent index is modeled as a stochastic fuzzy time series. Likewise, the uncertainty of the age-dependent coefficients is quantified using triangular fuzzy numbers. Considering this last hypothesis requires developing and solving a fuzzy regression model. Once the generalization of the desired fuzzy model is introduced, we will show how to fit the logarithm of the central mortality rate in Khuzestan province using by using fuzzy numbers arithmetic during the years 1401-1383 and random fuzzy forecast in the years 1402-1406.
|
|