|
|
|
|
|
 |
Search published articles |
 |
|
Showing 23 results for Entropy
Abdol Saeed Toomaj, Volume 18, Issue 1 (8-2024)
Abstract
In this paper, the entropy characteristics of the lifetime of coherent systems are investigated using the concept of system signature. The results are based on the assumption that the lifetime distribution of system components is independent and identically distributed. In particular, a formula for calculating the Tsallis entropy of a coherent system's lifetime is presented, which is used to compare systems with the same characteristics. Also, bounds for the lifetime Tsallis entropy of coherent systems are presented. These bounds are especially useful when the system has many components or a complex structure. Finally, a criterion for selecting the preferred system among coherent systems based on the relative Tsallis entropy is presented.
, Hadi Alizadeh Noughabi, Majid Chahkandi, Volume 19, Issue 2 (3-2026)
Abstract
In today’s industrial world, effective maintenance plays a key role in reducing costs and improving productivity. This paper introduces goodness-of-fit tests based on information measures, including entropy, extropy, and varentropy, to evaluate the type of repair in repairable systems. Using system age data after repair, the tests examine the adequacy of the arithmetic reduction of age model of order 1. The power of the proposed tests is compared with classical tests based on martingale residuals and the probability integral transform. Simulation results show that the proposed tests perform better in identifying imperfect repair models. Their application to real data on vehicle failures also indicates that this model provides a good fit.
Dr Alireza Pakgohar, Dr Soheil Shokri, Volume 20, Issue 1 (9-2026)
Abstract
This study investigates the wavelet energy distribution in high-frequency fractal systems and analyzes its characteristics using information-theoretic measures. The main innovation of this paper lies in modeling the wavelet energy distribution ($p_j$) using a truncated geometric distribution and incorporating the concept of extropy to quantify system complexity. It is demonstrated that this distribution is strongly influenced by the fractal parameter $alpha$ and the number of decomposition levels $M$. By computing wavelet entropy and extropy as measures of disorder and information, respectively—the study provides a quantitative analysis of the complexity of these systems. The paper further examines key properties of this distribution, including its convergence to geometric, uniform, and degenerate distributions under limiting conditions (e.g., $M to infty$ or $alpha to 0$). Results indicate that entropy and extropy serve as complementary tools for a comprehensive description of system behavior: while entropy measures disorder, extropy reflects the degree of information and certainty. This approach establishes a novel framework for analyzing real-world signals with varying parameters and holds potential applications in the analysis of fractal signals and modeling of complex systems in fields such as finance and biology.
To validate the theoretical findings, synthetic fractal signals (fractional Brownian motion) with varying fractal parameters ($alpha$) and decomposition levels ($M$) were simulated. Numerical results show that wavelet entropy increases significantly with the number of decomposition levels ($M$), whereas extropy exhibits slower growth and saturates at higher decomposition levels. These findings underscore the importance of selecting an appropriate decomposition level. The proposed combined framework offers a powerful tool for analyzing and modeling complex, non-stationary systems in domains such as finance and biology.
|
|
|
|
|
|
|
|
|
|
|