[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Ethics Considerations::
Contact us::
Site Facilities::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Indexing and Abstracting



 
..
Social Media

..
Licenses
Creative Commons License
This Journal is licensed under a Creative Commons Attribution NonCommercial 4.0
International License
(CC BY-NC 4.0).
 
..
Similarity Check Systems


..
:: Search published articles ::
Showing 2 results for High-Dimensional Data

Dariush Najarzadeh,
Volume 13, Issue 1 (9-2019)
Abstract

‎Testing the Hypothesis of independence of a p-variate vector subvectors‎, ‎as a pretest for many others related tests‎, ‎is always as a matter of interest‎. ‎When the sample size n is much larger than the dimension p‎, ‎the likelihood ratio test (LRT) with chisquare approximation‎, ‎has an acceptable performance‎. ‎However‎, ‎for moderately high-dimensional data by which n is not much larger than p‎, ‎the chisquare approximation for null distribution of the LRT statistic is no more usable‎. ‎As a general case‎, ‎here‎, ‎a simultaneous subvectors independence testing procedure in all k p-variate normal distributions is considered‎. ‎To test this hypothesis‎, ‎a normal approximation for the null distribution of the LRT statistic was proposed‎. ‎A simulation study was performed to show that the proposed normal approximation outperforms the chisquare approximation‎. ‎Finally‎, ‎the proposed testing procedure was applied on prostate cancer data‎.


Nasrin Noori, Hossein Bevrani,
Volume 17, Issue 2 (2-2024)
Abstract

The prevalence of high-dimensional datasets has driven increased utilization of the penalized likelihood methods. However, when the number of observations is relatively few compared to the number of covariates, each observation can tremendously influence model selection and inference. Therefore, identifying and assessing influential observations is vital in penalized methods. This article reviews measures of influence for detecting influential observations in high-dimensional lasso regression and has recently been introduced. Then, these measures under the elastic net method, which combines removing from lasso and reducing the ridge coefficients to improve the model predictions, are investigated. Through simulation and real datasets, illustrate that introduced influence measures effectively identify influential observations and can help reveal otherwise hidden relationships in the data.


Page 1 from 1     

مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences

Persian site map - English site map - Created in 0.05 seconds with 32 queries by YEKTAWEB 4710