UM  > Faculty of Science and Technology  > DEPARTMENT OF MATHEMATICS
Residential Collegefalse
Status已發表Published
Linear kernel tests via empirical likelihood for high-dimensional data
Ding, Lizhong1; Liu, Zhi3; Li, Yu2; Liao, Shizhong4; Liu, Yong5; Yang, Peng2; Yu, Ge6; Shao, Ling1; Gao, Xin2
2019
Conference Name33rd AAAI Conference on Artificial Intelligence / 31st Innovative Applications of Artificial Intelligence Conference / 9th AAAI Symposium on Educational Advances in Artificial Intelligence
Source Publication33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019
Pages3454-3461
Conference DateJAN 27-FEB 01, 2019
Conference PlaceHonolulu, HI
Abstract

We propose a framework for analyzing and comparing distributions without imposing any parametric assumptions via empirical likelihood methods. Our framework is used to study two fundamental statistical test problems: the two-sample test and the goodness-of-fit test. For the two-sample test, we need to determine whether two groups of samples are from different distributions; for the goodness-of-fit test, we examine how likely it is that a set of samples is generated from a known target distribution. Specifically, we propose empirical likelihood ratio (ELR) statistics for the two-sample test and the goodness-of-fit test, both of which are of linear time complexity and show higher power (i.e., the probability of correctly rejecting the null hypothesis) than the existing linear statistics for high-dimensional data. We prove the nonparametric Wilks' theorems for the ELR statistics, which illustrate that the limiting distributions of the proposed ELR statistics are chi-square distributions. With these limiting distributions, we can avoid bootstraps or simulations to determine the threshold for rejecting the null hypothesis, which makes the ELR statistics more efficient than the recently proposed linear statistic, finite set Stein discrepancy (FSSD). We also prove the consistency of the ELR statistics, which guarantees that the test power goes to 1 as the number of samples goes to infinity. In addition, we experimentally demonstrate and theoretically analyze that FSSD has poor performance or even fails to test for high-dimensional data. Finally, we conduct a series of experiments to evaluate the performance of our ELR statistics as compared to state-of-the-art linear statistics.

URLView the original
Indexed ByCPCI-S
Language英語English
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:000485292603058
Scopus ID2-s2.0-85090175170
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionDEPARTMENT OF MATHEMATICS
Corresponding AuthorDing, Lizhong; Gao, Xin
Affiliation1.Inception Institute of Artificial Intelligence (IIAI), Abu Dhabi, United Arab Emirates
2.King Abdullah University of Science and Technology (KAUST), Saudi Arabia
3.University of Macau, Macao
4.Tianjin University, China
5.Institute of Information Engineering, CAS, China
6.Technology and Engineering Center for Space Utilization, CAS, China
Recommended Citation
GB/T 7714
Ding, Lizhong,Liu, Zhi,Li, Yu,et al. Linear kernel tests via empirical likelihood for high-dimensional data[C], 2019, 3454-3461.
APA Ding, Lizhong., Liu, Zhi., Li, Yu., Liao, Shizhong., Liu, Yong., Yang, Peng., Yu, Ge., Shao, Ling., & Gao, Xin (2019). Linear kernel tests via empirical likelihood for high-dimensional data. 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 3454-3461.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Ding, Lizhong]'s Articles
[Liu, Zhi]'s Articles
[Li, Yu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Ding, Lizhong]'s Articles
[Liu, Zhi]'s Articles
[Li, Yu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Ding, Lizhong]'s Articles
[Liu, Zhi]'s Articles
[Li, Yu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.