1
|
Lederer J. Statistical guarantees for sparse deep learning. ASTA ADVANCES IN STATISTICAL ANALYSIS 2023. [DOI: 10.1007/s10182-022-00467-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
AbstractNeural networks are becoming increasingly popular in applications, but our mathematical understanding of their potential and limitations is still limited. In this paper, we further this understanding by developing statistical guarantees for sparse deep learning. In contrast to previous work, we consider different types of sparsity, such as few active connections, few active nodes, and other norm-based types of sparsity. Moreover, our theories cover important aspects that previous theories have neglected, such as multiple outputs, regularization, and $$\ell_{2}$$
ℓ
2
-loss. The guarantees have a mild dependence on network widths and depths, which means that they support the application of sparse but wide and deep networks from a statistical perspective. Some of the concepts and tools that we use in our derivations are uncommon in deep learning and, hence, might be of additional interest.
Collapse
|
2
|
Wang Y, Xu J, Wang Z. A simple tuning parameter selection method for high dimensional regression. COMMUN STAT-THEOR M 2022. [DOI: 10.1080/03610926.2022.2117559] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
Affiliation(s)
- Yanxin Wang
- Department of Applied Statistics, Ningbo University of Technology, Ningbo, China
| | - Jiaqing Xu
- Department of Applied Statistics, Ningbo University of Technology, Ningbo, China
| | - Zhi Wang
- Department of Applied Statistics, Ningbo University of Technology, Ningbo, China
| |
Collapse
|
3
|
Topology Adaptive Graph Estimation in High Dimensions. MATHEMATICS 2022. [DOI: 10.3390/math10081244] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
We introduce Graphical TREX (GTREX), a novel method for graph estimation in high-dimensional Gaussian graphical models. By conducting neighborhood selection with TREX, GTREX avoids tuning parameters and is adaptive to the graph topology. We compared GTREX with standard methods on a new simulation setup that was designed to assess accurately the strengths and shortcomings of different methods. These simulations showed that a neighborhood selection scheme based on Lasso and an optimal (in practice unknown) tuning parameter outperformed other standard methods over a large spectrum of scenarios. Moreover, we show that GTREX can rival this scheme and, therefore, can provide competitive graph estimation without the need for tuning parameter calibration.
Collapse
|
4
|
Huang ST, Xie F, Lederer J. Tuning-free ridge estimators for high-dimensional generalized linear models. Comput Stat Data Anal 2021. [DOI: 10.1016/j.csda.2021.107205] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
5
|
Jiménez-Cordero A, Maldonado S. Automatic feature scaling and selection for support vector machine classification with functional data. APPL INTELL 2021. [DOI: 10.1007/s10489-020-01765-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|