Fast Excess Risk Rates via Offset Rademacher Complexity
Chenguang Duan,u00a0Yuling Jiao,u00a0Lican Kang,u00a0Xiliang Lu,u00a0Jerry Zhijian Yang
Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.