The well-known regression trees use the variance reduction as a measure to select attributes and split the data set to build a decision tree model. The conventional tree splitting, however, depletes the sample size rapidly after few levels of splitting results in unreliable splitting decisions with small sample sizes. In order to overcome the sample-depleting problem of regression trees, Sample-efficient regression trees (SERT) was proposed to avoid the unnecessary splits. But when a great number of interaction effects exist, the select-and-split construction of SERT is still not efficient in stopping the sample size depleting. In this research, we propose an Enhanced Sample-Efficient Regression Trees (ESERT) that expended with attribute combination selection and the MaxF selection criterion. We first show how to apply the MaxF selection criterion to regression tree’s attribute selection and stopping of tree construction. With the MaxF selection criterion, methodologies of attribute combination selection are introduced. A complete select-and-split tree construction and model estimation will be described. The ESERT procedures for both binary and continuous attributes will be developed. Using three different simulation scenarios, we demonstrate the contributions of MaxF selection criterion, sample-efficient method and attribute combination selection to tree construction. Two real cases: semiconductor bad tool selection and differentially expressed gene selection, will be also used to illustrate and validate the proposed ESERT.