We use the multivariate orthogonal greedy algorithm (MOGA) to help select variables for high-dimensional time dependent models. Under weak moment assumptions, we develop the optimal convergence rate of the prediction error, and propose an information criterion whose penalty varies with the imposed order of moments. The resulting regression estimate is shown to reach the oracle property.