Learning from data with missing values has become a commonly faced challenge in real-world applications. This work emphasizes on a specific scenario that a subset of training feature dimensions becomes unavailable during the prediction stage. In this certain case, most of the existing approaches suffer from the vacancy of designated feature dimensions, thus not capable of providing quality results. This work proposes a novel neural-based learning framework to leverage the knowledge obtained during training to alleviate the effect from missing of certain features during prediction. Our solutions incorporate two knowledge transferring strategies allowing the model to learn from diminishing features as well as from a teacher network trained with full information. Experiment results show promising outcomes comparing with the state-of-the-art imputation-based solutions and the effectiveness of our weight diminishing algorithm and the whole superiority of our teacher-student learning framework, compared to state-of-the-art methods tackling missing data.