With the increasing number of natural language processing tasks, the need for better representation of words (word embedding) and senses (sense embedding) is getting higher in recent years. In this study, we firstly discuss the problem of abnormal dimensions in word embeddings, and then propose models that combine word embedding with ontology. The combination is discussed in three ways: directly combination approach, support vector regression approach and retrofitting approach. In sense embedding, we firstly propose a joint sense retrofitting model that learns better sense embedding from contextual and ontological information, and then generalize the proposed model.