Deep learning is the state-of-the-art machine learning approach for many natural language processing (NLP) tasks. However, in order for deep learning to be used to process a language the language must be converted to numerical vectors. In deep learning NLP research distributed semantic representations are the standard approach used to represent text as numeric vectors.
In this research, I will develop a model that combines both compositional and distributional perspective on meaning. This model will encode and represent meanings of different parts of texts (such as morphemes, words, phrases, sentences etc.) from its constituent parts as well as its context of use as numeric vectors. Since the representation of meaning is vital to any deep learning based NLP approach, improvements in distributed semantic representations is likely to result in the improved performance of many downstream NLP applications (such as, for example, machine translation). A significant impact on web technological industry is expected from this project.