Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    New Generation Model of Word Vector Representation Based on CBOW or Skip-Gram

    Zeyu Xiong1,*, Qiangqiang Shen1, Yueshan Xiong1, Yijie Wang1, Weizi Li2

    CMC-Computers, Materials & Continua, Vol.60, No.1, pp. 259-273, 2019, DOI:10.32604/cmc.2019.05155

    Abstract Word vector representation is widely used in natural language processing tasks. Most word vectors are generated based on probability model, its bag-of-words features have two major weaknesses: they lose the ordering of the words and they also ignore semantics of the words. Recently, neural-network language models CBOW and Skip-Gram are developed as continuous-space language models for words representation in high dimensional real-valued vectors. These vector representations have recently demonstrated promising results in various NLP tasks because of their superiority in capturing syntactic and contextual regularities in language. In this paper, we propose a new strategy based on optimization in contiguous… More >

Displaying 1-10 on page 1 of 1. Per Page