A low-dimensional vector representation for words using an extreme learning machine

2017 
Word embeddings are a low-dimensional vector representation of words that incorporates context. TWo popular methods are word2vec and global vectors (GloVe). Word2vec is a single-hidden layer feedforward neural network (SLFN) that has an auto-encoder influence for computing a word context matrix using backpropagation for training. GloVe computes the word context matrix first then performs matrix factorization on the matrix to arrive at word embeddings. Backpropagation is a typical training method for SLFN's, which is time consuming and requires iterative tuning. Extreme learning machines (ELM) have the universal approximation capability of SLFN's, based on a randomly generated hidden layer weight matrix in lieu of backpropagation. In this research, we propose an efficient method for generating word embeddings that uses an auto-encoder architecture based on ELM that works on a word context matrix. Word similarity is done using the cosine similarity measure on a dozen various words and the results are reported.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    6
    Citations
    NaN
    KQI
    []