"Word2vec"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
imported>Pythagoras0
imported>Pythagoras0
4번째 줄: 4번째 줄:
 
** https://rare-technologies.com/performance-shootout-of-nearest-neighbours-contestants/
 
** https://rare-technologies.com/performance-shootout-of-nearest-neighbours-contestants/
 
** Using gensim’s memory-friendly streaming API I then converted these plain text tokens to TF-IDF vectors, ran Singular Value Decomposition (SVD) on this TF-IDF matrix to build a latent semantic analysis (LSA) model and finally stored each Wikipedia document as a 500-dimensional LSA vector to disk.
 
** Using gensim’s memory-friendly streaming API I then converted these plain text tokens to TF-IDF vectors, ran Singular Value Decomposition (SVD) on this TF-IDF matrix to build a latent semantic analysis (LSA) model and finally stored each Wikipedia document as a 500-dimensional LSA vector to disk.
 +
 +
 +
==pretrained korean word2vec==
 +
* https://github.com/Kyubyong/wordvectors
 +
  
  

2018년 5월 8일 (화) 00:24 판

gensim


pretrained korean word2vec


memo


related items


computational resource