introduction: http://kexue.fm/archives/4402/
here we release a tf_word2vec. it use a new loss -- I called it random softmax loss.
一个较完整的tensorflow版本的word2vec封装,用了自己设计的loss降低训练计算量。
introduction: http://kexue.fm/archives/4402/
here we release a tf_word2vec. it use a new loss -- I called it random softmax loss.
一个较完整的tensorflow版本的word2vec封装,用了自己设计的loss降低训练计算量。