News
To achieve this, we used t-distributed stochastic neighbor embedding (t-SNE) to reduce 300-dimensional word vectors to three dimensions. The word vectors were derived from the fastText ...
In other words, the word embeddings can be dreadfully sexist. This happens because any bias in the articles that make up the Word2vec corpus is inevitably captured in the geometry of the vector space.
This is that specific words have the same relationship to each other regardless of the language. For example, the vector “king - man + woman = queen” should hold true in all languages.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results