Jun 28, 2023
Hi Valentina, I played around with Word2Vec using Python. (It's older than transformers, but the idea is the same.) You can say things like "Italy - Rome + Paris" and the result is "France". Pretty cool way to do math using semantic vectors. Transformers now use encode/decode, but the idea is the same, where the encoder really creates semantic vectors.