Rob Vermiller
Jun 28, 2023

--

Hi Valentina, I played around with Word2Vec using Python. (It's older than transformers, but the idea is the same.) You can say things like "Italy - Rome + Paris" and the result is "France". Pretty cool way to do math using semantic vectors. Transformers now use encode/decode, but the idea is the same, where the encoder really creates semantic vectors.

--

--

Rob Vermiller
Rob Vermiller

Written by Rob Vermiller

A computer scientist with a passion for AI, AGI and Cognitive Science, and author of Quantum Rendering: Games, Minds and the Future of AI.

No responses yet