vectoring/embedding words
Vectoring Words (Word Embeddings) – Computerphile – 17 min video – [https://www.youtube.com/watch?v=gQddtTdmG_8&t=604s]:
How do you represent a word in AI? Rob Miles reveals how words can be formed from multi-dimensional vectors – with some unexpected results.
more on rob miles: [https://www.youtube.com/playlist?list=PLUTypj9XuPp710VyE4y33JzV1BkrIAhnq]: Rob Miles from the University of Nottingham, courtesy of Computerphile.
via thomas o’brien after 23 convo:
on resonations of right to city p 1-16 with recent convo w thomas on tech to facil..
just reread (during 2023 reconnection w thomas on vectoring words and words embedding [https://www.youtube.com/watch?v=gQddtTdmG_8&t=604s] and michael levin ness [https://twitter.com/drmichaellevin]) p 1-16 of right to the city.. huge to decision making is unmooring us law ness.. and people trying to protect (safety addiction) via planning/ordering et al.. and aziz let go law and carhart-harris entropy law.. et al
avi on decision making might be of interest.. avi is one from museum of care i was telling you about maria desribing him ‘anthropologist of deplorable animals, Dr Khalil ‘Avi’ Betz-Heinemann’
there’s a legit use of tech (nonjudgmental expo labeling).. to facil a legit global detox leap.. for (blank)’s sake.. and we’re missing it
notes/quotes from video:
how do you rep a word to your network..
4 min – if looking for a word.. rather.. give it a number.. ie: 2 words next to each other in dictionary aren’t similar..
5 min – what does it mean for a word to be similar to another word.. ie: used in similar contexts.. how to rep words as vectors such that 2 similar vectors are 2 similar words..
6 min – if good language model able to predict next word.. has to be compressing info down..
say for simplicity.. taking word and trying to guess next word
8 min – have to pull info out.. kind of like an egg drop competition.. need to get egg safely to ground.. and if can do that.. have learned things about engineering and team work.. it’s the friends you make along the way
so rather than trying to predict next word.. although that will work.. that will give you word embeddings.. but they’re not that good.. you look around the word.. sample from the neighborhood of that word.. randomly and train from that.. when fully trained.. give it any word and will give you likeliness et al..
11 min – the near by ness of those vectors expresses something meaningful about how similar contexts are.. so we assume similar words.. surprising how much info that’s able to extract..
we’re training it to produce good images from random noise.. in process creates mapping from latent space to images.. by doing basic arithmetic.. produces meaningful changes in image.. so end up w that same principle for words
12 mim = ie: if take vector for king.. add vector for man.. subtract.. et al.. get queen
ie’s of this adding/subtracting of vectors
_______
_______
_______
- words
_________________
__________
__________
_______
from way back when: word2vec via bernd nurnberger
open source code for better word recognition, .. ai..
Junto, Venessa, Monica.. wit (dot) ai.. wear able ness
_______
_______
_______


