db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 6 months agoThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.message-squaremessage-square259fedilinkarrow-up1952arrow-down10file-text
arrow-up1952arrow-down1message-squareThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 6 months agomessage-square259fedilinkfile-text
minus-squaremilicent_bystandr@lemm.eelinkfedilinkEnglisharrow-up3·6 months agoYes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.
Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.