dvtt@lemmings.world to News@lemmy.worldEnglish · 9 months agoStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comexternal-linkmessage-square31fedilinkarrow-up171arrow-down116
arrow-up155arrow-down1external-linkStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comdvtt@lemmings.world to News@lemmy.worldEnglish · 9 months agomessage-square31fedilink
minus-squareknightly the Sneptaur@pawb.sociallinkfedilinkarrow-up11arrow-down2·9 months agoSo, what? You think women need their own LLMs or something? You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up5arrow-down17·9 months agoComputers do not have the sentience required to be sexist.
minus-squareknightly the Sneptaur@pawb.sociallinkfedilinkarrow-up14arrow-down1·9 months agoThey don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down1·edit-29 months agodeleted by creator
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down17·9 months agoInteresting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
minus-squarelolcatnip@reddthat.comlinkfedilinkEnglisharrow-up17·edit-29 months agoProgrammers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
So, what? You think women need their own LLMs or something?
You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.
Computers do not have the sentience required to be sexist.
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.