![](https://lemmy.ml/pictrs/image/1a75ee6b-cc6b-404a-bc0b-39968ceda02c.png)
![](https://lemmy.ml/pictrs/image/4qCJsb8xk5.png)
… nor own.
Somewhere between Linux woes, gaming, open source, 3D printing, recreational coding, and occasional ranting.
🇬🇧 / 🇩🇪
… nor own.
Choose whatever fits you
And stick to it! Also make sure other participants also adhere to that. Optionally configure a linter for doing that.
“Keine Lust” ist natürlich kein Grund, wenn wir von einer Pflicht reden.
Eine Pflicht ist ein Zwang, und Leute die unter Zwang arbeiten müssen, werden das bestimmt sehr gut und gründlich machen.
/s
Wie wählen die CDU trotzdem wieder.
Meine tägliche Pendelstrecke führt mich auf dem schmalen Radweg an einer in beide Richtungen vierspurigen Straße vorbei. Ich sehe da weder morgens noch abends Autos fahren, die stehen da immer nur. Manchmal hupen auch welche.
Ist das diese “Freiheit”?
Ach komm schon, gibt auch Obstkorb und gratis Wasser.
I never looked into flatseal and I don’t have any issues with Steam. But I wonder if flatseal can allow a Flatpak Java application to run systemctl poweroff
.
No thank you, no kids.
I like my freedom and that it is quiet.
Es ging NIE um Kinderschutz.
Absolutely.
bullshitting as in when you give a confident answer without regard of actual reality.
So you say there could be different meanings of the same word? Like “bullshitting” or “hallucination”?
LOL, okay.
LLMs bullshit all the time
Bullshitting to me is giving intentionally wrong statements. LLMs do not generate intentionally wrong statements. Saying they do, means that you imply intelligence.
LLMs know nothing nor are they intelligent. They also are not right or wrong, they generate output based on statistics.
“Hallucination” as a term for “AIs” making things up is used since the early 2000s (even if it’s meaning has changed since then).
Of course is the term stupid. Neither is an LLM an AI, nor is any AI in the current state intelligent. In the end it all boils down to being answer machines. Complex ones, but still far away from anything even remotely being am AI.
So you say, a technical term should not be created by the people who actually develop the technology the term is used for?
It’s not a “fancy word” here, but a technical term. An AI making things up is actually called hallucination.
never trust AI
Statements from LLMs are to be seen as hallucinations unless proven otherwise by classic research.
Ja, und das nicht zu knapp! Wer sich anstrengt, kann locker 18,36 Euro pro Monat verdienen.