• 9 Posts
  • 47 Comments
Joined 1 year ago
cake
Cake day: July 19th, 2023

help-circle
  • Even better, we can say that it’s the actual hard prompt: this is real text written by real OpenAI employees. GPTs are well-known to easily quote verbatim from their context, and OpenAI trains theirs to do it by teaching them to break down word problems into pieces which are manipulated and regurgitated. This is clownshoes prompt engineering done by manager-first principles like “not knowing what we want” and “being able to quickly change the behavior of our products with millions of customers in unpredictable ways”.


  • That’s the standard response from last decade. However, we now have a theory of soft prompting: start with a textual prompt, embed it, and then optimize the embedding with a round of fine-tuning. It would be obvious if OpenAI were using this technique, because we would only recover similar texts instead of verbatim texts when leaking the prompt (unless at zero temperature, perhaps.) This is a good example of how OpenAI’s offerings are behind the state of the art.


  • corbin@awful.systemstoTechTakes@awful.systemsChatGPT spills its prompt
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    14 hours ago

    Not with this framing. By adopting the first- and second-person pronouns immediately, the simulation is collapsed into a simple Turing-test scenario, and the computer’s only personality objective (in terms of what was optimized during RLHF) is to excel at that Turing test. The given personalities are all roles performed by a single underlying actor.

    As the saying goes, the best evidence for the shape-rotator/wordcel dichotomy is that techbros are terrible at words.

    NSFW

    The way to fix this is to embed the entire conversation into the simulation with third-person framing, as if it were a story, log, or transcript. This means that a personality would be simulated not by an actor in a Turing test, but directly by the token-predictor. In terms of narrative, it means strictly defining and enforcing a fourth wall. We can see elements of this in fine-tuning of many GPTs for RAG or conversation, but such fine-tuning only defines formatted acting rather than personality simulation.




  • In enterprise computing, “smart contracts” are called “database triggers” or “stored procedures.” They’re a nightmare, because they’re very hard to reason about or maintain, and they’re prone to unexpected and spooky effects.

    It occurs to me that the situation’s even more dire than this single-node description. If everything’s in one database, then yes, a smart contract is effectively a stored procedure. But it can be worse! Imagine e.g. an MMORPG where city centers or dungeons are disconnected from the regional map to prevent overload. A smart contract might need to synchronize data between two databases, e.g. a dungeon and a surrounding region, to maintain correctness.






  • corbin@awful.systemstoLinux@lemmy.mlopen letter to the NixOS foundation
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    2 months ago

    The original signers include members of the infrastructure and moderation teams. You can find about half of them on Mastodon. They’re all well-established community members who hold real responsibility and roles within the NixOS Foundation ecosystem.

    Also note that Eelco isn’t “a maintainer” but the original author and designer, as well as a de facto founder of Determinate Systems. He’s a BDFL. Look at this like the other dethronings of former BDFLs in the D, Python, Perl, Rails, or Scala communities; there’s going to be lots of drama and possibly a fork.






  • This is some of the most corporate-brained reasoning I’ve ever seen. To recap:

    • NYC elects a cop as mayor
    • Cop-mayor decrees that NYC will be great again, because of businesses
    • Cops and other oinkers get extra cash even though they aren’t business
    • Commercial real estate is still cratering and cops can’t find anybody to stop/frisk/arrest/blame for it
    • Folks over in New Jersey are giggling at the cop-mayor, something must be done
    • NYC invites folks to become small-business owners, landlords, realtors, etc.
    • Cop-mayor doesn’t understand how to fund it (whaddaya mean, I can’t hire cops to give accounting advice!?)
    • Cop-mayor’s CTO (yes, the city has corporate officers) suggests a fancy chatbot instead of hiring people

    It’s a fucking pattern, ain’t it.