nitter archive

just in case you haven’t done your daily eye stretches yet, here’s a workout challenge! remember to count your reps, and to take a break between paragraphs! duet your score!

oh and, uh… you may want to hide any loose keyboards before you read this. because you may find yourself wanting to throw something.

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    the reason why I don’t get imposter syndrome is because I will never spew stupid bullshit as loud or as rapid as the folks my industry thinks are knowledgeable

    Currently we have single-threaded execution running at ~10Hz (tok/s) and enjoy looking at the assembly-level execution traces stream by.

    starting to feel a lot better about the FPGA work I’ve been doing, if a 10Hz single-threaded processor is considered anywhere near a success

    • froztbyte@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      tfw your drunken friday night shellscripts are someone else’s whole startup model[0]

      it is astounding just how convoluted and self-twisted these people manage to make their arguments though. like, this thing. literally billions of investment over time, multiple decamillions (if not more) in actual in-the-rack compute, however fucking much else you have in supporting infra (power, IP, routing, and all the other cumulative layer cake of goog’s investment in that over time), all as actual fucking computers that are probably already running virtual machines to run the containers for running your model code… and then you get this fucking kind of remark out of it

      it takes some incredibly creative point-dodging

      [0] - unironically I’ve had this feel a non-zero amount of times

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        it’s fucking bizarre to see this many supposed computer science experts forget how much of the field is built on complexity theory and optimization, since things get fucking weird (simulate an entire alternate universe weird, like we’ve seen a lot of these AI fools turn to) when you stop analyzing complexity

        of course my FPGA obsession lately has been designing a lambda calculus reducer that runs on hardware, a ridiculously inefficient (in terms of both cycles and bytes) way to do general computation. I don’t claim that work’s practical though, unlike these breathless AI fuckers pretending an LLM doing inefficient computation (on top of a multitude of layers of efficient computation) is revolutionary

  • bitofhope@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    I did five eyerolls. Then I got to the ten fucking herz and realized either I or the writer had misunderstood what they wanted to express.

    • froztbyte@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      ngl, the 10hz was a hell of a turn. I certainly didn’t see it coming. guess I was running at over 9000hz and had a phase issue

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        It’s wild. This thing running on computers, abstracted dozens of layers upwards, can do things a computer can, but a couple hundred million times slower.

        The next stage of transportation isn’t faster, greener, more efficient, more afrordable, or more pleasant cars, buses, trains, or planes; it’s crawling.

        • Amoeba_Girl@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          hey come on now that’s ten times faster than a redstone computer and only consumes a handful magnitudes more ressources!

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    With many 🧩 dropping recently, a more complete picture is emerging of crypto not as a currency, but the kernel process of a new Operating System. E.g. today it orchestrates:

    • Input & Output across modalities (text, audio, vision)
    • Code interpreter, ability to write & run programs
    • Browser / internet access
    • Embeddings database for files and internal memory storage & retrieval

    A lot of computing concepts carry over. Currently we have single-threaded execution running at ~10Hz (tok/s) and enjoy looking at the assembly-level execution traces stream by. Concepts from computer security carry over, with attacks, defenses and emerging vulnerabilities.

    I also like the nearest neighbor analogy of “Operating System” because the industry is starting to shape up similar: Windows, OS X, and Linux <-> Ethereum, Urbit, Solana, and Terra/Luna(?:)). An OS comes with default apps but has an app store. Most apps can be adapted to multiple platforms.

    TLDR looking at crypto as currency is the same as looking at early computers as calculators. We’re seeing an emergence of a whole new computing paradigm, and it is very early.

    • Steve@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      that’s an electron app. It consumes all your memory and doesn’t do anything it’s supposed to do

  • sinedpick@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    This is hard for me to digest because I have a lot of respect for Karpathy. His famous article on the potential of LSTMs as language models absolutely blew my mind. The innovations that led to LLMs stand on his shoulders, so how can he have such a poor take here?

    • corbin@awful.systems
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      He’s too young to grok why operating systems exist. (So am I…) To be fair, we’re outgrowing the concept in several areas; folks say that Web browsers, unikernels, blockchains, and now apparently LLMs are all replacements for the operating system, because they resemble each other in functionality sometimes.

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    With the ever increasing hype for AI/LLMs, I imagine there are a lot of young people sponging up this kind of garbage “insight”, going to university, and incessantly asking all their lecturers how AI factors into whatever CS fundamentals course they’re taking. After receiving enough answers that don’t confirm their biases about AI meaningfully sinking its tentacles into every computing related topic, they’ll quit CS out of frustration, probably.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      oh absolutely, I saw exactly that play out but for crypto before that bubble burst

      “uhm professor what if we did the NP-complete algorithm on an ASIC, theoretically wouldn’t that let us increase our hashrate uhm I mean”

      and of course the classic: getting kicked out of school because you used one of the GPU servers to mine ethereum and one of the grad students noticed

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    One of the first CS courses I ever had was one of the profs explaining that according to the definition a rock is also a computer so thankfully im immune to smashing stuff after reading this. (This was also the era before we called programs ‘apps’, which does almost cause me to smash things).

    Also, that isn’t what the words ‘computing paradigm’ means. If all those concepts etc carry over, they are still turing machine bound general purpose computing machines.

    That makes me wonder, one of these days somebody should write an article on how the dotcom era has lead to fetishization of ‘radical innovations’ over just incremental ones.

    • froztbyte@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Also, that isn’t what the words ‘computing paradigm’ mean

      I meant for that to be dripping with sarcasm (and is why they’re capitalised), but not sure if that came through

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Ow yeah, I meant their usage of the word, not yours. But not knowing more than surface level things about things is the way to become huge in tech. See for example how Musk can release an chatbot and say ‘it is based on the hitchhikers guide to the galaxy’ and almost nobody remarks that if you read the books it is revealed that one of those guidebooks is actually an malicious AI who lead everybody to their deaths (To a place numbered 42 even, so the whole ‘The Answer to the Ultimate Question of Life, the Universe, and Everything’ can be read more as a ‘pocket universe’ thing where they are lead to number 42 because that is where everything ends (and if they knew this they obv would avoid the nr 42 so everything would become more complex). But this requires people to have actually read (or look on the books wikipedia page) “Mostly Harmless” (I have not read ‘And Another Thing…’ btw)).

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          this is my reminder that it’s been about 20 years since I last read all the Hitchhiker’s Guide books and I should give them another read soon

        • sc_griffith@awful.systems
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          heard an audio clip recently of musk trying to explain the hitchhiker’s guide to the galaxy and it was painful. he described the basic plot elements but as if they were without irony, essentially inverting their meanings

    • 200fifty@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I would blame the rise of smartphones for that too. Somehow everyone became convinced that if tech isn’t radically upending everyone’s lives every 10 years then something is wrong. But hey, maybe our lives don’t need tech companies to disrupt them anymore actually?

  • Steve@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    A lot of computing concepts carry over.

    I feel like an honest person would write this and decide the concept needs more thought.