• 2 Posts
  • 330 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle





  • The old thread I posted this in was deleted, but I wrote this:

    Okay so hear me out. I have this pet theory that might explain some of the divide between genders, but also political parties, causing paralysis which ultimately might lead to humanity’s extinction. Forgive me if I’m stating the obvious.

    I’m going to set up two axioms to arrive at an extrapolated conclusion.

    One: Human psychology tends to ascribe more weight to negative things than positive things in the short term. In the long term this generally balances out, but in the short term it’s more prudent in a biological sense to pay attention to the rustling in the bushes than the berries you might pick from them. This is known as the negativity bias.

    Two: The modern gatekeepers of social interaction, Big Tech, employ blind algorithms that attempt to steer your attention towards spending more time on their platforms. These companies are the arbiters of the content we experience daily and what you do and don’t see is mostly at their discretion. The techniques they employ, in simple terms, are designed to provoke what they call ‘engagement’. They do this because at the end of the day FAANG have not only a financial interest, but a fiduciary duty to sell advertisements at the behest of their shareholders. The more they can engage you, the more ads they can sell. They employ live A-B testing, divide people into cohorts and poke and prod them with psychological techniques to try and glue your eyeballs to their ads.

    Extrapolated conclusion: These companies have a financial and legally binding interest to divide the population against itself, obstructing politics and social interaction to the point where we might not be able to achieve any of the goals that we need to reach to prevent oblivion.

    Thank you for coming to my TED Talk.












  • In 15 words: deep learning worked, got predictably better with scale, and we dedicated increasing resources to it.

    Are you sure about that Sam? Because one, you’re the snake oil salesman writing this and I wouldn’t trust you as far as I can throw you, and two, yeah maybe it scales predictably but the prediction is that training the next generation for marginal improvement will cost an exponential 100 billion (and that is taking your Microsoft discount for compute into account). You’re hitting a wall hard and the profits are still not in sight. This avenue of progress is a dead end and Sam knows it, because OpenAI is selling PPU’s instead of stock and looking to Saudi investment. Don’t get stuck with the bag folks, the few thousand days Sam claims to need aren’t survivable.