You’re choosing between “lots of people being killed” vs “LOOOOOOTTTTTSSSS of people being killed”
Based on your own morality you have outlined, ethically you would choose to vote Kamala then, as under her far far fewer people will die.
You’re choosing between “lots of people being killed” vs “LOOOOOOTTTTTSSSS of people being killed”
Based on your own morality you have outlined, ethically you would choose to vote Kamala then, as under her far far fewer people will die.
Weird take.
People can care about and discuss more than one thing at a time.
Maybe, but the intersection of this group and certain plane trips to a certain island is suspicious.
One vehicle delivering 2 peoples food is better than 2 people driving out to get food, tbh.
Overall delivery drivers substantially reduce traffic.
For more deliverable stuff like packages, 1 delivery truck delivery 40 peoples packages in one trip us so much better than 40 individual households all driving to Walmart or whatever.
I am fine with the majority of traffic just being delivery vehicles and public transit, those are the two actually effective uses for vehicles at the public level.
I’ve been looking for th8s for awhile too.
Not a locally run tool, but a self hosted web app (that I wire up to my self hosted db) that has a web portal I login to, and then can manage my db with a nice slick UI to define tables, relations, etc.
There’s been some I’ve found but they vastly lacked basic features and were clearly in very early beta.
I use Hugo, it’s not super complicated.
You basically just define templates in pseudo html for common content (header, nav panel, footer, etc), and then you write your articles in markdown and Hugo combines the two and outputs actual html files.
You also have a content folder for js, css, and images which get output as is.
That’s about all there is to it, it’s a pretty minimalist static site generator.
Hosting wise you can just put it on github pages for free.
Well yeah, I’d hope so, that’s the entire point.
Catcha’s data collection always was with the intent for training ai on these skills. That’s “the point” of them.
It’s reasonable to expect that the older version of captchas can now be beaten by modern ai, because they’re often literally trained on that exact data to beat it.
Captcha effectively is free to use on websites as a tool because the data collection is the “payment”, they then license that data out to people like OpenAI to train with for stuff like image recognition.
It’s why ai is progressing so fast, captchas are one of humanity’s long term collected data silos that are very full now.
We are going to have to keep progressing the complexity of catches as it will be the only way to catch modern AIs, and in turn it will collect more data to improve it.
Not quite.
It’s mostly wisdom of the crowd, as it always has been.
As long as you mostly click the same squares most other people click, you pass.
You often at random get 2-3 images because 2 of them are actual checks, but the third is a new image that you auto pass and they’re using it to gather data on what the average clicks are on it.
If you want to win an election, Google might be arguably the worst possible company to directly threaten, not gonna lie.
Pretty sure uf they wanted to, they could wbd your political career with any manner of ways.
I bet trump’s Google search history would be devastating if they threatened him with releasing it to the public, lol.
This continues to boil down into that tired argument that an amalgamation of human behavior is distinct from how humans actually behave, but since no one can actually prove how humans produce thoughts, it follows you can’t actually prove that an LLM actually works or doesn’t work any different.
So I dont really dig into that argument.
To be honest, the one thing that LLMs actually are good at, is summarizing bodies of text.
Producing a critique of a manuscript isnt actually to far out for an LLM, it’s sorta what it’s always doing, all the time.
I wouldn’t classify it as something to use as concrete review, and one must also keep in mind that context windows on LLMs usually are limited to only thousands of tokens, so they can’t even remember anything more then like 5 pages ago. If your story is bigger than that, they’ll struggle to comment on anything before the last 5 or so pages, give or take.
Asking an LLM to critique a manuscript is a great way to get constructive feedback on specific details, catch potential issues, maybe even catch plot holes, etc.
I’d absolutely endorse it as a step 1 before giving it to an actual human, as you likely can substantially improve your manuscript by iterating over it 3-4 times with an LLM, just covering basic issues and improvements, then letting an actual human focus on the more nuanced stuff an AI would miss/ignore.
Because having people download static map data for the entire planet just to play a game is untenable.
You shouldn’t have to download the entire planet though.
The game 100% should support installing local specific areas you wanna fly around, that anyone could then keep a copy of.
If a user wanted to cache an entire 8 TB of the entire world on a drive, they should be able to just do that (and thus have forever support without worrying about internet services staying online)
At least, as a snapshot of what the world looked like in 2024.
I don’t see why users shouldn’t have the option to locally HD save the data if they want to, to avoid maxing out their internet bandwidth in one sitting.
“Move Fast and Break Things” is Zuckerberg/Facebook motto, not Musk, just to note.
She has so much more sway than The Chicks had at the time, by such a huge amount.
The Chicks were already controversial, and had always been so.
But Dolly is the fucking queen of country, people revere her.
How many Dixie Chicks lookalike competitions were you seeing right before their career nosedived?
Yeah this is just noticeable because most products weren’t even resealable, they just expected you to seal em yourself with a clip, twist em, put em in a container, etc.
Now they are adding cheap resealable zips to the bag, which is nice in theory but the bag material has to be strong enough to support it.
Actual ziplock baggies themselves are made of thick plastic that can take a bit of abuse.
But cheap paper plastic hybrid materials a chip bag us made of can’t handle that sort of load, so it becomes the fail point.
Regardless of budget, I have found the following setup has afforded me all the comfort upsides of mobility and console gaming, with none of the performance downsides.
Build a standard desktop gaming pc to your budget, setting aside ~$150, give or take.
Make sure it’s wired into your network and not using wifi. Setup Steam on it as usual.
3a. (Console experience) Buy a Google TV with Chromecast, or whatever it’s called now. Install Steam Link app on it and connect it to your gaming pc. Get a Bluetooth compatible Xbox controller, connect it to the chromecast. Enjoy a console experience with your gaming pc. If you have the chromecast on a wired ethernet lime you’ll have maybe 1ms of input lag, very playable.
3b. (Laptop experience), buy a dirt cheap laptop, install steam on it, use Steam Streaming fu ctionaloty to stream from gaming pc to laptop. If you plug the laptop into ethernet you should have sub 1ms input lag.
This let’s you get all the horsepower of a gaming pc, at gaming pc hardware prices, but the portability of a laptop and/or couch gaming comfort of a console.
And since it’s all centralized to your 1 “server” machine, of you make changes in setup A (ie change am in game setting or etc), it’ll persist even if you swap over.
IE if I change my settings or preferences on the console, I’ll persist that over on my laptop and won’t have to change it again.
Furthermore no network save game synching needed, no waiting for a game to download a second time, no need to update the fane multiple times, etc.
It’s all centralized to your own core machine and everything else is just a thin client.
PS: this works with the Steam Deck too, you can stream from gaming pc to steam deck and use it as a thin client 👍
It’s heavily because you call out to your SO a lot, and their full name is a mouthful.
Typically words like “babe”, “hun”, etc are the lowest effort pet name. The “b” percussive is one of the easiest to pronounce.
Usually this is simply to make communication faster and easier, “hun” is way faster to say than whatever their full name is.
This becomes do commonplace that after being together for many years, their full name is reserved for emergencies.
Like if you cut yourself or are hurt or whatever, you instinctually use their full name to grab their attention and alert them. (People alert to their full name way easier and can hear it better)
This results in producing an alarm “wtf?” response when you use it casually, it makes them whip their head up and their brain goes “is something wrong?”
Then when they realize the situation is fine, it becomes a sort of “you spooked me for nothing! Don’t!” result.
You effectively reserve the full name only when you are trying to get their attention.
Dunno why ppl are down voting you, this is 100% the way.
Architecture as code is amazing, being able to completely wipe your server, re-install fresh, and turn it on and it goes right back to how it was is awesome.
GitOps version controlled architecture is easy to maintain, easy to rollback, and easy to modify.
I use k8s for my entire homelab, it has some initial learning curve but once you “get it” and have working configs on github, it becomes so trivial to add more stuff to it, scale it up, etc.
This… actually might be my new daily driver for note taking.
I am using neorg atm, but the LSP integration is phenomenal.
Re-using symbols navigation, code actions, hoverhints, etc from LSP API for markdown is chess kiss
That’s so clever, I gotta take this for a spin.
Thats the natural end game of capitalism, yes.
Not much you can do about it, it’s human nature.