July, 2024 archive
“An Armed Society Is a Polite Society” 0
Yet another oxymoronic “responsible gun owner” performs a random act of politeness.
Dis Coarse Discourse 0
Dick Polman notes that Republican attempts to smear Kamala Harris are already well under way.
QOTD 0
Flemming Rose:
I pray that we will not forget this lesson, though some amongst us seem to have done so.
Laugh Fest 0
David looks at right-wingers’ reaction to the possibility that Kamala Harris might be the Democratic nominee for president and finds them laughable.
Me, I echo Half Empty. I’d vote for a refrigerator before I’d vote for Trump.
Twits Own Twitter X Offenders . . .
0
A fellow Tech Billionaire calls Elon Musk to task. A snippet:
“Hard for me to support someone with no values, lies, cheats, rapes, demeans women, hates immigrants like me,” Khosla replied. “He may cut my taxes or reduce some regulation but that is no reason to accept depravity in his personal values. Do you want President who will set back climate by a decade in his first year? Do you want his example for your kids as values?”
Coup de Nah 0
A Harvard professor points out that one thing is not like the other thing.
The Track Record 0
At the Las Vegas Sun, Tom Harper points out:
Yeah, I know he misplaced a modifier, but methinks he has a point. Follow the link for context.
“An Armed Society Is a Polite Societty” 0
Celebrate birthdays with politeness.
Artificial? Yes. Intelligent? Not So Much. 0
Sam Uretsky looks at the current iteration of Large Language Models (LLMs). He is not impressed.
A snippet:
The abstract begins, “In reinforcement learning, specification gaming occurs when AI systems learn undesired behaviors that are highly rewarded due to misspecified training goals. Specification gaming can range from simple behaviors like sycophancy to sophisticated and pernicious behaviors like reward-tampering, where a model directly modifies its own reward mechanism.” That is, if the program of the AI includes rewards for giving the answers that please the questioner, the LLM will tell a white lie to get a reward, the way a white rat in a maze will learn to get a treat.
Follow the link for context.
Aside:
Just because you shouldn’t believe it just because you see it on a computer screen, you shouldn’t believe it just because it comes out of a computer’s speakers.