From Pine View Farm

Geek Stuff category archive

Artificial? Yes. Intelligent? Not So Much. 0

Sarah Silverman, among others, is suing the makers of AI bots for copyright infringement. She has the unmitigated gall to think that Tech Bros shouldn’t just vacuum up the work of others so as to line their own pockets.

A snippet:

The suits, filed July 7, accuse the two firms of using Silverman’s and two other plaintiffs’ work to enrich their AI systems, OpenAI’s ChatGPT and Meta’s LLaMA, in violation of book copyright law. It’s the latest in a stream of legal challenges to the training and output of AI systems, suits that have often accused the tech companies of lifting work without permission or pay.

“AI needs to be fair and ethical for everyone,” Matthew Butterick, one of the suit’s lawyers, said in a statement. “But Meta is leveraging the work of thousands of authors with no consent, no credit, and no compensation.”

Share

Artificial? Yes. Intelligent? Not So Much. 0

Bruce Schneier points out the ChatGPT and similar “AI” bots work by mining data produced by others, then spitting in back out.

He proposes that those whose data is mined deserve to be reimbursed for their contributions. Follow the link for his reasoning.

Share

All That Was Old Is New Again, Reprise 0

Thom reminds us that, as Mark Twain once observed, history doesn’t repeat itself, but it often rhymes.

Share

Legends in Their Own Minds 0

Methinks Atrios is onto something.

Share

Geeking Out 0

Magiea v. 8 with the Fluxbox window manager. The wallpaper is from my collection.

Screenshot

If I had to pick one thing that keeps me using Fluxbox, it’s the right-click menu.

Anywhere the mouse pointer is on the screen, as long as its not on an application window, a right-click brings up the menu.

Share

Artificial? Yes. Intelligent? Not So Much. 0

At Psychology Today Blogs, Matt Grawitch argues that one of the effects of the growing use of artificial “intelligence” has been paradoxically to highlight the importance of human expertise and research. A snippet:

But the internet also brought challenges in determining the credibility and accuracy of information. Those with insufficient expertise to discriminate what was credible from what was not could easily research their way into questionable conclusions (e.g., the Earth is flat, vaccines cause autism). And we saw a rise in people claiming they had “done their own research” to justify their views, regardless of the legitimacy of the views they endorsed.

(snip)

Fast-forward to today and the increasing availability of AI-driven tools for research and decision-making. While many of these tools are very confined in terms of the scope of their capabilities, the introduction of broader AI, like Bard and ChatGPT, makes it possible for people to by-pass the process of researching a topic and building an argument and head straight to the conclusion or decision. The dangers of this, though, have been on full display recently, such as when a professor incorrectly flunked all his students for cheating (because ChatGPT told him they had) or the lawyer who used ChatGPT for legal research only to find that the cases he had cited didn’t exist (he was subsequently sanctioned).

Share

Twits on Twitter 0

An ersatz twit.

Who woulda thunk?

Share

Droning On 0

The marvels of modern technology . . . .

Investigators say Stephanie Merola, 32, was preparing to take a shower in her Cranston residence Wednesday evening when she heard a buzzing sound outside the bathroom window.

When Merola went outside to investigate, cops say, she spotted a drone hovering near the window. As she approached the drone, it began to fly away, but struck a tree branch and fell to the ground. Merola then grabbed the drone (pictured below) and dunked it in her pool, disabling the quadcopter’s electronics.

According to the report, the pilot has been–er–grounded.

Share

The Bullies’ Pulpit 0

At Psychology Today Blogs, Mark Travers discusses a study about why some persons turn into cyberbullies. The findings were not expected. An excerpt:

However, contrary to previous research linking cyberbullying to factors such as low self-esteem, the perceived anonymity of the online world, anger, and a desire for revenge, Soares’s study involving 359 Canadian young adults revealed distinct motivations for cyberbullying — primarily driven by what researchers have termed ‘recreation’ and ‘reward.’

“Recreation pertains to impulsive antisocial acts, whereas reward relates to more calculated and premeditated acts that may evolve over time,” said Soares. “Young individuals who partake in antisocial behavior online may be driven by a desire for excitement and the pursuit of positive emotions or social status among their peers.”

Share

The Disinformation Superhighway 0

People worry about the “singularity.

They should be worrying about the stupid-larity.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Emma and the crew look behind the curtain and call out the chicanery.

Share

Behind That (AI) Curtain, Reprise 0

While we’re on the subject . . . .

The stupid. It burns.

Share

Behind That (AI) Curtain 0

At Psychology Today Blogs, Gleb Tsipursky offers some pointers for detecting the deception. He make three main points:

  • AI-generated misinformation blurs truth, making it hard to discern fact from fiction.
  • People can unmask AI content by scrutinizing it for inconsistencies and a lack of human touch.
  • AI content detection tools can spot and neutralize misinformation, protecting against its spread.

Follow the link for a detailed discussion of each one.

Aside:

I will add one bit of advice:

Don’t believe stuff just because you see it on a computer screen.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Bruce Schneier asks whether “we really want to entrust this revolutionary technology (AI–ed.) solely to a small group of US tech companies?”

He goes on to remind us that

Silicon Valley has produced no small number of moral disappointments. Google retired its “don’t be evil” pledge before firing its star ethicist. Self-proclaimed “free speech absolutist” Elon Musk bought Twitter in order to censor political speech, retaliate against journalists, and ease access to the platform for Russian and Chinese propagandists. Facebook lied about how it enabled Russian interference in the 2016 US presidential election and paid a public relations firm to blame Google and George Soros instead.

Follow the link for the rest of his thoughts.

Share

Twits Own Twitter 0

Elon Musk apparently has the courage of his evictions.

A Boulder landlord succeeded in obtaining an order to evict Twitter over unpaid rent, according to recent court documents.

Share

Artificial? Yes. Intelligent? Not So Much. 0

At Above the Law, Ethan Beberness reports that Open AI and ChatGPT just might be getting their day in court–as defendants.

Share

No Place To Hide 0

At the Washington Monthly, Karina Montoya has a long and detailed article about how advertising strategy is changing once again. In the past decade, advertising moved to “social” media, with a disturbing side effect of eroding the business models of legitimate news organizations. Now, she argues, retailers are marketing personal information gathered through loyalty programs, credit card purchases, and the like to advertisers. The entire piece is worth a read, but this particular bit caught my eye:

Recently, I got curious about what CVS might know about me through my participation in its ExtraCare loyalty program. After accessing the CVS website, I requested the records it had about me. CVS—a conglomerate whose mergers and acquisitions include Aetna insurance, Caremark, and physician practices like Oak Street Health—knows a lot about me. In addition to my shopping history of the last 12 months, CVS knows my ethnicity, country of origin, household location, income level, types of credit cards, homeowner status, and interest in weight loss, vitamins, and natural foods. And there are little or no regulatory barriers to CVS using this data to sell me stuff or letting others use it to target me with ads.

Corporations, not the government, are your “surveillance state.” And we walk nekkid through its streets every day.

Share

It’s All about the Algorithm 0

Said algorithm engages those eyeballs and sucks them right down into a vortex of vile.

“Social” media isn’t.

Share

Artificial? Yes. Intelligent? Not So Much. 0

In Georgia, Mark Walters, who is apparently a local radio personality of some sort, has sued OpenAI for libel based on falsehoods propagated by ChatGPT, and Techdirt wonders whether the suit has a prayer. Here’s a bit from their article; follow the link for context.

Of course, all of this raises a bunch of questions: Is this actually defamatory? Is there actual malice? If so, who is legally liable?

And I’m not sure there are really good answers. First off, only one person actually saw this information, and there’s no indication that he actually believed any of it (indeed, it sounds like he was aware that it was hallucinating), which would push towards it not being defamation and even if it was, there was no harm at all.

Second, even if you could argue that the content was defamatory and created harm, is there actual malice by Open AI? First off, Watson is easily a public figure, so he’d need to show actual malice by OpenAI . . . .

Aside:

Whether Walters wins or loses, I doubt he’ll be the last to want a day in court with ChatGPT.

Share

Full-Face and Profiled 0

The EFF reports on a court’s finding about farcical recognition. A snippet (emphasis added):

In a victory for transparency in police use of facial recognition, a New Jersey appellate court today ruled that state prosecutors—who charged a man for armed robbery after the technology showed he was a “possible match” for the suspect—must turn over to the defendant detailed information about the face scanning software used, including how it works, source code, and its error rate.

Follow the link for details.

Share
From Pine View Farm
Privacy Policy

This website does not track you.

It contains no private information. It does not drop persistent cookies, does not collect data other than incoming ip addresses and page views (the internet is a public place), and certainly does not collect and sell your information to others.

Some sites that I link to may try to track you, but that's between you and them, not you and me.

I do collect statistics, but I use a simple stand-alone Wordpress plugin, not third-party services such as Google Analitics over which I have no control.

Finally, this is website is a hobby. It's a hobby in which I am deeply invested, about which I care deeply, and which has enabled me to learn a lot about computers and computing, but it is still ultimately an avocation, not a vocation; it is certainly not a money-making enterprise (unless you click the "Donate" button--go ahead, you can be the first!).

I appreciate your visiting this site, and I desire not to violate your trust.