From Pine View Farm

Geek Stuff category archive

False Pretenses 0

“A. I.” does not stand for “artificial intelligence.” That misrepresents what it does. It does not think. It assembles.

“A. I.” stands for “automated infringement” of intellectual property.

Share

Artificial? Yes. Intelligent? Not So Much. 0

A successor to Shakespeare? To be or not to cumquat, that is the merchant of the tempest.

Share

The Art of the Con 0

Der Spiegel takes a deep dive into the workings of a massive multinational online consumer con.

Just go read it and remember, just because you see it on a computer screen, it ain’t necessarily so.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Delusional? Darn tootin’.

Susan A. Nolan and Michael Kimball consider the source at Psychology Today Blogs. Here’s a bit of their article:

AI hallucinations are, of course, not limited to science. They occur across disciplines and contexts. Indeed, such errors are a growing problem in court where legal documents increasingly include references to non-exist (sic) legal cases. For example, in less than four weeks in May, judges reported at least 23 made-up legal citations. Moreover, it appears that professionals – lawyers and trained legal staff – are largely responsible for these fake references. Lawyers may be overly relying on AI, yet not checking what AI produces. Reporters noted that the actual number of fake citations is almost certainly higher given that judges may not always catch when it happens.

Share

Geeking Out 0

Now this is what I call wallpaper.

Screenshot

Mageia v. 9 with the Plasma desktop. GKrellM is in the lower right; xclock, the upper right.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Is it keeping a list and checking it twice? Well, it knows if you’ve been naughty or nice.

Share

Facebook Frolics 0

At the EFF, Lena Cohen and Rory Mir explore the Zuckerborg’s latest efforts to spy on you and offer some suggestions for combating it. A snippet:

Meta’s tracking pixel was secretly communicating with Meta’s apps on Android devices. This violates a fundamental security feature (“sandboxing”) of mobile operating systems that prevents apps from communicating with each other. Meta got around this restriction by exploiting localhost, a feature meant for developer testing. This allowed Meta to create a hidden channel between mobile browser apps and its own apps.

Share

Facebook Frolics 0

In a column discussing how to deal with fraudulent frolics (in this case relating to Alabama football) on the Zuckberborg, Michael Casagrande makes this observation, which methinks is quite accurate:

The billion-dollar company clearly doesn’t care if it’s growing a field of weeds where a garden once lived.

Follow the link for context.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Brine for your brain? It’s the real dill.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Therapeutic? Bwaaa-haaaa-haaaa-haaaa.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Properly secured? Not according to security expert Bruce Schneier.

Share

The Disinformation Superhighway 0

At Psychology Today Blogs, Rebecca Dolgin outlines several ways in which “social” media isn’t. Here’s one; follow the link for the others.

Social media gives the illusion of a public square, but it actually doesn’t accurately reflect offline reality because a small percentage of users generate the majority of posts, and those voices are often the loudest, most polarized, or most extreme.

Given how many persons think that “social” media is a reliable source for news, I find this a timely and disquieting read.

Share

Artificial? Yes. Intelligent? Not So Much. 0

A new field for fraudsters? Oh! Look over there! Someone’s ploughing a field.

Share

Geeking Out 0

Mageia v. 9 with the Plasma desktop. The wallpaper is from my collection.

Screenshot

Share

Facebook Frolics 0

There’s copyrights, and then there’s copywrongs.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Trustworthy? That bridge in Brooklyn is still on the market.

Along those lines, you might want to listen to Harry Shearer’s conversation with Gary Marcus on this week’s episode of Le Show. It starts at about the eight minute mark.

Aside:

We recently watched The Matrix.

Methinks they got it wrong.

Machines didn’t subordinate mankind. Mankind seems quite willing, eager even, to subordinate themselves to the machines.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Fodder for the easily fooled? All too often.

At Psychology Today Blogs, Cornelia C. Walther looks at why person may tend to fall for AI generated mis- and disinformation. She points out that

Perhaps most concerning is our inherent vulnerability to what psychologists term “automation bias”—our tendency to trust machine-generated content more than human-created material. This cognitive bias creates a perfect storm for AI manipulation. We trust AI-generated content partly because we don’t recognize it as AI-generated, and partly because we assume machines are more objective than humans.

Recent studies reveal that people consistently underestimate AI’s persuasive capabilities, making them more vulnerable to manipulation. When survey participants were told content was AI-generated, their resistance increased significantly. However, in real-world scenarios, AI-generated persuasive content is rarely labeled as such, leaving audiences defenseless against sophisticated psychological manipulation.

She goes on to offer some techniques to fend off falsehoods.

I think you will find it worth your while.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Trustworthy? I’ve still got that bridge for sale in Brooklyn.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Fomenting folderol? Well, garbage in, garbage out, as Steven J. Vaughan-Nichols reports at El Reg. Here’s a bit of his article:

Welcome to Garbage In/Garbage Out (GIGO). Formally, in AI circles, this is known as AI model collapse. In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and “irreversible defects” in performance. The final result? A Nature 2024 paper stated, “The model becomes poisoned with its own projection of reality.”

In related news, in this week’s Le Show, Harry Shearer reports on how big tech’s fascination with AI is leading to record levels of usage of electricity and of water to cool data centers. The relevant portion starts at about the 38-minute mark.

Also, too.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Occasionally correct? As my old boss used to day, even a blind pig finds an acorn sometime.

Share
From Pine View Farm
Privacy Policy

This website does not track you.

It contains no private information. It does not drop persistent cookies, does not collect data other than incoming ip addresses and page views (the internet is a public place), and certainly does not collect and sell your information to others.

Some sites that I link to may try to track you, but that's between you and them, not you and me.

I do collect statistics, but I use a simple stand-alone Wordpress plugin, not third-party services such as Google Analitics over which I have no control.

Finally, this is website is a hobby. It's a hobby in which I am deeply invested, about which I care deeply, and which has enabled me to learn a lot about computers and computing, but it is still ultimately an avocation, not a vocation; it is certainly not a money-making enterprise (unless you click the "Donate" button--go ahead, you can be the first!).

I appreciate your visiting this site, and I desire not to violate your trust.