From Pine View Farm

Geek Stuff category archive

The Myth of Multitasking 0

At Psychology Today Blogs, Joyce Marter debunks de bunk. A snippet:

While multitasking may seem like a productivity booster, it can also lead to decreased focus, poorer work quality, and increased stress levels. Multitasking has been proven to reduce productivity and job performance . . . .

Follow the link for context.

Share

Artificial? Yes. Intelligent? Not So Much. 0

The Register reports on a New York law firm that tried to use ChatGPT to justify a ginormous billing.

The judge was not impressed.

Share

Geeking Out 0

Mageia v. 9 with the Plasma desktop. Firefox is shaded near the top of the screen under the Plasma menu. Xclock is in the upper right, GKrellM in the lower right. The wallpaper is from my collection.

Screenshot

Click for a larger image.

Recently, I ran an online upgrade from v. 8 to v. 9. The online upgrade from v. 7 to v. 8 went smooth as glass, and this one seemed to also, but, when it was done, I was unable to run updates or install new software from the repos. I poked at the problem for a while, but was unable to resolve it, so last night I installed v. 9 from optical media while listening to a BBC Lord Peter Wimsey mystery at the Old Time Radio Theater.

The installation went quickly and easily, and, as I have a separate /home partition, when I fired it up, all my configuration files were still in place without my having to restore anything from backup (and, no, you can’t do that on Windows). I’m currently cleaning up the dust bunnies, such as, for example, installing the few applications, like Xclock and GKrellM, which are not part of the standard Mageia installation.

Share

Meta: Purged Plugin 0

Thanks to the most excellent detective work of my hosting provider’s tech support staff, it has been deemed necessary to remove the NOAA Weather plugin that was on the sidebar over there —–> for years, as the plugin has not been updated for several years and is no longer compatible with the most recent versions of php, the scripting language that powers WordPress, which in turn powers this geyser of genius. (Or is it a drenching of drivel? Inquiring minds want to know.)

As one who wore a headset for over half a decade, I must say that my hosting provider’s tech support staff is superb.

I know, because I’ve been there.

Share

Artificial? Yes. Intelligent? Not So Much. Dangerous? Certainly. 0

At Psychology Today Blogs, Dr. Marlynn Wei tales a look at the psychological implications of the spread of deepfakes and “AI” clones and lists half a dozen dangers. Here’s one (emphasis in the original):

4. Creation of false memories

Research in deepfakes shows that people’s opinions can be swayed by interactions with a digital replica, even when they know it is not the real person. This can create “false” memories of someone. Negative false memories could harm the reputation of the portrayed person. Positive false memories can have complicated and unexpected interpersonal effects as well. Interacting with one’s own AI clone could also result in false memories.

Her article is a worthwhile read, and prends garde a toi.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Via Bruce Schneir, here’s a study that demonstrates that AI can be made more human-like.

That is, it can be “trained” to deceive.

As the song* says

    The things that you’re liable
    To see in your large language model,
    They ain’t necessarily so.

_________________

*With apologies to George Gershwin.

Share

Deepfaked 0

You can read the news story here.

Share

Extra-Special Bonus QOTD 0

Kerry Greenwood:

Email is a trap; it feels private, but it isn’t.

Greenwood, Kerry, Trick or Treat (Scotsdale, AZ, Poisoned Pen Press, 2010) p. 161

Share

Suckered by the Algorithm 0

At Psychology Today Blogs, Bill Sullivan offers yet more evidence that “social” media isn’t, particular for the young. Here’s a bit of his article:

Studies have shown that social media use, especially among younger people, increases materialism—a drive to accumulate and flaunt money and possessions. Such a mindset implies that external goods like Italian sports cars, Cartier watches, or gourmet dining are signs of a person’s status. Materialists gauge their own worth—and the worth of other people—by how much money and bling they’ve acquired, rather than judging internal goods like character. The materialistic mindset fosters attitudes of competition, envy, and greed.

Madonna’s claim that we are living in a material world is backed by convincing data.

Follow the link, then be sure to post it to your Zuckerborg or Muskrat page.

Share

Facebook Frolics 0

Under-valuation frolics.

Share

Devolution 0

Image of man evolving from apes, learning to walk upright, then once again walking hunched over, glued to a cell phone.

Via C&L.

Share

Artificial? Yes. Intelligent? Not So Much. 0

Just because you see–er–hear it on a computer–er–device, it ain’t necessarily so.

The Second Circuit has referred attorney Jae S. Lee to a grievance panel for citing a fake case made up by ChatGPT in a complaint and never checking to see if the case spit out by the AI was real.

Details at the link.

Share

Facebook Frolics 0

A Consumer Reports study details the extent to which you have been assimilated by the Zuckerborg and its enablers. Indeed, perhaps the most astounding bit in the report is the number of Zuckerborg enablers who “share” your data with Facebook.

Follow the link for the article.

Using a panel of 709 volunteers who shared archives of their Facebook data, Consumer Reports found that a total of 186,892 companies sent data about them to the social network. On average, each participant in the study had their data sent to Facebook by 2,230 companies. That number varied significantly, with some panelists’ data listing over 7,000 companies providing their data.

One more time, “social” media isn’t.

You don’t use “social” media. It uses you.

H/T Bruce Schneier for the head up.

Share

Artificial? Yes. Intelligent? Not So Much. 0

The New York Times reports on a internet user who used “AI” to compose a false and misleading obituary just to get clicks (and advertising revenue), spreading lies and drowning truth along the way.

Just go read it. The “intelligence” may be “artificial,” but the stupid is real.

Share

The Bullies’ Pulpit 0

One more time, “social” media isn’t.

Share

Artificial? Yes. Intelligent? Not So Much. (Updated) 0

I find it ironic that what used to be called “data scraping” somehow morphs into being “training” when the scraper is labeled “AI.”

Aside:

Can this be? Is “AI” the new blow-up doll?

Addendum:

Bruce Schneier offers a hint as to how to out “AI” bots on “social” media.

Share

Frozen 0

It turns out that Teslas don’t seem to like really cold weather. Here’s a bit of the report from The Register:

In the Oak Brook suburb of Chicago, Illinois, where temperatures have routinely dipped way below freezing, local media reported public charging stations turning into “car graveyards” because motorists were unable to power their vehicles.

“Nothing. No juice. Still on zero percent, and this is like three hours being out here after being out here three hours yesterday,” Tesla owner Tyler Beard told Fox 32.

He wasn’t alone. Dozens of cars were reportedly lined up and abandoned at the Tesla supercharging station in Oak Brook along with multiple charging stations around Chicago.

Share

Artificial? Yes. Intelligent? Not So Much. 0

It turns out, when persons as “AI” questions about case law, “AI” tends to just make stuff up “hallucinate,” to use the term from the article at Above the Law.

Share

The Surveillance State Society 0

The EFF reports on a victory for privacy. A snippet:

Phone app location data brokers are a growing menace to our privacy and safety. All you did was click a box while downloading an app. Now the app tracks your every move and sends it to a broker, which then sells your location data to the highest bidder, from advertisers to police.

So it is welcome news that the Federal Trade Commission has brought a successful enforcement action against X-Mode Social (and its successor Outlogic).

The FTC’s complaint illustrates the dangers created by this industry. The company collects our location data through software development kits (SDKs) incorporated into third-party apps, through the company’s own apps, and through buying data from other brokers. The complaint alleged that the company then sells this raw location data, which can easily be correlated to specific individuals.

More at the link.

Aside:

I find it ironic that persons sweat bullets about limited and regulated “government surveillance” while willingly and heedlessly running nekkid before corporate collectors of confidentia–oh, never mind.

Share

Deceptive by Design 0

At Psychology Today Blogs, Penn State professor Patrick L. Plaisance looks at the hazards of designing Chatbots and similar “AI” mechanisms (after, that’s what they are: mechanisms) to interact with users (i. e., people) as if said mechanisms were people. For example, he mentions programming them so that they appear to be typing or speaking a response at a human-like speed when, in actuality, they formed their complete response in nano-seconds.

He makes three main points; follow the link for a detailed discussion of each.

  • Anthropomorphic design can be useful, but unethical when it leads us to think the tool is something it’s not.
  • Chatbot design can exploit our “heuristic processing,” inviting us to wrongly assign moral responsibility.
  • Dishonest human-like features compound the problems of chatbot misinformation and discrimination.

Share