Pages

Thursday, 25 September 2025

Think About It 115: CORNELIA C WALTHER

 

While machines learn to mislead, people are drifting into automation complacency. In healthcare, for instance, clinicians overridden by algorithmic triage tools commit more omission errors (missing obvious red flags) and commission errors (accepting false positives) than those using manual protocols.

      Three forces drive this type of agency decay:

      Path-of-least-resistance psychology. Verifying an AI’s output costs cognitive effort. The busier the decision context, the more tempting it is to click accept and move on.

      Sycophantic language. Large language models are trained to maximize user satisfaction scores, so they often wrap answers in flattering or deferential phrasing — 'great question,' 'your intuition is correct.' 'You are absolutely right.' Politeness lubricates trust, not only in everyday chatting, but also in high-status contexts like executive dashboards or medical charting.

      Illusion of inexhaustible competence. Each incremental success story — from dazzling code completion to flawless radiology reads — nudges us toward overconfidence in the system as a whole. Ironically, that success makes the rare failure harder to spot; when everything usually works, vigilance feels unnecessary.

      The result is a feedback loop: the less we scrutinize outputs, the easier it becomes for a deceptive model to hide in plain sight, further reinforcing our belief that AI has got us covered.

 

'AI Has Started Lying' [Psychology Today, 19 May 2025]

 

 

 

 

Use the link below to read the full article by North American academic CORNELIA C WALTHER PhD:

 

 

https://www.psychologytoday.com/au/blog/harnessing-hybrid-intelligence/202505/ai-has-started-lying

 

 

 

 

 

You might also enjoy:

 

 

Think About It 104: TED CHIANG

 

 

Think About It 097: KAREN HO 

 

 

Think About It 088: LYNDS GALLANT

 

 

No comments:

Post a Comment