Jump to page content

BIAS & PROJECTION - Human thoughts about Nonhuman Intelligence #1

To Psychologists, Projection is a psychological defense mechanism. They define it as “Unconsciously attributing your own disowned feelings, traits, or impulses to another person.”

It’s relatively common, varies by degrees and essentially protects one’s image of self by externalising an internal conflict.

Or so my shrink told me, har har.

It can be frustrating to be on the receiving end - it can be more obvious to the receiver than the giver, but at least it tends to be at manageable levels - 1 to 1 as it were.

Whilst some legacy media personalities may (allegedly) indulge at broadcast scale, it’s largely contained within, and mitigated by, the normal and reasonable opinions we form about other people - whether we trust their opinions and judgement and so on. People are fallible. That's our default.

Let us  consider Artificial Intelligence. 

Examples of AI Bias are common. COMPAS, Microsoft Tay, iTutor spring to mind and we’re only just beginning to create them. Recently, Craig Guildford, Chief Constable of West Midlands Police landed in very hot water, having to admit misleading MP’s after a Google AI augmented search was used in justification for banning Maccabi Tel Aviv fans from a match. An entirely fictitious match was cited - probably an AI hallucination.

Already the predominant first result “Answer” in your Google search, much to the Chief Constable’s chagrin, is not a web link to a third party publisher, it is published by Google, having originated with their Gemini AI.

For many, Google is already the absolute authority on objective truth, (unlike the assumed fallibility of humans). It gained this reputation by linking to the documents of others, which could be reviewed with skepticism, with critical thinking and across the field of opposing viewpoints. 

In moving from searching the web of third party documents as was previously the case, Google and others have chosen to present  a single “answer”.  They call it AI Overview.

This is not the same

This is a seismic and systemic change, leveraging as it does, Google’s authority and perceived trustworthiness to answer our questions. Transferring the perceived source of the answer from the web of collected projected human musings, and abstracting it up to an artificial intelligence, perceived as free from human bias or guile.

Human and Nonhuman intelligence have some learning about each other to do, it seems.

 

By Phil Blything

Phil Blything