The human touch in AI: Why we anthropomorphise machines

|
  • 0

The human touch in AI: Why we anthropomorphise machines

Tuesday, 04 February 2025 | Biju Dharmapalan

The human touch in AI: Why we anthropomorphise machines

As AI continues to evolve, it is crucial to maintain clear boundaries between machine intelligence and human cognition

In an era increasingly defined by artificial intelligence (AI), a curious psychological phenomenon has emerged—humans are anthropomorphising AI tools. From virtual assistants like Siri and Alexa to large language models like ChatGPT, people often ascribe human-like emotions, intentions and even personalities to these systems. While this tendency may seem benign or even endearing, it has profound implications for society, ethics, and the way we perceive technology.

By giving AI tools human-like names, voices and personalities, developers create a sense of familiarity and comfort. For instance, when Alexa responds with a cheerful “Hello!” or when ChatGPT uses conversational language to explain complex topics, users are more likely to feel at ease. This human-like interaction fosters trust, making people more willing to rely on AI for tasks ranging from scheduling appointments to providing emotional support.

A lot of evidence points to psychological causes for this behaviour. As a species, we have an innate propensity to attribute human characteristics to things that do not exist. Even inanimate things, like animals and objects, can have feelings and purposes.

The advent of AI has amplified this tendency to an unprecedented degree by creating tools that can engage in decision-making, interactive communication and natural language processing in ways that are reminiscent of humans. Incorporating human-like characteristics into AI systems takes advantage of this natural propensity, making them appear more relatable and trustworthy. 

While the benefits of anthropomorphising AI are evident, the practice is not without its pitfalls. When AI tools are designed to mimic human behaviour too closely, users may forget they are interacting with a machine.

This blurring of lines can lead to over-reliance, misplaced trust, and even exploitation. For instance, a user might confide deeply personal information to a chatbot, unaware that their data could be stored, analysed, or misused.

Deploying AI tools in delicate situations raises even more serious ethical concerns. Take into account the application of AI in counselling and treatment. Although a chatbot designed to mimic human behaviour can offer some assistance, it can’t replace a human expert when it comes to empathy and moral reasoning.

Users’ potential misunderstanding of AI as a substitute for human therapists highlights the importance of openness and limits. The ethical ramifications of anthropomorphisation need to be considered by regulators and lawmakers as AI becomes more integrated into everyday life. AI tools do not yet possess awareness, emotions, or free will; thus, companies must be forthright in informing consumers of these limits.

Another important point is that AI could be used in a deceptive way. Companies and governments might use human-like AI to make convincing and emotionally engaging connections. AI chatbots used for customer service or political campaigns might quietly sway people’s views and choices, which raises concerns about ethics and whether users agree to this influence.

While AI can enhance productivity, creativity, and convenience, it is essential to maintain a clear distinction between human intelligence and machine intelligence. Designers should focus on developing efficient and useful AI without encouraging unrealistic emotional attachments. Additionally, individuals should cultivate a healthy scepticism toward AI interactions, recognising the sophisticated abilities yet realising that they are fundamentally non-human. The moment humans forget social interactions and machines start social interactions, it’s the end of humanity.

(The writer is  the Dean-Academic Affairs, at Garden City University; views are personal)  

Sunday Edition

Trump's Second Term: A new dawn for Indo-US relations

09 February 2025 | ANOOP BOSE | Agenda

Dumplings to dragon dance

09 February 2025 | Abhi Singhal | Agenda

Royal Flavours of Mughlai Cuisine

09 February 2025 | Abhi Singhal | Agenda

Swaad-e-dilli

09 February 2025 | SAKSHI PRIYA | Agenda

Serene Retreat in the Aravallis

09 February 2025 | Pawan Soni | Agenda