Rise Of Intention Economy? AI Tools To Manipulate You Into Making Decisions

Powered by LLMs, AI tools will "anticipate and steer" users based on "intentional, behavioural and psychological data".

News Desk
News Desk Super Admin
calendar_today Dec 31, 2024 schedule 2:30 AM visibility 13 chat_bubble 0
W
World
NEWS CARD
Logo
Rise Of Intention Economy? AI Tools To Manipulate You Into Making Decisions
“Rise Of Intention Economy? AI Tools To Manipulate You Into Making Decisions”
Favicon
Read more on attentionworlds.com
31 Dec 2024
https://www.attentionworlds.com/rise-of-intention-economy-ai-tools-to-manipulate-you-into-making-decisions
Copied
Rise Of Intention Economy? AI Tools To Manipulate You Into Making Decisions

After social media and targeted advertising manipulated people into making impulse decisions regarding their shopping and electoral habits, the future may have just got a little more dystopian. Researchers at the University of Cambridge claim that artificial intelligence (AI) tools could soon be used to manipulate the masses into making decisions that they otherwise would not. The study introduces the concept of an "intention economy," a marketplace where AI can predict, understand, and manipulate human intentions for profit.

Powered by large language models (LLMs), AI tools such as ChatGPT, Gemini and other chatbots, will "anticipate and steer" users based on "intentional, behavioural and psychological data". The study claimed that this new economy will succeed the current "attention economy," where platforms vie for user attention to serve advertisements.

"Anthropomorphic AI agents, from chatbot assistants to digital tutors and girlfriends, will have access to vast quantities of intimate psychological and behavioural data, often gleaned via informal, conversational spoken dialogue," the research stated.

The study cited an example of an AI model created by Meta, called Cicero, that has achieved a human-like ability to play the board game Diplomacy which requires the participants to infer and predict the intent of opponents. Cicero's success shows how AI may have already learned to "nudge" conversational partners towards specific objectives which can effectively translate into pushing users online towards a certain product that the advertisers may want it to sell.

Selling right to influence?

The dystopia does not stop here. The research claims that this level of personalisation would allow companies such as Meta to auction the user's intent to advertisers where they buy the right to influence the decisions.

Dr. Yaqub Chaudhary from Cambridge's Leverhulme Centre for the Future of Intelligence (LCFI) emphasised the need to question whose interests these AI assistants serve, especially as they gather intimate conversational data.

"What people say when conversing, how they say it, and the type of inferences that can be made in real-time as a result, are far more intimate than just records of online interactions," said Dr Chaudhary.

Also Read | 'Dystopian' AI Workplace Software That Tracks Every Move Has Employees Worried

Internet spooked

Safe to say, the findings have spooked the internet with users worried about what they had been sharing with the new-age chatbots.

"People are sharing much more personal info with AI than regular google search. The better it understands you, the easier you will be manipulated," said one user, while another added: "Now in other news, the Sun rises in the East and sets in the West."

A third commented: "This level of persuasiveness would be dangerous in the hands of the best government, and it's going to be in the hands of the worst."

The study calls for immediate consideration of these implications so that users can protect themselves from becoming unsuspecting victims of AI's evil intentions.

favorite Follow us for the latest updates:
Author
News Desk

News Desk Super Admin

keyboard_arrow_up
podcasts Podcast amp_stories Web Stories local_fire_department Trending person_book Biography mark_email_unread Newsletter