Study finds that including "AI" in product descriptions makes them less appealing to consumers

midian182

Posts: 9,999   +131
Staff member
Facepalm: Companies love to shoehorn the term AI into their product descriptions, even if doing so seems weird or, at times, just stupid. They believe the inclusion of the initialism will appeal to consumers who want the latest cutting-edge tech. The reality, though, is that many people are put off when a product reveals its AI smarts.

A study by Washington State University, published in the Journal of Hospitality Marketing & Management, surveyed 1,000 adults to evaluate the link between AI disclosure and consumer behavior.

The study involved presenting diverse products and services to participants. Some of these included the term "artificial intelligence" in their description to see if it affected a person's decision whether to purchase it.

It was found that one group of participants was much less likely to buy a smart television when it included "AI" in its description. Another group that saw the same description just without the AI part was much more likely to buy the TV.

The negative impact of using the term artificial intelligence was more pronounced in "high-risk" purchases such as expensive TVs, medical devices, or financial services. Mesut Cicek, the Study's lead author and Washington State University clinical assistant professor of marketing, suggested this was because consumers are more wary of losing money or risking their physical safety.

Low-risk products such as vacuum cleaners and service delivery robots that mentioned AI weren't perceived quite as badly, but people still preferred their non-AI alternatives.

"When AI is mentioned, it tends to lower emotional trust, which in turn decreases purchase intentions," said Cicek. "We found emotional trust plays a critical role in how consumers perceive AI-powered products."

The same aversion to the use of AI in descriptions was present across eight different product and service categories.

Cicek summarized that marketers should be careful how they present artificial intelligence in their product descriptions, and that emphasizing the term might not be the best approach.

Earlier this month, a poll asked if PC fans would be willing to pay extra money for hardware with AI capabilities and features. Over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn't know, while just under 2,000 voters said yes.

Placing AI as the highlight of a product has become commonplace these days – you just have to look at AMD's Strix Point mobile chips, which carry the Ryzen AI 300 branding. There are also AI PCs, apps, sales, services, and pretty much everything else you can think of. And this is despite warnings that companies in the industry such as OpenAI could face bankruptcy if they don't start seeing better returns from the massive investments and running costs associated with the technology.

Permalink to story:

 
I had to do a quick crop on a photo in Android for my wife using Photoshop Express app. When I sent the cropped photo to my wife, Photoshop added a separate message to state that Photoshop AI was used. What the *#&#*( does AI have to do with a simple crop? I'd be willing to bet that most of these "AI" tools could have been done just fine without it.
 
I'm glad to see it's not just tech-savvy people being turned off by "AI everywhere". I've told my parents "treat it like I told you with 'crypto' and 'blockchain' - at best you don't need it, and at worst is is a scam."
 
More generally, it is because AI has a negative connotation to it. Through media such as TV shows and Movies we have been accustomed to AI going rogue or doing things that are unintended, like spying on us or taking control away from us.

When I see AI on a product two things come to mind:
1) Buzz word that means you want to charge me for something with little benefit or actually at a detriment. Does a vacuum cleaner need AI, seriously?
2) You are trying to steal my personal data in some way. This is mostly associated with software. I mean the core way "AI" works is learning from data. That data usually comes from the consumer, without their explicit knowledge. It comes off creepy and wrong.
 
Back