Home Interviews  William Ammerman, Author Of ‘The Invisible Brand: Marketing In The Age Of...

[Interview] William Ammerman, Author Of ‘The Invisible Brand: Marketing In The Age Of Automation, Big Data, And Machine Learning’

1690
0

How you market your business is undergoing dramatic change. Sophisticated computer algorithms can test billions of targeted messages, measure the results, and rework how you attract and retain customers ― all in fractions of a second.

In the new book “The Invisible Brand: Marketing in the Age Of Automation, Big Data, and Machine Learning, thought leader and renowned technologist William Ammerman provides an in-depth exploration of the risks and rewards of this epochal shift, all while delivering the information and insight you need to stay ahead of the game.

Ammerman recently sat down with Young Upstarts to share his insights about the intersection of AI, marketing, and business.

Here is some of our conversation:

There’s a fascinating story behind the title of your book, “The Invisible Brand”. How did it come about?

My goal in writing the book is to help people see the hidden influence of artificial intelligence in their lives and to reveal the inner workings of digital marketing technology. The original book title was “The God Algorithm”, but when McGraw-Hill expressed interest in being my publisher, we decided to tweak it to capture the attention of professionals in business and education.

As I was pondering new titles, a friend remarked that artificial intelligence operates like “the invisible hand,” a reference to the unseen forces that shape free markets in Adam Smith’s book “The Wealth of Nations“. I thought this was an interesting observation, and I wrestled with variations like “invisible intelligence” and “invisible marketing” before settling on “invisible brand.”

I’ve stretched the term “brand” to encompass a wide range of people and interests — from politicians to institutions and even governments — who are all intent on changing how we think and what we do. Artificial intelligence (AI) is complex, and while the public has a general sense that they are being watched by their laptops and mobile phones, I wanted to explain exactly how it works. I want to help people see the Invisible Brand already at work in their lives.

Why should entrepreneurs and businesses prioritize AI? What consequences might they face if they don’t?

AI is impacting commerce in a variety of ways, including marketing automation, content personalization, dynamic pricing, and voicebots.

Consider the example of voice assistants like Siri and Alexa. Powered by a branch of AI known as natural language processing (NLP), voice assistants are proliferating exponentially, and for good reason. Children as young as two years old can engage with Alexa’s voice user interface, long before they learn to read. We’ve been living in the age of the graphical user interface (as our computer and mobile screens attest), but increasingly we are talking to our cars, our TVs, and our mobile devices.

Juniper predicts voice commerce will grow to an $80 billion industry by 2023, so it should be apparent that entrepreneurs and businesses will gain an advantage if they ensure their products and services can be found and purchased using voice.

When Google came on the scene in the late 90s, some businesses quickly understood the importance of being found by the search engine, and they invested resources in mastering SEO. Just as companies had to invest in SEO when it was new, they need to make a similar commitment to AI today. Not doing so means lost opportunity and profit.

In your book, you introduce the concept of psychotechnology. What does this look like? 

Psychotechnology is a portmanteau of “psychological technology.” I use it to describe technology that influences people psychologically by deploying artificial intelligence through digital media. Informally, it can be shortened to “psychotech.” The four components of psychotechnology include the personalization of information, persuasion as a science, machine learning, and natural language processing.

By personalization, I’m referring to the fact that people see content and advertising on digital media that’s customized to their likes and interests. Your Facebook feed is different from my Facebook feed, and the ads you see on the home page of CNN.com are different from the ads I see. As opposed to broadcast media, which delivers the same information to everyone at the same time, digital media allows for mass customization, all tailored to the individual. Artificial intelligence allows marketers to combine personalized information with machine learning to learn how best to persuade us.

Then, add natural language processing to the mix. We are now speaking with machines that are designed to learn how to persuade us with personalized information. The implications are staggering — at once both awesome and terrifying. Psychotechnology has applications for media, marketing, finance, education, and even the arts. In healthcare, psychotechnology holds the promise of helping us overcome challenges, like addiction, by persuading us to make healthier lifestyle decisions.

By giving psychotechnology a name, people may learn to see it at work in their own lives, and hopefully begin a conversation about the opportunities and dangers psychotechnology represents for all of us.

How does AI-driven personalization play out to consumers? 

The internet allows brands to personalize messaging based on data about what we like, where we go, and whom we know. Digital advertising technologies (AdTech) first applied personalization to deliver narrowly targeted advertising to specific audience segments based on their online behaviors. Marketing technology (MarTech) broadened the approach to deliver personalized messaging to online and offline audiences, based on information gathered from online and offline sources. Native advertising, which mimics the look of online news content, delivers customized messaging to target audiences with tailored headlines, images, and content. Social media delivers unique content to individuals based on their interests and associations.

In short: mass media has been replaced by mass personalization.

AI algorithms reward us by telling us what we want to hear. A social media “like” triggers a small release of dopamine, which produces pleasure in our brains and keeps us addicted to our social media feeds. Video game developers use similar triggers to reward us and keep us addicted to their games. Researchers, including Clifford Nass and BJ Fogg, have transformed the study of persuasion into a science while simultaneously demonstrating that humans can develop an empathetic relationship with their computers. They have shown that the more human-like computers seem, the more empathy humans display toward them. As computers gain more human-like qualities, such as speech, they become more persuasive.

AI is exciting, but it has a dark side. What should business leaders be wary of? How can they ensure AI is used ethically?

Consumers are increasingly wary of business technology that crosses the creepy line. They want a wall of separation between their personal lives and the corporations, politicians, institutions, and governments that want to influence them. Business leaders and marketers need to respect that boundary — or cross it at their own peril.

For instance, if you’re running a ride sharing app, you legitimately need to know where someone is and where he or she is going. The consumer accepts this as a logical tradeoff for the convenience of being picked up and dropped off. But that same consumer might be less tolerant of a music app that tracks movements and locations. Similarly, consumers seem willing to trade the convenience of talking to Alexa or Siri in exchange for not having to type their questions into a screen. But they are much less tolerant of apps that “listen” to conversations specifically to gather data and deliver targeted advertising.

The lesson for businesses and marketers is that consumer attitudes toward AI technologies are changing; consumers may be willing to trade privacy in exchange for certain conveniences but not others. As consumers become more aware of surveillance from their devices, the more they are demanding corporate responsibility in the use of that capability, and they are willing to proactively punish companies that cross the line.

To learn more about William Ammerman and his new book, visit his website.