When we go online, we’re seeking a sense of safety and control.

Let the user choose

Too many social media and news website developers fail to understand basic human nature.

Are you like me in finding it irritating when news and social media sites push videos on you with the sound ON while you’re listening to your favorite playlist? Why should we have to actively pause videos? Why should any video be allowed to play, especially with the sound on, if a person hasn’t chosen to view AND listen to it?  When that happens to me, I unconsciously wrestle back my sense of control by closing the video even if the subject is compelling. If there’s text below the video, I’m also less likely to read it. –Research shows that my reaction is standard for most people.

While this response isn’t entirely rational, it is predictable, given the brain’s primitive flight-or-fight biochemistry. Modern life is a struggle with its continual challenges, including the overwhelming number of daily decisions we have to make in rapid succession. When we go online, we’re seeking a sense of safety and control, comparable to returning to our cave in prehistoric times after fending off multiple threats to life and limb on the savannah.

When we click a website for information and instead are affronted by a loud, bright video, our brain interprets those sensations very similar to the way our ancient ancestors would experience the terrifying sight of a charging, growling saber-toothed tiger. Of course, we ‘know’ these things are completely different, but we subconsciously rationalize our aversion to the video based on our ancient hardwiring, telling ourselves that the video isn’t interesting.

Help users navigate

Similarly, if a visitor scrolls down a long launch page, with poorly differentiated segments, it evokes the primal fear of getting lost, also rooted in our ancient history as a species. –To prevent this, incorporate a ‘sticky navigation‘ menu. More specifically, the website menu needs to move with the page as the visitor scrolls down.

Above all, remember that if you push content on visitors without their permission, it triggers a much more contemporary concern–that you are giving them an old-fashioned hard sell, something that works for few and alienates most.

Our online information is tracked, calibrated and interpreted by innumerable invisible algorithms to help marketers target us with ads designed to get us to buy what they’re selling. That, combined with our ongoing vulnerability to hackers, has bred cynicism and a growing awareness that online privacy is tentative at best.

Emotionally intelligent AI

In the near future, many of our devices will be equipped with emotionally intelligent AI sensors that follow our constantly morphing facial expressions, voice, and other biometric indicators. On one level, it’s just a continuation of advertisers trying to get us to buy things. However, this is much more than an evolutionary progression because it allows invisible actors to track the full range of our subtle emotional responses, including those we are trying to conceal. As technology increases its capacity to discern these cues, its manipulations will gain stealth and greater impact.  We all need to be aware of our growing susceptibility to being ‘played.’

These new generation AI advancements will be introduced into our devices under the radar as firmware upgrades. For example, Apple recently purchased Emotient, a leading AI company at the frontier of facial recognition technology. So, when SIRI is upgraded to read your moods, don’t be surprised if “she” recommends a specific song or movie to uplift your spirits.

Virtual assistants to the rescue

When we go online to research a product, we typically make a purchase. Mediating that process are tech goliaths like Google, Amazon, Apple, Microsoft and Facebook. Fortunately, they are all developing Virtual Agents (VAs), aka, Virtual Personal Shopping Assistants (VPSAs), designed to serve our real needs and avoid AI marketing manipulations.

A VA/VPSA monitors online purchase options 24/7, whether you’re online or not. Once a potential purchase is identified, your virtual assistant facilitates your asking specific questions to help you make the best possible purchase decision. In this way, a VA replicates the experience we get in a brick and mortar store that has outstanding customer service.

This means that marketers will now need to target their messages directly to VAs, rather than to humans alone. This is much to our (human) advantage because VAs have few cognitive limitations and no emotional vulnerabilities.

The psychology of digital marketing

From its inception, marketing has relied on psychology for insights and strategies. Digital marketing is supercharging this long-standing connection. So far, I focused on one example:  how the inappropriate use of online video jars our ‘hard wiring.’ I discussed how emotionally intelligent AI makes us increasingly vulnerable to marketing ploys while new Virtual Assistants hold promise for rebuilding our online defenses.

Below, I discuss the emerging development of new technology giving us the ability to design our own online personal identity algorithms —This leaves a lot of territory to explore in future Insights articles.

New online technologies

To date, digital technology has been all about optimizing commerce. Few of us consider the possibility of taking control of the online algorithms that define our needs, emotions, and ethical standards. However, if we don’t, others will continue to manage that for us by default. As AI-based online technologies become more transparent and respectful of our preferences, users will feel safer making decisions, including purchases, when online.

Several companies already offer first generation software to help us control our data– either by selling it to us directly or organizing it on platforms. Such services can be found on Datacoup, Mecco, Sedicii, myWave, digi.me and Magpie. Some experts believe that such individual ‘data currency engines’ can provide individual control while at the same time generating significant business value. Strict online privacy laws in the EU are creating additional momentum for this and other potential strategies to protect personal information.

Next generation of protection?

A significant advancement, suggested by one IT engineer, would be to combine a personal assistant with a sophisticated data filter and proxy avatar (the latter, leveraging virtual reality technology) to help stay true to what has personal value for us over time. This would demand a level of sophistication we haven’t yet achieved.

Surprising emotional benefits of robotic AI

One final note–the Internet of Things has introduced helpful, friendly companion ‘robots’ like Google’s NEST and Amazon’s Echo. ‘Artificial empathy’ coming from AI-enhanced emotionally intelligent machines creates a beneficial feedback loop for us humans. When machines honor our subjective feelings and beliefs, it reinforces our emotional awareness and capacity for emotionally intelligent behavior towards our fellow human beings.