Get in touch
Think that we can help? Feel free to contact us.
At UpShift we really like the concept of “engagement”, if you want to communicate with someone you need them to be engaged, behind which lies some art, science and luck. However, digital engagement has gone off the rails over the past decade and it’s worth exploring what engagement is and how it’s being misunderstood and misused in the digital age.
Let’s start with some relevant definitions of “engage”:
These are all positive, worthy activities - how has some of the digital landscape got it so wrong?
For much of the digital world engagement is the name of the game, especially for companies that rely on advertising revenue - the longer a person spends on a site or app the more likely it is they’ll take a revenue generating action (clicking on an ad, buying a product etc.)
First up let’s blame the machines.
In the beginning digital content was generally listed in a navigable content structure, or as a date sorted list. Think of the UpShift UpDate - the most recent posts are at the top, the oldest at the bottom - easy.
In the late 2000s/ early 2010s companies started experimenting with matching personalised content for each user based on their perceived interests. This is how Facebook, Twitter and pretty much all social media platforms work in 2021. Larger content and e-commerce sites also use personalised content matching to display articles or products that are of interest to the visitor.
So far so good - until we task computers with trying to work out what a person is interested in.
An easy way to measure engagement is by the amount of time a user spends looking at a certain piece of content and how they interact with it. As a base measure it’s pretty hard to beat and companies can easily get truckloads of data on time spent and actions taken.
Which is where the wheels start to fall off.
Think of two different pieces of content:
If we have machines measuring engagement based on time spent and activity which post is going to be deemed as more engaging by our robot buddies?
Yes - the inflammatory post, almost every time.
From the machine perspective they are seeing increased activity and time on divisive posts. From the human perspective we are seeing yelling, screaming, stupidity and possibly some damaged friendships.
The underlying issue is that machines can’t measure the impact of content - does a piece of content make us happy or sad / smarter or dumber / relaxed or angry/ secure or scared? I’m sure there’s teams of very smart people working on this problem right now - but considering a chunk of the population are wearing smart devices which monitor our biological processes in real time that makes me slightly nervous.
And let’s not forget that maybe companies would rather have us glued to our screens, rather than happy? As I’ve said elsewhere:
Happy people don’t stay online as much as angry/ sad/ outraged people do.
So, obviously the machines are to blame as they are using simplified measurements to dish up content that divides and angers.
But what about the human factor?
To get the obvious out of the way - yes, people build and direct the bots that run our social media platforms and content delivery systems. So it’s easy to point our collective fingers at big tech and blame their ongoing thirst for power and wealth.
But does the audience have a hand in this mess? Let’s dig deeper:
Human beings are very social animals, and high up in our hierarchy of needs (sleep, water, food) is social interaction and the desire to connect with other people in a meaningful way. Social media offers us a quantity of social interaction we never imagined, but much like junk food it lacks the emotional nourishment we actually need.
So instead of putting our time and energy into a small number of deep meaningful interactions (which are sometimes hard to find amongst the rush of modern life) we find ourselves investing in a large number of shallower interactions instead.
In most cases when we create content we are looking for a reaction - be it a “like”, “love” or comment. Which encourages us to create content that is more likely to get a reaction.
We’ve now got a reasonable chunk of the population involved in the gamification of content where to “win” is to attract attention and the “rules” are stacked towards negative, divisive communication. The rules even dictate that shorter posts are better than long posts as people are more likely to read short posts. Not to mention that in many algorithms images beat text so writing text on an image becomes a more effective form of communication. Ever wondered why your feeds are such a mess?
Adding to this, humans have evolved to pay more attention to perceived threats & conflict. There's a good reason "Action", "Drama", "Thriller" & "Horror" are such popular genres in fiction - we're more interested & engaged in these types of narratives.
Not only are the machines showing us shallow, inflammatory content because people are more “engaged” with it - but we’re creating more shallow, inflammatory content in an effort to get a reaction from our peers.
Of course the content doesn’t need to be inflammatory to get a reaction, it could be populist, misleading or involve showing more skin than normal. There’s plenty of ways of getting a reaction but the easiest generally have the sophistication of a preschooler running through a dinner party in their underwear shouting “poo!”.
This battle for attention has even generated the clickbait culture where professional content writers fill our screens with “Young mum is shocked by one simple trick which pays off her mortgage in 21 months” - but I’ll leave that rant, I mean post for another day.
Where have we got to?
By using simple ways of measuring “engagement” machines are more likely to show us inflammatory and divisive content. But for people to get the reactions they crave they are more likely to post inflammatory and divisive content. The current digital content ecosystem makes people misinformed and unhappy while abusing the concept of engagement in concerning ways.
No doubt there’s a bunch of smart people out there working on solutions, but there’s some pretty big business models and a lot of inertia around the status quo.
The irony of all this is that while I’d like to think I’m writing this post as a thought exercise about an issue I’ve been pondering on for years - I’m going to be pretty bummed out if I don’t get much of a reaction when I post it to social media. Am I going to write some crafty supporting text in an effort to get click throughs - yes I am. Self awareness can suck sometimes.
Banner Photo by Headway