Years before Google was a thing, I worked on an AI project that felt more like science fiction than science. I was in charge of teaching the computer about television; Charlie’s Angels is a TV show about sexy, female detectives, not a radical religious organization. I was told that if I fed enough information into the computer, the machine would eventually learn the difference between Chips the cop drama and chips that you eat. I left the program before that ever happened so I can’t say for sure if that programmer’s dream ever came true but someone cracked the code because today’s operating systems and search engines are pretty darned smart when it comes to sort out this from that. But they’re not perfect and that’s where a company can run into trouble.
I began thinking about my old gig when Google search wisely offered me a link to a story in VentureBeat called “Careful what you write: How text mining could hurt your business”. The author, Jeff Catlin of Lexalytics, says that today’s AIs are “still lacking the contextual knowledge that a nine-year-old is carrying around.” For example, they don’t understand that stories posted on April 1 are likely to be false and that a musician who is “sick” isn’t ill but very good at what he does.
The problem stems from the fact that companies are being forced to use automated scanning and filtering systems if they want to stay ahead of the pack. Stock traders use programs that scan the news and press releases for any hint of activity that might cause a stock to rise or fall. Companies routinely set up alerts to find out what the competition is up to but that’s nothing compared to this:
“More than 100 nations convened under the auspices of the United Nations in Geneva last week to meet to discuss “Lethal Autonomous Weapons Systems” – (aka “killer robots”). Ethical questions aside, these are systems that are heavily dependent on machine learning and AI in order to make targeting and “fire” decisions.”
When it comes to the car I’m riding in, I trust a computer’s reaction more than the reaction of a human behind the wheel. But when we’re talking about guided missiles. . . . I’m not completely sold.
Your company may not be involved in anything that heavy but a dependence on machine learning can lead to an embarrassing social media moment or a bad brand decision.
I once worked with the owner of a local gym who thought he’d short cut the Twitter follower system by using a program that automatically followed anyone who mentioned the word ball or balls. He was expecting base, basket and foot but that’s not all he got. Luckily he wasn’t using a tool to auto retweet those comments or that would have been the end of his after-school program for kids.
The more sensitive the nature of your business, the more you need to be wary of AI’s bearing gifts. If you’re a trendy clothing company, your PR people can probably spin that inappropriate machine-posted message into an “I meant to do that” crowd-pleaser. But if you’re running a business that deals with kids, health care, education or finance, that wild post is going to be harder to brush off.
So how do you save time and your reputation? By using discovery tools to find data but only authorizing humans to post. I tested this theory last week when I reviewed a new online tool called DrumUp. You hook up your social media accounts, pop in some keywords and DrumUp finds relevant content for your feeds. A real time saver, for sure, and it also surfaced some interesting posts that I didn’t find in my own daily search. But after 24 hours, I felt compelled to cut the connection. None of the offered items were inappropriate but many missed the mark because of semantics. When I say “fan”, I mean people who love TV but DrumUp prefers to highlight sports fans even after I listed a dozen sports in the filter out box.
Machine learning is a wonderful thing. It’s the reason we can find what we need on Google, network with the right people on LinkedIn and read our Gmail without all those pesky promotional emails getting in the way. But if you become too dependent on those systems you might be missing out on misfiled emails and seemingly unrelated people whom you’d really like to meet.
This isn’t just about protecting yourself from an overzealous bot, this is about being in control of your business.
I’ll leave you with this quote from artificial intelligence researcher Eliezer Yudkowsky:
“By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.”
Who is running your business? You or the machines?