Enjoy the craziness of election season while you can: In the future, computers could become so adept at molding your behavior that you won't really choose who you vote for anymore.
At least, that's what one robotics expert at Carnegie Mellon University fears.
Professor Illah Nourbakhsh recently appeared on a panel called "Life in 2025" during a conference at MIT. He expressed concern that today's germs of artificial intelligence technology could evolve into a full-blown disease that infects our political system.
Nourbakhsh envisioned a future where politicians control voters based on their data -- the information they create when posting online, using a smartphone and so on. And he said this scenario might be little more than two election cycles away.
"What does it mean when the winner of the political system has the best machine learning?" Nourbakhsh asked at the conference.
Machine learning exists all around us today. Basically, the term refers to a program's ability to study data, recognize patterns and then make predictions based on that data. Facebook, for example, uses machine learning to determine what you think is interesting and serve you similar content. You should never forget that anything you post online -- be it a link, status or photograph -- is really just data for an algorithm to read and learn from.
It leads you to question whether we still have free will.Illah Nourbakhsh, Carnegie Mellon University
Nourbakhsh fears that this type of machine learning could soon become so advanced that politicians will be able to tap into programs that perfectly understand voters. Those programs, which Nourbakhsh said will likely be available only to the richest groups at first, could help shape how voters behave, based on what those people have done in the past.
In a sense, you could think of it like subliminal advertising on steroids. Rather than targeting voters with a single television ad that may or may not resonate, campaigns of the future could target individuals online with incredible precision because their messaging will be informed by the data those voters created.
"It leads you to question whether we still have free will," Nourbakhsh told The Huffington Post in a follow-up interview last week.
"If you're getting exposed to a set of directions just right to get you to vote a certain way, that echo chamber you're being built into has high walls," he continued. "It can stop you from seeing the whole landscape. And in doing so, it could make you less informed."
Some researchers and academics already fear that Facebook creates an echo chamber effect. The argument goes that, since the platform delivers content that it thinks you'll find interesting, you're likely to see a bunch of stuff you agree with, no matter how diverse your network of friends is.
That may not worry you, since Facebook has yet to become ingrained in every aspect of our lives, but the world is quickly becoming more connected and more intelligent. For example, Clear Channel Outdoor is introducing advertising billboards across the United States that can collect data from your mobile phone in hopes of learning about your behavior -- which then leads to more effective advertising.
"With the data and analytics, Clear Channel Outdoor could determine the average age and gender of the people who are seeing a particular billboard in, say, Boston at a certain time and whether they subsequently visit a store," The New York Times reported Monday.
In other words, while it's worth being concerned about the data you consciously create, like an Instagram post, there's an entirely different layer of data you unthinkingly generate just by walking around with a smartphone.
Perhaps most worrisome is the notion that younger people are starting to generate this data. Many adults understand the responsibility of signing up for a Twitter account or enabling location services on their smartphones, but children get smartphones when they're in middle school and use Google Chromebooks in their classrooms. All of it creates personal behavioral data.
Google, responding to criticisms about how its Chromebooks might collect data on minors, said "we keep students' data private and secure," though one might still worry over the fundamental fact that said data is held by one of the world's most powerful tech and advertising companies.
This would all be less concerning, Nourbakhsh said, if the U.S. government were more adept at considering these issues. The European Union has fought to protect the privacy of its citizens from tech giants, but U.S. law seems to trail behind these advancements instead of anticipating them.
"Europe is great because they're creating an expectation of privacy. They're staying ahead of the technology curve," Nourbakhsh told HuffPost.
"They've been thinking about innovation, and what they're doing at the regulatory level is including technologists so they can have some foresight in what's coming down the pike. Here, we don't do that," he added.
Neither do we educate our computer science majors on these ethical issues, Nourbakhsh said.
"People get their engineering degree and never take an ethics class," he said. "They don't even study the kind of inequities we have in society. It's remarkable."
Of course, keep in mind that we can't say for sure how artificial intelligence will actually be deployed. That's why experts say we need to have these conversations now, before the genie is out of the bottle.