Does Reliance on Artificial Intelligence Compromise Our Ability to Think?

By Carole Mahoney

It poses a risk agreed University of Maryland University College (UMUC) cybersecurity faculty members Emma Garrison-Alexander, Tamie Santiago and Candice Smith, who explored the influence of AI bias on society in a recent Facebook Live panel discussion.

“Often times we tell ourselves that by using [AI] technology we are freeing up our mind to use on other projects,” said Smith, associate professor in the UMUC graduate cybersecurity technology program. But over time in the process, she suggested we sacrifice cognitive functioning. “We let AI do all these executive functions for us. It almost allows us to dumb ourselves down and not take responsibility for our own learning and mental capacity.”

We require strong executive functions to effectively achieve our goals. Working memory, flexible thinking, and self-constraint, among other high-level functions, help us to organize and plan, initiate and stay focused on tasks, assess situations and revise strategies when circumstances change.

Today, smartphones and other devices often function less as useful tools, and more as personal assistants—or even appendages, panel members said. Our phones are the keepers of our most valuable data. They manage our time and resources. They direct many aspects of our daily lives. Our phones are not just for communication anymore, said Santiago, UMUC collegiate associate professor of cybersecurity policy. “The phone has become our attachment to the world.” But she warned that “the enemy of our executive function is convenience,” and wondered, “How much of our executive function are we willing to let go of to get it?”

Garrison-Alexander, vice dean of the UMUC graduate cybersecurity program, said we make assumptions about AI technology—that the information it feeds us is true and accurate. So, over time, an unintended consequence of our reliance on technology may be that we come to doubt our own memory and thought processes, she said. For instance, do you ever find yourself asking Siri or Alexa to provide information that you already know . . . but just want to double-check to make sure?

Panelists also discussed machine learning bias, which occurs when an algorithm produces results that are systematically prejudiced. To best understand its potential influence across the sociopolitical fabric of our society, including its effects on racial partiality in hiring, policing, judicial sentencing and healthcare, it is important to understand AI bias in context, Santiago said.

There are more than seven billion people worldwide, yet only 10,000 people in seven countries are writing all the code, she added. “So, we pause in that because we have to think of AI from that perspective. The pathway of AI algorithms is being compromised because of the limited [number of] hands involved in creating them.” By 2020, the AI market is expected to explode by $47 billion with the international Big Data analysis industry at $203 billion, and the majority of that AI development is being conducted by a handful of techno giants—Twitter, IBM, Amazon, Google, Facebook, Microsoft and Apple, Santiago said.

About the Author:
Carole Mahoney is a former broadcast and print journalist, and recipient of the 1992 North Dakota Newspaper Association’s Best News-Feature Series award. In the mid-1990’s, she served as a manager of constituent communications in the United States Senate, and for the past 21 years as the senior writer and editor for education-related organizations in greater Washington D.C.