Academia’s Role in the AI-Amplified Age
How HSB is navigating research, teaching in a future-forward paradigm
By Eleanor Hunt
With artificial intelligence (AI), new opportunities are emerging while some careers are retreating. Although some people fear technological sophistication will displace jobs, others laud AI for aiding efficiency. Academia plays an important role in the new AI paradigm.
“AI must be incorporated in higher education,” said Joseph Aoun, president of Northeastern University. “If not, institutions will become obsolete.”
Aoun wrote the book “Robot-Proof: Higher Education in the Age of Artificial Intelligence” eight years ago, long before AI was featured in daily headlines. A second edition of the book was released in August. In both editions, Aoun proposes that higher education prepares students for an AI-infused workplace by helping them master three literacies: data, technology and human. Human literacy incorporates teamwork, ingenuity, critical thinking, entrepreneurship and cultural agility.
“Students need to understand how AI technologies function and the data they generate,” he said. “They also need to understand what humans can do that the digital and AI world cannot duplicate.”
At Northeastern University, classroom studies emphasize the three literacies and experiential learning programs underscore that knowledge by blending classroom theory with real-world experience.
The role of AI in teaching
Hankamer School of Business faculty leverage AI and large learning models (LLMs) in teaching and researching. Mitchell Neubert, senior associate dean for Research and Faculty Development, said technology can help faculty in course preparation by collecting, summarizing and illustrating information. It can facilitate classroom engagement by inviting student participation in generating prompts or evaluating output, and it can be used by faculty for initial grading or summarizing student feedback.
“Outside of class, I have used Copilot AI to summarize comments about an event,” Neubert said. “It quickly helped make sense of feedback from several hundred students by sorting through the data, creating main themes and summarizing the students’ feedback.”
Lane Wakefield, clinical assistant professor of Marketing, employs AI to create mock role plays for his students. Using different prompts, students can experience and respond to AI by taking the customer’s position.
Zachary Ward, associate professor of Economics, applies AI in his Econometrics class where students learn the theory of using data and statistical models in economic forecasting. The statistical software Ward uses has a steep learning curve, but the AI-powered language model, ChatGPT, provides software programming code and assists with progression model estimations.
“ChatGPT is helping my students get to the frontier of learning how to do things,” Ward said.
The role of AI in research
Governing bodies for academic journals require researchers disclose how they use AI technologies in their work. That is because an LLM generates outputs from existing data, so huge parts of a manuscript can be created from a prompt entered in the software.
“AI-generated manuscripts are not original research works,” Neubert said. “AI can sometimes be speculative because it is predictive and can deliver ‘hallucinations,’ which are misleading results. The danger is not doing the diligence in research to verify information and create original knowledge.”
Carlos Torres, assistant professor of Information Systems and Business Analytics, uses an algorithm trained on scientific literature databases to identify potential keywords, find papers and read them to determine their relevance to a research question. It has been a beneficial addition to having research assistants helping him.
“It is more efficient in conducting literature reviews,” he said. “After selecting the articles based on the summaries provided by the AI, I pull the articles to ensure I have a high level of accuracy and we can go through them. We use real intelligence to determine information accuracy.”
Betty Xing, assistant professor of Accounting, is researching company financial statements and disclosures and how they impacted investors and the capital market during COVID-19. To analyze the texts from company disclosures, she uses ChatGPT, along with the traditional textual analysis approach that is keyword-based. She notices that ChatGPT can classify sentences better into different categories because it understands contextual features around the text. It also is able to identify textual features unique to the COVID-19 setting. Xing has papers under review in several highly reputable Accounting journals.
“In addition to gaining a more complete understanding of textual disclosures for academic research, we can train students to use AI technology to understand both qualitative and quantitative aspects of financial statements in making management decisions,” Xing said. “As educators, we are training our workforce not to do things that AI can do, but to understand them.”
Ethics in an AI-driven world
To maintain ethical standards in academics, HSB faculty set clear guidelines about AI-generated content in their curriculum and require students to document the steps they take in certain assignments. In research, HSB faculty give AI and LLMs well-defined, supervised roles.
Meanwhile, government, companies and academia continue to discuss AI ethics—from plagiarism in essays to machines usurping jobs, intellectual property and human likenesses. The misconception is that AI can do what humans can do. Not so, according to Aoun.
“We are overhyping the sophistication of these systems,” he said. “Humans can transpose their knowledge from one domain to another, which is something machines cannot duplicate easily.”
Xing said Accounting professionals still must use their professional judgment, interpersonal experiences and knowledge of company behaviors to make sense of data interrelations in digital outputs.
“In the past, we trained students to do bookkeeping,” she said. “Now, we train them to understand the meaning behind bookkeeping to be better decision makers.”
At this stage, Ward has few qualms about people displacement in economics analysis.
“An Economics education forces you to think and learn the theory of how to adapt to marketplace changes, which is always a useful skill to have,” he said.
Yet, some career fallout is expected as digitization and workplace automation surge ahead. The key to being impervious—i.e., robot-proof— lies in lifelong learning and reinvention, according to Aoun.
“We are going to constantly reskill ourselves,” he said. “But this process of reinvention is not a call for individuals to reinvent themselves. It is for higher education to prepare people for a life of reinvention, which is going to be a key attribute. Incidentally, that may define us in a distinctive way from machines, because machines are not going to reinvent themselves.”