Embrace the Change: AI is Ours to Own

As the head of school at de Toledo High School in West Hills, California, and a member of that school’s board of directors, we have been ruminating about the impact of AI and the role of educational institutions in addressing new technologies. We used our different professional and personal experiences to examine the impact of the still-nascent technology and what it may mean from the ground and balcony levels.

 

Meet the Challenge Head On

In 1494, Abbot Johannes Trithemius, a scholar of some renown, declared the “mass” printed Gutenberg Bible “the Devil’s work.” (Ironically, he had this concern copied and distributed by the printing press.) H.M. Warner (one of the Warner brothers) famously declared in 1927, “Who the hell wants to hear actors talk?” Needless to say, these two gentlemen misread the value of innovation. History is littered with examples of “experts” like these, dismissing or fearing the latest technological innovations.

In recent times, we have dealt with successive technological opportunities and challenges presented by the advent of personal computers, search engines and the seemingly endless amount of internet content ripe for plagiarism. We believe that schools cannot shy away from their responsibility to embrace technological and pedagogical advances and offer students access and understanding, coupled with ethical guidelines and other guardrails.

 

Image
photo
The Successful Embrace of Technology to Date

Secondary schools have embraced technology since the early days of personal computers in a way that is pedagogically sound and grounded in ethical behaviors. Secondary school curricula have evolved along with these technologies to encompass educational best practices on conducting research, evaluating the quality of sources, and developing the critical thinking required to parse masses of information and disinformation. 

Only a decade ago, with the advent of “term-paper mills,” many in our midst predicted that student research and writing would fall by the wayside, as responses to assignments could no longer be trusted. Websites like turnitin.com quickly sprang up, allowing educators to feed term papers into a database that would determine whether a paper was the student’s work or “borrowed” from another source. These sites have become increasingly sophisticated. Importantly, because students know that their teachers use these systems, most avoid falling into that trap. Plagiarism today isn’t any more prevalent than when students lifted pages from encyclopedias and National Geographic magazines in the 1950s.

 

Today’s Challenge

The situation today is not appreciably different from when any new technology emerged. Let us not be like the learned minds at Western Union who, in an internal memo in 1876, declared, “This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”

We can say that students must be encouraged to unplug and encouraged to avoid using ChatGPT and other large language models, but that’s a fool’s errand. Instead, as educational leaders, it is our job to find ways of incorporating this technology into our lessons as well as teaching the ethics, values and morals of how to best use the technology. As Carl Sagan noted, “The world-altering powers that technology has delivered into our hands now require a degree of consideration and foresight that has never before been asked of us.”

The best example of the impact of technology on education in Sagan’s time was the invention and introduction of the calculator. The response was quick and definitive. We could no longer “see the work,” so all of a sudden, the impetus to learn multiplication tables and formulas had to take a back seat to learning to use these new contraptions. At first, teachers bristled at the use of the calculator, but over time, they learned how to incorporate the tool so that their students could have a deeper and broader understanding of mathematics. Today, the calculator’s successor, the Excel spreadsheet, is the primary teaching tool in business schools.

Sagan was writing late in the last century, but his observation is even more true with today’s technologies. We seem to have resolved the challenge of search engines and plagiarism, but AI poses even greater challenges. ChatGPT can create writings in mere seconds—writing that often seems, on the surface, quite human and well-researched. It truly is a great leap, and it poses serious ethical and educational questions. But instead of trying to reject, ban or block access to this and other emerging AI technologies (which our “digital native” students will quickly learn to circumvent), we need to learn how to use them not only for instructional purposes but to better prepare students  for the world and workplaces they eventually will inherit.

 

Image
photo
What This Is…and What It Is Not

It might be worth a sidebar here to acknowledge what ChatGPT is and what it is not. It is a “large language model (LLM).” It is not a sentient being capable of thought, analysis, novel insights or humor. LLMs piece together a collage of prior writings that they access from the Internet. Think of it as “Google on steroids.” How it “writes” an essay has as much to do with word frequency and word order models as it does with the actual subject matter. What it lacks is perspective, context and human emotion. Both of us write a lot, and one can ask ChatGPT for an essay “written in our voice.” What one receives in response is something that has some narrative flow and organization, even incorporating a writer’s verbal “tics” and typical usage. But it lacks a sense that there is a human behind the writing—real analysis of disparate texts, ideas or events, and conclusions that are anything beyond mechanical. In addition, there are already a multitude of examples where the information spit out by ChatGPT is just wrong or incomplete (see https://tinyurl.com/mvcjs4m9).

 

A Prescription for Success

So what can be done? First, we all must get smarter about what this new technology is and what it could become. Schools need to bring their teachers the best information about the new technology and provide them the tools to use it, understanding its promise and its limitations. Second, we need to create lesson plans to educate students on the use of these models and the ethical issues their use presents. Third, we need to continue the road started by Google and concentrate our students’ focus on critical thinking and analytical writing, not the mere recitation of facts. Fourth, we need to identify the support mechanisms that undoubtedly will arise hand-in-hand with emerging technology. Before too long, turnitin.com and its ilk will be capable of identifying problematic text by utilizing the same algorithms and methodologies that ChatGPT and others employ. The technology will be harnessed to police the very abuses it may encourage. 

From a practical point of view, we must remember that we are in the early stages, the overture, of AI and LLM. Just as it took years for education to incorporate the calculator into its pedagogy, so will it take time for our approaches to this new reality to iterate and improve. But already, in the few short months after its introduction, there are numerous resources for educators to draw upon. Most importantly, teachers should open their classes discussing with students how AI can and cannot be used in class. Especially in our mission-driven independent schools, an ethical conversation is necessary.

 

Image
photo
 
Great Promise

ChatGPT can be used to help students create and revise their thesis statements and outlines. It can be used as a study guide to help students prepare for exams. And LLMs can help students imagine different ways of approaching assignments, allowing them to focus on the process and the learning, and less on the end product. In an era of “fake news” and questionable sources, ChatGPT can be used to help students learn to evaluate sources and information. And for the significant percentage of the population that deals with learning differences, AI and LLMs open up a host of new resources and approaches to help navigate these differences.

Human interactions and understandings between educator and student will be forced to improve. Increasingly, teachers will need to better “know” their students, their capabilities and their writing styles. They will learn to be better consumers of their students’ work product. And they will need to learn how better to differentiate between machine writing and the writing of their students. Combating AI within the context of high school essays might be as simple as identifying the failure to share personal anecdotes, the inability to compare and contrast different texts from different eras, or the lack of evidence of empathy or perspective in an analysis.

 

Conclusion

In the words of Pogo, “we have met the enemy, and he is us.” The machine is not the adversary, any more than is a hammer, a calculator or a personal computer. The machine is, in the end, merely a tool. The real challenge is twofold. First, embracing and understanding how this new technology can benefit humankind. Second, understanding and outwitting how humans may misuse these miraculous new tools and working to redirect their use for mind- and education-expanding purposes.

AI can’t feel love, can’t experience loss, can’t apply ideas to lived experience, can’t get angry at injustice and can’t formulate ideas to improve the world. Research skills are important, and AI no doubt will help refine those skills. But it is these very human emotions that we should teach our students to feel and express through their writing. We should demand proficiency in humanity over recitation of facts and critical thinking over mere research. Isn’t this, after all, what our job always has been?

Return to the issue home page:
Image
photo
AI and Tech
Fall 2023
Image
ad banner
Image
ad banner