(Originally published in Inside Higher Ed’s “Admissions Insider on January 9, 2023)

“I am not a robot.” How many of us have been asked by Google or other websites to prove that? It is tempting to say that being forced to check the “I am not a robot” box is dehumanizing, but it’s actually humanizing. 


The “I am not a robot” checkbox is an example of a captcha (Completely Automated Turing Test to Tell Humans and Computers Apart), a tool designed to filter out spam and bots. I prefer to think that it’s real aim is to inconvenience me. Having to find the kitty pictures among a mass of images is annoying, although it’s still an improvement on having to type distorted text into a box. Does the fact that I have a hard time determining whether that’s a 0 (zero) or a capital O mean that I’m not human or that I might need glasses?


Will “Are you a robot?” soon be a required question on college applications? That question is raised by the recent introduction of ChatGPT, an Artificial Intelligence app that interacts conversationally, giving it the ability to “write.” A New York Times article describes ChatGPT (the GPT stands for “generative pre-trained transformer”) as “the best artificial intelligence chatbot ever released to the general public.” More than one million people signed up to test it in the first five days after its release.


To borrow from the title of a recent ECA column, it’s “too early to tell” what this means. Is this another example of technological advances making our lives both simpler and simultaneously more complicated? Another instance of science fiction turning into non-fiction? Another chapter in the age-old philosophical debate about what qualities distinguish us as human? Or the next step down the road leading to servitude to our smarter and hopefully benevolent machine overlords?


ChatGPT poses particular challenges for those of us who love the written word and those of us who work in education.  The New York Times columnist Frank Bruni asked in his most recent column whether ChatGPT will make him irrelevant. (I’d like to think not.) What happens to take-home essay assignments when you can’t be sure that the essay was written by Johnny and not his AI app? In higher education the humanities are already under threat. What happens to the humanities when the human component is removed? “Machinities,” anyone?


That brings those of us in the college admissions and counseling worlds to consider the college application essay. Does ChatGPT signify the end of the application essay?


I was interviewed for a Forbes article with the title “A Computer Can Now Write Your College Essay–Maybe Better Than You Can.” Forbes fed ChatGPT two college essay prompts, one the 650-word Common Application prompt, “Some students have a background, identity, interest, or talent that is so meaningful they believe their application would be incomplete without it. If this sounds like you, then please share your story,” and the other the “Why Wisconsin?” essay from the University of Wisconsin-Madison supplement.  According to the article, each essay took ChatGPT less than ten minutes to complete. That is both far less time than we hope students would spend composing essays and far more time than most admissions officers spend reading essays.


I was asked to weigh in on whether the AI-produced essays were convincing, whether they looked similar to essays from actual high school seniors, and whether anything in the essay suggested that they were written by AI rather than a human being. My answer was that I probably couldn’t detect the AI authorship, but that I also wouldn’t label the essays as convincing.


I found both essays to resemble cliche essays, with neither answering the prompt in a convincing way.  They also didn’t sound like an essay a teenager would write, but rather an essay a teenager might write with major assistance and editing by an adult.


The Forbes reporter, Emma Whitford, had provided ChatGPT with the following factoids for use in the “identity” essay–competitive swimmer who broke his shoulder in tenth grade, interested in majoring in business, parents from Bangalore, India who now own a restaurant in Newton, Massachusetts.  ChatGPT threw all of that at the wall in formulating the essay, with some interesting creative embellishments. The writer began swimming competitively at the age of nine, the broken shoulder came in a swimming accident, and the interest in business came from working in the family restaurant, where he helped his parents with “inventory management, staff scheduling, and customer relations,” as well as marketing and advertising and developing new menu items.


The “identity” essay did exactly what many student essays do, throwing out lots of things in hopes that something will stick. But it didn’t really address the prompt. The weakest part of the essay, in fact, is the part dealing with the student’s Indian heritage. It consists of vague generalities about “a deep appreciation for Indian culture” and “the challenges and opportunities that come with being a first-generation immigrant,” but there is nothing in that paragraph showing how coming from an Indian background has influenced the student’s experience or world view. Can I imagine a student writing such an essay? Yes. Are my standards for what makes an essay compelling too high? Possibly.


The “Why Wisconsin?” essay had similar characteristics. The information provided to ChatGPT included an intended major in Business Administration and Marketing, part-time work at the family restaurant, and a love for Badger football. Again, the bot showed some creativity in expanding on those themes. It referenced the student’s starting as a dishwasher and progressing to researching the restaurant’s competition and identifying its “unique selling points,” and included a Camp Randall Stadium reference. But, like many student first drafts of the “Why…?” essay, there is nothing that shows any real familiarity with the university or that would prevent one from inserting any other university’s name into the essay.


Nevertheless, the quality of these essays is either impressive or scary, depending upon your perspective. This seems like a major leap beyond learning that a computer could defeat a human world champion in chess.


So what are the ethical implications? That, after all, is the focus of ECA.  


The low-hanging fruit answer is that it is clearly unethical for a student to submit an essay written by ChatGPT. The more complicated question is whether it is unethical for a college to require an application essay or make the essay a significant factor in evaluating a student’s application. How can you use an application essay to help make admission decisions when you can’t tell whether the student actually wrote the essay?


Then again, in how many cases is an essay determinative for an admissions decision? I think essays, like test scores, are overrated by the public. Personal statements and essays are important for some students at some colleges. Most colleges are not selective enough to give attention to a student’s essay unless it contains some kind of red flag. It is only at the very highly selective/rejective colleges and universities, where the vast majority of applicants have superb transcripts and scores, that the voice piece of the application, including essays, becomes important and differentiating.


It is already clear that ChatGPT is capable of composing a passable essay, and that may be enough to augur the end of the personal essay as an admissions factor. Just how good an essay AI can produce may be dependent on the quality of information given it. My father was a pioneer in the computer field, and I learned early the concept of GIGO–Garbage In, Garbage Out.


I’m far from convinced that ChatGPT can produce great college essays. Great essays have a spark to them that is not about the ability to write but rather the ability to think. Great personal essays are clever and insightful, with an authenticity and a sincerity that’s–well, personal. As Roger Ailes once said about public speaking, you either have to be sincere or fake sincerity, and it’s very hard to fake sincerity. 


That skepticism toward ChatGPT’s writing abilities may label me as either a dinosaur or a dreamer. It wouldn’t be the first time. But I’ll take either over being a robot.