[ad_1]
I was driving home from Palo Alto to San Francisco, a journey I’d done dozens upon dozens of times before. Only this time, I faced a problem: a phone without power; a journey without GPS. I missed my exit and became hopelessly lost in streets less than a mile from my home. How embarrassing: I claim to love this city, and yet in that moment I felt I barely knew it. Suddenly deprived of my tech, I was unable to find my way, because I had never needed to actually learn it.Â
I’m not arguing against the use of GPS. But I bring it up to demonstrate that efficient technology can be an impediment to learning. Only through effort and repetition, without shortcuts, can we truly retain useful knowledge.
Much has been written about GPT-3, one of the world’s most advanced artificial intelligence systems. It can do things that would have been considered science fiction just a few years ago, such as generate realistic-sounding articles, or translate between languages it has never seen before. It does so by learning from a vast amount of text, and then making predictions based on that data.
(It also wrote that last paragraph, using just the prompt “much has been written about GPT-3”. I’d like to think I would never stoop to using that writing cliché, “like science fiction”.)
This kind of AI-generated text is creating waves in academia. It’s an inflection point from which we should be careful in how we proceed. A recent Vice article detailed how a community of students was using GPT-3 (and other similar AI text programs) to do the grunt work in writing essays, filling in context and saving time. Because the AI-generated text was “unique”, it allowed students to evade anti-plagiarism detection software. “I just use AI to handle the things I don’t want to do or find meaningless,” said one student.
Is the student cheating? You could argue convincingly in either direction. It’s maybe simpler to ask whether the student is cheating themselves, to which the answer is surely yes. Those things students don’t want to do are what underpins retention. Writing, rethinking, retaining, over and over.Â
Practice makes perfect. We’ve all heard of the “10,000 hours rule” — the amount of intensive practice supposedly needed to master something — but we have many ways to make the same point: repetition means remembering. Remembering means learning and mastering.Â
Hermann Ebbinghaus, a psychologist who studied the benefits of repetition, illustrated this with his “forgetting curve” — demonstrating how knowledge escapes over time if not consciously remembered — and “spaced learning”, repetition over regular intervals. His work has influenced how we learn for more than a century. It’s the difference between becoming an expert and merely passing a test. Does a student deserve an “A” grade if the algorithm does the legwork? He or she becomes no more aware of the subject than I was of my direction home.
Besides, experts in the capabilities of today’s AI warn against it in a blunter sense. Nathan Baschez, creator of Lex.Page, a word processing system that can be used to summon GPT-3 to bulk out your sentences, told me it should be used with great caution in “high stakes” environments like journalism or academia.Â
“GPT-3 can just make up facts that aren’t true and say other things that are nonsense,” he said. But it’ll only get better. It’s always learning. Are we?
Dave Lee is a San Francisco-based correspondent for the FT. Follow @FTMag on Twitter to find out about our latest stories first
[ad_2]
Source link