In case you haven’t heard, artificial intelligence (AI) is taking over the world. Maybe not in the Skynet/apocalypse sense, but the topic has been everywhere since the release of ChatGPT in November of 2022. 

As the leader of an EdTech company serving over 10,000 schools each year, it didn’t take long for me to realize AI would immediately affect our industry, the broader world of education, and the millions of teachers and students we support. But, when we started asking teachers how they felt about it, their feelings were mixed at best.  

Of the more than 400 teachers who responded to our survey in April, only 18% felt “in the loop” about recent advancements in AI and large language models like ChatGPT. Only 20% felt strongly that they will be using AI or AI-powered tools in some form during the ‘23-’24 school year, and only 28% reported feeling “excited about the potential applications for AI in the classroom.” 

Students working on eSpark during a site visit!

At eSpark…

At eSpark, we’ve been working closely with teachers and students for the past several months to develop our own AI-powered reading activity, Choice Texts. It’s been fascinating to see some of that early skepticism melt away when teachers observe the impact of the technology. Frankly, the response from our testing with reluctant readers, in particular, has been emotional in a way we haven’t experienced with previous work. 

No, AI will not be this wonderful cure-all that solves all of our problems overnight. But it will also not be some soulless, evil force that teachers spend all their (nonexistent) free time trying to fight. The answer, as usual, is somewhere in between. Many of the school and district leaders we’ve spoken to already feature AI in their professional development plans for next year. EdTech companies also have some responsibility to educate. Together, we can lay the groundwork for AI standards and best practices while clearing the fog of fear, uncertainty, and doubt. 

Here’s what to consider when evaluating AI-powered apps for the classroom:

1. Is it safe, secure, and age-appropriate?

All three of these are non-negotiable, as they should be. We know students will find a way to push the boundaries of any program you put in front of them. AI opens up a world of possibilities; not all are good. Fortunately, some excellent content moderation tools are available for our developers to catch bad inputs before they result in bad outputs. Still, be sure to ask questions and put any new tool through rigorous testing before you put it in front of students.

Behind the curtain: For Choice Texts, we reinforced existing moderation tools with our own in-house safeguards and are continuing to test, monitor, and make updates to ensure we’re catching everything. We also knew from the start that it would be crucial to give teachers a way to monitor what their students see in any AI-powered tool and provide them with the power to intervene in a way computers cannot.

Security

Security has been another top-of-mind concern for many. Any information sent to ChatGPT is not private or secure, as several companies have learned the hard way. If you’re using an app that includes any identifiable information in its prompts, it is likely not compliant with federal or state privacy laws. 

Behind the curtain: For Choice Texts, we made sure to strip away all identifying information from student inputs, so there is no way to tie any prompts back to the student, teacher, or school. 

Age-appropriateness is possibly the most difficult of these three to get right. A text passage that’s interesting and on-level for a kindergartener will fall flat for a fifth grader. Things to look for when evaluating whether an AI app is right for your classroom include the following: 

  • Is the reading level accurate for the grade you are teaching and/or the level your students are working at?
  • Do your students have the background knowledge to relate to and understand the content? 
  • Is the content engaging for your students, or does it feel “too young” or “too old” for them?

Behind the curtain: We’re still working to find the sweet spot for Choice Texts, and students have been a tremendous source of feedback for us. In one early round of testing, some fifth graders remarked that their stories felt “too childish” and even requested an option to make the text harder. That was a good indication that we needed to go back and look at our prompts again to deliver the same magical experience across all the grade levels we serve.

2. Is it accurate and reliable?

As “intelligent” as ChatGPT appears to be, the technology is still very much prone to “hallucinations,” which is a fancy way of saying “making stuff up.” That’s a dealbreaker in an educational setting. We can’t afford to make mistakes when those mistakes result in the spread of disinformation and confusion among students. 

Currently, we can’t rely on AI to perform the role of “instruction.” We’ve seen real-life examples of AI-generated court filings citing fake cases, Google’s chatbot shared false information about the James Webb Telescope in a widely circulated promotional video, and the first time I tested a math prompt, ChatGPT told me the hour hand on a clock was the big hand.

Behind the curtain: This issue drove much of our work within the Reading Informational domain. Reading Literature was easier to work with—when telling fictional stories, facts are far less important than generating passages about science or social studies topics. We ultimately settled on building a much stronger fence around our RI lessons on launch, keeping the playful personalization intact while also ensuring everything students see is factually accurate.

The most significant role for AI early on will be as a facilitator—we’ve already seen several chatbots designed to provide feedback, scaffolding, and remediation for students in real time. Personalization will also be popular, given AI’s ability to generate text and images based on student inputs or data points.

3. Is AI instructionally sound?

Teachers who’ve been around since the Wild West days of early EdTech have seen the rise and fall of thousands of flashy products that never delivered on their promise to improve student outcomes. ESSA’s evidence-based requirements helped rein things in a bit, and any AI tools will need to meet those requirements to prove they are more than just window dressing.

Behind the curtain: This has been a top priority for our learning design team from the start. Nobody would be happy releasing a product that wasn’t rooted in the principles of high-quality instruction. The old saying “garbage in, garbage out” holds true with any AI tool. We spent weeks researching, testing, and improving the rubrics we use to evaluate Choice Texts outputs. We adjusted our prompts continuously (and continue to change them as needed) until we could consistently meet the high standards we set for text quality. 

Eric presenting Choice Texts to a local classroom.

Trust, but Verify.

The AI revolution has already happened, and the technology is here to stay. I’m confident teachers will eventually receive the training and resources they need to make informed decisions about what their students use in the classroom. In the meantime, the three pillars above offer a good starting point for any evaluation.

As with any new tech, there will be growing pains, but it’s been encouraging to see the rapid shift from blanket bans in the wake of cheating to more of a “we can’t fight it, so we need to prepare our students for it” mentality in most schools and districts. AI has the potential to completely redefine the concept of personalized learning on a level we previously didn’t think was possible. Still, it will have to be done purposefully and with great care. 

On behalf of eSpark, I can promise that we will do our part to anticipate these challenges and stay ahead of them so you don’t have to worry. I hope other EdTech companies will do the same.

Teachers can try Choice Texts for free here!


Eric Dahlberg is the CEO of eSpark Learning and a leading proponent for “playful personalization” in the classroom. Eric has been on the front lines of AI in education since the release of ChatGPT in November of 2022, which was when he quickly realized that our understanding of what was possible with personalized learning was about to change forever.

Related Blog Posts

classroom technology
music and technology
differentiating literacy instruction using technology

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Get the support and encouragement you deserve! It's FREE, FUN, & FULL of surprises!
Join the Club!
close-image
Cookie Consent Banner by Real Cookie Banner