top of page
Search

Practically every week another article is published about how AI will benefit classroom teachers. It’s always something to the effect of: 

“By automating grading, planning, and administrative work, artificial intelligence systems can free up educators' time and energy for increased student contact.” 

While this statement may hold some truth, it misses the larger point. Teachers shouldn’t be thought of as passive beneficiaries of AI’s ability to automate tasks (some of which many teachers enjoy). Instead, they should be recognized for having the ideal skill set to shape the direction of AI for themselves, their peers, and their students.

I’ve been using the term “Teacher”, but this applies equally well to all Educators, from curriculum developers to instructional designers, researchers, instructors, academics, and librarians. These professionals can leverage their expertise not just to use AI but to guide the development of new AI systems.


Isn’t the Current State of AI Similar to That of a Young Mind?

AI is not a database of knowledge. The neural networks used to train large language models (LLMs) that power AI are designed to mimic the function of neurons in the human brain. They form and strengthen connections between concepts. They apply weights and biases to arrive at nuanced associations between them. Just like students, AI needs to be guided, challenged, tested, and interacted with in a supportive and positive way to achieve the best outcomes. That’s what teachers do.

Educators are in the business of not just transferring knowledge, but of guiding young minds in how to think and arrive at their own conclusions. When the same approaches are used with AI, it’s referred to as Prompt Engineering. Researchers in the field of Prompt Engineering experiment with pedagogical principles to understand how to guide AI interactions toward desired outcomes. The most effective strategies for interacting with AI such as question framing, problem structuring, and guided exploration, are exactly what teachers do every day with their students. 

Educators are not merely overburdened laborers in need of AI to relieve them of work. They are highly skilled professionals with unique expertise in communication, context, and clarity that sets them up to be the most insightful practitioners with AI. They are the original prompt engineers! Who better to guide and grow AI into an effective, adaptable educational co-creator?


What Might Educator-Trained AI Look Like?

Every day, new AI tools come out that focus on automating administrative tasks. Lately, there’s a trend toward emerging applications in lesson planning and even student tutoring. Some of these will be real lifesavers for educators, but honestly, educators have deeper and more nuanced issues that need focus. In today’s political landscape, the need for locally developed and customized educational materials is more pressing than ever. Political shifts and policy changes often require curriculum adjustments that reflect local values, needs, and prohibitions. Educators are well positioned, and in fact already are trying, to use AI in a thoughtful and informed way to adapt materials to meet these demands, while ensuring the educational content remains relevant and responsive to their classroom.

The coming generation of AI solutions will be quick learners; as such, they have the potential to be vital partners. They might be a source of professional development. They might discuss and explore, or compare and contrast methodologies. They might help with research tasks, identifying novel source materials that connect topics with their classroom. They might brainstorm on how to provide varied perspectives on subjects so every student can understand them. They might serve as a source of constructive feedback and constant encouragement in an increasingly challenging and polarized world.


Keep Humans at the Heart of Education

We are just at the beginning of seeing the impact of AI on education. Some of it will be good, some less so. It's important to recognize that AI's transformative power will be in its utility to enhance educators' practices, rather than replace tasks. AI in education should be in the hands of educators—they ensure that the human element remains central in education. Technology shouldn't reduce the humanity of education; it should amplify it. In the whirlwind of the current hype cycle, let's not miss the obvious and incredible reality before us: the best-prepared AI trainers amongst us are in the classroom.


Thanks to Christine Zanchi at iCivics and Dr. Carly Muetterties, my colleague at CommonGood, for the insightful contributions.



2 views0 comments

If you were to read a random sample of articles on generative AI’s potential impact on teaching & learning, you’d come away with a fairly bleak view on the state of classrooms. The prevailing thinking seems to be that teachers are inundated with thousands of short, often unrelated tasks. And that the most humane thing that we can do as developers is to take some of those off their plates; find a teacher's pain point and smash it. Repeat.


At CommonGood we have a very different view on teaching. We believe that most teachers are deeply interested in big ideas, ways those ideas connect to their students’ lives and planning learning experiences that will make them come alive. We think that given the option, teachers will dedicate a large portion of their attention to this process. Rather than casting teachers as the task mistress that barely keeps chaos at bay, could we behave as though teachers are called to deeply think and expertly plan as a fundamental part of their practice?


In a recent interview OpenAI CEO Sam Altman suggested that we will soon transition from using AI to solve small, five minute problems to solving more complex problems. “Someday they’ll do 10-minute tasks, and then they’ll do an hour-long task,” Altman said. “But you’ll still have to think about, ‘How is this all going to fit together? What do I want to build?’” That sounds more like it. At CommonGood, we don’t see supporting teachers’ core work as a five minute problem - and we have an opinion on how things fit together.


For around a decade, our founders have gathered educators in cohorts to deeply consider their students’ communities, the learning that will be most compelling to them, and flex our collective learning science muscles to design learning experiences that will lead to good outcomes. These processes are time consuming, intellectually rigorous and sometimes personally demanding. They often challenge our assumptions, shine new light on the communities we serve and stretch our capacities. Educators also love them - this work can build new mindsets and communities of practice. Some have even said that it reinvigorated their love of teaching.


That’s good stuff. Perhaps even better is what these approaches yield for students. In a recent paper, we lay out how similar approaches don’t suggest incremental improvements to student outcomes. They suggest that transformation is possible. 


But these approaches require expertise and dozens of hours of work for participants. Few communities have the bandwidth to take this on. Enter Collaborative AI. CommonGood Co-Founder Kyle Morton designed a workflow management platform, with AI-powered supports for each step in the design process. Like the cohort-based approach, teachers are presented with the underlying theory for each step, along with recommendations and ideas on how to execute it. The educative process both builds fluency with evidence-based learning design techniques and maintains the centrality of the teacher. 


Even maintaining healthy skepticism and high standards for outputs, the affordances of the technology showed promise early on. “I’m a self-described design snob” said Co-Founder Dr. Carly Muetterties. “I think that because the design processes are so well defined (leaning on peer-reviewed research and frameworks, lots of examples to reference) that the technology is very good at following those guidelines and making contextualized recommendations. The tools added value to my own design process very early on.”


Well-defined processes with lots of reference material is key to the shift being proposed. Using generative AI (or really any technology) to identify a small bug or teacher pain point and smash it doesn’t require much fluency with education theory or learning science. It presumes that when a product has been built to smash a bug, those involved will use the audience that they’ve built to find / smash bigger bugs. But it can be difficult to make the transition from peripheral to more core problems of practice. By starting with well-established evidence-based models, there is already a tremendous amount of information on what works, how pieces fit together and what good outputs look like. 


We tested and iterated on the design process over the course of years, working with and gathering feedback from lots of educators and across a variety of contexts. We’ve compiled a huge amount of data on how design steps fit together, how to coach people on them, and what predictably leads to usable outputs. We’ve done a lot of (peer reviewed) writing on these processes as well. An evidence-based approach is by its definition, well-defined.


And generative AI seems well suited to facilitate defined processes, particularly when there are lots of examples to reference. “When we set out on this build,” shared Dr. Carly Muetterties, “we’d hoped to make the process more efficient for ourselves and our clients. We have been successful there - in our use we realize a 70-85% efficiency gain. What we didn’t entirely expect is that we’re discovering that the platform can make these approaches even more effective. In my own practice, the platform generates well-reasoned recommendations for me to consider (which pushes my thinking), surfaces sources that I wouldn’t likely have found, etc.. It’s humbling sometimes, but the technology has made me a better designer.”


Suggesting that technology be used to facilitate established models has a few important implications. First, we should stop casting teachers as an ‘in over their heads’ class of quasi-professionals constantly combating burnout. Rather we should work on the assumption that they’re learning science practitioners, constantly using, testing and informing the evidence base on what works in classrooms. Second, we should build tools in close partnership with people who have practice applying that evidence base to core problems. We should expect to be able to see what experts, what models, what evidence are informing the technologies that are being developed for classrooms. 


Our platform is being tested by partners on curriculum teams at both school districts / operators and solutions providers. These partners are among the most discerning and demanding users we could find, most with decades at the forefront of evidence-based R&D. We are excited to share more about what we learn from these partners in the months to come.


In sum, we think that the best application of something as powerful as generative AI is to make proven models more accessible. Improving curriculum & instruction is not mysterious, just very difficult. Thankfully we have 100+ years of learning science that holds unrealized potential. Our use of generative AI so far is that it can make complex and difficult tasks less complex, and time consuming processes dramatically less so. We want to use this new technology to allow teachers to take on more - more inspiring, deeply impactful, even transformative work - not to do less. 

21 views0 comments

We're excited to announce that the CommonGood team is growing in an important direction! Earlier this year, Kyle Morton joined us as a co-founder, bringing his expertise in tech and product development. With a solid background in building SaaS companies and a deep understanding of search, AI, and educational technologies, Kyle will help CommonGood reach more educators with better products.


“The AI revolution is in full swing, and there are many solutions for students and teachers already—some good, some not so much. What drew me to CommonGood is our shared vision to keep humans at the heart of education while finding ways to save time and money without cutting corners. It’s the kind of challenging problem I love, and I’ve found great partners in Carly and Evan,” Kyle shares.


Kyle’s career highlights include founding HapYak Interactive Video, a leader in video-based training, which Newsela acquired in 2021. Before HapYak, he led product strategy at RAMP, launching the first scalable video search solution for media companies. He also co-developed one of the first mobile advertising solutions at Third Screen Media, later acquired by AOL.


“The co-design approach at CommonGood is exactly what we need right now—creating educational resources that are culturally sustaining, locally relevant and highly effective. The big challenge is scaling it. This is a unique moment where technology can really help solve this problem. I’m excited to lead our efforts to use Ethical, Collaborative AI to make life easier for curriculum developers, teachers, and students,” Kyle adds.


CommonGood focuses on co-designing curriculum resources with and for diverse communities. By bringing together educators, community leaders, experts, schools, and districts, we collaboratively envision and create curricula that are both better and more reflective of the students they serve.


“Collaborative AI is about applying AI to real problems in real contexts, combining what algorithms do well with what humans do best. I believe this approach to curriculum development and customization will drastically reduce costs, exactly when institutions need to save money, while radically increasing the number of students who benefit from high quality educational materials.”


Kyle will be working closely with the CommonGood team, educators, and curriculum designers to develop solutions that enhance and amplify their work. We’re looking forward to the innovative strides we’ll make together.

53 views0 comments
1
2
bottom of page