AI With a Conscience: What We Build Reflects What We Believe
Every community has a few people who quietly raise the bar for everyone else. In the AI Salon, Stacie Lech is one of them. She leads our Education Hub and is the founder of AI Teacher School, where she builds AI-powered tools and curriculum innovations for K–12 districts after fifteen years in the classroom. When her students worried about the water cost of AI, she didn’t hand-wave the concern. She dug in, traced the footprint across everything from TikTok to Bitcoin, and gave them the full picture instead of an easy answer. Her work reminds us that better technology begins with better stewardship, and that honest teaching does more than transfer knowledge. It strengthens public trust. Stacie’s report is a model of what it looks like to pair technical skill with moral clarity, and it stands as a reminder that excellence means little unless it’s shared with generosity. Stacie’s approach is simple: meet concern with evidence, and meet fear with context. She doesn’t shield her students from the harder truths about infrastructure, extraction, or cost. She gives them the whole map so they can make choices rooted in understanding rather than reaction. That alone sets her apart in a moment when most debates about technology bounce between hype and panic with nothing in between. What struck me most about her investigation is how firmly it stays anchored in real communities. She shows how the same systems that power our lessons, our entertainment, and our digital lives are pulling from aquifers and municipal supplies in places that can’t afford strain. She refuses to frame this as a problem belonging only to tech. She treats it as a shared responsibility, because that’s what it is. And she doesn’t stop at the warning. She points to the models that are already working: data centers that run on recycled water, facilities cooled by northern climates, and municipal partnerships designed around long-term balance rather than short-term gain. Her message is clear. We already…