Adrian Perkins a sophomore at ʻIolani School in Honolulu.
The consensus was not that artificial intelligence should be banned forever or embraced, but rather critically understood.
In December, I sat in a crowded ballroom in Houston with over 200 students from 39 schools across 19 states. We had been brought together by the Close Up Foundation and Stanford University’s Deliberative Democracy Lab to do something different: discuss, as peers, about the future of artificial intelligence in schools and society.
I was interested in this program because I am someone who values the human voice, history, ethics, and the humanities. We heard from experts in education, technology, ethics, and public policy.
Through this process, we talked and disagreed, though ultimately, we worked toward a consensus.
Ideas showcases stories, opinion and analysis about Hawaiʻi, from the state’s sharpest thinkers, to stretch our collective thinking about a problem or an issue. Email news@civilbeat.org to submit an idea or an essay.
I walked in thinking AI was mostly used for academic dishonesty and students shortcutting the education process. I walked out understanding it as something far more complex. It was a question about cognitive development, educational equity, and what we actually want schools to accomplish.
Perhaps the most surprising thing is that the students in that room, from various different schools and states, all arrived at surprisingly similar conclusions. The consensus was not that AI should be banned forever or embraced unilaterally. It was the realization that most schools are skipping an important step.
They are handing students powerful AI tools, like ChatGPT and Gemini, before those students have built the foundational critical thinking skills that make AI an accelerant to learning rather than a hindrance.
(Wikimedia/Dall-e 3/2024)
This is a real concern for me, and many other students with whom I discussed. Students’ brains are still developing. I have watched students (and I have been tempted to) immediately put a Chemistry problem set, Algebra problems, or English discussion questions into AI.
The habits we build now at school — how we read, reason, and struggle through a problem before arriving at understanding — will inevitably shape the thinkers we become. When a student turns to ChatGPT the moment a task feels difficult, that struggle gets bypassed.
The learning and development never happens. This causes students to become dependent on AI rather than using it to fuel our development.
Despite their best efforts to responsibly respond to AI, many schools have created policies that are too strict, too permissive, inconsistent across teachers or disciplines, or unevenly enforced. For many students, the result is a growing culture of quiet reliance on tools that think for us.
When I returned from Houston, I worked with my classmates to suggest new policy recommendations for our school. As an alternative to large-language models (like ChatGPT or Google’s Gemini), we encouraged the use of education-specific AI tools — like Flint AI, where teachers set the parameters and the AI tutors, rather than completes the work for the users.
We shared our beliefs in the importance of delaying AI exposure — perhaps even until the 9th grade — to give younger students time to develop core academic, critical thinking, and nuanced personal opinions first. We also recommended the kind of AI education we received in Houston, so students could understand the cognitive, ethical, and environmental implications of the technology they use every day.
These recommendations came from students who use AI and who want to be prepared for a world where it is everywhere. We are not asking to be shielded away from AI; we want to be educated on how to use AI in a way that will make us better thinkers.
The workplace is rapidly changing with new jobs being eliminated and created every day. We want to be ready to adapt our minds and skills to whatever comes our way.
What I found most enlightening was how effective the deliberation process was at creating students who think in a more nuanced way. When students are given real information and data, have structured time to discuss it, and the expectation that they will engage seriously with those who disagree, the conversation evolves. It stops being about emotionally driven arguments and starts being about compromises and the middle ground.
Students need a seat at the table when schools make AI decisions.
One student in my group was initially hesitant about AI use in schools. His perspective shifted after learning about a Washington, D.C., school that uses a color-coded system to indicate appropriate levels of AI use based on how it supports learning, thinking, and creation. Teachers were able to label each assignment with these indicators, making expectations clear. This framework helped him see how AI could be used responsibly in classrooms.
That experience convinced me that students need a permanent seat at the table when schools make decisions about AI. Schools should create structured spaces, whether in deliberation forums or student roundtables, where young people can engage with policy questions that directly affect their education.
When students are trusted to think seriously about these issues, they rise to the occasion. I saw it happen in Houston, and I know it can happen at our schools around the island.
Hawaiʻi’s schools, public and private, are all navigating the same uncertain issues. The decisions being made right now about AI access, policy, and education will shape a generation of students. Those students deserve to be part of the conversation and contribute to it.
Sign up for our FREE morning newsletter and face each day more informed.
Community Voices aims to encourage broad discussion on many
topics of
community interest. It’s kind of
a cross between Letters to the Editor and op-eds. This is your space to talk about important issues or
interesting people who are making a difference in our world. Column lengths should be no more than 800
words and we need a photo of the author and a bio. We welcome video commentary and other multimedia
formats. Send to news@civilbeat.org. The opinions and
information expressed in Community Voices are solely those of the authors and not Civil Beat.
It's cool to see that it's not just my school that's in the middle of figuring out how to address AI use in academics. I'm a senior in engineering and I'm now hearing conversations about using AI to process data and clear introductory programming classes. Students are likely to be familiar with the matrix algebra necessary to perform a given task, but implementing that in MATLAB is another challenge which AI could assist with, so instructors are evaluating implementing that into curriculum.Something that's worrying me is that incoming freshmen seem to have pretty bad problem solving skills and endurance. That's not really a problem in the classroom but for a profession like engineering which is about weaving learned fundamentals with reasoning it's pretty worrying.
IantheEngineer·
2 months ago
Wow, Adrian, what a great report. I'm 88 years and AI is of another dimension for me. AI is not good or bad but how it's helping each individual using it to survive. My, what a big difference in the learning methods from when I was a sophomore and today. Make the best of it.
kealoha1938·
3 months ago
Excellent article. I hope all schools in Hawaii will take your advice to include students in their discussions of AI use in education. I suspect young people are more aware of the advantages and disadvantages of AI than are most of their teachers or administrators.
Ideas is the place you'll find essays, analysis and opinion on public affairs in Hawaiʻi. We want to showcase smart ideas about the future of Hawaiʻi, from the state's sharpest thinkers, to stretch our collective thinking about a problem or an issue. Email news@civilbeat.org to submit an idea.