Implications of AI for K-12 Schools

On May 30, 2024, over 60 representatives of Consortium districts came together for a 3-hour in-person event to discuss implications of AI for their schools, building on their initial experiences as well as preliminary results from an NSF-funded study (award 2333764) involving interviews and surveys of K-12 leaders in the region on this topic. To share key lessons learned in this event, for each of the seven themes that were discussed. We have captured the recording of the brief presentations that framed the small group conversation on that theme and the reporting back from that conversation.

Themes Addressed:

A. Realizing the potential of AI to improve instruction/learning 

Many educators have claimed that AI has the potential to
radically transform how students may learn and be taught in
the future – yet there has also been considerable debate
about this may involve, as well as the risks that may also be
associated to this transformation.

Framing questions:

  • What AI tools are currently used, and how, to support instruction? What are potential benefits and risks?
  • What are possible strategies to realize specific benefits, while minimizing related risks?

[Based on notes taken by Kristen Love / Summary generated by ChatGPT – With appropriate safeguards taken]

Key Points Discussed:

1. Current AI Tools and Support for Instruction:

  • Introduction of Microsoft Co-Pilot for report card comments, focusing on starting the process without overwhelming teachers.
  • Use of various AI tools like Gemini, ChatGPT, and others to support differentiated instruction, generate writing prompts, and assist with curriculum development.
  • Magic School was explored but found less effective due to imperfect prompting, leading to teacher discouragement.
  • Mention of other tools like Skillstruck and Tutor AI for enhancing interaction with students and keeping them on topic.

2. Benefits of AI in Education:

  • AI can serve as a 1:1 tutor for grammar and other subjects.
  • Helps generate initial drafts and frames for various educational materials, which can then be polished by educators.
  • Facilitates small group lessons and generates writing prompts.
  • Assists in creating curriculums, such as “Communication in digital media,” and writing lesson plans.

3. Challenges and Risks:

  • Risk of errors and the need for careful verification of AI-generated content.
  • Potential for AI tools to not meet teachers’ expectations, leading to frustration and lack of adoption.
  • Concerns about students cheating with AI tools and the need to teach them responsible usage.
  • Biases inherent in AI tools, as they lack human emotions and may reflect underlying biases.

4. Teacher Training and Professional Development:

  • Importance of teacher familiarity with AI tools to effectively integrate them into instruction.
  • Creation of Professional Learning Communities (PLCs) focused on using AI to ease administrative tasks and improve teaching efficiency.
  • Emphasis on the need for pedagogical expertise and readiness when introducing AI tools.
  • Training on prompt engineering to ensure effective use of AI tools and achieving desired outcomes.

5. Student Interaction with AI:

  • Debate on allowing students to access and use AI tools.
  • Importance of teaching students how to use AI responsibly and critically evaluate AI outputs.
  • Exploration of AI tools to make learning more interactive and personalized for students.

6. Administrative and Cultural Considerations:

  • Necessity of administrators understanding and supporting the integration of AI in education.
  • Acknowledgment that AI adoption is influenced by the existing educational culture and instructional expertise.
  • Balancing the benefits of AI with concerns about cheating and ensuring ethical usage.

7. Bias and Ethical Considerations:

  • Awareness of biases in AI tools and the importance of teaching students to recognize and question these biases.
  • Reference to resources like the book “Unmasking AI” to understand the implications of AI in education.

8. Future Directions:

  • Continued development of PLCs on AI topics, including differentiating instruction with AI and student use of AI.
  • Focus on elective courses as a way to introduce AI without overwhelming teachers with content demands.
  • Emphasis on building efficiency in AI integration and ongoing professional development for teachers.
Framing Remarks – Rich Colosi (04:45)
Reporting Back – Yu Jung Han (02:17)

B. Addressing concerns about plagiarism/academic dishonesty

Many teachers have expressed big concerns about allowing their students to use AI tools, because of the fear that this will lead to “cheating” and learning losses – yet the increased use of AI in the workplace makes us question what students really need to learn and how that learning should be assessed.

Framing questions

  • What are still common concerns and actual incidents related to “cheating”?
  • How can we encourage students’ academic honesty?
  • How can we rethinking assessment so as to avoid/minimize these risks?

[Based on notes taken by Adma Gama-Krummel / Summary generated by ChatGPT – With appropriate safeguards taken]

Major Concerns and Actual Incidents:

1. Focus on Academic Honesty:

  • Shift from viewing AI use as academic dishonesty to fostering academic honesty and transparency.
  • Emphasis on preparing students to leverage AI appropriately rather than ignoring the potential for misuse.

2. Curriculum Adaptation:

  • Need for curriculum changes to incorporate AI use and address potential cheating.
  • Encouraging students to use AI as a tool while understanding the ethical implications.

3. Student Perspective on Cheating:

  • Understanding students’ views on what constitutes cheating with AI.
  • Development of policies to support appropriate AI use and engage students in discussions about AI.

4. Facilitating AI Interaction:

  • Importance of teaching students how to use AI responsibly before they leave school.
  • Changing assessment designs to reflect the capabilities and appropriate use of AI.

5. Consistency Among Teachers:

  • Need for consistent expectations and directions across teachers regarding AI use.

Encouraging Academic Honesty:

1. Maintaining Student Voice:

  • Concerns that AI might diminish students’ voices, even with tutor guidance.
  • Need for structured AI-student interaction tracks to reflect on learning and appropriate AI use.

2. AI as a Democratizing Tool:

  • AI can level the playing field by providing support that some students might otherwise lack.
  • AI is embedded in various platforms, making it accessible without visiting specific websites.

3. Changing Mindsets:

  • Historical context: technologies like spellcheck were once considered cheating.
  • Engaging students in conversations about what is permissible and why.

4. Digital Literacy and Equity:

  • Addressing the skill gap between students familiar with AI and those who are not.
  • Explaining the importance of developing critical skills without over-relying on AI.

Rethinking Assessment:

1. Encouraging Proper AI Use:

  • Designing questions and assignments that encourage students to use AI correctly.
  • Helping students understand the purpose of tasks and the skills they are meant to develop.

2. Philosophical Shift in Assessments:

  • Rethinking what it means to demonstrate skills and proficiency in the age of AI.
  • Moving beyond tool usage to focus on desired outcomes and how AI can aid in achieving them.

3. Bridging the Knowledge Gap:

  • Addressing the gap between administrators, leaders, and teachers in understanding AI tools.
  • Focusing on outcomes, tools to help students, and designing effective guidance and assignments.

4. Overcoming Fear and Resistance:

  • Addressing fears and resistance to change in mindset regarding AI use.
  • Encouraging a philosophical shift towards integrating AI in a meaningful and ethical manner.
Framing Remarks – Jim Czadzeck (08:33)
Reporting Back – April Luehmann (05:08)

C. Realizing the potential of AI to increase K-12 educators’ productivity

While much of the debate about AI in K-12 education has focused on how students may be using it, AI tools may also significantly affect K-12 educators’ future work practices and conditions – so it is also important to explore AI’s potential to help K-12 teachers and staff save time and provide better services to students, along with the associated risks.  

Framing questions

  • What AI tools are currently used, and how, to support K-12 educators’ own work? What are potential benefits and risks?
  • What are possible strategies to realize specific benefits, while minimizing related risks?

[Based on notes taken by Md Mamunur Rashid / Summary generated by ChatGPT – With appropriate safeguards taken]

Key Points Discussed:

1. Teacher Buy-In and Value Recognition:

  • Importance of teachers seeing the value of AI.
  • Overcoming resistance from experienced teachers who may lack trust in technology.

2. Administrative Work Reduction:

  • AI tools to minimize administrative tasks.
  • Examples include using AI for email management, professional learning packaging, and personalized progress reports.

3. AI Applications in Education:

  • Translation and transcription tools (e.g., GPT-4.0, Propio, Bosis) for document translation and video transcription.
  • Magic School for streamlining tasks like email management and professional learning.
  • ChatGPT for creating transcripts, question generation, and grading assistance.

4. Curriculum and Lesson Planning:

  • Using AI for consistent curriculum development and efficient lesson planning.
  • Prompt engineering and execution of lesson plans to enhance student engagement.

5. Professional Development and Teacher Support:

  • Aligning AI use with PD teams and PLC time.
  • Providing guidance documents and examples of successful AI implementation.

6. **Student Interaction with AI**:

  • Concerns about students creating ChatGPT accounts and using AI tools outside school guidelines.
  • Sidekick app and AI assessment guides for managing student interactions with AI.

7. AI Tools and Applications:

  • Various AI tools mentioned include Microsoft RK8, Canva, and different apps for saving time.
  • Exploration of AI’s potential in creating interview questions, writing prompts, and grading assistance.

8. Challenges and Resistance:

  • Addressing the fear of new technology and the time required for teachers to adapt and play with AI tools.
  • Concerns about AI making teachers appear lazy and the need for leadership buy-in and understanding of AI.
Framing Remarks – Mike Newman (08:27)
Reporting Back – Raffaella Borasi (04:22)

D. Realizing the potential of AI to improve school/back office operations

AI tools have also the potential to help K-12 schools address current limitations in back-office operations (such as scheduling and cybersecurity efforts) and result in better systems and services – although some privacy and cybersecurity issues may need to be resolved before this potential is realized.

Framing questions

  • What AI tools are currently used, and how, to improve school/back-office operations? What are potential benefits and risks?
  • What are possible strategies to realize specific benefits, while minimizing associated risks?

[Based on notes taken by Dave Miller / Summary generated by ChatGPT – With appropriate safeguards taken]

Key Points Discussed:

1. Current AI Tools and Use Cases:

  • Introduction of AI tools for various administrative tasks such as scheduling, transportation, and communication.
  • Examples include Microsoft Co-Pilot for report card comments and INNIVE AI for data dashboards in Greece CSD.
  • AI tools are being explored for their potential to improve efficiency in master scheduling and transportation logistics.

2. Challenges and Concerns:

  • Security and privacy concerns related to student data, emphasizing the need for proper data “scrubbing” and training.
  • The risk of biases in AI outputs, necessitating careful oversight and human checks to ensure appropriateness and inclusiveness.
  • Potential for over-automation, which could overlook the human elements essential for certain tasks.

3. Supporting Administrators and Staff:

  • Importance of providing professional learning and support for administrators to effectively integrate AI tools.
  • Need for an administrator track in AI offerings to ensure they are equipped to support teachers and staff.
  • Training principals and administrators on how to leverage AI tools within existing workflows and district systems.

4. Vendor Relationships and Product Integration:

  • Importance of communication with vendors to ensure AI tools meet school policies, safety, and data requirements.
  • Vendors like SchoolTool integrating AI into their systems, highlighting the need for alignment with school needs and compliance standards.
  • Understanding how vendor applications are created and built to better utilize AI tools.

5. Operational Efficiencies and Improvements:

  • Examples of AI tools improving operations include automated scheduling with MS Bookings and creating scripts for Google Sheets.
  • Emphasis on transforming operational tasks to increase efficiency while maintaining a human-centric approach.
  • Exploration of how AI can streamline calendaring, minimize manual work, and provide actionable insights from aggregated data.

6. Professional Development and Training:

  • Creation of Professional Learning Communities (PLCs) focused on using AI for administrative tasks and improving job efficiency.
  • Training on prompt engineering and understanding AI capabilities to maximize benefits and minimize risks.
  • Need for ongoing education to build a basic vocabulary and understanding of business processes to effectively use AI.

7. Future Directions and Considerations:

  • Encouraging a mindset shift towards adopting AI tools and understanding their potential.
  • Balancing the desire for efficiency with the need to address human-level concerns and ensure ethical usage of AI.
  • Awareness of cost risks associated with AI applications, especially from major vendors like Microsoft and Google, and the need for competitive, sustainable solutions from smaller vendors.
Framing Remarks – Greg Baker (05:21)
Reporting Back – Gordy Baxter (05:01)

E. Professional learning on AI for K-12 educators

As AI is expected to play such a prominent role in future K-12 education, professional learning for all school personnel (from leaders to teachers to all staff), as well as students and their families, is needed to ensure that they can leverage AI in effective and ethical ways.  Yet the rapid advances in AI technology present unique challenges in providing the needed PD.

Framing questions

  • What is the current status about AI-related PL for K-12 educators?
  • What kind of knowledge about AI is most needed and by whom?
  • What kind of PL and other resources can be most helpful, and why?
  • What PLs and resources are already available, and could be leveraged?

[Based on notes taken by Hairong Shang-Butler / Summary generated by ChatGPT – With appropriate safeguards taken]

Key Points Discussed:

1. Teacher Engagement and Buy-In:

  • Importance of using AI tools personally and focusing on specific areas to gain teacher buy-in and save time for student interaction.
  • Addressing the challenge of getting teachers to learn new tools by starting small and experiencing success.
  • Embedding AI learning in faculty meetings to generate interest and discussions.

2. AI Tools and Resources:

  • Many options available, including consortium grants, AI-embedded lesson plans, and asynchronous professional development offerings.
  • Utilizing tools like MagicSchool for curriculum development, co-pilot for planning or brainstorming, and ChatGPT for various tasks.

3. Professional Development Strategies:

  • Offering new teacher orientation to increase awareness and elective PDs on district initiatives using AI.
  • Providing internal district sessions to help staff understand how to use AI as a time-saver.
  • Piloting programs and holding meetings with administrators, followed by differentiated and resource creation sessions.

4. Challenges and Solutions:

  • Struggles with getting teachers to attend and engage in AI learning sessions.
  • Encouraging teachers to experiment with AI tools by trying small tasks and experiencing success.
  • Overcoming fear and resistance by modeling AI use and showing practical applications.

5. Student Interaction and Ethics:

  • Exploring how to teach students about AI and its ethical implications.
  • Managing student use of AI tools and addressing concerns about cheating.
  • Encouraging student engagement with AI through projects and differentiated learning experiences.

6. Leadership and Systematic Implementation:

  • Presenting AI to leadership teams and exploring its applications for teachers, administrators, and operations.
  • Building capacity for AI training among teachers and framing AI as a time-saver.
  • Forming AI sub-committees to work with board members and administrators on PD initiatives.

7. Personalized Learning and Equity:

  • Highlighting AI’s role in personalized learning and addressing teachers’ fears about AI.
  • Emphasizing the need for equity by ensuring all students have access to AI tools and guidance.
  • Comparing AI’s impact on education to the internet revolution, stressing the importance of integration and comfort with AI.

8. Practical Applications and Examples:

  • Using AI for practical tasks such as changing reading levels for differentiation, data-crunching assessments, and facilitating creative ways to teach.
  • Highlighting successful examples and quotes from teachers who have benefited from AI integration.
  • Addressing high-level ethical issues and ensuring teachers can handle AI-relevant questions.
Framing Remarks – Lynn Girolamo (05:20)
Reporting Back – Zenon Borys (04:21)

F. Addressing the challenge of making AI-related policies 

There is a tension currently in K-12 schools about whether an AI policy is needed or even desirable, and how such a policy could be created given that AI technology is changing so rapidly.  Yet we cannot wait to make some important decisions about how AI tools could used by students, teachers, staff and leaders.

Framing questions

  • Which policies have been created/modified – or need to be created/modified – to reflect the possible use of AI in K-12 schools?
  • What processes and guidance could support the needed policies?

[Based on notes taken by Karen DeAngelis & Patricia Vaughan Brogan / Summary generated by ChatGPT – With appropriate safeguards taken]

Key Points Discussed:

1. Creating Flexible AI Policies:

  • Aim to develop policies that are broad enough to adapt to changes while still providing clear guidance.
  • Encourage the use of AI without restricting its application, ensuring policies protect student data and privacy.

2. Policy Development and Guidance:

  • Need for guidance principles that integrate AI into existing district policies without limiting its use.
  • Developing policy recommendations for the Board that balance specificity with flexibility.
  • Consider whether a separate AI policy is necessary or if existing policies (e.g., internet usage, Code of Conduct) are sufficient.

3. Professional Learning and Training:

  • Emphasis on the need for professional learning to help teachers and administrators understand and effectively use AI.
  • Training on responsible use and privacy issues related to AI.
  • Ensuring staff are clear about acceptable uses of AI and providing specific guidance.

4. Stakeholder Involvement:

  • Importance of including a broad range of voices in policy discussions, including tech experts, legal advisors, staff, families, and community members.
  • Ensuring those involved in the conversation have a deep understanding of AI and its implications.

5. Integration with Mission and Vision:

  • Policies should align with the district’s mission, vision, and strategic plans.
  • Highlight the role of AI in preparing students for future careers and enhancing teaching productivity.

6. Balancing Benefits and Risks:

  • AI can improve operational efficiency and student learning but requires careful consideration of data privacy and ethical use.
  • Monitoring and evaluating AI tools to ensure they meet safety and privacy standards.

7. Practical Considerations and Implementation:

  • Existing policies such as the Code of Conduct and data privacy policies can be adapted to include AI tools.
  • Development of regulations or guidelines that allow for flexibility and responsible use of AI.
  • Providing AI literacy training for staff and students to ensure effective and safe use.

8. Future Directions and State Guidance:

  • Awaiting further guidance from state education departments to inform district-level policies.
  • Federal guidance exists and can provide a framework for developing local policies.
  • Ongoing discussions and updates to policies as AI tools and their applications evolve.
Framing Remarks – Christine Osadciw (08:24)
Reporting Back – Patricia Vaughan Brogan (03:42)

G. Addressing AI-related privacy & cybersecurity issues 

The use of AI tools in K-12 schools is presenting new questions and challenges with respect to ensuring the privacy of protected data and the school’s safety from cyber attacks. Finding solutions for these issues may indeed be the biggest barrier to providing access to Ai tools in schools.

Framing questions

  • What key privacy/cybersecurity issues have been encountered/are expected? 
  • How can we address privacy issues and compatibility with EdLaw2D?
  • How can we address potential cybersecurity threats?  

[Based on notes taken by Sharon Mason / Summary generated by ChatGPT – With appropriate safeguards taken]

Overarching Issues:

1. Training of Users:

  • Emphasis on continuous user training to mitigate risks associated with AI and cybersecurity.
  • Addressing the challenge of educating users on new threats and best practices.

2. Navigating the Political Landscape:

  • Understanding the climates and cultures of districts and groups to effectively implement AI policies.
  • Balancing political decisions with best practices in cybersecurity.

Theme Questions:

1. Key Issues Encountered/Expected:

  • Sophisticated phishing attacks and impersonation.
  • Concerns about users clicking on malicious links, especially at the start of the school year.

2. Compatibility with Ed Law 2d:

  • Some AI products fall into a grey area regarding data handling and compliance.
  •  Need for clarity on where data is being harvested and stored.

3. Addressing Potential Threats:

  • Importance of user training and keeping systems up-to-date.
  • Understanding how existing systems protect against threats and the associated costs.

Key Concerns and Considerations:

1. Phishing and Impersonation:

  • Phishing campaigns targeting specific roles (e.g., teachers vs. administrators).
  • AI tools being used for remediation but struggling with outdated systems.

2. Training and Awareness:

  • Continuous need for user training to keep up with evolving threats.
  • Training challenges due to time constraints and lack of direct impact on users.

3. Ed Law 2d Compliance:

  • Concerns about AI systems pulling personally identifiable information (PII).
  • Specific tools turned off for students due to compliance uncertainties.

4. Human Factors and Political Influence:

  • Most attacks stem from human error, emphasizing the need for robust training programs.
  • Political decisions often influence tech policies, sometimes conflicting with best practices.

5. System and Network Security:

  • Use of AI in network infrastructure for threat detection.
  • Additional protections needed to secure networks and prevent impersonation.

6. Product Specificity and Cost:

  • Need for domain, law, and role-specific AI products.
  • Affordability of AI products for districts remains a concern.

7. Incident Response and Preparedness:

  • Importance of having a well-defined incident response plan.
  • Immediate actions, such as disabling certain services, can have significant implications.

8. Student Use and Education:

  • Students using AI to impersonate teachers and other malicious activities.
  • Importance of educating students on responsible AI use and digital literacy.

Noteworthy Observations:

1. User Training is Crucial:

  • Frequent clickers and human error are major concerns.
  • Structured and continuous training programs are necessary.

2. Balancing Security and Usability:

  • Strong security measures can backfire if not well-received by users.
  • Need for a balanced approach that considers user needs and best practices.

3. Ongoing Political and Cultural Challenges:

  • Political decisions can drive policy, sometimes conflicting with cybersecurity best practices.
  • Climate and culture of districts influence the effectiveness of AI-related policies.
Framing Remarks – Kelli Eckdahl (06:06)
Reporting Back – Sharon Mason & Kelli Eckdahl (05:37)