• 2 days ago
During a House Education Committee hearing on Tuesday, Rep. Ryan Mackenzie (R-PA) questioned Dr. Julia Rafal-Baer, the CEO of the ILO Group, about examining artificial intelligence systems used in the classroom.

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript


Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com

Category

🗞
News
Transcript
00:02Representative from Pennsylvania, Mr. McKenzie is recognized for five minutes.
00:06Thank you to the chairman, and I appreciate all of the testimony being provided today.
00:11Very important topic, integrating AI into our education system has huge potential benefit
00:17for our students.
00:18I think we're all aware of, even right now in its early stages, what it can do to increase
00:25educational opportunity and customization for students.
00:29So a lot of benefit there.
00:30We are also talking about some of the elements that we need to be aware of to make sure that
00:36we protect privacy and security for our students, great topics.
00:41But earlier we heard from a number of testifiers about bias that can be built into the AI systems
00:48that are being incorporated into the classroom.
00:51And I think we're all aware that depending on who nurtures or trains the AI, they're
00:56going to end up putting out different results at the end of the day.
01:00Ultimately, I think what we want for our students is a maximally seeking truth AI.
01:06We want the truth to be provided to our students.
01:10So it shouldn't be a corporation, it shouldn't be a foreign entity that is maybe weighing
01:14in on the results that are coming from AI.
01:17It shouldn't even be a politician on this panel saying that we need to reweight results
01:22so that we get the accurate outcome that we want for our students.
01:27So how do we and how should everybody be thinking about what AI we are selecting?
01:34What actual critiques or reviews of that AI is being made available publicly when we see
01:41that bias or misinformation is being included in results?
01:44And how does that get communicated to the decision makers in school districts or local
01:49entities that are actually picking the AI systems?
01:52And I'll go first to Dr. Rae Fulber.
01:58Thank you so much.
01:59Incredibly thoughtful.
02:00I think there's a couple of layers to that answer.
02:02I think from our perspective, we think states having AI assurance labs is one place to be
02:08able to examine this.
02:09Having one place where the public has a level of transparency about how the model on the
02:14front end before it is approved for use, what it has gone through to be rigorously
02:19tested.
02:20But then in an ongoing way, looking at the outputs of the model, knowing that these models
02:23evolve over time as people are using them.
02:27But I think you're also hitting a larger topic that we've talked about a few times about
02:30AGI.
02:32What happens when we get to a place where we are working alongside models that are doing
02:37things at the same level of humans, knowing that there's levels of bias within that?
02:42And I think that's where the importance of critical thinking and helping to really infuse
02:47in our students understand about ethics and ethical considerations is such a critical
02:52role of our K-12 system.
02:55And to be able to make sure that we're bringing those conversations often into these community
02:59engagement sessions and thinking about this.
03:02Because we'll never get rid of bias, but we can help to make sure we're shaping a generation
03:07of learners who deeply understand the power and importance of human relationship and when
03:12and where to use AI and under what conditions and how to evaluate those outputs consistently.
03:18I'll go to anybody else on the panel.
03:20Do you guys have any thoughts or are there resources available where people can go and
03:27see which AI systems are actually providing on a whole collectively maximally truth-seeking
03:36information?
03:37Because I think this is going to be a real challenge for everybody that is adopting this
03:42to have this kind of decision made in a classroom.
03:46It's akin to selecting your curriculum or your textbooks.
03:50But at the same time, there is a lot less information out there right now about these
03:54AI systems.
03:55So how do you as educators see that?
04:00I think that's a fantastic question and I want to reinforce what Dr. Rafael Behr has
04:05said.
04:07There's a step back in asking that question.
04:08I think making the analog to textbooks is very important because we're going to have
04:13to think about adoption in the classroom of various platforms in the same way we think
04:17about curricular adoptions of textbooks, both in finance models and in content models.
04:22But I think in answer to your bigger question, this is where the importance of demystifying
04:27these tools becomes critical.
04:29Because if we're going to move into the critical thinking position about having students be
04:34able to evaluate the information that's put before them, they first have to understand
04:38how those systems work, what an LLM does, where the data is pulled from, so that they
04:44can begin those evaluative processes.
04:46In other words, by demystifying the function of a generative AI tech allows the student
04:53then to see the working model and understand these are now the questions I have to ask.
04:59As to resources for identifying which platforms are more viable than others, the first thing
05:05we have to acknowledge is that what we need is more research and development for platforms
05:10specifically designed for education.
05:13We do a lot in education of retrofitting to technologies that have already existed and
05:18put in play in industry and other places.
05:21We actually need platforms that are designed for education specifically.
05:25This is why Mr. Chisholm's approach to having his own independent server and his own model
05:31is a very kind of forward-thinking model.
05:33So I think the demystification process is critical.
05:36Well, I will wrap up.
05:38We are over time.
05:39But I want to thank all of you again for your very thoughtful input today.
05:42I think this is, again, an incredibly important topic as we look to advance education in our
05:47country.
05:48And I appreciate particularly that last point about demystifying the systems, because I'll
05:53date myself very quickly here.
05:56When I was growing up, scientific calculators were a new invention.
06:01But at the same time, we were always trained to actually understand the fundamentals and
06:06the processes before we used the scientific calculators to get the answer in a quicker
06:11fashion.
06:12So thank you again, and I yield back.

Recommended