Episode 50: AI in Education
On this Episode of Under the Hat, we celebrate episode 50 with Mark Henderson. Mark serves as the Director of Digital Initiatives at the Missouri School Boards’ Association, where he leads their AI Advisory Group and spearheaded the creation of The AI Toolkit for K-12 Education. Mark has presented on AI's impact on education at major conferences across the country, including in Chicago, Dallas, Kansas City, and Minneapolis. With a background as a high school English teacher and extensive experience in educational and corporate technology, Mark brings a unique blend of classroom insight and tech expertise to his work in advancing AI in schools.
Want to learn more from Mark? Here are the links:
00:00:00.816 --> 00:00:03.277
Welcome to the Under the Hat podcast,
00:00:03.738 --> 00:00:04.937
the show where we talk
00:00:04.958 --> 00:00:06.359
about everything and
00:00:06.698 --> 00:00:08.060
anything that is top of
00:00:08.119 --> 00:00:09.240
mind in education.
00:00:09.961 --> 00:00:11.141
The goal is to provide
00:00:11.201 --> 00:00:12.823
meaningful conversations to
00:00:12.843 --> 00:00:14.103
what is currently happening
00:00:14.183 --> 00:00:16.045
in our classrooms.
00:00:16.105 --> 00:00:19.567
Stories, pains, moments of success.
00:00:20.126 --> 00:00:21.128
We want to hear it all.
00:00:24.605 --> 00:00:26.385
friends welcome back to
00:00:26.666 --> 00:00:28.367
under the hat podcast and
00:00:28.407 --> 00:00:30.748
here we are at episode five
00:00:30.807 --> 00:00:32.908
zero fifty uh Sophie and I
00:00:32.927 --> 00:00:33.969
have been talking a lot
00:00:34.009 --> 00:00:35.869
about this uh really
00:00:35.950 --> 00:00:37.649
precious special Milestone
00:00:37.689 --> 00:00:38.710
for the show so we
00:00:39.091 --> 00:00:40.631
appreciate everybody that
00:00:40.692 --> 00:00:43.313
has been listening watching
00:00:43.713 --> 00:00:46.073
um contributing to the show in some
00:00:48.015 --> 00:00:49.034
And I'm really excited.
00:00:49.215 --> 00:00:51.136
It's a beautiful day out
00:00:51.195 --> 00:00:52.597
here in California.
00:00:52.616 --> 00:00:53.536
It's very sunny.
00:00:53.737 --> 00:00:55.497
And Sophie actually earlier this week,
00:00:55.557 --> 00:00:57.899
I think, made a joke about, Steve,
00:00:57.918 --> 00:00:59.058
you cannot survive
00:00:59.378 --> 00:01:00.700
somewhere else in the
00:01:00.740 --> 00:01:03.140
country because I get cold quite easily.
00:01:03.720 --> 00:01:04.921
We're kind of talking about the snow.
00:01:05.302 --> 00:01:06.102
Sophie,
00:01:06.382 --> 00:01:08.522
how is the weather out in West Virginia?
00:01:08.543 --> 00:01:08.623
Yeah.
00:01:09.132 --> 00:01:11.493
It's actually in the upper fifties.
00:01:11.554 --> 00:01:12.314
Thank goodness.
00:01:12.394 --> 00:01:13.915
My oldest is already outside
00:01:13.954 --> 00:01:15.555
playing with a light jacket
00:01:15.715 --> 00:01:17.516
and I am thrilled that
00:01:17.537 --> 00:01:19.078
things are starting to get warmer,
00:01:19.197 --> 00:01:22.179
but snow is forecasted in the future.
00:01:22.259 --> 00:01:24.159
So it's, this is like our false winner,
00:01:24.219 --> 00:01:26.001
false hope window we got
00:01:26.040 --> 00:01:26.682
going on right now.
00:01:28.501 --> 00:01:29.402
Nice.
00:01:29.701 --> 00:01:30.563
Nice.
00:01:30.602 --> 00:01:30.703
Yeah.
00:01:30.742 --> 00:01:34.043
Yeah.
00:01:34.063 --> 00:01:35.263
And maybe Mark will have a
00:01:35.343 --> 00:01:37.305
different aspect of what
00:01:37.364 --> 00:01:38.064
the weather looks like
00:01:38.084 --> 00:01:39.025
where he is because he's
00:01:39.064 --> 00:01:40.445
between Steve and I.
00:01:41.006 --> 00:01:43.066
Today's guest is Mark Henderson,
00:01:43.227 --> 00:01:44.206
serves as a director of
00:01:44.266 --> 00:01:45.567
digital initiatives at the
00:01:46.188 --> 00:01:48.268
Missouri's School Boards Association,
00:01:48.608 --> 00:01:50.129
where he leads their A.I.
00:01:50.149 --> 00:01:51.248
advisory group and
00:01:51.328 --> 00:01:54.049
spearheaded the creation of the A.I.
00:01:54.109 --> 00:01:55.790
toolkit for K- twelve education.
00:01:56.471 --> 00:01:57.891
Mark has presented on AI's
00:01:57.971 --> 00:01:59.251
impact in education at
00:01:59.292 --> 00:02:01.132
major conferences across the country,
00:02:01.192 --> 00:02:03.194
including Chicago, Dallas, Kansas City,
00:02:03.394 --> 00:02:04.073
and Minneapolis.
00:02:04.313 --> 00:02:06.094
No, yeah, Minneapolis.
00:02:06.114 --> 00:02:07.956
That didn't make sense in my
00:02:07.975 --> 00:02:08.575
head for some reason.
00:02:09.036 --> 00:02:10.056
And with a background as a
00:02:10.097 --> 00:02:11.236
high school English teacher
00:02:11.417 --> 00:02:13.198
and extensive experience in
00:02:13.258 --> 00:02:15.538
educational and corporate technology,
00:02:15.618 --> 00:02:17.400
Mark brings a unique blend
00:02:17.439 --> 00:02:19.020
of classroom insight and
00:02:19.039 --> 00:02:20.760
tech experience to his work
00:02:20.800 --> 00:02:23.201
in advancing AI in schools.
00:02:24.901 --> 00:02:26.662
welcome to the show mark
00:02:28.302 --> 00:02:30.383
thank you that was a great
00:02:30.424 --> 00:02:32.185
introduction that chat gpt
00:02:32.224 --> 00:02:32.865
helped me write about
00:02:32.905 --> 00:02:34.225
myself a few months ago
00:02:34.246 --> 00:02:36.067
right I forgot about that I
00:02:36.086 --> 00:02:37.426
was like oh I sound really
00:02:37.447 --> 00:02:41.468
impressive if you hadn't
00:02:41.509 --> 00:02:42.449
provided that I would have
00:02:42.590 --> 00:02:44.350
used some another ai to do
00:02:44.371 --> 00:02:45.290
that as well I've done that
00:02:45.311 --> 00:02:46.551
with other guests I've just
00:02:46.572 --> 00:02:47.992
like plugged in stuff I've
00:02:48.032 --> 00:02:50.133
copied and pasted off of um
00:02:50.213 --> 00:02:51.735
to use ai because that's
00:02:52.768 --> 00:02:54.951
It's stuff that's public on the web anyway,
00:02:54.991 --> 00:02:55.793
so I don't feel bad about
00:02:55.812 --> 00:02:57.495
putting it in AI.
00:02:57.895 --> 00:02:58.176
Sure.
00:02:58.197 --> 00:02:59.157
But then, yeah.
00:02:59.639 --> 00:03:01.921
So what is the weather like there?
00:03:02.562 --> 00:03:04.806
You want the weather report from Missouri,
00:03:04.847 --> 00:03:05.146
right?
00:03:05.568 --> 00:03:05.828
Yes.
00:03:07.236 --> 00:03:07.497
Yeah.
00:03:07.576 --> 00:03:10.860
So we have this saying in Missouri, wait,
00:03:10.979 --> 00:03:11.700
if you don't like the weather,
00:03:11.741 --> 00:03:12.540
wait five minutes.
00:03:13.062 --> 00:03:14.502
I'm sure other states use that, too.
00:03:14.582 --> 00:03:17.466
But so it was a week ago it
00:03:17.485 --> 00:03:19.046
was it was below zero.
00:03:19.568 --> 00:03:20.588
And then we had like seventy
00:03:20.609 --> 00:03:21.710
five degrees this week and
00:03:21.750 --> 00:03:23.330
now it's down to like thirty today.
00:03:23.431 --> 00:03:25.573
So you just roll with it.
00:03:25.592 --> 00:03:27.514
Your body never knows what what to expect.
00:03:27.835 --> 00:03:28.496
So, Steve,
00:03:28.536 --> 00:03:29.676
you would you would freeze here.
00:03:31.991 --> 00:03:33.171
that's why I can't survive
00:03:33.671 --> 00:03:34.552
we have a friend that lives
00:03:34.592 --> 00:03:35.733
in erie and they literally
00:03:35.752 --> 00:03:37.552
had feet of snow in one
00:03:37.592 --> 00:03:39.473
snowstorm and we were steve
00:03:39.493 --> 00:03:41.054
is like cold in sixty
00:03:41.074 --> 00:03:41.854
degree weather and we're
00:03:41.913 --> 00:03:42.774
like steve you would just
00:03:42.854 --> 00:03:46.854
die in erie um but anyways
00:03:47.055 --> 00:03:48.634
mark uh we did we did just
00:03:48.655 --> 00:03:50.536
talk about how you are
00:03:50.695 --> 00:03:52.395
speaking you're on an ai
00:03:52.415 --> 00:03:54.355
advisory group you created
00:03:54.437 --> 00:03:56.616
the ai toolkit for k- Twelve education
00:03:57.037 --> 00:03:58.500
But that's where you are now.
00:03:58.561 --> 00:03:59.401
How about you talk to us a
00:03:59.442 --> 00:04:01.326
little bit about your education journey,
00:04:01.366 --> 00:04:02.248
how you got here?
00:04:05.137 --> 00:04:06.057
it's funny I have two
00:04:06.078 --> 00:04:07.618
teenage daughters and
00:04:07.639 --> 00:04:08.580
they're both at that age
00:04:08.620 --> 00:04:10.100
where at school they're
00:04:10.121 --> 00:04:10.842
like you got to kind of
00:04:10.861 --> 00:04:12.342
figure out a career path a
00:04:12.402 --> 00:04:13.563
direction you want to go
00:04:13.623 --> 00:04:14.604
and it stresses them out
00:04:15.004 --> 00:04:16.305
I'm like I could have never
00:04:16.386 --> 00:04:17.346
guessed when I was in high
00:04:17.406 --> 00:04:19.788
school twenty years ago
00:04:20.189 --> 00:04:23.011
that um I'd be going around
00:04:23.031 --> 00:04:24.012
the country talking about
00:04:24.132 --> 00:04:26.973
ai like that that was not a thing so
00:04:28.154 --> 00:04:29.154
I try to say you never know
00:04:29.194 --> 00:04:29.716
what you end up.
00:04:29.755 --> 00:04:30.675
Just choose something that
00:04:30.735 --> 00:04:32.355
interests you and go that direction,
00:04:32.396 --> 00:04:33.857
because I never would have
00:04:33.877 --> 00:04:34.877
thought this would be my life.
00:04:34.956 --> 00:04:38.057
So, Steve, I heard you say on a podcast,
00:04:38.077 --> 00:04:39.519
I can't remember which one,
00:04:39.538 --> 00:04:40.399
but one of your episodes
00:04:40.439 --> 00:04:41.418
that there's kind of two
00:04:41.439 --> 00:04:42.178
kinds of teachers,
00:04:42.519 --> 00:04:43.100
the ones that were like
00:04:43.139 --> 00:04:44.139
really good in school and
00:04:44.160 --> 00:04:45.641
the ones that didn't like school.
00:04:46.581 --> 00:04:47.380
I was certainly in that
00:04:47.600 --> 00:04:48.701
ladder as a student.
00:04:50.341 --> 00:04:50.942
I was smart,
00:04:50.963 --> 00:04:53.084
but I was just bored most of the time.
00:04:53.665 --> 00:04:53.904
You know,
00:04:53.925 --> 00:04:54.646
I was the kid who would raise
00:04:54.665 --> 00:04:55.266
their hand and go like,
00:04:55.485 --> 00:04:56.586
why do we have to know this?
00:04:56.627 --> 00:04:57.728
Why do we have to learn this?
00:04:57.887 --> 00:04:59.550
You know, teachers love that.
00:05:00.730 --> 00:05:02.492
I got that as a teacher.
00:05:02.552 --> 00:05:03.932
I got the karma that,
00:05:04.213 --> 00:05:05.093
that I put in the world.
00:05:05.353 --> 00:05:06.956
You know, I wasn't a bad kid.
00:05:06.975 --> 00:05:08.536
I didn't get in trouble, but I just wasn't,
00:05:08.617 --> 00:05:09.557
I didn't feel challenged
00:05:09.617 --> 00:05:10.298
most of the time.
00:05:12.319 --> 00:05:13.440
And that kind of motivated
00:05:13.480 --> 00:05:14.901
me when I was in college,
00:05:15.021 --> 00:05:17.064
when I had to choose a career, like,
00:05:17.767 --> 00:05:18.689
someone pointed out, like,
00:05:20.009 --> 00:05:21.350
you're good at explaining things.
00:05:21.370 --> 00:05:22.531
You should try being a teacher.
00:05:22.572 --> 00:05:23.252
You should consider it.
00:05:23.293 --> 00:05:24.233
And I thought, well,
00:05:24.254 --> 00:05:25.595
I know there are a lot of
00:05:25.634 --> 00:05:26.314
things not to do.
00:05:26.355 --> 00:05:27.216
I would do it differently.
00:05:27.456 --> 00:05:29.057
So that kind of motivated me
00:05:29.117 --> 00:05:30.158
to go in that direction.
00:05:31.800 --> 00:05:33.961
So I became a high school English teacher.
00:05:37.024 --> 00:05:40.168
And I did that for
00:05:41.916 --> 00:05:44.158
few years, good experiences,
00:05:45.418 --> 00:05:46.540
that side of things.
00:05:47.540 --> 00:05:48.560
So I really got into looking
00:05:48.620 --> 00:05:50.302
at ed tech and things like that.
00:05:50.862 --> 00:05:51.663
And while I was teaching,
00:05:51.702 --> 00:05:53.103
I got my master's degree in
00:05:53.564 --> 00:05:55.026
instructional design,
00:05:56.447 --> 00:05:57.848
worked at a college for a while,
00:05:57.867 --> 00:05:59.728
went online learning, that sort of thing,
00:05:59.749 --> 00:06:00.910
putting classes together.
00:06:01.730 --> 00:06:02.569
And then I had my,
00:06:02.930 --> 00:06:03.471
I think they call it
00:06:03.531 --> 00:06:05.350
rumspringa in the Amish world,
00:06:05.370 --> 00:06:06.490
where you kind of, you know,
00:06:06.512 --> 00:06:07.471
when the Amish people get
00:06:07.512 --> 00:06:08.632
to go out in the world and
00:06:09.492 --> 00:06:10.273
see if they like it,
00:06:10.293 --> 00:06:11.233
and then they have to decide.
00:06:11.252 --> 00:06:13.494
That was kind of my career rumspringa was,
00:06:13.994 --> 00:06:14.374
I was like,
00:06:14.454 --> 00:06:16.454
I'm going to try the corporate world.
00:06:16.475 --> 00:06:18.035
I'm going to leave education for a while.
00:06:18.475 --> 00:06:19.596
And I got a job at a bank
00:06:20.115 --> 00:06:21.375
working and doing
00:06:21.396 --> 00:06:23.216
instructional design and training there.
00:06:24.076 --> 00:06:25.317
And I did it for about five years.
00:06:25.396 --> 00:06:27.658
I learned, no, I don't want to do this.
00:06:27.778 --> 00:06:28.677
I miss education.
00:06:29.678 --> 00:06:30.319
I learned a lot.
00:06:32.305 --> 00:06:34.807
And it applies now to what I do.
00:06:36.189 --> 00:06:38.610
Just the corporate culture just wasn't,
00:06:39.091 --> 00:06:41.254
I just never really blended in.
00:06:41.713 --> 00:06:45.096
So back in twenty twenty one,
00:06:45.838 --> 00:06:47.238
I just kind of found the
00:06:47.278 --> 00:06:49.920
perfect job for me right here in Columbia,
00:06:49.940 --> 00:06:50.262
Missouri,
00:06:50.281 --> 00:06:51.201
with the Missouri School Boards
00:06:51.221 --> 00:06:51.983
Association.
00:06:52.923 --> 00:06:54.485
I started out and I still do
00:06:54.545 --> 00:06:55.805
instructional design for them.
00:06:56.666 --> 00:06:59.048
But in the last couple of years,
00:06:59.088 --> 00:07:01.151
a lot of my job has been about AI,
00:07:01.971 --> 00:07:03.593
reading about it, learning about it,
00:07:03.913 --> 00:07:05.053
talking to people about it,
00:07:05.353 --> 00:07:07.315
going to conferences, presenting.
00:07:08.776 --> 00:07:08.896
Yeah,
00:07:08.937 --> 00:07:10.177
I never thought this would be my life.
00:07:10.218 --> 00:07:11.358
I don't like it when people
00:07:11.399 --> 00:07:12.259
call me an AI expert
00:07:12.279 --> 00:07:13.321
because I don't know that
00:07:13.341 --> 00:07:14.442
anybody's an AI expert,
00:07:14.461 --> 00:07:15.762
but I'm certainly not.
00:07:16.362 --> 00:07:17.964
I might know more than the average person,
00:07:18.024 --> 00:07:20.247
but yeah, it's cool.
00:07:20.387 --> 00:07:21.807
It's great to be where I am now.
00:07:24.887 --> 00:07:27.995
I find it fascinating just
00:07:28.074 --> 00:07:29.738
listening to your story of
00:07:30.680 --> 00:07:31.863
being in education
00:07:33.007 --> 00:07:35.007
and then going and taking
00:07:35.047 --> 00:07:36.327
that knowledge and applying
00:07:36.387 --> 00:07:38.028
it to industry in some way
00:07:38.668 --> 00:07:40.228
even if it wasn't like your
00:07:40.288 --> 00:07:42.968
cup of tea I I kind of
00:07:43.009 --> 00:07:43.968
those these types of
00:07:44.069 --> 00:07:45.750
stories I don't hear often
00:07:46.089 --> 00:07:47.170
I've I've heard it maybe
00:07:47.230 --> 00:07:48.410
three times over the last
00:07:48.470 --> 00:07:49.810
few years but it's
00:07:49.829 --> 00:07:50.971
something that happens
00:07:51.050 --> 00:07:52.511
right and there's there's a
00:07:52.571 --> 00:07:53.971
market for that like when
00:07:54.011 --> 00:07:55.350
I'm on linkedin and kind of
00:07:55.391 --> 00:07:57.711
seeing those types of opportunities
00:07:58.959 --> 00:08:01.180
absolutely you know um I
00:08:01.220 --> 00:08:02.440
worked with a lot of people
00:08:02.942 --> 00:08:03.961
when I worked for the bank
00:08:04.281 --> 00:08:05.262
very smart very
00:08:05.302 --> 00:08:06.423
knowledgeable about banking
00:08:06.463 --> 00:08:07.343
and customer service and
00:08:07.363 --> 00:08:08.785
all these things but they
00:08:08.805 --> 00:08:09.646
couldn't teach they
00:08:09.665 --> 00:08:10.985
couldn't you know they
00:08:11.026 --> 00:08:12.086
didn't have those skills
00:08:12.107 --> 00:08:13.887
that background yeah so in
00:08:13.908 --> 00:08:15.870
addition to creating online
00:08:16.009 --> 00:08:17.750
materials for bankers like
00:08:18.290 --> 00:08:20.052
here's how you use this program,
00:08:20.093 --> 00:08:20.892
or here's how you talk to
00:08:20.932 --> 00:08:22.053
customers about this.
00:08:22.595 --> 00:08:24.435
A lot of it was working with
00:08:24.495 --> 00:08:25.476
the other leaders, like,
00:08:25.718 --> 00:08:27.059
here's how you give a good presentation.
00:08:27.098 --> 00:08:28.800
Here's how you stay engaged
00:08:28.860 --> 00:08:30.322
and keep your audience's
00:08:30.362 --> 00:08:31.702
attention and things like that,
00:08:31.723 --> 00:08:33.464
because they just came from
00:08:33.484 --> 00:08:34.544
a different world than I did.
00:08:35.986 --> 00:08:36.326
Yeah.
00:08:36.427 --> 00:08:37.748
Don't we all have to take
00:08:37.788 --> 00:08:42.192
that speech class and call it speech?
00:08:42.251 --> 00:08:43.692
Some people it sticks more than others,
00:08:43.732 --> 00:08:44.053
I guess.
00:08:45.611 --> 00:08:46.152
Yeah.
00:08:46.192 --> 00:08:46.351
Yeah.
00:08:46.432 --> 00:08:48.092
So in terms of content,
00:08:48.192 --> 00:08:49.273
it was like I'd imagine
00:08:49.312 --> 00:08:51.214
like a wide range of things
00:08:51.234 --> 00:08:54.575
that you needed to design.
00:08:54.634 --> 00:08:55.835
It was totally foreign to me.
00:08:55.875 --> 00:08:57.716
It was like if I was, you know,
00:08:57.956 --> 00:08:59.876
teaching a French class or
00:08:59.897 --> 00:09:01.437
something and I don't speak French,
00:09:01.496 --> 00:09:03.437
like I learned a lot about
00:09:03.798 --> 00:09:05.859
banking and regulations and
00:09:05.979 --> 00:09:07.239
what you can do and can do
00:09:07.259 --> 00:09:08.719
and what you say and don't say.
00:09:08.798 --> 00:09:11.039
And it was it was interesting.
00:09:11.100 --> 00:09:11.720
I think a lot of it's
00:09:11.740 --> 00:09:13.240
probably left my brain by this point.
00:09:15.111 --> 00:09:17.413
because it's been replaced by AI stuff.
00:09:18.355 --> 00:09:18.895
But it was really
00:09:18.916 --> 00:09:21.038
fascinating just to try
00:09:21.077 --> 00:09:22.139
something totally different
00:09:22.200 --> 00:09:22.980
where you're just like,
00:09:23.140 --> 00:09:25.222
I don't know what any of this means.
00:09:26.063 --> 00:09:27.024
I know how to make a course.
00:09:27.205 --> 00:09:28.326
I know how to make a presentation.
00:09:28.366 --> 00:09:29.888
But when it comes to the content,
00:09:30.649 --> 00:09:31.409
I have no clue.
00:09:32.441 --> 00:09:33.900
Yeah, yeah.
00:09:34.000 --> 00:09:35.481
No, that's certainly fascinating.
00:09:36.981 --> 00:09:39.062
We can dive in more into the
00:09:39.222 --> 00:09:42.082
AI components of this pod,
00:09:42.163 --> 00:09:42.984
of this episode.
00:09:44.403 --> 00:09:46.683
One thing that I know that, Mark,
00:09:46.724 --> 00:09:47.705
like something that you
00:09:47.804 --> 00:09:50.325
speak on often and kind of
00:09:50.345 --> 00:09:52.245
address is like just the
00:09:52.285 --> 00:09:54.066
misconceptions that are out
00:09:54.105 --> 00:09:55.746
there around AI.
00:09:56.067 --> 00:09:58.126
I'm really keen to hear what
00:09:58.187 --> 00:10:00.187
those are and what you have
00:10:00.227 --> 00:10:01.048
to say about them.
00:10:03.736 --> 00:10:03.996
Yeah.
00:10:04.096 --> 00:10:05.197
So, you know,
00:10:05.238 --> 00:10:06.298
David Letterman was famous
00:10:06.318 --> 00:10:07.340
for his top ten list.
00:10:07.419 --> 00:10:08.921
I just have a top five list
00:10:08.941 --> 00:10:11.263
because that's enough.
00:10:11.322 --> 00:10:11.982
But, you know,
00:10:12.323 --> 00:10:13.124
the last couple of years
00:10:13.183 --> 00:10:14.465
I've talked to so many
00:10:14.504 --> 00:10:16.167
people and read so many
00:10:16.226 --> 00:10:17.408
things and there just seem
00:10:17.427 --> 00:10:18.567
to be some common
00:10:18.607 --> 00:10:21.250
misconceptions that I hear a lot.
00:10:22.464 --> 00:10:24.125
And I always try to put
00:10:24.166 --> 00:10:25.006
those out there to help
00:10:25.047 --> 00:10:26.028
people understand better.
00:10:26.087 --> 00:10:29.211
So number one is that people
00:10:29.251 --> 00:10:30.452
think that AI is new.
00:10:31.053 --> 00:10:33.176
They say, oh, it's this shiny new thing.
00:10:33.196 --> 00:10:34.697
But AI has been around since
00:10:34.717 --> 00:10:35.639
the nineteen fifties.
00:10:35.719 --> 00:10:37.019
You know, it's it's not new.
00:10:38.182 --> 00:10:39.643
It just seems new because of
00:10:39.702 --> 00:10:41.985
generative AI programs like chat GPT,
00:10:42.005 --> 00:10:42.506
which are new.
00:10:43.006 --> 00:10:44.368
But AI has been around forever,
00:10:44.408 --> 00:10:45.729
and we use it all the time, right?
00:10:45.769 --> 00:10:48.611
Siri and Alexa and GPS and
00:10:48.652 --> 00:10:50.873
predictive text and things like that.
00:10:51.333 --> 00:10:53.315
We have a little quiz here for you guys.
00:10:54.096 --> 00:10:55.177
What year do you think the
00:10:55.236 --> 00:10:59.740
first chatbot went on?
00:10:59.760 --> 00:11:04.384
The first chatbot went live.
00:11:04.403 --> 00:11:05.764
I'm going to say face.
00:11:05.926 --> 00:11:06.625
I'm guessing.
00:11:08.067 --> 00:11:09.368
Nineteen seventy one.
00:11:09.548 --> 00:11:09.628
OK.
00:11:14.567 --> 00:11:16.908
Sophie, the first day I chat by, um.
00:11:18.070 --> 00:11:19.169
I'm thinking back of when, like,
00:11:19.210 --> 00:11:20.551
the Internet was even given
00:11:20.591 --> 00:11:21.611
to me in school.
00:11:21.711 --> 00:11:26.515
So I, I'm gonna say later than Steve,
00:11:26.615 --> 00:11:29.857
but so, like, two thousand six.
00:11:31.979 --> 00:11:33.961
Yeah.
00:11:37.360 --> 00:11:41.004
well uh steve's closer
00:11:41.245 --> 00:11:42.886
actually was the first ai
00:11:42.907 --> 00:11:44.187
chat bot if you can believe
00:11:44.227 --> 00:11:45.870
that and the first thing
00:11:45.909 --> 00:11:46.750
like sophie you're thinking
00:11:46.770 --> 00:11:47.530
well the internet didn't
00:11:47.551 --> 00:11:48.613
really exist yet how is
00:11:48.633 --> 00:11:50.033
there an ai chat bot it's
00:11:50.394 --> 00:11:51.755
because it was only on one computer
00:11:52.996 --> 00:11:55.518
You couldn't access it from a network.
00:11:56.337 --> 00:11:57.418
It was called ELIZA,
00:11:57.597 --> 00:11:58.097
which I think is
00:11:58.138 --> 00:11:59.038
interesting because we have
00:11:59.078 --> 00:12:00.879
this personification now,
00:12:01.158 --> 00:12:03.039
Siri and Alexa giving human
00:12:03.080 --> 00:12:05.200
names to these robots.
00:12:05.919 --> 00:12:07.321
It was called ELIZA and it
00:12:07.360 --> 00:12:09.020
was supposed to be a psychotherapist.
00:12:09.081 --> 00:12:10.621
It didn't do a very good job,
00:12:11.022 --> 00:12:11.942
but you could make an
00:12:11.981 --> 00:12:13.201
appointment and you could
00:12:13.261 --> 00:12:15.802
come type and talk to ELIZA
00:12:16.602 --> 00:12:17.562
and she would give you
00:12:17.582 --> 00:12:18.703
advice and listen to you.
00:12:20.456 --> 00:12:21.557
So yeah, AI is not new.
00:12:21.618 --> 00:12:23.039
That's my misconception number one.
00:12:23.059 --> 00:12:24.581
Number two is that there's
00:12:24.600 --> 00:12:26.302
this kind of debate about is AI good?
00:12:26.342 --> 00:12:27.303
Is AI bad?
00:12:27.543 --> 00:12:28.904
Sometimes is AI evil?
00:12:30.547 --> 00:12:32.869
Neither is the truth.
00:12:33.028 --> 00:12:34.730
AI is just a tool.
00:12:36.590 --> 00:12:38.630
I love to compare it to a hammer.
00:12:38.650 --> 00:12:39.611
It's a tool.
00:12:40.032 --> 00:12:41.653
People can do good things
00:12:41.673 --> 00:12:42.833
and bad things with a hammer.
00:12:42.874 --> 00:12:43.914
People can do good things
00:12:43.955 --> 00:12:45.174
and bad things with AI.
00:12:45.676 --> 00:12:47.777
But AI is just a tool.
00:12:47.897 --> 00:12:52.299
So what you do with it is what matters.
00:12:52.340 --> 00:12:54.201
Number three is this fear
00:12:54.221 --> 00:12:55.902
that robots are taking over the world.
00:12:57.875 --> 00:12:58.134
You know,
00:12:58.154 --> 00:12:59.916
a lot of the early knowledge that
00:13:00.196 --> 00:13:01.937
people get about AI is from
00:13:02.057 --> 00:13:02.817
science fiction.
00:13:03.498 --> 00:13:05.599
The key word I would say is fiction,
00:13:05.958 --> 00:13:06.279
right?
00:13:06.720 --> 00:13:09.441
So the Terminator and iRobot
00:13:09.500 --> 00:13:11.423
and AI and movies like that,
00:13:11.783 --> 00:13:12.403
they're not real.
00:13:12.923 --> 00:13:13.703
And the stuff that happens
00:13:13.744 --> 00:13:14.504
in those movies isn't
00:13:14.524 --> 00:13:15.445
happening in real life.
00:13:15.904 --> 00:13:17.365
Humans are still in charge
00:13:17.966 --> 00:13:19.466
of the AI right now.
00:13:19.787 --> 00:13:21.528
So anything that it does,
00:13:21.587 --> 00:13:22.889
it's because a human taught
00:13:22.928 --> 00:13:24.429
it to do that or told it to do that.
00:13:24.809 --> 00:13:26.610
These robots, these AI programs,
00:13:26.630 --> 00:13:30.293
they can't think and act on their own.
00:13:30.352 --> 00:13:30.933
Number three,
00:13:31.254 --> 00:13:31.974
if you guys feel free to
00:13:31.994 --> 00:13:32.934
interrupt at any time if
00:13:32.975 --> 00:13:34.576
you want to say anything,
00:13:34.596 --> 00:13:35.076
I'm just going to.
00:13:35.355 --> 00:13:35.917
I got two more.
00:13:37.418 --> 00:13:38.077
Or sorry.
00:13:38.118 --> 00:13:38.618
Yeah, two.
00:13:38.878 --> 00:13:40.318
Number four is this fear
00:13:40.339 --> 00:13:40.860
that we're all going to
00:13:40.879 --> 00:13:42.841
lose our jobs to AI.
00:13:42.860 --> 00:13:43.620
That's a real fear.
00:13:45.602 --> 00:13:47.823
Not true, the experts say.
00:13:47.865 --> 00:13:48.884
The University of
00:13:48.924 --> 00:13:50.826
Pennsylvania put out a report that said,
00:13:50.926 --> 00:13:52.126
eighty percent of American
00:13:52.167 --> 00:13:54.328
jobs will be affected by AI
00:13:54.548 --> 00:13:55.809
in the next five years.
00:13:55.970 --> 00:13:56.509
Affected.
00:13:56.610 --> 00:13:58.230
So some jobs will go away.
00:13:58.691 --> 00:13:59.932
There'll be some new jobs.
00:14:00.172 --> 00:14:00.932
But a lot of us,
00:14:01.493 --> 00:14:03.374
what we do and how we
00:14:03.394 --> 00:14:04.955
interact with technology is
00:14:04.995 --> 00:14:05.875
going to change.
00:14:06.615 --> 00:14:07.777
And certainly if you want to
00:14:07.797 --> 00:14:08.717
be one of those people who
00:14:08.758 --> 00:14:09.879
gets to keep your job,
00:14:10.339 --> 00:14:11.620
you need to embrace change.
00:14:11.940 --> 00:14:14.380
You need to be open to AI,
00:14:14.581 --> 00:14:15.241
things like that.
00:14:17.389 --> 00:14:19.610
There's an author I love named Kevin Roos,
00:14:20.412 --> 00:14:21.611
R-O-O-S-E.
00:14:22.212 --> 00:14:23.552
He writes a lot of good stuff online,
00:14:23.572 --> 00:14:24.153
but he also,
00:14:24.333 --> 00:14:26.014
he wrote this book called Future Proof,
00:14:26.595 --> 00:14:27.596
Nine Rules for Humans in
00:14:27.635 --> 00:14:28.775
the Age of Automation.
00:14:29.756 --> 00:14:30.317
It's a great book.
00:14:30.336 --> 00:14:31.337
I don't know if you can see it there.
00:14:32.337 --> 00:14:33.479
I'm just a free plug.
00:14:35.620 --> 00:14:37.302
and he's he's just such
00:14:37.341 --> 00:14:38.481
great practical things in
00:14:38.501 --> 00:14:40.123
this book about technology
00:14:40.163 --> 00:14:41.903
and where we're going and
00:14:41.964 --> 00:14:43.443
he tells the story about
00:14:43.484 --> 00:14:45.044
when electricity became
00:14:45.325 --> 00:14:46.846
common so they used to have
00:14:46.885 --> 00:14:48.105
these people who were oil
00:14:48.125 --> 00:14:49.287
lamp lighters and when the
00:14:49.326 --> 00:14:50.927
sun went down their job was
00:14:50.967 --> 00:14:52.648
to go around town and light
00:14:52.687 --> 00:14:54.308
these oil lamps so that
00:14:54.328 --> 00:14:56.110
people could see well then
00:14:56.149 --> 00:14:57.951
electricity comes along and
00:14:57.971 --> 00:14:59.431
they don't need those jobs
00:14:59.471 --> 00:15:01.532
anymore because you know
00:15:01.552 --> 00:15:02.793
the lights come on by themselves
00:15:03.513 --> 00:15:05.813
So what happened to those guys?
00:15:06.075 --> 00:15:08.096
They got jobs at the power company.
00:15:08.456 --> 00:15:09.537
They got different jobs,
00:15:10.157 --> 00:15:12.577
but jobs went away, but they changed.
00:15:12.597 --> 00:15:13.538
So it's not like they were
00:15:13.999 --> 00:15:14.740
out on the street.
00:15:15.080 --> 00:15:16.660
They just found a new use for them.
00:15:16.821 --> 00:15:18.542
So you kind of think about
00:15:18.642 --> 00:15:19.642
AI in the same way.
00:15:20.003 --> 00:15:20.883
Some of the jobs that might
00:15:20.903 --> 00:15:22.624
be going away will be
00:15:22.683 --> 00:15:25.645
replaced by some new ones.
00:15:25.785 --> 00:15:27.746
And number five is- Sorry,
00:15:27.767 --> 00:15:28.768
I just want to stop you there.
00:15:28.927 --> 00:15:31.109
There's actually like a cartoon where like
00:15:32.393 --> 00:15:33.994
It was from my childhood and
00:15:34.033 --> 00:15:34.894
it was a factory.
00:15:35.313 --> 00:15:37.794
And this man was putting on
00:15:38.035 --> 00:15:40.996
the lids of toothpaste and
00:15:41.136 --> 00:15:42.057
then they brought in a
00:15:42.136 --> 00:15:43.398
robot to replace him.
00:15:43.918 --> 00:15:45.099
But then he just ended up
00:15:45.139 --> 00:15:46.379
being the person working
00:15:46.578 --> 00:15:49.259
and fixing the robot.
00:15:49.360 --> 00:15:50.860
It's not that it would replace him.
00:15:50.900 --> 00:15:52.701
He just got to end up a different job.
00:15:52.822 --> 00:15:54.322
Yeah, exactly.
00:15:54.623 --> 00:15:55.602
And that's a lot of what
00:15:55.623 --> 00:15:57.403
we're going to see with AI as well.
00:15:58.798 --> 00:16:00.259
So misconception five is
00:16:00.299 --> 00:16:02.000
that it's easy to spot when
00:16:02.041 --> 00:16:03.302
something is AI generated,
00:16:03.381 --> 00:16:05.903
either writing or images or video.
00:16:07.524 --> 00:16:09.306
Most people think I'm smart
00:16:09.346 --> 00:16:10.326
enough to tell the difference.
00:16:10.427 --> 00:16:12.107
I could tell if something was fake or not.
00:16:12.668 --> 00:16:14.568
But they've done studies and
00:16:14.629 --> 00:16:15.610
the majority of Americans
00:16:15.649 --> 00:16:16.590
can't tell the difference
00:16:16.629 --> 00:16:18.812
between real human content
00:16:18.871 --> 00:16:20.332
and AI generated content.
00:16:21.133 --> 00:16:22.293
And I got to tell you,
00:16:22.312 --> 00:16:23.994
in the two years I've been doing this,
00:16:24.754 --> 00:16:25.854
it's gotten so much harder
00:16:25.894 --> 00:16:26.914
because I used to be able
00:16:26.934 --> 00:16:28.294
to talk about like, you know,
00:16:28.335 --> 00:16:29.174
count the fingers.
00:16:29.195 --> 00:16:30.395
Do they have six fingers?
00:16:30.456 --> 00:16:31.696
Oh, it's probably not real.
00:16:32.375 --> 00:16:33.817
Are there like blurry spots
00:16:33.856 --> 00:16:34.517
in the background?
00:16:34.557 --> 00:16:35.256
Stuff like that.
00:16:35.677 --> 00:16:36.577
You can't really rely on
00:16:36.618 --> 00:16:37.378
that stuff anymore.
00:16:38.118 --> 00:16:40.400
It's gotten so good and so convincing,
00:16:40.721 --> 00:16:41.881
and it's only going to get harder.
00:16:42.503 --> 00:16:43.744
So really,
00:16:43.984 --> 00:16:45.745
we have to educate ourselves and
00:16:45.826 --> 00:16:49.190
others about manipulation of AI.
00:16:49.750 --> 00:16:52.373
And the key is really critical thinking.
00:16:52.472 --> 00:16:53.153
And I think, Sophie,
00:16:53.173 --> 00:16:53.955
you mentioned before we
00:16:53.975 --> 00:16:55.616
went live is the younger
00:16:55.657 --> 00:16:57.278
people are used to these kinds of things.
00:16:57.318 --> 00:16:58.220
They grew up with it.
00:16:59.062 --> 00:17:00.263
But the older you get,
00:17:00.302 --> 00:17:01.144
you haven't really been
00:17:01.303 --> 00:17:02.684
exposed to this sort of thing.
00:17:02.784 --> 00:17:05.007
So it's important to talk to our parents,
00:17:05.027 --> 00:17:06.867
our grandparents about these things,
00:17:07.288 --> 00:17:09.109
because they can get scammed.
00:17:09.150 --> 00:17:10.611
They can fall for, you know,
00:17:10.711 --> 00:17:12.813
fake information, things like that.
00:17:13.853 --> 00:17:14.574
They're not getting the
00:17:14.614 --> 00:17:15.994
exposure that that the
00:17:16.015 --> 00:17:16.756
young people are getting.
00:17:16.955 --> 00:17:18.017
My kids are really cynical
00:17:18.037 --> 00:17:18.856
about stuff online.
00:17:18.876 --> 00:17:20.719
They're like, that's probably not real.
00:17:21.179 --> 00:17:22.759
Like they're just that's just their brain.
00:17:24.121 --> 00:17:25.021
But the older you go,
00:17:25.041 --> 00:17:25.903
you're not you're not used
00:17:25.942 --> 00:17:27.544
to that way because you
00:17:27.564 --> 00:17:28.625
didn't grow up in that world.
00:17:30.530 --> 00:17:31.211
so like what you're
00:17:31.230 --> 00:17:32.551
describing here like to me
00:17:32.571 --> 00:17:33.712
for me personally this is
00:17:33.772 --> 00:17:35.913
like the scariest part
00:17:35.933 --> 00:17:38.796
about it right is like how
00:17:38.935 --> 00:17:40.896
good it's getting and the
00:17:40.977 --> 00:17:41.978
fact that it can be
00:17:42.198 --> 00:17:44.338
anywhere referencing
00:17:44.539 --> 00:17:49.563
anything yeah um and for a world that has
00:17:50.421 --> 00:17:51.381
of like what you mentioned
00:17:51.421 --> 00:17:52.682
like we're we're kind of in
00:17:52.741 --> 00:17:54.541
this point and I guess like
00:17:54.761 --> 00:17:55.982
human history where like
00:17:56.103 --> 00:17:58.103
some of us remember what
00:17:58.163 --> 00:18:00.423
it's like to get dialed up
00:18:00.604 --> 00:18:02.525
internet for the first time
00:18:02.545 --> 00:18:05.185
right well some of us like
00:18:05.266 --> 00:18:07.665
we're like the the iphone
00:18:07.705 --> 00:18:09.067
the smartphone is just it's
00:18:09.207 --> 00:18:11.027
always been there so like
00:18:11.086 --> 00:18:12.907
we we do or kind of at a
00:18:14.607 --> 00:18:16.308
interesting place with our
00:18:16.489 --> 00:18:18.049
exposure and experience
00:18:18.150 --> 00:18:19.631
with this type of technology,
00:18:20.030 --> 00:18:20.872
generally speaking.
00:18:21.893 --> 00:18:22.073
You know,
00:18:22.093 --> 00:18:23.433
it used to be the joke was like
00:18:23.493 --> 00:18:25.234
your parents, your grandparents would say,
00:18:25.595 --> 00:18:26.295
when I was your age,
00:18:26.315 --> 00:18:28.596
I had to walk to school, you know,
00:18:28.656 --> 00:18:30.718
ten feet of snow and all that stuff.
00:18:30.738 --> 00:18:31.638
I think the modern version
00:18:31.659 --> 00:18:33.200
of that is me telling my kids, like,
00:18:33.240 --> 00:18:33.839
when I was your age,
00:18:33.880 --> 00:18:35.441
I had to use dial-up
00:18:35.500 --> 00:18:36.961
internet and wait for a
00:18:37.182 --> 00:18:38.363
single image to download
00:18:38.403 --> 00:18:39.182
for five minutes.
00:18:39.262 --> 00:18:40.523
And, you know,
00:18:40.584 --> 00:18:41.964
it blows their mind how much
00:18:42.305 --> 00:18:43.145
stuff has changed.
00:18:43.922 --> 00:18:45.364
We had these giant books
00:18:45.723 --> 00:18:47.326
called photo albums.
00:18:48.987 --> 00:18:50.128
Encyclopedias.
00:18:50.929 --> 00:18:51.609
Yeah, right.
00:18:52.770 --> 00:18:53.291
I know.
00:18:53.311 --> 00:18:54.373
I told my kids once that I
00:18:54.393 --> 00:18:55.273
was older than Google,
00:18:55.314 --> 00:18:56.295
and it just blew their mind.
00:18:57.559 --> 00:18:57.880
what?
00:18:57.980 --> 00:18:59.422
People would have thought I said, you know,
00:18:59.521 --> 00:19:00.823
I was alive before
00:19:01.423 --> 00:19:02.984
electricity was invented or something.
00:19:07.949 --> 00:19:08.269
Wow.
00:19:08.529 --> 00:19:08.690
Yeah.
00:19:08.769 --> 00:19:10.090
No, those are, those are all,
00:19:10.131 --> 00:19:11.031
those are really good.
00:19:11.092 --> 00:19:13.273
Like, I mean the, the,
00:19:13.314 --> 00:19:15.316
the five misconceptions I
00:19:15.395 --> 00:19:17.416
feel like at least I hear
00:19:17.678 --> 00:19:19.318
somewhat on a regular basis
00:19:20.059 --> 00:19:21.881
or have some concern over,
00:19:21.980 --> 00:19:22.902
over those topics.
00:19:23.142 --> 00:19:23.282
And, and,
00:19:23.923 --> 00:19:26.545
in some way um I do know
00:19:26.585 --> 00:19:28.467
that you've also done work
00:19:28.747 --> 00:19:31.970
around policy is that is
00:19:31.990 --> 00:19:33.131
that correct in terms of
00:19:33.191 --> 00:19:34.731
like what schools and
00:19:34.791 --> 00:19:35.893
districts should be doing
00:19:36.913 --> 00:19:38.895
around that and sophie and
00:19:38.915 --> 00:19:39.896
I have talked about this
00:19:40.057 --> 00:19:41.018
offline but like kind of
00:19:41.038 --> 00:19:42.138
like you know you kind of
00:19:42.159 --> 00:19:43.380
have some folks that are
00:19:43.400 --> 00:19:47.222
like no policy cut off
00:19:47.262 --> 00:19:48.584
while others are being
00:19:48.624 --> 00:19:50.326
really purposeful um
00:19:51.257 --> 00:19:53.258
What are the, what are the ramifications?
00:19:53.337 --> 00:19:55.878
Like what, what are the effects of, of,
00:19:56.079 --> 00:19:58.681
of either and kind of what is your,
00:19:59.300 --> 00:20:00.622
what is your mindset with all that?
00:20:01.903 --> 00:20:03.523
So this is something I hear a lot is like,
00:20:03.903 --> 00:20:05.865
do we need an AI policy or not?
00:20:07.066 --> 00:20:08.406
And because I work a lot
00:20:08.426 --> 00:20:09.548
with school boards,
00:20:09.587 --> 00:20:10.688
policy is a big part of
00:20:10.708 --> 00:20:12.588
what they do is adopting policy.
00:20:13.970 --> 00:20:15.411
And the message I give them
00:20:15.611 --> 00:20:17.551
is whether you have a
00:20:17.731 --> 00:20:20.473
standalone AI policy or you
00:20:20.493 --> 00:20:21.654
weave it in other places,
00:20:22.055 --> 00:20:23.215
you need to spell out
00:20:23.516 --> 00:20:25.777
artificial intelligence in your policies.
00:20:27.377 --> 00:20:28.439
There's different ways to go
00:20:28.459 --> 00:20:29.279
about doing it,
00:20:29.881 --> 00:20:31.502
and we haven't really
00:20:31.542 --> 00:20:33.164
talked about my AI toolkit yet,
00:20:33.184 --> 00:20:33.885
but there's a couple of
00:20:33.945 --> 00:20:36.068
examples of AI policies in
00:20:36.088 --> 00:20:38.391
there as well that people can look at.
00:20:39.997 --> 00:20:41.518
But the story I like to tell
00:20:41.637 --> 00:20:43.038
is from back in October,
00:20:43.578 --> 00:20:45.019
there was a school,
00:20:45.059 --> 00:20:46.259
you may have heard about this in,
00:20:46.661 --> 00:20:47.800
I believe it was in Boston,
00:20:48.401 --> 00:20:49.741
where a history teacher
00:20:50.321 --> 00:20:52.544
gave a project to students
00:20:53.144 --> 00:20:54.325
and this boy turned in his
00:20:54.365 --> 00:20:55.365
project and the history
00:20:55.384 --> 00:20:57.766
teacher could tell that he's Chad GPT.
00:20:57.806 --> 00:20:59.988
He's like, psh, F, you cheated.
00:21:00.788 --> 00:21:02.790
The parents sued the school
00:21:03.011 --> 00:21:04.413
because they said it
00:21:04.433 --> 00:21:05.615
doesn't say in that assignment,
00:21:05.675 --> 00:21:06.938
it doesn't say in your syllabus,
00:21:06.978 --> 00:21:07.618
it doesn't say in your
00:21:07.659 --> 00:21:09.742
policies that he can't use it.
00:21:10.364 --> 00:21:10.564
Now,
00:21:10.644 --> 00:21:11.685
I don't know how that's going to turn
00:21:11.806 --> 00:21:12.547
out in the courts.
00:21:13.481 --> 00:21:14.481
but you guys don't want to
00:21:14.521 --> 00:21:15.762
go to court for this, right?
00:21:15.863 --> 00:21:18.344
You got another gift.
00:21:18.364 --> 00:21:20.325
So my advice is always,
00:21:20.424 --> 00:21:21.045
whether you have a
00:21:21.204 --> 00:21:22.806
standalone AI policy or you
00:21:23.105 --> 00:21:24.346
address it other places,
00:21:24.747 --> 00:21:26.387
like your plagiarism policy,
00:21:26.768 --> 00:21:28.769
your technology use agreement,
00:21:29.249 --> 00:21:30.308
you need to spell it out.
00:21:30.409 --> 00:21:31.490
Because just saying like,
00:21:32.210 --> 00:21:33.431
Students won't use
00:21:33.671 --> 00:21:36.172
electronic tools and
00:21:36.211 --> 00:21:37.133
claiming it as their own
00:21:37.192 --> 00:21:38.053
work or whatever.
00:21:38.613 --> 00:21:40.534
That's too vague in the age of AI.
00:21:40.554 --> 00:21:41.974
You really need to spell out
00:21:42.634 --> 00:21:44.256
what's acceptable and what's not.
00:21:44.997 --> 00:21:45.957
And I think a lot of times
00:21:45.997 --> 00:21:46.737
the best thing to put in
00:21:46.777 --> 00:21:49.259
policy is the teacher will
00:21:49.398 --> 00:21:51.480
determine what level of AI
00:21:51.500 --> 00:21:53.060
can be used and then make
00:21:53.080 --> 00:21:53.881
sure your teacher's in
00:21:53.901 --> 00:21:55.162
their syllabi and their
00:21:55.201 --> 00:21:56.741
assignments are spelling it out.
00:21:56.781 --> 00:21:57.663
There's kind of this common,
00:21:57.682 --> 00:21:59.163
you've probably seen at the stoplight,
00:21:59.997 --> 00:22:01.297
technique that people use to
00:22:01.356 --> 00:22:03.637
teach what level of ai is
00:22:03.698 --> 00:22:05.298
okay on this project red
00:22:05.338 --> 00:22:07.038
means none yellow means
00:22:07.499 --> 00:22:08.500
under the conditions I give
00:22:08.519 --> 00:22:10.619
you and green means go wild
00:22:11.119 --> 00:22:12.340
so that's kind of a basic
00:22:12.520 --> 00:22:14.441
version but yeah I can't
00:22:14.500 --> 00:22:16.422
stress enough if you're if
00:22:16.442 --> 00:22:17.281
you're at the leadership
00:22:17.321 --> 00:22:19.002
level you need to put in
00:22:19.022 --> 00:22:19.942
your policies if you're a
00:22:19.982 --> 00:22:21.762
teacher you need to address
00:22:21.803 --> 00:22:23.304
it in your in your assignments
00:22:25.730 --> 00:22:29.373
yeah yeah there was sophie
00:22:29.393 --> 00:22:30.053
this may have been an
00:22:30.153 --> 00:22:31.874
episode that that you were
00:22:32.273 --> 00:22:32.993
maybe it was when you were
00:22:33.074 --> 00:22:34.654
at fetc because I remember
00:22:34.694 --> 00:22:37.415
doing this solo um but the
00:22:37.455 --> 00:22:39.196
guests for for an episode a
00:22:39.217 --> 00:22:40.657
while back like we we had
00:22:40.698 --> 00:22:42.678
that conversation of like
00:22:43.159 --> 00:22:45.299
he was just like I want if
00:22:45.319 --> 00:22:46.859
they're going to use ai
00:22:47.881 --> 00:22:50.162
they just needed to disclose it and how
00:22:51.327 --> 00:22:53.308
because some of the stuff he
00:22:53.328 --> 00:22:54.809
was doing it's not always
00:22:54.930 --> 00:22:57.250
essay writing right it's a
00:22:57.510 --> 00:23:00.833
project-based nature and
00:23:00.952 --> 00:23:01.913
one can make an argument
00:23:01.932 --> 00:23:05.255
that like ai can there's a
00:23:05.315 --> 00:23:06.755
place for that right in
00:23:06.875 --> 00:23:07.736
that kind of learning and
00:23:07.756 --> 00:23:10.237
that kind of project um I
00:23:10.257 --> 00:23:11.897
also think it helps bridge like
00:23:12.833 --> 00:23:14.336
better relationships and
00:23:14.415 --> 00:23:16.579
better discussions between
00:23:16.960 --> 00:23:17.862
educators and their
00:23:17.942 --> 00:23:22.409
students around this topic and around AI,
00:23:22.670 --> 00:23:23.632
which hopefully will be
00:23:23.791 --> 00:23:25.434
enlightening and insightful
00:23:25.515 --> 00:23:26.717
for everyone involved.
00:23:28.839 --> 00:23:30.661
And, you know, none of this is new.
00:23:30.781 --> 00:23:33.163
I mean, AI, generative AI is new,
00:23:33.564 --> 00:23:35.505
but the idea of plagiarism, I mean,
00:23:35.525 --> 00:23:37.686
I'm sure back in the, you know,
00:23:37.747 --> 00:23:38.527
caveman days,
00:23:38.788 --> 00:23:39.647
people were probably copying
00:23:39.667 --> 00:23:41.990
each other's drawings off the cave wall,
00:23:42.069 --> 00:23:42.369
right?
00:23:42.430 --> 00:23:43.810
Like plagiarism is not new.
00:23:45.172 --> 00:23:48.194
I love to show this article
00:23:48.255 --> 00:23:49.115
about teachers in the
00:23:49.194 --> 00:23:50.395
eighties protesting
00:23:50.435 --> 00:23:52.017
calculator use in schools
00:23:52.057 --> 00:23:53.578
saying calculators are cheating.
00:23:53.959 --> 00:23:55.460
Sophie feels very passionate about that.
00:23:57.494 --> 00:23:58.256
So I say,
00:23:58.776 --> 00:24:00.518
so when I have a room full of people,
00:24:00.538 --> 00:24:01.157
I like to say,
00:24:01.198 --> 00:24:02.038
is it cheating to use a
00:24:02.078 --> 00:24:04.361
calculator on your math homework?
00:24:04.941 --> 00:24:05.761
And what's the answer?
00:24:06.521 --> 00:24:08.344
Well, it depends, right?
00:24:08.364 --> 00:24:10.204
It depends on what you're assessing.
00:24:10.246 --> 00:24:10.726
Exactly.
00:24:11.445 --> 00:24:12.287
AI is the same way.
00:24:12.467 --> 00:24:14.108
Like, what's the ultimate goal?
00:24:14.449 --> 00:24:15.170
What are they learning?
00:24:15.349 --> 00:24:16.871
What are they, what's being assessed?
00:24:17.471 --> 00:24:18.051
So again,
00:24:18.092 --> 00:24:20.273
it comes back to that whole AI as a tool,
00:24:20.814 --> 00:24:21.114
right?
00:24:21.535 --> 00:24:22.955
When's it right to use it?
00:24:22.996 --> 00:24:24.237
When is it not right to use it?
00:24:25.451 --> 00:24:27.512
And because you mentioned that,
00:24:27.614 --> 00:24:28.634
I just want to say this,
00:24:28.693 --> 00:24:32.237
that like we've had AI in
00:24:32.277 --> 00:24:33.637
the classrooms for a while
00:24:33.657 --> 00:24:34.438
because they would like
00:24:34.498 --> 00:24:35.919
take pictures of their math
00:24:35.959 --> 00:24:37.180
work and it would tell them
00:24:37.220 --> 00:24:38.060
how to solve it.
00:24:39.201 --> 00:24:40.643
But no one was like it
00:24:40.682 --> 00:24:42.644
wasn't a national uprise
00:24:42.864 --> 00:24:44.144
and there weren't articles
00:24:44.184 --> 00:24:45.605
and news and constant
00:24:45.786 --> 00:24:50.950
podcasts about that app on their phone.
00:24:51.029 --> 00:24:52.290
I'm a math teacher, by the way.
00:24:53.868 --> 00:24:54.910
on their phone that they was
00:24:54.970 --> 00:24:56.471
telling them how to do the
00:24:56.510 --> 00:24:57.332
work and the kids weren't
00:24:57.372 --> 00:24:58.413
actually using it the way
00:24:58.432 --> 00:24:59.334
it was designed so they
00:24:59.354 --> 00:25:00.253
could actually learn what
00:25:00.294 --> 00:25:00.954
they're supposed to do.
00:25:00.994 --> 00:25:02.076
They were just getting the answer,
00:25:02.115 --> 00:25:03.416
copying it down and moving on.
00:25:04.538 --> 00:25:05.077
And then,
00:25:05.919 --> 00:25:08.260
so like this uprising about calculators,
00:25:08.280 --> 00:25:10.383
there wasn't really one about the, that,
00:25:10.871 --> 00:25:12.892
that app um for math and but
00:25:13.071 --> 00:25:15.053
but but kids are gonna go
00:25:15.252 --> 00:25:17.314
use generati to write an
00:25:17.394 --> 00:25:19.855
essay and oh my gosh it's
00:25:19.914 --> 00:25:21.355
national news and people
00:25:21.415 --> 00:25:22.375
are freaking out about them
00:25:22.435 --> 00:25:23.696
cheating on essays and I'm
00:25:23.737 --> 00:25:27.878
like use it I'm all about
00:25:28.019 --> 00:25:28.999
using it as the tool to
00:25:29.038 --> 00:25:30.359
teach what the actual thing
00:25:30.380 --> 00:25:32.121
that you're about to do I
00:25:32.161 --> 00:25:32.980
taught middle school math
00:25:33.381 --> 00:25:34.481
if my kids were struggling
00:25:34.501 --> 00:25:35.362
with their multiplication
00:25:35.402 --> 00:25:36.722
tables I was going to give
00:25:36.742 --> 00:25:37.782
them the calculator so they
00:25:37.803 --> 00:25:39.644
could learn how to find the slope
00:25:40.268 --> 00:25:42.888
and do proportions in my math class,
00:25:42.909 --> 00:25:43.689
because that's what I was
00:25:43.709 --> 00:25:44.848
supposed to be teaching them,
00:25:45.269 --> 00:25:46.750
not trying to get them to
00:25:46.769 --> 00:25:47.630
memorize something that
00:25:47.650 --> 00:25:49.049
they were supposed to have
00:25:49.109 --> 00:25:50.730
learned earlier and before me.
00:25:50.810 --> 00:25:52.631
So I'm like, okay,
00:25:52.931 --> 00:25:54.230
let's use Grammarly to work
00:25:54.250 --> 00:25:55.912
with your grammar while
00:25:55.951 --> 00:25:58.152
you're focusing on genre
00:25:58.392 --> 00:25:59.732
and alliteration and
00:25:59.752 --> 00:26:00.913
descriptive language.
00:26:00.952 --> 00:26:01.792
Yes, let's do that.
00:26:01.833 --> 00:26:02.834
That's my opinion,
00:26:02.894 --> 00:26:04.054
but I know other people
00:26:04.074 --> 00:26:04.993
have other opinions.
00:26:05.473 --> 00:26:06.354
Use it as a tool.
00:26:06.674 --> 00:26:07.894
I agree.
00:26:07.914 --> 00:26:08.474
It's not going away.
00:26:09.638 --> 00:26:11.220
I like to cite this study, too,
00:26:11.240 --> 00:26:12.759
that Education Week did
00:26:13.359 --> 00:26:15.361
that the number of kids who
00:26:15.921 --> 00:26:17.622
are cheating has not gone
00:26:17.701 --> 00:26:20.002
up since ChatGPT came around.
00:26:20.343 --> 00:26:20.702
Basically,
00:26:20.722 --> 00:26:22.222
the kids who used to cheat are
00:26:22.262 --> 00:26:22.943
still doing it.
00:26:23.163 --> 00:26:24.463
They're probably using AI now.
00:26:25.104 --> 00:26:25.703
The kids who weren't
00:26:25.723 --> 00:26:26.845
cheating aren't cheating now.
00:26:27.204 --> 00:26:30.026
So it really hasn't changed.
00:26:30.165 --> 00:26:30.925
I love that.
00:26:31.046 --> 00:26:31.306
Yeah.
00:26:31.365 --> 00:26:32.826
The kids that are going to
00:26:32.866 --> 00:26:33.866
find a loophole are going
00:26:33.886 --> 00:26:35.087
to find a loophole no matter what.
00:26:35.107 --> 00:26:35.188
Yes.
00:26:35.567 --> 00:26:38.388
Exactly.
00:26:38.429 --> 00:26:38.989
That's amazing.
00:26:40.105 --> 00:26:41.567
I see,
00:26:41.586 --> 00:26:44.469
because you mentioned Grammarly just now.
00:26:46.210 --> 00:26:47.050
I came from a school
00:26:47.090 --> 00:26:50.212
district that they blocked Grammarly,
00:26:50.893 --> 00:26:52.534
not because it was AI, but because,
00:26:52.654 --> 00:26:54.516
at least at the time, it wasn't,
00:26:54.816 --> 00:26:56.737
at least in the eyes of the district,
00:26:56.856 --> 00:26:59.159
it wasn't as compliant as
00:26:59.239 --> 00:27:01.480
others with privacy security.
00:27:03.162 --> 00:27:03.862
At least that's what was
00:27:03.922 --> 00:27:05.282
told to us teachers, right?
00:27:05.303 --> 00:27:07.084
And like, OK, cool, checks out.
00:27:09.488 --> 00:27:09.827
But Mark,
00:27:09.948 --> 00:27:11.729
I do know part of what you bring
00:27:11.769 --> 00:27:12.808
to the table into this
00:27:12.868 --> 00:27:14.170
conversation is the
00:27:14.210 --> 00:27:15.789
importance of vetting these
00:27:15.910 --> 00:27:20.392
AI tools and how to protect student data.
00:27:22.613 --> 00:27:25.153
What does that look like in your mind?
00:27:27.354 --> 00:27:28.974
What are the things to identify?
00:27:29.835 --> 00:27:31.655
What is that work involved?
00:27:31.695 --> 00:27:34.757
And is there a rubric in
00:27:34.777 --> 00:27:37.117
your view of what to look for?
00:27:38.496 --> 00:27:40.838
This is one of the hardest parts, right,
00:27:40.878 --> 00:27:42.480
is to make sure that before
00:27:42.500 --> 00:27:43.381
you use a tool,
00:27:43.421 --> 00:27:44.261
especially before you let
00:27:44.281 --> 00:27:45.182
your students use a tool,
00:27:45.202 --> 00:27:46.464
that it's safe for them.
00:27:47.085 --> 00:27:48.006
It's going to protect their
00:27:48.046 --> 00:27:50.788
personal data and that sort of thing.
00:27:51.009 --> 00:27:52.431
It's not going to show them
00:27:52.451 --> 00:27:53.332
anything crazy they
00:27:53.352 --> 00:27:54.913
shouldn't see at their age, too,
00:27:54.953 --> 00:27:55.795
is an important part.
00:27:57.500 --> 00:28:00.103
So we recommend that you vet
00:28:00.163 --> 00:28:01.243
all your tools before you
00:28:01.284 --> 00:28:02.523
let your students use them.
00:28:03.045 --> 00:28:04.705
But what does that mean?
00:28:05.886 --> 00:28:09.048
If you go to pretty much any AI website,
00:28:09.588 --> 00:28:10.450
it's going to say, oh,
00:28:10.670 --> 00:28:11.851
we don't sell your data.
00:28:11.951 --> 00:28:13.332
Your data is protected.
00:28:13.751 --> 00:28:15.032
But that doesn't mean it's true.
00:28:16.253 --> 00:28:17.273
One of the most fascinating
00:28:17.694 --> 00:28:19.536
and awkward conversations I
00:28:19.556 --> 00:28:22.178
ever had was we had someone
00:28:22.218 --> 00:28:23.838
demo an AI tool for us.
00:28:24.419 --> 00:28:26.681
And we had one of our lawyers on the call.
00:28:27.661 --> 00:28:28.622
And our lawyer was going
00:28:28.642 --> 00:28:29.883
through their website and like, oh, well,
00:28:29.923 --> 00:28:30.883
it says here that you
00:28:30.923 --> 00:28:31.884
protect the information.
00:28:31.923 --> 00:28:32.825
Well, how do you do that?
00:28:32.964 --> 00:28:34.566
And they know what to say,
00:28:34.925 --> 00:28:36.767
but they can't really prove it.
00:28:37.248 --> 00:28:38.448
AI is a goldmine right now.
00:28:38.508 --> 00:28:39.709
Everybody wants to sell you
00:28:39.788 --> 00:28:41.750
some cool product or something,
00:28:42.250 --> 00:28:43.271
or they want something free.
00:28:43.332 --> 00:28:44.092
It's going to take all your
00:28:44.112 --> 00:28:45.173
information from you.
00:28:46.973 --> 00:28:48.476
So you really have to be careful.
00:28:49.217 --> 00:28:50.598
You got to read the fine print.
00:28:50.999 --> 00:28:52.260
You got to ask questions.
00:28:52.441 --> 00:28:53.261
You know what you could do?
00:28:53.301 --> 00:28:55.023
You could take their
00:28:55.044 --> 00:28:55.904
disclaimer and you could
00:28:55.944 --> 00:28:57.547
put it in chat TPT and say,
00:28:58.367 --> 00:28:59.588
are there any red flags here?
00:29:00.009 --> 00:29:00.871
Things like that.
00:29:01.371 --> 00:29:01.612
Perfect.
00:29:02.051 --> 00:29:03.413
Have the AI check the AI.
00:29:03.433 --> 00:29:03.733
I love that.
00:29:03.753 --> 00:29:06.036
Fight fire with fire, right?
00:29:06.517 --> 00:29:06.876
Yeah.
00:29:06.916 --> 00:29:09.420
So a couple of really good
00:29:09.640 --> 00:29:11.020
places that can help you.
00:29:11.602 --> 00:29:13.943
Common Sense Media has tons
00:29:13.983 --> 00:29:15.025
of great stuff on AI,
00:29:15.085 --> 00:29:15.625
and they've started
00:29:15.665 --> 00:29:18.087
publishing reviews of AI tools.
00:29:18.788 --> 00:29:19.690
And they will tell you what
00:29:19.829 --> 00:29:21.750
grade level they're appropriate for.
00:29:22.431 --> 00:29:23.872
And also if there's any
00:29:24.172 --> 00:29:25.573
collection of personal data,
00:29:25.633 --> 00:29:26.473
things like that.
00:29:26.753 --> 00:29:28.174
So that's a really great place to go.
00:29:28.654 --> 00:29:29.516
There's also something
00:29:29.536 --> 00:29:31.176
called the Student Privacy Pledge.
00:29:31.896 --> 00:29:32.497
And these are,
00:29:32.636 --> 00:29:34.558
if you just Google Student Privacy Pledge,
00:29:34.898 --> 00:29:36.240
these are organizations
00:29:36.279 --> 00:29:37.359
that have pledged that
00:29:37.380 --> 00:29:38.621
they're not going to
00:29:38.721 --> 00:29:40.402
collect student data.
00:29:40.842 --> 00:29:42.522
And the big ones are on there, you know,
00:29:42.623 --> 00:29:44.104
Apple, Microsoft, Google,
00:29:44.163 --> 00:29:45.105
things like that.
00:29:45.664 --> 00:29:46.826
Now, it's not, of course,
00:29:46.945 --> 00:29:48.007
that's not a legal thing.
00:29:48.752 --> 00:29:50.865
legally bound promise, but
00:29:51.673 --> 00:29:53.813
at least it's a pretty good indicator.
00:29:53.913 --> 00:29:55.054
And I guess they probably
00:29:55.074 --> 00:29:55.913
could have a lawsuit on
00:29:55.933 --> 00:29:57.314
their hands if they claim
00:29:57.354 --> 00:29:59.173
that and it turned out to not be true.
00:30:00.674 --> 00:30:02.994
So do your research the best you can.
00:30:03.335 --> 00:30:04.535
Use tools like that,
00:30:04.734 --> 00:30:06.455
like Common Sense Media and
00:30:06.476 --> 00:30:07.855
the Privacy Pledge.
00:30:08.895 --> 00:30:09.915
Don't be afraid to ask your
00:30:10.016 --> 00:30:12.057
IT people or maybe your
00:30:12.076 --> 00:30:14.676
librarians to investigate it too.
00:30:15.896 --> 00:30:17.178
You just got to be careful.
00:30:17.198 --> 00:30:18.678
And of course, we got to teach our kids
00:30:19.417 --> 00:30:20.838
what it's safe and not safe
00:30:20.878 --> 00:30:23.340
to put into ai you know we
00:30:23.361 --> 00:30:24.241
do a good job teaching
00:30:24.301 --> 00:30:25.722
stranger danger online
00:30:25.942 --> 00:30:27.564
right don't tell a stranger
00:30:27.604 --> 00:30:28.744
where you live things like
00:30:28.805 --> 00:30:30.605
that but sometimes kids let
00:30:30.625 --> 00:30:31.487
their guard down with ai
00:30:31.507 --> 00:30:32.587
because they're like it's
00:30:32.627 --> 00:30:34.789
not a person so I can say
00:30:34.829 --> 00:30:36.049
whatever it doesn't matter
00:30:36.631 --> 00:30:37.531
it's like yeah but people
00:30:37.551 --> 00:30:38.332
can still get that
00:30:38.352 --> 00:30:40.534
information so on the other
00:30:40.973 --> 00:30:43.016
side of that computer screen yeah
00:30:45.555 --> 00:30:48.756
and it's for me the like at
00:30:48.776 --> 00:30:49.636
least in the classroom
00:30:49.717 --> 00:30:52.718
setting it's all the
00:30:52.877 --> 00:30:54.519
student-facing stuff right
00:30:54.878 --> 00:30:58.040
like um like a lot of these
00:30:58.080 --> 00:30:59.161
companies are like oh
00:30:59.221 --> 00:31:00.521
here's our here's our
00:31:00.582 --> 00:31:02.682
product here's our platform
00:31:03.202 --> 00:31:04.222
and now there's like the
00:31:04.343 --> 00:31:06.903
student version of that um
00:31:06.963 --> 00:31:07.924
and instructionally that
00:31:09.056 --> 00:31:09.635
That's cool.
00:31:09.896 --> 00:31:10.737
Like what,
00:31:10.916 --> 00:31:12.979
depending on what problem it's
00:31:13.159 --> 00:31:14.500
addressing or how it's
00:31:14.539 --> 00:31:16.501
moving the needle instructionally.
00:31:18.423 --> 00:31:19.144
But just like,
00:31:21.285 --> 00:31:23.426
what are we putting in the
00:31:23.467 --> 00:31:24.647
hands of our children?
00:31:25.167 --> 00:31:25.528
And,
00:31:26.028 --> 00:31:27.529
cause kind of what I'm hearing Mark is
00:31:27.569 --> 00:31:32.032
like, nothing is bulletproof in this talk,
00:31:32.354 --> 00:31:32.634
right?
00:31:32.713 --> 00:31:34.934
Like there is some sort of
00:31:34.996 --> 00:31:36.737
risk in gamble to an extent.
00:31:38.311 --> 00:31:38.992
If you're a leader,
00:31:39.053 --> 00:31:40.273
if you're an ed tech leader
00:31:40.314 --> 00:31:41.695
right now for a school district,
00:31:42.176 --> 00:31:44.219
that's a lot of pressure on
00:31:44.239 --> 00:31:47.564
your shoulders to, you know,
00:31:47.923 --> 00:31:48.984
what's the best gamble?
00:31:49.967 --> 00:31:51.028
I guess that's kind of how
00:31:51.107 --> 00:31:51.769
I'm thinking of it.
00:31:53.099 --> 00:31:53.861
Well, you know,
00:31:53.901 --> 00:31:55.321
even to take it away from AI,
00:31:55.762 --> 00:31:57.104
there was a huge data
00:31:57.124 --> 00:31:58.986
breach with PowerSchool recently.
00:31:59.226 --> 00:31:59.707
You know,
00:31:59.747 --> 00:32:02.849
millions of students' data was exposed.
00:32:02.890 --> 00:32:03.750
Now, PowerSchool,
00:32:04.070 --> 00:32:05.692
they have protections in place.
00:32:06.173 --> 00:32:07.515
I'm sure if you read their contract,
00:32:07.535 --> 00:32:08.015
they would say,
00:32:08.055 --> 00:32:09.076
we don't give out any
00:32:09.115 --> 00:32:10.337
personal data to anybody.
00:32:10.738 --> 00:32:11.318
But, you know,
00:32:11.338 --> 00:32:12.819
they got hacked or whatever.
00:32:13.840 --> 00:32:16.022
So, yeah, you're never totally safe.
00:32:17.384 --> 00:32:18.285
But I don't think we can
00:32:18.325 --> 00:32:20.507
just try to turn it all off either.
00:32:20.926 --> 00:32:21.287
You know,
00:32:21.307 --> 00:32:23.749
there's a famous case that the
00:32:23.769 --> 00:32:24.710
entire state of New York
00:32:25.526 --> 00:32:26.125
their Department of
00:32:26.185 --> 00:32:28.047
Education about a year and
00:32:28.067 --> 00:32:31.008
a half ago said, no AI for students.
00:32:31.167 --> 00:32:32.828
We're going to block everything.
00:32:33.548 --> 00:32:34.128
No AI.
00:32:34.388 --> 00:32:35.630
It's too much of a risk.
00:32:36.289 --> 00:32:37.890
And a few months later, they're like,
00:32:38.190 --> 00:32:38.711
it didn't work.
00:32:38.830 --> 00:32:39.490
We couldn't do it.
00:32:39.911 --> 00:32:40.971
So they changed their minds.
00:32:40.990 --> 00:32:41.412
And now they're like,
00:32:41.432 --> 00:32:42.011
we're going to teach
00:32:42.092 --> 00:32:43.852
responsible use instead,
00:32:44.153 --> 00:32:45.133
which I think is the smart
00:32:45.153 --> 00:32:45.992
thing to do anyway.
00:32:47.294 --> 00:32:48.294
But you can't run from it.
00:32:48.314 --> 00:32:49.173
You can't ignore it.
00:32:49.233 --> 00:32:49.653
It's here.
00:32:49.693 --> 00:32:50.795
It's not going anywhere.
00:32:51.454 --> 00:32:53.436
you know, our kids are using it already.
00:32:53.916 --> 00:32:55.397
They're going to use it in college,
00:32:55.538 --> 00:32:57.039
in the workforce, in the military,
00:32:57.059 --> 00:32:58.401
whatever, whatever they're going to do,
00:32:58.461 --> 00:32:59.781
it's going to be a part of their lives.
00:33:00.041 --> 00:33:01.623
So we can't,
00:33:01.923 --> 00:33:04.566
we can't let the threats scare
00:33:04.605 --> 00:33:05.266
us too much.
00:33:05.625 --> 00:33:07.587
We have to use it responsibly.
00:33:08.409 --> 00:33:08.528
We,
00:33:08.709 --> 00:33:10.250
a term we like to use here is cautious
00:33:10.309 --> 00:33:11.611
embrace of AI.
00:33:11.871 --> 00:33:15.453
So we have to cautiously embrace it.
00:33:16.494 --> 00:33:17.236
Yeah.
00:33:17.276 --> 00:33:17.695
I love that.
00:33:17.976 --> 00:33:18.415
I love that.
00:33:18.936 --> 00:33:20.198
I, and kind of like what you,
00:33:21.080 --> 00:33:22.082
touched on a bit,
00:33:22.162 --> 00:33:23.803
which I think is a good segue for,
00:33:24.403 --> 00:33:27.924
or about to talk about, um, is this,
00:33:28.545 --> 00:33:29.685
this thing of like the
00:33:29.746 --> 00:33:32.428
skills that I guess our,
00:33:32.548 --> 00:33:34.008
our students need, right?
00:33:34.048 --> 00:33:36.630
Like these, these skills that,
00:33:37.009 --> 00:33:38.391
that are really future ready.
00:33:39.271 --> 00:33:42.633
Um, how important is that?
00:33:43.114 --> 00:33:45.075
And also what are these
00:33:45.095 --> 00:33:46.496
skills that are going to be
00:33:46.516 --> 00:33:50.317
needed over the next decade or decades?
00:33:51.650 --> 00:33:53.432
So I think when educators
00:33:53.471 --> 00:33:55.733
hear AI is not going anywhere,
00:33:55.773 --> 00:33:56.654
it's a huge deal,
00:33:56.714 --> 00:33:57.435
everybody's going to be
00:33:57.476 --> 00:33:58.977
using it on the job, they think,
00:33:59.657 --> 00:34:00.238
is it all we're going to
00:34:00.258 --> 00:34:03.682
teach is how to use AI, how to talk to AI,
00:34:03.821 --> 00:34:04.583
tech skills?
00:34:05.143 --> 00:34:06.144
But that's not true.
00:34:07.685 --> 00:34:09.246
What I like to cite is the
00:34:09.266 --> 00:34:10.648
World Economic Forum puts
00:34:10.728 --> 00:34:12.409
out this future of jobs report.
00:34:13.150 --> 00:34:14.532
They just put it out in January.
00:34:15.443 --> 00:34:17.445
and they survey all the big
00:34:17.485 --> 00:34:19.226
employers around the world.
00:34:19.467 --> 00:34:20.047
And they say,
00:34:20.507 --> 00:34:22.349
what skills are you looking
00:34:22.389 --> 00:34:24.251
for when you hire somebody
00:34:24.612 --> 00:34:25.693
in twenty twenty five?
00:34:26.572 --> 00:34:28.014
And the top ten list here
00:34:29.021 --> 00:34:30.242
I'll just read them to you quickly.
00:34:31.402 --> 00:34:32.182
But what's interesting is
00:34:32.344 --> 00:34:33.403
only one of them is about
00:34:33.443 --> 00:34:34.985
technology out of the top
00:34:35.005 --> 00:34:36.224
ten skills they're looking for.
00:34:36.945 --> 00:34:38.365
So while I read this,
00:34:38.405 --> 00:34:39.186
I want you guys to I'm
00:34:39.226 --> 00:34:39.887
putting on the spot here.
00:34:39.907 --> 00:34:40.567
I want you to think about
00:34:40.626 --> 00:34:41.728
why you think that is.
00:34:41.748 --> 00:34:45.429
OK, if AI is such a big deal, it's so big,
00:34:45.469 --> 00:34:46.349
it's going to get bigger.
00:34:46.510 --> 00:34:48.811
Why are people not that
00:34:48.851 --> 00:34:50.112
concerned with it in terms
00:34:50.152 --> 00:34:51.092
of what they're hiring for?
00:34:51.112 --> 00:34:53.413
OK, so the number one skill,
00:34:53.512 --> 00:34:54.414
analytical thinking.
00:34:55.429 --> 00:34:57.471
Number two, resilience, flexibility,
00:34:57.510 --> 00:34:58.150
and agility.
00:34:58.891 --> 00:34:59.331
Number three,
00:34:59.371 --> 00:35:01.333
leadership and social influence.
00:35:02.072 --> 00:35:03.494
Number four, creative thinking.
00:35:04.393 --> 00:35:04.875
Number five,
00:35:04.934 --> 00:35:06.534
motivation and self-awareness.
00:35:07.195 --> 00:35:09.137
Number six is technological literacy.
00:35:09.737 --> 00:35:10.536
That's the tech one.
00:35:11.398 --> 00:35:11.958
Number seven,
00:35:12.097 --> 00:35:13.619
empathy and active listening.
00:35:14.672 --> 00:35:15.132
Number eight,
00:35:15.152 --> 00:35:17.235
curiosity and lifelong learning.
00:35:17.996 --> 00:35:19.617
Number nine, talent management,
00:35:19.677 --> 00:35:21.018
which is basically a fancy
00:35:21.059 --> 00:35:22.239
word for like HR skills,
00:35:22.280 --> 00:35:23.541
being able to manage other people.
00:35:24.282 --> 00:35:26.003
And number ten is service
00:35:26.164 --> 00:35:28.025
orientation and customer service.
00:35:29.106 --> 00:35:30.847
So out of the top ten skills
00:35:30.887 --> 00:35:32.909
that people were looking
00:35:32.949 --> 00:35:37.333
for to hire right now,
00:35:37.353 --> 00:35:38.956
only one of them is about technology.
00:35:38.996 --> 00:35:39.896
Why do you guys think that is?
00:35:44.266 --> 00:35:45.226
You want to go first, Sophie?
00:35:45.427 --> 00:35:46.168
Yes, I do.
00:35:46.608 --> 00:35:47.309
I would like to think that
00:35:47.369 --> 00:35:50.471
it's more about all those
00:35:50.490 --> 00:35:51.311
skills are how you're going
00:35:51.331 --> 00:35:52.612
to use the technology.
00:35:53.052 --> 00:35:54.914
Your creativity,
00:35:54.954 --> 00:35:56.556
your analytical thinking is
00:35:56.576 --> 00:35:57.757
like how you're going to use it.
00:35:57.777 --> 00:35:58.757
And it's not just about that.
00:35:58.797 --> 00:35:59.978
It's about interacting with
00:35:59.998 --> 00:36:00.940
your peers as well.
00:36:00.980 --> 00:36:01.780
So your colleagues,
00:36:01.800 --> 00:36:03.742
your managers and things.
00:36:03.782 --> 00:36:09.407
And so I've been in jobs
00:36:09.427 --> 00:36:10.148
where they're like,
00:36:10.347 --> 00:36:10.967
you don't have to know
00:36:11.028 --> 00:36:12.630
anything about what
00:36:13.686 --> 00:36:16.288
you're selling,
00:36:16.327 --> 00:36:17.228
but you have to be willing
00:36:17.248 --> 00:36:18.228
to learn and you have to be
00:36:18.248 --> 00:36:19.949
a good person type of thing.
00:36:20.030 --> 00:36:21.090
So it's like, it's not,
00:36:21.210 --> 00:36:24.233
it's all of these people skills.
00:36:24.313 --> 00:36:25.594
Analytical thinking was one
00:36:25.614 --> 00:36:26.534
of the first ones that you said.
00:36:26.554 --> 00:36:27.375
And I was like, yes,
00:36:27.414 --> 00:36:28.456
we don't have that critical
00:36:28.476 --> 00:36:29.155
thinking that problem
00:36:29.175 --> 00:36:30.617
solving skills at something
00:36:30.657 --> 00:36:31.777
that we're math teachers
00:36:31.797 --> 00:36:33.759
are always trying to teach,
00:36:33.798 --> 00:36:35.940
but that's also how you
00:36:35.960 --> 00:36:36.840
need to think when you're
00:36:36.860 --> 00:36:38.302
interacting with AI,
00:36:38.461 --> 00:36:39.422
because if you don't
00:36:39.463 --> 00:36:40.523
understand why it keeps
00:36:40.543 --> 00:36:42.485
giving you not what you're looking for,
00:36:42.963 --> 00:36:45.125
then you gotta learn from your mistakes.
00:36:45.144 --> 00:36:46.126
You gotta move on from that.
00:36:46.826 --> 00:36:48.166
And just using the
00:36:48.206 --> 00:36:50.047
technology that can be taught,
00:36:50.108 --> 00:36:51.528
like how to use something
00:36:51.568 --> 00:36:52.369
that can be taught.
00:36:52.730 --> 00:36:53.809
But there are other skills
00:36:53.849 --> 00:36:56.831
that are the soft skills
00:36:56.851 --> 00:36:58.353
that you're gonna use more every day.
00:37:01.257 --> 00:37:03.277
that's great I and I guess
00:37:03.358 --> 00:37:04.780
my answer is like I'm
00:37:04.840 --> 00:37:06.780
removing the technology out
00:37:06.800 --> 00:37:07.521
of the equation all
00:37:07.561 --> 00:37:08.922
together with my answer
00:37:09.543 --> 00:37:11.844
because I think my answer
00:37:11.864 --> 00:37:13.806
is more grounded about
00:37:14.027 --> 00:37:17.188
knowledge so the more we go
00:37:17.228 --> 00:37:20.532
along as humans the more we
00:37:20.592 --> 00:37:22.574
learn how stuff works right
00:37:23.481 --> 00:37:26.423
on a social level, with science,
00:37:26.744 --> 00:37:28.505
with just all of it, right?
00:37:28.945 --> 00:37:32.469
And so that kind of unpacked
00:37:33.630 --> 00:37:35.190
the nuance and the layers
00:37:35.210 --> 00:37:39.614
that exist vertical and horizontally.
00:37:39.815 --> 00:37:44.157
And so as an example, you may many,
00:37:44.197 --> 00:37:46.780
many years ago be like, oh, that worker,
00:37:48.141 --> 00:37:49.242
they never contribute,
00:37:49.541 --> 00:37:52.204
or it takes them a while to get going.
00:37:54.039 --> 00:37:55.798
But now that might be a person of like,
00:37:56.019 --> 00:37:56.460
you know what,
00:37:56.639 --> 00:37:57.440
that's a person that's
00:37:57.480 --> 00:37:59.280
being really thoughtful or
00:37:59.340 --> 00:38:01.320
maybe they're more of an introvert,
00:38:01.380 --> 00:38:03.360
but they do have something to say.
00:38:03.842 --> 00:38:05.121
We just need to give them
00:38:05.141 --> 00:38:07.581
the tools and the means to do so.
00:38:08.583 --> 00:38:09.623
So unpacking and
00:38:09.762 --> 00:38:13.143
understanding more of how we are.
00:38:14.146 --> 00:38:16.469
within interpersonal communication,
00:38:17.108 --> 00:38:19.090
how we are in terms of how
00:38:19.271 --> 00:38:20.931
business and various
00:38:21.052 --> 00:38:22.492
organizations and school
00:38:22.532 --> 00:38:25.235
structures are built out
00:38:25.494 --> 00:38:27.597
and how they all work together.
00:38:28.336 --> 00:38:29.257
We're learning more about
00:38:29.378 --> 00:38:30.318
all of this stuff.
00:38:30.358 --> 00:38:32.460
So we do need things like
00:38:32.519 --> 00:38:33.780
creative solutions.
00:38:33.820 --> 00:38:36.262
We do need more empathy in the workplace.
00:38:36.302 --> 00:38:41.246
We do need this ability to not just...
00:38:42.405 --> 00:38:45.226
communicate x y or z but do
00:38:45.286 --> 00:38:47.128
it in a way that is
00:38:47.188 --> 00:38:49.568
effective for everybody
00:38:49.648 --> 00:38:51.809
involved right um and and
00:38:51.889 --> 00:38:53.670
so to me that's my answer
00:38:53.791 --> 00:38:55.172
is like we know more
00:38:56.434 --> 00:38:57.835
And once you lift up that
00:38:57.896 --> 00:38:59.115
rock and you see that
00:38:59.197 --> 00:39:00.376
what's underneath that rock,
00:39:02.358 --> 00:39:03.137
you're going to have to do
00:39:03.197 --> 00:39:04.778
something different and do
00:39:04.818 --> 00:39:07.079
something more as to how
00:39:07.099 --> 00:39:08.061
we're preparing our
00:39:08.121 --> 00:39:09.442
students and just all of us
00:39:09.481 --> 00:39:11.463
in general of how to live
00:39:11.503 --> 00:39:12.342
with that knowledge.
00:39:13.003 --> 00:39:14.123
Like we're not living blind
00:39:14.784 --> 00:39:16.264
or as blind as we used to.
00:39:18.097 --> 00:39:19.538
I totally agree with both of you.
00:39:19.557 --> 00:39:21.559
There's multiple interpretations here.
00:39:22.739 --> 00:39:23.780
The one other thing I would
00:39:23.840 --> 00:39:26.461
add is that AI is going to
00:39:26.521 --> 00:39:28.402
do a lot of the things that
00:39:29.003 --> 00:39:29.744
we don't like.
00:39:29.784 --> 00:39:30.764
There's a lot, as you mentioned,
00:39:30.804 --> 00:39:32.045
I think some of you said soft skills.
00:39:32.065 --> 00:39:33.626
There's a lot of things that
00:39:33.666 --> 00:39:34.545
we can focus on.
00:39:34.965 --> 00:39:37.288
What AI can't do for us is
00:39:37.327 --> 00:39:39.108
what we still need humans to do.
00:39:40.750 --> 00:39:41.690
And the question that I
00:39:41.771 --> 00:39:43.753
often ask when I put this
00:39:43.793 --> 00:39:45.936
top ten list up with teachers is,
00:39:46.797 --> 00:39:47.619
are you teaching these
00:39:47.659 --> 00:39:49.722
skills now in your classroom?
00:39:50.804 --> 00:39:53.427
And if not, how would you teach them?
00:39:54.418 --> 00:39:55.659
And the other part of that is,
00:39:55.739 --> 00:39:56.539
are there things that you
00:39:56.559 --> 00:39:57.360
have to teach now that
00:39:57.380 --> 00:39:58.260
maybe aren't relevant
00:39:58.340 --> 00:40:00.163
anymore that you could make
00:40:00.643 --> 00:40:02.003
space for these things?
00:40:02.563 --> 00:40:05.666
So I'm curious about your own experiences.
00:40:05.706 --> 00:40:06.106
If you think,
00:40:06.347 --> 00:40:06.987
I know you don't have the
00:40:07.027 --> 00:40:07.768
list in front of you.
00:40:07.809 --> 00:40:08.768
I can put it in the chat for
00:40:08.829 --> 00:40:09.489
you guys to see.
00:40:09.869 --> 00:40:13.253
But are you teaching these things?
00:40:13.432 --> 00:40:14.574
And do you think they're being taught?
00:40:17.635 --> 00:40:19.637
So I just did a professional
00:40:19.657 --> 00:40:22.260
learning session this week on...
00:40:24.795 --> 00:40:26.958
We touched most of the things on here,
00:40:27.818 --> 00:40:28.900
and it was grounded on the
00:40:28.940 --> 00:40:31.403
topic of design thinking, right?
00:40:31.423 --> 00:40:34.347
So getting students to solve
00:40:34.407 --> 00:40:35.789
real-world issues by
00:40:36.048 --> 00:40:37.670
empathizing with the end
00:40:37.811 --> 00:40:40.655
user or with the stakeholders.
00:40:41.797 --> 00:40:43.719
how do we redefine what that
00:40:43.760 --> 00:40:46.242
problem is the ideation
00:40:46.282 --> 00:40:47.744
prototype testing out in
00:40:47.764 --> 00:40:50.586
the community boom uh I
00:40:50.706 --> 00:40:51.887
used that a lot in the
00:40:51.927 --> 00:40:52.869
classroom when I was in the
00:40:53.088 --> 00:40:53.650
in the high school
00:40:53.670 --> 00:40:56.152
classroom but I will say
00:40:56.211 --> 00:40:59.195
that I I don't believe that
00:40:59.235 --> 00:41:00.536
these things are being
00:41:00.617 --> 00:41:03.380
taught consistently and
00:41:03.420 --> 00:41:05.402
purposefully across like
00:41:06.487 --> 00:41:08.748
all of it um yeah because
00:41:08.768 --> 00:41:11.568
like at my school I was I
00:41:11.590 --> 00:41:12.929
was the only oddball that
00:41:12.969 --> 00:41:14.650
was using this thing that
00:41:14.809 --> 00:41:16.771
people were like ew no that
00:41:16.811 --> 00:41:19.811
sounds like work um it was
00:41:19.871 --> 00:41:21.192
work but it was fun work
00:41:21.992 --> 00:41:23.233
and you know there's all
00:41:23.253 --> 00:41:24.393
these different reasons why
00:41:24.672 --> 00:41:27.134
I why I taught with it um
00:41:27.173 --> 00:41:29.454
that I won't mention now um
00:41:29.514 --> 00:41:30.494
but that's kind of like my
00:41:30.875 --> 00:41:32.516
my answer is like is is
00:41:32.536 --> 00:41:33.635
this work being done
00:41:35.427 --> 00:41:36.427
maybe to an extent.
00:41:37.128 --> 00:41:38.128
Um, but I don't know if,
00:41:38.228 --> 00:41:41.648
if Sophie has a different take on that.
00:41:41.929 --> 00:41:43.688
Um, I would agree with what you said,
00:41:44.670 --> 00:41:45.110
Steve.
00:41:45.210 --> 00:41:47.349
And I also, I'm looking at this list now,
00:41:47.369 --> 00:41:49.650
cause I'm a very visual, um, person.
00:41:49.670 --> 00:41:50.650
I'm looking at this list now.
00:41:50.710 --> 00:41:52.492
And a lot of this aligns
00:41:52.592 --> 00:41:56.492
with the mathematical habits of mind, um,
00:41:56.552 --> 00:42:00.313
resilience, persevering, um, and, uh,
00:42:00.393 --> 00:42:01.753
analytical thinking, um,
00:42:02.481 --> 00:42:06.543
self-awareness so you can understand,
00:42:06.603 --> 00:42:08.784
like speak why you did it,
00:42:08.844 --> 00:42:10.125
speak to why you solved the
00:42:10.144 --> 00:42:11.045
problem in such a way.
00:42:12.005 --> 00:42:12.967
So I would say that if
00:42:13.106 --> 00:42:15.268
teachers are teaching in a
00:42:15.327 --> 00:42:17.409
more student-centered way
00:42:17.449 --> 00:42:18.889
to get them to do more of
00:42:18.909 --> 00:42:20.170
the thinking rather than,
00:42:20.590 --> 00:42:21.851
and this is for every classroom,
00:42:22.092 --> 00:42:23.733
don't just teach them, do this, this,
00:42:23.813 --> 00:42:25.594
this, this, this is the process,
00:42:26.434 --> 00:42:28.054
do this again and again and again,
00:42:28.114 --> 00:42:29.335
and you will always have success.
00:42:29.376 --> 00:42:31.257
But teaching the why behind it,
00:42:31.721 --> 00:42:34.262
is going to move into more
00:42:34.302 --> 00:42:38.423
of these curiosity and
00:42:39.864 --> 00:42:41.905
creative thinking so that
00:42:41.945 --> 00:42:43.106
they can do that themselves.
00:42:43.266 --> 00:42:44.427
And that is something that
00:42:45.088 --> 00:42:46.108
the traditional way of
00:42:46.168 --> 00:42:50.251
teaching is going out as
00:42:50.371 --> 00:42:53.032
more and more teachers retire, hopefully,
00:42:54.253 --> 00:42:55.713
and that we are moving our
00:42:55.753 --> 00:42:56.693
classrooms to be more of a
00:42:56.753 --> 00:42:57.594
student-centered,
00:42:58.394 --> 00:43:00.576
whether it's problem-based learning or
00:43:01.327 --> 00:43:02.628
group work and collaboration,
00:43:04.309 --> 00:43:05.391
however it is that you do it,
00:43:05.431 --> 00:43:07.731
but where the teacher is
00:43:07.791 --> 00:43:08.952
not the only one exhausted
00:43:08.992 --> 00:43:09.693
at the end of the day.
00:43:11.273 --> 00:43:13.255
That is the goal.
00:43:13.275 --> 00:43:14.675
That's the goal for sure.
00:43:15.556 --> 00:43:17.458
And if those classrooms are
00:43:17.498 --> 00:43:18.759
doing more of those things
00:43:18.818 --> 00:43:19.518
and they are going to be
00:43:19.539 --> 00:43:20.260
teaching more of this,
00:43:20.300 --> 00:43:22.802
maybe not specifically,
00:43:22.961 --> 00:43:25.722
but through those projects
00:43:25.742 --> 00:43:27.143
and those activities that you do,
00:43:27.483 --> 00:43:28.204
you'll be doing that.
00:43:32.864 --> 00:43:33.324
Yeah.
00:43:33.403 --> 00:43:34.545
Yeah.
00:43:34.565 --> 00:43:34.644
Well,
00:43:34.704 --> 00:43:36.125
and then because Sophie made the
00:43:36.164 --> 00:43:37.365
comment of like, you know,
00:43:37.405 --> 00:43:38.885
it shouldn't be the teacher being the one,
00:43:38.925 --> 00:43:40.206
the only one that's tired.
00:43:41.067 --> 00:43:42.628
I would also say like the teacher,
00:43:42.648 --> 00:43:44.289
because teachers love their
00:43:44.329 --> 00:43:45.528
content for the most part.
00:43:45.608 --> 00:43:45.809
Right.
00:43:45.869 --> 00:43:47.650
Like also I would say like
00:43:48.250 --> 00:43:49.670
that it shouldn't be just
00:43:49.690 --> 00:43:50.670
the teacher that's having
00:43:50.771 --> 00:43:52.092
fun in that classroom.
00:43:52.391 --> 00:43:53.311
Right.
00:43:54.132 --> 00:43:54.992
They shouldn't be the only
00:43:55.213 --> 00:43:57.333
ones that are curious
00:43:57.793 --> 00:43:58.873
within their classrooms.
00:43:59.375 --> 00:44:01.936
And so that's the other piece too,
00:44:02.016 --> 00:44:02.496
is like,
00:44:03.518 --> 00:44:06.362
I've heard over the years, we're like, oh,
00:44:06.402 --> 00:44:08.724
my kids are just so disengaged.
00:44:08.744 --> 00:44:10.224
I'm like, oh, so what's going on?
00:44:11.186 --> 00:44:12.367
They just, you know,
00:44:13.447 --> 00:44:14.108
they're not asking
00:44:14.148 --> 00:44:15.971
questions while I'm lecturing at them.
00:44:15.990 --> 00:44:18.632
I'm like, yeah.
00:44:19.313 --> 00:44:20.494
You've lost it.
00:44:20.514 --> 00:44:21.516
You answered your own question.
00:44:22.137 --> 00:44:22.396
Yeah.
00:44:22.416 --> 00:44:23.538
Why are they disengaged?
00:44:24.402 --> 00:44:25.463
Why are they disengaged?
00:44:25.523 --> 00:44:25.744
Yeah.
00:44:26.945 --> 00:44:27.144
Mark,
00:44:27.625 --> 00:44:30.166
just one quick question before we hit
00:44:30.387 --> 00:44:33.809
the tail end of this episode.
00:44:34.449 --> 00:44:36.092
You did mention this a
00:44:36.132 --> 00:44:37.632
little bit in several
00:44:37.652 --> 00:44:38.572
different spots here,
00:44:38.773 --> 00:44:41.856
but teachers' roles.
00:44:42.817 --> 00:44:44.217
What's the teacher role with all this?
00:44:44.458 --> 00:44:45.898
And then that concern,
00:44:45.958 --> 00:44:47.300
because I heard this last
00:44:47.340 --> 00:44:48.780
weekend at a conference.
00:44:48.860 --> 00:44:49.942
I heard an educator be like,
00:44:50.628 --> 00:44:51.849
You know AI is going to take
00:44:52.110 --> 00:44:54.532
over all of our jobs as teachers.
00:44:56.572 --> 00:44:58.454
And so just curious,
00:44:58.554 --> 00:44:59.916
maybe get a little bit
00:44:59.976 --> 00:45:03.677
deeper into why that's not true.
00:45:03.717 --> 00:45:06.019
It sounds like from your opinion,
00:45:06.179 --> 00:45:09.262
from your perspective, that it's not.
00:45:09.322 --> 00:45:09.661
Yeah.
00:45:10.583 --> 00:45:11.443
And not just my opinion,
00:45:11.483 --> 00:45:13.545
but the data seems to show that as well.
00:45:13.804 --> 00:45:14.686
Teachers are worried about
00:45:14.726 --> 00:45:15.467
losing their jobs.
00:45:16.474 --> 00:45:18.135
and just like everybody I think is,
00:45:19.195 --> 00:45:19.855
but teachers really
00:45:19.875 --> 00:45:21.335
shouldn't be worried about it.
00:45:21.956 --> 00:45:23.115
The data shows that there's
00:45:23.175 --> 00:45:23.836
probably actually going to
00:45:23.856 --> 00:45:25.976
be growth and the need for
00:45:26.016 --> 00:45:27.717
teachers in the coming years.
00:45:28.577 --> 00:45:29.378
There's a really interesting
00:45:29.438 --> 00:45:32.420
website called WillRobotsTakeMyJob.com.
00:45:32.820 --> 00:45:33.840
It's kind of a silly name,
00:45:34.340 --> 00:45:35.880
but it's a cool site.
00:45:36.440 --> 00:45:37.280
You can go there and you can
00:45:37.320 --> 00:45:38.342
search your job
00:45:39.284 --> 00:45:41.065
and it'll tell you what the
00:45:41.184 --> 00:45:42.166
odds are of you being
00:45:42.206 --> 00:45:43.226
replaced by a robot.
00:45:44.186 --> 00:45:45.487
Teaching is very low and
00:45:45.788 --> 00:45:46.809
it's actually says that
00:45:46.909 --> 00:45:49.411
it's a growth field.
00:45:49.871 --> 00:45:51.572
And it's funny when you go to that website,
00:45:52.172 --> 00:45:53.293
it shows you like the most
00:45:53.572 --> 00:45:55.795
searched jobs and teaching
00:45:55.835 --> 00:45:57.115
is one of them.
00:45:57.255 --> 00:45:58.195
People are worried about it,
00:45:58.215 --> 00:45:59.277
but they really shouldn't be.
00:46:00.358 --> 00:46:01.478
The numbers show that
00:46:01.538 --> 00:46:02.338
there's probably going to
00:46:02.378 --> 00:46:03.139
be growth there.
00:46:03.739 --> 00:46:06.280
And I always think, we need more teachers,
00:46:07.641 --> 00:46:08.641
I don't know about in your states,
00:46:08.681 --> 00:46:09.282
but in Missouri,
00:46:09.443 --> 00:46:10.503
we have a teacher shortage.
00:46:11.563 --> 00:46:12.943
And I think about if you're
00:46:12.963 --> 00:46:14.244
in college right now and you're like,
00:46:14.384 --> 00:46:15.384
what kind of career should
00:46:15.403 --> 00:46:16.664
I go into where I'm not
00:46:16.684 --> 00:46:18.105
going to lose my job to a robot?
00:46:18.125 --> 00:46:21.125
I'm not going to be working a few years.
00:46:21.164 --> 00:46:22.045
Sell them on teaching.
00:46:22.806 --> 00:46:24.025
And there's so many things
00:46:24.065 --> 00:46:26.547
like if you want human connection,
00:46:26.927 --> 00:46:27.527
you want to make a
00:46:27.567 --> 00:46:28.686
difference in the world,
00:46:29.306 --> 00:46:31.108
and you want a job where
00:46:31.128 --> 00:46:33.887
you're not going to be replaced by AI,
00:46:34.588 --> 00:46:35.289
look to teaching.
00:46:36.528 --> 00:46:37.409
um it really should be a
00:46:37.449 --> 00:46:38.891
selling point because
00:46:39.030 --> 00:46:39.972
humans can do so many
00:46:40.012 --> 00:46:41.672
things that ai can't do ai
00:46:41.693 --> 00:46:43.815
is great there's so much it
00:46:43.835 --> 00:46:45.996
can do for education but
00:46:46.036 --> 00:46:46.956
there's so many things that
00:46:47.097 --> 00:46:48.737
that we still need human
00:46:48.777 --> 00:46:50.179
beings we need that human
00:46:50.259 --> 00:46:51.559
interaction with our kids
00:46:51.619 --> 00:46:54.202
in the classroom to do um
00:46:54.222 --> 00:46:55.762
there have been a few cases where
00:46:56.744 --> 00:46:57.804
small schools private
00:46:57.844 --> 00:46:59.264
schools have said we're
00:46:59.284 --> 00:47:00.005
going to get rid of our
00:47:00.025 --> 00:47:00.905
teachers and we're going to
00:47:01.085 --> 00:47:03.485
use ai instead and it
00:47:03.525 --> 00:47:05.726
hasn't worked because maybe
00:47:06.047 --> 00:47:08.027
kids are learning content
00:47:08.706 --> 00:47:10.027
but they don't have those
00:47:10.228 --> 00:47:11.427
other kinds of skills those
00:47:11.467 --> 00:47:12.527
other kind of benefits that
00:47:12.547 --> 00:47:13.909
they get from having an
00:47:14.009 --> 00:47:16.789
actual human teacher so if
00:47:16.809 --> 00:47:18.150
you're a teacher you
00:47:18.170 --> 00:47:18.829
shouldn't worry about
00:47:18.869 --> 00:47:19.989
losing your job there's
00:47:20.070 --> 00:47:22.231
plenty of there's plenty of jobs for us
00:47:25.483 --> 00:47:26.425
And you made the important
00:47:26.465 --> 00:47:28.407
point of there's so many
00:47:28.467 --> 00:47:32.969
things that we could do that AI cannot.
00:47:33.891 --> 00:47:35.733
To what degree will that always be true?
00:47:35.793 --> 00:47:37.054
And I'm kind of just pushing
00:47:37.094 --> 00:47:39.375
back just because this kind
00:47:39.414 --> 00:47:40.436
of counter arguments that
00:47:40.615 --> 00:47:42.297
I've heard is like, well,
00:47:42.617 --> 00:47:43.838
the AI has control.
00:47:44.675 --> 00:47:46.637
grown has gotten more
00:47:46.898 --> 00:47:48.460
sophisticated over the last
00:47:48.500 --> 00:47:50.583
few years right um at least
00:47:50.603 --> 00:47:52.284
to our measure will it
00:47:52.344 --> 00:47:53.686
continue to grow in that
00:47:53.726 --> 00:47:55.708
way to where this
00:47:55.768 --> 00:47:57.471
conversation changes a little bit
00:47:59.012 --> 00:48:00.815
I think it could to a degree,
00:48:01.355 --> 00:48:04.358
but it's never going to be
00:48:04.518 --> 00:48:05.818
able to do everything that
00:48:05.918 --> 00:48:06.880
a human can do.
00:48:07.481 --> 00:48:09.342
And I'm trying to pull up
00:48:09.382 --> 00:48:10.623
something that I really – okay,
00:48:10.643 --> 00:48:11.063
here it is.
00:48:11.943 --> 00:48:13.306
There's this guy named Doan
00:48:13.346 --> 00:48:16.007
Winkle on LinkedIn that I follow.
00:48:16.748 --> 00:48:18.690
He's a big advocate for AI and teaching.
00:48:19.650 --> 00:48:21.311
um and I got to got to zoom
00:48:21.331 --> 00:48:22.592
with him once too but he
00:48:22.632 --> 00:48:23.833
had this great post a
00:48:23.853 --> 00:48:25.135
couple weeks ago that I
00:48:25.155 --> 00:48:25.894
just want to read to you he
00:48:25.914 --> 00:48:27.396
said ai won't replace great
00:48:27.436 --> 00:48:29.077
teachers it will replace
00:48:29.137 --> 00:48:31.378
those who resist change ai
00:48:31.398 --> 00:48:33.119
shares info but you shape
00:48:33.159 --> 00:48:34.940
character ai can analyze
00:48:34.981 --> 00:48:37.663
data but you build trust ai
00:48:37.682 --> 00:48:39.083
assists learning but you
00:48:39.123 --> 00:48:40.405
guide growth ai gives
00:48:40.485 --> 00:48:42.666
answers you spark curiosity
00:48:42.925 --> 00:48:44.786
ai tracks progress but you
00:48:44.847 --> 00:48:46.628
celebrate wins um
00:48:47.509 --> 00:48:48.530
So I really love that.
00:48:48.710 --> 00:48:50.192
Because I feel like no
00:48:50.211 --> 00:48:52.213
matter how good our technology gets,
00:48:52.653 --> 00:48:53.655
it's never going to be able
00:48:53.695 --> 00:48:55.036
to do those things at least
00:48:55.097 --> 00:48:57.518
not feel authentic, right?
00:48:57.938 --> 00:48:59.900
Like, so that's,
00:49:00.001 --> 00:49:00.862
that's why we're going to
00:49:00.902 --> 00:49:02.302
need human teachers,
00:49:02.744 --> 00:49:04.646
no matter how good the technology gets,
00:49:05.106 --> 00:49:05.887
we're still going to need
00:49:05.947 --> 00:49:07.789
humans to guide humans.
00:49:09.126 --> 00:49:10.347
Wow.
00:49:10.768 --> 00:49:12.751
Mark, I know you cited someone else,
00:49:12.971 --> 00:49:15.092
but that was a really good answer.
00:49:15.233 --> 00:49:16.175
Powerful, yes.
00:49:16.235 --> 00:49:18.157
Yeah, I don't get credit for those words,
00:49:18.197 --> 00:49:19.538
but I feel like that
00:49:19.617 --> 00:49:25.844
summarizes best why we need teachers.
00:49:25.985 --> 00:49:26.626
Amazing.
00:49:27.164 --> 00:49:27.445
Yeah.
00:49:28.266 --> 00:49:28.646
Awesome.
00:49:28.806 --> 00:49:29.027
Well,
00:49:29.367 --> 00:49:31.331
we're getting towards the end of this
00:49:31.510 --> 00:49:32.311
episode.
00:49:32.853 --> 00:49:33.793
And Mark,
00:49:33.853 --> 00:49:35.876
I know you've listened to the show before,
00:49:35.976 --> 00:49:37.278
so I'm sure you're ready.
00:49:38.079 --> 00:49:39.121
It could be something that
00:49:39.181 --> 00:49:41.123
we kind of touched on or it
00:49:41.143 --> 00:49:42.005
can be something different.
00:49:43.661 --> 00:49:44.762
But really curious,
00:49:45.182 --> 00:49:47.425
and we're all waiting on, you know,
00:49:48.226 --> 00:49:49.927
what is top of mind for you right now?
00:49:50.007 --> 00:49:51.068
What is under the hat?
00:49:51.588 --> 00:49:52.190
What is something in
00:49:52.250 --> 00:49:53.271
education that we need to
00:49:53.291 --> 00:49:54.652
be talking more about?
00:49:55.052 --> 00:49:56.733
Or perhaps what is something
00:49:56.753 --> 00:49:58.476
that you feel like really
00:49:58.516 --> 00:49:59.556
like you need to let out?
00:50:00.498 --> 00:50:01.978
Really curious as to hear
00:50:02.018 --> 00:50:02.920
what's under the hat for you.
00:50:04.260 --> 00:50:05.061
Well, I'm disappointed.
00:50:05.201 --> 00:50:05.762
I thought we were going to
00:50:05.782 --> 00:50:06.541
talk about hats.
00:50:06.621 --> 00:50:08.143
I thought this podcast was about hats,
00:50:08.182 --> 00:50:08.882
and we didn't get to talk
00:50:08.902 --> 00:50:09.784
about hats today.
00:50:09.884 --> 00:50:11.545
But it's funny.
00:50:11.565 --> 00:50:12.445
I had a bunch of hats on my
00:50:12.485 --> 00:50:13.545
desk yesterday because I
00:50:13.585 --> 00:50:15.086
did this little demo about
00:50:15.106 --> 00:50:16.527
how I wear a lot of hats at work,
00:50:16.606 --> 00:50:17.987
and I took them home.
00:50:21.670 --> 00:50:23.170
Top of mind in education,
00:50:23.530 --> 00:50:25.052
it kind of ties into AI,
00:50:25.492 --> 00:50:26.952
but I just wonder about...
00:50:28.512 --> 00:50:29.952
they say kids more and more
00:50:30.012 --> 00:50:31.552
don't see the relevance of
00:50:31.592 --> 00:50:32.873
what they're learning at school.
00:50:34.215 --> 00:50:36.315
And I know in my state,
00:50:36.635 --> 00:50:37.815
there's a big debate right
00:50:37.856 --> 00:50:39.277
now about cursive writing
00:50:39.717 --> 00:50:40.336
and whether we should
00:50:40.376 --> 00:50:41.657
mandate that kids have to
00:50:41.697 --> 00:50:43.557
learn cursive writing in school or not.
00:50:44.398 --> 00:50:44.639
And
00:50:45.597 --> 00:50:46.818
I'm not saying it's bad to
00:50:46.878 --> 00:50:48.181
learn cursive writing,
00:50:48.840 --> 00:50:50.202
but when I ask people,
00:50:50.402 --> 00:50:53.385
why do you think it's so important?
00:50:53.766 --> 00:50:54.586
Their answers are, well,
00:50:54.606 --> 00:50:55.547
because I had to learn it
00:50:56.708 --> 00:50:57.329
because they're going to
00:50:57.369 --> 00:50:59.271
have to write checks.
00:50:59.472 --> 00:50:59.692
I'm like,
00:50:59.731 --> 00:51:00.833
when's the last time you wrote a check?
00:51:00.853 --> 00:51:02.594
They're going to have to
00:51:02.655 --> 00:51:04.996
sign things like that.
00:51:05.016 --> 00:51:07.699
I just think we need to
00:51:07.719 --> 00:51:09.121
think about the world we're living in.
00:51:10.048 --> 00:51:12.610
and what our kids need from it now,
00:51:13.110 --> 00:51:14.170
because it's relevant,
00:51:14.210 --> 00:51:15.731
because it's what they're going into.
00:51:16.092 --> 00:51:18.213
This idea of, well, I learned,
00:51:18.333 --> 00:51:19.253
I had to memorize the
00:51:19.293 --> 00:51:21.235
states and capitals when I was a kid.
00:51:22.315 --> 00:51:24.876
but now you can Google it, right?
00:51:24.916 --> 00:51:26.635
So why are there, you know,
00:51:26.655 --> 00:51:27.836
we talked about the skills
00:51:27.856 --> 00:51:30.197
we need to be teaching, but I think we,
00:51:30.536 --> 00:51:31.717
and by me, we, I mean,
00:51:31.818 --> 00:51:34.297
the leaders of the districts,
00:51:34.338 --> 00:51:35.438
the people who put the
00:51:35.478 --> 00:51:37.338
curriculum together, the state standards,
00:51:37.438 --> 00:51:38.199
they need to be thinking
00:51:38.219 --> 00:51:39.860
about what don't we need to
00:51:39.900 --> 00:51:40.599
teach anymore?
00:51:41.119 --> 00:51:41.960
What's outdated?
00:51:42.041 --> 00:51:42.900
What can we just,
00:51:43.867 --> 00:51:45.728
Google in two seconds and
00:51:45.748 --> 00:51:46.670
get an answer for.
00:51:48.030 --> 00:51:49.271
So this is kind of my,
00:51:49.472 --> 00:51:51.672
this is not representing my employer.
00:51:51.693 --> 00:51:53.153
This is just my personal opinion here,
00:51:53.193 --> 00:51:55.335
but we really need to look at, you know,
00:51:55.715 --> 00:51:57.097
what don't we need to teach anymore?
00:51:57.617 --> 00:51:58.597
What's not relevant?
00:51:58.918 --> 00:52:02.280
Things like that.
00:52:02.300 --> 00:52:02.480
Yeah.
00:52:02.519 --> 00:52:03.701
I always hear like what
00:52:03.760 --> 00:52:04.882
needs to be included.
00:52:06.463 --> 00:52:07.704
It's interesting of like,
00:52:08.873 --> 00:52:10.074
what shouldn't be there anymore.
00:52:11.554 --> 00:52:14.594
There's only so much time to teach.
00:52:15.315 --> 00:52:16.175
We never get done.
00:52:16.195 --> 00:52:18.036
We never address everything we want to.
00:52:18.615 --> 00:52:19.795
But there are some things where we're like,
00:52:19.815 --> 00:52:19.936
well,
00:52:19.956 --> 00:52:21.757
we just have to do this because the
00:52:21.777 --> 00:52:22.757
state says so or the
00:52:22.797 --> 00:52:24.697
district says so or whatever.
00:52:25.577 --> 00:52:27.557
You made me think back to
00:52:27.637 --> 00:52:29.257
when I was a student in
00:52:29.297 --> 00:52:30.759
high school and then when I
00:52:30.958 --> 00:52:32.619
went and became a math teacher.
00:52:32.798 --> 00:52:35.679
And in math class, I remember
00:52:38.005 --> 00:52:39.065
a teacher saying,
00:52:39.465 --> 00:52:40.226
you're not going to have a
00:52:40.266 --> 00:52:43.226
calculator in your pocket.
00:52:43.306 --> 00:52:44.527
But we do.
00:52:45.086 --> 00:52:45.746
But we do.
00:52:45.806 --> 00:52:46.967
And then I remember being
00:52:47.027 --> 00:52:48.307
like thinking that in my
00:52:48.347 --> 00:52:49.969
head to say it to my kids,
00:52:50.028 --> 00:52:51.548
my first like three years
00:52:51.949 --> 00:52:53.289
of teaching and being like,
00:52:53.309 --> 00:52:53.949
wait a second.
00:52:55.184 --> 00:52:57.626
we do have a calculator in our pocket.
00:52:57.646 --> 00:52:58.206
Great.
00:52:58.266 --> 00:53:02.248
Jokes on them.
00:53:02.288 --> 00:53:03.168
Jokes on them.
00:53:03.248 --> 00:53:04.568
I didn't have to memorize that.
00:53:04.869 --> 00:53:05.289
But yeah,
00:53:05.349 --> 00:53:06.769
so it's one of those things where
00:53:06.789 --> 00:53:07.230
it's like,
00:53:07.429 --> 00:53:10.190
what is needed now in today's world?
00:53:10.490 --> 00:53:13.012
And just to go back to the
00:53:13.253 --> 00:53:13.932
cursive writing,
00:53:14.420 --> 00:53:15.840
Our historical documents are
00:53:15.880 --> 00:53:16.681
in cursive writing.
00:53:16.701 --> 00:53:17.742
We need to know how to read them.
00:53:18.001 --> 00:53:18.902
We need to be able to.
00:53:19.862 --> 00:53:21.202
But I guess I could do that.
00:53:21.302 --> 00:53:22.423
Maybe a picture of it.
00:53:22.523 --> 00:53:23.423
It'll translate it for you.
00:53:23.443 --> 00:53:23.744
Yeah.
00:53:23.764 --> 00:53:25.164
Yeah.
00:53:25.184 --> 00:53:27.005
I'm not anti-cursive writing by any means.
00:53:27.045 --> 00:53:28.585
That's just a hot topic
00:53:28.626 --> 00:53:30.007
right now in our state is
00:53:30.027 --> 00:53:31.088
if they're going to pass a
00:53:31.128 --> 00:53:33.148
law that schools have to teach that.
00:53:33.228 --> 00:53:33.568
I'm like,
00:53:34.208 --> 00:53:36.710
of all things that kids need to
00:53:36.730 --> 00:53:37.309
know right now.
00:53:37.329 --> 00:53:40.612
Well, my state just brought it back.
00:53:42.465 --> 00:53:43.646
in the elementary grade.
00:53:43.686 --> 00:53:46.126
So Riley's going to learn cursive, I think,
00:53:46.987 --> 00:53:47.646
next year.
00:53:48.947 --> 00:53:50.027
And she's kind of excited
00:53:50.068 --> 00:53:52.108
about it because she kind of sees it.
00:53:53.009 --> 00:53:53.969
And that's cool from a
00:53:54.030 --> 00:53:55.030
curiosity standpoint.
00:53:55.050 --> 00:53:57.311
Be like, oh, yeah.
00:53:57.510 --> 00:54:00.472
I'm going to have to brush up, I think.
00:54:00.731 --> 00:54:03.092
There are a few letters that I'm like,
00:54:03.132 --> 00:54:04.552
how do you do a Q?
00:54:04.612 --> 00:54:06.094
I haven't done a Q in long.
00:54:06.173 --> 00:54:06.733
You know what I mean?
00:54:06.773 --> 00:54:07.514
It's stuff like that.
00:54:09.588 --> 00:54:10.809
Now, it's super fascinating.
00:54:11.210 --> 00:54:11.811
Mark,
00:54:12.072 --> 00:54:15.115
one thing I'm going to put it up for
00:54:15.155 --> 00:54:17.699
those that are watching and
00:54:17.719 --> 00:54:19.922
perhaps not listening.
00:54:19.943 --> 00:54:21.766
The AI toolkit.
00:54:24.969 --> 00:54:25.871
Tell us a little bit about this.
00:54:26.891 --> 00:54:29.092
Yeah, so last year,
00:54:30.273 --> 00:54:31.175
here at Missouri School
00:54:31.195 --> 00:54:32.074
Boards Association,
00:54:32.534 --> 00:54:34.376
I'm sort of the head AI guy,
00:54:34.396 --> 00:54:35.277
but we have a committee.
00:54:36.297 --> 00:54:37.398
We just saw that there was
00:54:37.438 --> 00:54:39.139
really a need for something like this.
00:54:39.179 --> 00:54:39.960
It's called a toolkit,
00:54:39.980 --> 00:54:41.221
but it's basically an e-book.
00:54:42.360 --> 00:54:43.521
It's totally free.
00:54:43.661 --> 00:54:44.902
You don't have to sign up for it.
00:54:44.943 --> 00:54:45.884
You don't have to.
00:54:46.423 --> 00:54:47.864
There's no ads or anything.
00:54:48.885 --> 00:54:49.746
Some of the things you'll see,
00:54:49.766 --> 00:54:50.766
there's Eliza that I told
00:54:50.786 --> 00:54:51.447
you about earlier.
00:54:53.467 --> 00:54:55.009
so this is just a place for
00:54:55.048 --> 00:54:56.210
us to house all of our
00:54:56.289 --> 00:54:58.090
information so the first
00:54:58.150 --> 00:54:59.211
part is really just
00:54:59.351 --> 00:55:02.012
understanding ai um for
00:55:02.032 --> 00:55:02.693
people who don't know
00:55:02.713 --> 00:55:03.954
anything about it I am not
00:55:04.014 --> 00:55:05.956
like a techie guy I I mean
00:55:05.976 --> 00:55:07.255
I like technology but I
00:55:07.476 --> 00:55:09.057
don't talk in that way you
00:55:09.077 --> 00:55:10.659
know I speak plain english
00:55:11.099 --> 00:55:12.980
the best I can so we try to
00:55:13.019 --> 00:55:14.280
make it so that people can
00:55:14.340 --> 00:55:15.842
understand these things
00:55:17.083 --> 00:55:19.164
So there's understanding AI.
00:55:19.423 --> 00:55:21.146
There's a whole section on policy,
00:55:21.545 --> 00:55:23.507
section on safety, ethics.
00:55:24.628 --> 00:55:26.309
There's videos in there.
00:55:26.369 --> 00:55:28.871
There's references, links,
00:55:28.911 --> 00:55:29.793
things like that.
00:55:29.813 --> 00:55:30.693
There's a whole section
00:55:30.733 --> 00:55:32.735
about the ways that we can
00:55:32.875 --> 00:55:34.655
use AI for good and the
00:55:34.695 --> 00:55:36.237
concerns in schools.
00:55:36.257 --> 00:55:37.778
I always try to present a
00:55:37.838 --> 00:55:38.659
balanced approach.
00:55:40.521 --> 00:55:41.382
If you want to check it out,
00:55:41.442 --> 00:55:45.844
it's just mosba.org slash AI,
00:55:46.485 --> 00:55:48.505
mosba.org slash AI.
00:55:48.965 --> 00:55:50.847
It's a totally free resource.
00:55:51.867 --> 00:55:53.168
We add to it all the time.
00:55:53.688 --> 00:55:55.170
Love to hear your thoughts on it.
00:55:55.431 --> 00:55:57.751
If there's something we haven't covered,
00:55:58.112 --> 00:55:59.532
my email address is in there.
00:55:59.632 --> 00:56:00.554
But yeah.
00:56:02.570 --> 00:56:03.190
This is so cool.
00:56:04.132 --> 00:56:05.932
I will include this in the show notes.
00:56:06.472 --> 00:56:07.233
I'll have it linked up.
00:56:07.853 --> 00:56:07.974
Yeah,
00:56:07.994 --> 00:56:09.195
I'm sure a lot of folks like I'm
00:56:09.215 --> 00:56:10.974
going to dig into this a little bit more.
00:56:11.815 --> 00:56:12.036
Mark,
00:56:12.115 --> 00:56:13.317
I know this is part of how we got
00:56:13.356 --> 00:56:16.679
connected originally online via LinkedIn.
00:56:16.778 --> 00:56:17.838
I think you had done a post
00:56:17.978 --> 00:56:19.460
on it and I was like, oh, yeah,
00:56:19.480 --> 00:56:20.260
this is fascinating.
00:56:22.362 --> 00:56:23.481
So really appreciate that
00:56:23.641 --> 00:56:24.842
and how we got connected.
00:56:27.199 --> 00:56:28.119
I have to say, you know,
00:56:28.159 --> 00:56:29.378
LinkedIn has become a great
00:56:29.418 --> 00:56:32.599
place to talk about AI and education.
00:56:32.820 --> 00:56:34.139
And I used to just think of
00:56:34.179 --> 00:56:35.039
LinkedIn as where you went
00:56:35.059 --> 00:56:35.721
when you're looking for a
00:56:35.780 --> 00:56:37.681
job or you're looking to hire somebody,
00:56:37.740 --> 00:56:39.900
but there's so much great
00:56:40.221 --> 00:56:41.922
conversation on there now.
00:56:42.001 --> 00:56:43.461
I spent a lot of time on there.
00:56:44.262 --> 00:56:45.842
I posted about AI and I
00:56:45.862 --> 00:56:46.862
would just encourage folks
00:56:46.902 --> 00:56:47.782
to take a look.
00:56:48.302 --> 00:56:49.222
Kind of depends on who you
00:56:49.242 --> 00:56:50.422
follow and what you, you know,
00:56:50.483 --> 00:56:51.543
the hashtags you click on,
00:56:51.563 --> 00:56:52.284
but there's a lot of great
00:56:52.324 --> 00:56:53.744
content on LinkedIn these days.
00:56:54.605 --> 00:56:57.045
Well, they've expanded their,
00:56:57.525 --> 00:56:58.527
maybe I'm correct on this.
00:56:59.086 --> 00:57:01.608
They've expanded like the courses, right?
00:57:02.047 --> 00:57:04.048
And the things you can learn, like,
00:57:04.528 --> 00:57:06.630
and they have a whole course build.
00:57:07.050 --> 00:57:07.869
Sophie knows a little bit
00:57:07.929 --> 00:57:10.510
more about that than I do, I believe.
00:57:11.992 --> 00:57:12.271
But yeah,
00:57:12.311 --> 00:57:13.873
like you go there for a wide
00:57:13.932 --> 00:57:15.313
range of reasons.
00:57:15.452 --> 00:57:16.313
Yeah.
00:57:16.373 --> 00:57:16.574
Yeah.
00:57:17.601 --> 00:57:19.663
very cool yeah um I I
00:57:19.702 --> 00:57:20.784
learned from others I get
00:57:20.804 --> 00:57:22.565
connected with others um
00:57:22.704 --> 00:57:24.385
people learn from each
00:57:24.445 --> 00:57:25.507
other there it's sort of
00:57:25.586 --> 00:57:26.588
like a more professional
00:57:26.648 --> 00:57:27.527
TikTok where you go on
00:57:27.568 --> 00:57:28.469
TikTok to try and learn how
00:57:28.509 --> 00:57:29.389
to do something or more
00:57:29.409 --> 00:57:30.530
professional YouTube you go
00:57:30.550 --> 00:57:31.530
to YouTube to learn how to
00:57:31.990 --> 00:57:34.192
do this um mechanical thing
00:57:34.211 --> 00:57:35.052
that you don't know how to
00:57:35.092 --> 00:57:36.614
do but you go to LinkedIn
00:57:37.014 --> 00:57:38.434
to like build your
00:57:38.474 --> 00:57:39.635
knowledge base like your
00:57:39.876 --> 00:57:42.177
your your professional um
00:57:42.197 --> 00:57:44.297
toolkit there now and it's
00:57:44.358 --> 00:57:46.780
not just about jobs and
00:57:47.199 --> 00:57:51.523
that it is an interesting place to be now.
00:57:51.643 --> 00:57:54.146
And I'm thrilled to see
00:57:54.166 --> 00:57:55.809
things that my connections
00:57:55.829 --> 00:57:56.809
are posting and to learn
00:57:56.849 --> 00:58:01.615
from them and to see common
00:58:02.076 --> 00:58:04.139
minds combined and see what
00:58:04.159 --> 00:58:05.079
they can produce there.
00:58:05.320 --> 00:58:05.760
So it's...
00:58:07.516 --> 00:58:08.358
I like it.
00:58:08.378 --> 00:58:08.818
I agree with you.
00:58:08.838 --> 00:58:11.679
Absolutely.
00:58:11.699 --> 00:58:12.619
I have to say one more thing.
00:58:12.639 --> 00:58:14.201
I know we're almost out of time,
00:58:14.240 --> 00:58:15.222
but last night I was
00:58:15.262 --> 00:58:16.242
talking to my grandma,
00:58:16.382 --> 00:58:17.003
my eighty three year old
00:58:17.023 --> 00:58:17.822
grandma on the phone.
00:58:18.302 --> 00:58:18.884
She asked what I was going
00:58:18.903 --> 00:58:19.563
to do this weekend.
00:58:19.583 --> 00:58:20.844
I said, I'm going to be on a podcast.
00:58:20.864 --> 00:58:21.965
Do you know what a podcast is?
00:58:21.985 --> 00:58:22.905
She's like, oh, yeah,
00:58:22.925 --> 00:58:23.987
I know what a podcast is.
00:58:24.487 --> 00:58:24.706
She's like,
00:58:24.746 --> 00:58:25.327
are you going to be on the
00:58:25.347 --> 00:58:26.809
Kelsey Brothers podcast?
00:58:27.648 --> 00:58:28.208
You know, Travis,
00:58:28.228 --> 00:58:29.809
Kelsey's a big deal in Missouri.
00:58:29.829 --> 00:58:30.751
And she thought I was going
00:58:30.771 --> 00:58:31.791
to be on his podcast.
00:58:31.811 --> 00:58:32.192
I was like.
00:58:32.867 --> 00:58:33.349
No.
00:58:33.989 --> 00:58:35.130
This one's about education,
00:58:35.190 --> 00:58:37.092
but I'm sure it's just as popular.
00:58:38.032 --> 00:58:40.355
It's not as cool.
00:58:40.414 --> 00:58:42.215
My grandma knows what podcasts are.
00:58:45.759 --> 00:58:48.282
I kind of hear from that
00:58:48.342 --> 00:58:49.922
generation of like, oh,
00:58:50.903 --> 00:58:54.086
it's your version of radio.
00:58:54.567 --> 00:58:56.429
It's the radio thing, but now it's like...
00:58:57.425 --> 00:58:58.726
Just expand it a little bit.
00:59:00.807 --> 00:59:01.188
On demand.
00:59:01.208 --> 00:59:03.188
On demand, your topic, Selection Radio.
00:59:03.248 --> 00:59:04.409
That's really what it is.
00:59:04.429 --> 00:59:06.190
Yeah.
00:59:06.251 --> 00:59:06.831
Very cool.
00:59:06.871 --> 00:59:07.172
Very cool.
00:59:07.632 --> 00:59:09.172
Well, Mark, this has been a ton of fun.
00:59:10.313 --> 00:59:11.134
I've learned a lot.
00:59:11.853 --> 00:59:13.394
We really appreciate you being here.
00:59:13.534 --> 00:59:15.797
So big shout.
00:59:15.817 --> 00:59:16.197
Thank you.
00:59:16.257 --> 00:59:17.557
It's been great talking to both of you.
00:59:18.914 --> 00:59:19.273
Awesome.
00:59:19.594 --> 00:59:19.893
Awesome.
00:59:21.315 --> 00:59:21.594
Sophie,
00:59:21.614 --> 00:59:25.255
we do have like a kind of a big
00:59:25.295 --> 00:59:25.797
announcement,
00:59:25.896 --> 00:59:26.556
something that's already
00:59:26.577 --> 00:59:28.378
been out there on the interwebs.
00:59:29.557 --> 00:59:30.898
Let me see if I can pull it
00:59:30.938 --> 00:59:31.498
up really quick.
00:59:31.539 --> 00:59:32.298
But while I do that,
00:59:32.458 --> 00:59:33.260
you want to talk about what
00:59:33.300 --> 00:59:34.360
we're doing next weekend?
00:59:37.204 --> 00:59:37.945
Is it next weekend?
00:59:37.965 --> 00:59:38.744
Is it already next?
00:59:38.885 --> 00:59:39.726
It is next weekend.
00:59:39.826 --> 00:59:41.126
It's already next weekend.
00:59:41.166 --> 00:59:43.146
On March eighth.
00:59:43.186 --> 00:59:43.726
Holy cow.
00:59:43.846 --> 00:59:44.786
I am not ready for this.
00:59:45.306 --> 00:59:45.927
On March eighth,
00:59:45.947 --> 00:59:47.606
we are going to do an edgy
00:59:47.666 --> 00:59:50.487
protocols showdown, which is a mark.
00:59:50.568 --> 00:59:52.088
It's a under the hat podcast
00:59:52.128 --> 00:59:53.447
version of March madness,
00:59:53.827 --> 00:59:54.889
where we are featuring
00:59:55.028 --> 00:59:57.469
sixteen edgy protocols and
00:59:57.489 --> 00:59:58.849
they're going to face off.
00:59:58.989 --> 01:00:01.909
So and next Saturday is
01:00:01.949 --> 01:00:02.730
going to be packed.
01:00:02.769 --> 01:00:06.030
That hour is going to be slam packed with.
01:00:07.590 --> 01:00:11.494
Two rounds of a March Madness bracket.
01:00:11.554 --> 01:00:12.635
So you're going to see all
01:00:12.675 --> 01:00:13.695
sixteen of the Edge of
01:00:13.715 --> 01:00:15.056
Protocols face off and then
01:00:15.097 --> 01:00:16.197
you're going to see them again.
01:00:17.579 --> 01:00:19.059
Those winners of that first round,
01:00:19.099 --> 01:00:20.481
you'll see them again in a second round.
01:00:20.760 --> 01:00:22.101
So then you will know who is
01:00:22.121 --> 01:00:24.164
going to face off on March fifteenth.
01:00:24.577 --> 01:00:25.418
And on March,
01:00:25.438 --> 01:00:27.298
we will decide who is the
01:00:27.378 --> 01:00:29.460
edgy protocols of all edgy protocols,
01:00:30.780 --> 01:00:31.501
which one is.
01:00:31.721 --> 01:00:35.003
And I'm super excited about that.
01:00:35.083 --> 01:00:36.603
I'm super excited about
01:00:37.644 --> 01:00:39.545
collaborating with edgy protocols,
01:00:39.686 --> 01:00:42.967
John and his entire team and
01:00:44.251 --> 01:00:46.713
I'm excited to have sixteen
01:00:46.773 --> 01:00:48.675
people on the show all at one time.
01:00:48.715 --> 01:00:50.277
So that's going to be an interesting,
01:00:50.516 --> 01:00:50.956
it's going to be an
01:00:51.036 --> 01:00:52.057
interesting adventure that
01:00:52.177 --> 01:00:55.880
I am so excited to take on.
01:00:56.460 --> 01:00:57.641
And Steve, have you found it yet?
01:00:59.804 --> 01:01:00.744
I'm about to share it.
01:01:00.764 --> 01:01:02.286
Hold on.
01:01:03.927 --> 01:01:04.067
Yeah,
01:01:04.086 --> 01:01:05.969
we've never had this many people on
01:01:06.009 --> 01:01:06.628
the show.
01:01:06.949 --> 01:01:09.231
And as Sophie pointed out,
01:01:10.012 --> 01:01:10.911
I'm a little nervous.
01:01:10.992 --> 01:01:11.413
I'm like, wait,
01:01:11.512 --> 01:01:12.833
are we going to break StreamYard?
01:01:16.289 --> 01:01:18.672
So yeah, so that's the posting.
01:01:19.132 --> 01:01:20.855
This is the graphic that we've used.
01:01:21.295 --> 01:01:22.215
Big shout out to Sophie.
01:01:23.597 --> 01:01:25.920
Anything that remotely looks
01:01:26.079 --> 01:01:28.221
good from a graphic design standpoint,
01:01:28.240 --> 01:01:29.782
just assume that Sophie's
01:01:29.802 --> 01:01:31.382
the one that did it and not me.
01:01:32.443 --> 01:01:34.965
But yeah, really excited.
01:01:35.085 --> 01:01:37.807
So this is going to go over two Saturdays.
01:01:37.887 --> 01:01:39.228
So next Saturday and then
01:01:39.248 --> 01:01:40.409
the following Saturday.
01:01:41.550 --> 01:01:44.271
We have a lot of good speakers lined up.
01:01:44.833 --> 01:01:46.034
They're very excited to be
01:01:46.313 --> 01:01:48.335
on sharing their knowledge
01:01:48.355 --> 01:01:49.195
of Edgey Protocols.
01:01:50.313 --> 01:01:52.594
And yeah, so there will be, and Sophie,
01:01:52.855 --> 01:01:53.775
I was pulling this up.
01:01:53.815 --> 01:01:54.416
So, you know,
01:01:54.456 --> 01:01:56.637
my multitasking skills or lack thereof.
01:01:57.898 --> 01:01:58.840
She may have mentioned that
01:01:58.880 --> 01:02:00.380
there is going to be a live vote.
01:02:00.541 --> 01:02:01.822
So we're trying to get as
01:02:01.902 --> 01:02:05.224
many folks as possible to vote for.
01:02:05.304 --> 01:02:07.085
So we have two protocols stacked up.
01:02:07.467 --> 01:02:09.307
We do want that vote to come
01:02:09.568 --> 01:02:11.329
in to see who's advancing
01:02:11.349 --> 01:02:12.331
to the next weekend.
01:02:13.592 --> 01:02:15.952
But yeah, we'll expect more postings,
01:02:16.514 --> 01:02:17.974
more stuff on social media.
01:02:18.375 --> 01:02:19.376
And again,
01:02:19.396 --> 01:02:20.697
it's going to go live the way
01:02:20.737 --> 01:02:24.139
that our stuff normally goes live.
01:02:24.199 --> 01:02:27.003
So super excited.
01:02:27.123 --> 01:02:27.382
Yes.
01:02:27.443 --> 01:02:29.465
So I do want to stress again.
01:02:29.981 --> 01:02:31.942
live voting with the live
01:02:31.981 --> 01:02:33.503
showings of what all these
01:02:33.563 --> 01:02:35.463
edu protocols do from edu
01:02:35.503 --> 01:02:37.144
protocol users in the
01:02:37.164 --> 01:02:38.025
classroom and what they're
01:02:38.106 --> 01:02:39.306
actually using these.
01:02:40.166 --> 01:02:41.867
So I'm super excited about it.
01:02:42.367 --> 01:02:44.628
We will be launching next week.
01:02:44.688 --> 01:02:46.429
You'll see out on all the socials,
01:02:47.010 --> 01:02:48.711
the sixteen people who are
01:02:49.572 --> 01:02:50.612
representing an edu
01:02:50.652 --> 01:02:52.313
protocol will start posting
01:02:52.474 --> 01:02:56.016
their amazing little
01:02:56.335 --> 01:02:57.976
graphics that we've designed for them.
01:02:58.615 --> 01:03:00.577
and I'm so excited to have
01:03:00.677 --> 01:03:02.338
everybody on I'm actually
01:03:02.358 --> 01:03:03.759
going to be fangirling just
01:03:03.778 --> 01:03:04.998
a little bit because one of
01:03:05.079 --> 01:03:06.260
our presenters is someone
01:03:06.280 --> 01:03:07.820
that I follow on tick tock
01:03:07.960 --> 01:03:09.581
and I think is fabulous so
01:03:09.601 --> 01:03:13.043
I'm excited yeah no it's
01:03:13.063 --> 01:03:13.884
gonna be a good time it's
01:03:13.903 --> 01:03:14.543
gonna be a good time
01:03:15.364 --> 01:03:16.565
awesome so a lot of
01:03:16.625 --> 01:03:17.764
exciting things coming in
01:03:17.804 --> 01:03:20.166
march uh march is a super
01:03:20.206 --> 01:03:21.726
busy month for me probably
01:03:21.746 --> 01:03:23.288
for all of us uh oh my
01:03:23.307 --> 01:03:26.289
goodness it's crazy um but yeah so
01:03:27.340 --> 01:03:28.521
be on the lookout for that
01:03:28.561 --> 01:03:30.603
friends uh but we're going
01:03:30.643 --> 01:03:32.125
to wrap up this episode
01:03:32.644 --> 01:03:33.885
again big shout out for all
01:03:33.945 --> 01:03:35.606
of those that are uh kind
01:03:35.626 --> 01:03:36.628
of seeing you all on
01:03:36.827 --> 01:03:39.010
instagram on youtube in the
01:03:39.050 --> 01:03:41.070
chat like you know big
01:03:41.090 --> 01:03:42.913
shout big thanks another
01:03:42.952 --> 01:03:44.213
big thanks to mark
01:03:44.253 --> 01:03:45.835
henderson just being on and
01:03:45.894 --> 01:03:47.275
sharing his wisdom and
01:03:47.295 --> 01:03:49.418
guidance uh but we're gonna
01:03:49.657 --> 01:03:51.458
call it a do remember to
01:03:51.478 --> 01:03:53.260
keep your hats on but your
01:03:53.300 --> 01:03:55.561
minds open y'all enjoy the
01:03:55.581 --> 01:03:56.643
rest of your day bye

Mark Henderson
Director of Digital Initiatives, Missouri School Boards' Association
Mark Henderson serves as the Director of Digital Initiatives at the Missouri School Boards’ Association, where he leads their AI Advisory Group and spearheaded the creation of The AI Toolkit for K-12 Education. Mark has presented on AI's impact on education at major conferences across the country, including in Chicago, Dallas, Kansas City, and Minneapolis. With a background as a high school English teacher and extensive experience in educational and corporate technology, Mark brings a unique blend of classroom insight and tech expertise to his work in advancing AI in schools.