AI Resources
Ozonian, Lynn
(16 Oct 2020 19:08 EDT)
|
Re: [AI4K12] AI Resources Pat Langley (17 Oct 2020 00:54 EDT)
|
Re: [AI4K12] AI Resources
Dave Touretzky
(17 Oct 2020 01:48 EDT)
|
Re: [AI4K12] AI Resources
VLADIMIR TOSIC
(17 Oct 2020 06:45 EDT)
|
Re: [AI4K12] AI Resources
Leftwich, Anne Todd
(17 Oct 2020 09:40 EDT)
|
Re: [AI4K12] AI Resources
ADEL KASSAH
(17 Oct 2020 11:04 EDT)
|
Re: [AI4K12] AI Resources
Haobo Lai
(17 Oct 2020 11:22 EDT)
|
Re: [AI4K12] AI Resources
Randi Williams
(17 Oct 2020 11:48 EDT)
|
Re: [AI4K12] AI Resources
Dr. Marlo Barnett
(17 Oct 2020 11:55 EDT)
|
Re: [AI4K12] AI Resources
DAVID CRANDALL
(17 Oct 2020 13:17 EDT)
|
Re: [AI4K12] AI Resources
Tracie Yorke
(17 Oct 2020 08:01 EDT)
|
RE: AI Resources
Gina DeAngelo
(19 Oct 2020 10:17 EDT)
|
Lynn, > I am preparing a presentation for our K-12 teachers about AI and > Algorithms and bias. I often hear people talk about algorithmic bias, but I don't know any examples of that. However, because machine learning relies on training data to induce models, biased samples can produce biased predictors. Still, the underlying classification technique, whether it operates over neural networks, decision trees, or probabilistic summaries, is not itself biased, only the models that it uses to make decisions. This is really no different from problems associated with older, more traditional statistical models (e.g., logistic regressors) when they are given nonrepresentative training samples. I think it's important that you convey this idea to your students and, ideally, avoid the term "algorithmic bias", which could lead to deep confusion about the source of the problem. Best wishes, Pat Langley