Artwork

المحتوى المقدم من MIT OpenCourseWare. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة MIT OpenCourseWare أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Player FM - تطبيق بودكاست
انتقل إلى وضع عدم الاتصال باستخدام تطبيق Player FM !

The Human Element in Machine Learning with Prof. Catherine D’Ignazio, Prof. Jacob Andreas & Harini Suresh

16:03
 
مشاركة
 

Manage episode 318613822 series 2625682
المحتوى المقدم من MIT OpenCourseWare. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة MIT OpenCourseWare أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

48 حلقات

Artwork
iconمشاركة
 
Manage episode 318613822 series 2625682
المحتوى المقدم من MIT OpenCourseWare. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة MIT OpenCourseWare أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

48 حلقات

Alle Folgen

×
 
Loading …

مرحبًا بك في مشغل أف ام!

يقوم برنامج مشغل أف أم بمسح الويب للحصول على بودكاست عالية الجودة لتستمتع بها الآن. إنه أفضل تطبيق بودكاست ويعمل على أجهزة اندرويد والأيفون والويب. قم بالتسجيل لمزامنة الاشتراكات عبر الأجهزة.

 

دليل مرجعي سريع