Opinion
Science Opinion

Preventing an Artificial-Intelligence Fueled Dystopia, One Student at a Time

By Tess Posner — December 11, 2017 4 min read
BRIC ARCHIVE
  • Save to favorites
  • Print

AI is the Terminator. AI is coming for your job. AI is taking over the world. If we compiled all the headlines about artificial intelligence from the last year, we’d have a picture of a dystopian world where jobs are scarce and AI and automation rule everything we do. In this scenario, millions of people are impacted by AI and autonomous systems created with little regard for their consequences: They are deployed in unethical ways, riddled with errors and bias, and discriminatory. The obscurity of how AI works and where it’s used result in fear and confusion. And the few who still have jobs are far wealthier and more powerful than everyone else.

But there’s another story that doesn’t often get catchy headlines. Imagine a world where AI supports and improves our human capabilities. Algorithms and the data sets used to train them are open and transparent, giving users control and democratizing access. AI and automation work alongside humans to help teachers personalize curricula for students who learn in different ways, enable doctors to make more specific medical diagnoses, and give the elderly and people with disabilities mobility through assistive robotics. And forward-thinking and proactive policies address the potentially problematic effects these new technologies may have on our lives.

Commentary Collection

BRIC ARCHIVE

In this special collection of Commentary essays, professors, advocates, and futurists challenge us all to deeply consider how schooling must change—and change soon—to meet the needs of a future we cannot yet envision.

This special section is supported by a grant from the Noyce Foundation. Education Week retained sole editorial control over the content of this package; the opinions expressed are the authors’ own, however.

Read more from the collection.

This second future is not the path we’re on now—at least not yet. In order to get there, we need to take direct and urgent action collectively to address how AI is developed, tested, and deployed.

A relatively homogenous group of people are the current creators, researchers, and builders of AI. One consequence of this homogeneity is that AI systems unintentionally reflect or amplify unconscious societal biases. We see this already in racially biased risk-assessment software in prisons (which rate the likelihood of black people committing future crimes as higher than white people), language-translation software with gender bias (such as translating doctors as men and teachers as women), and face-recognition software that has trouble reading nonwhite faces. As we incorporate AI into our daily lives through health care, transportation, and financial and social services, opaque algorithms have the potential to perpetuate the power and wealth inequalities we already face.

Ninth grade students in the Stanford AI4ALL program for girls learn how to fly drones with a PID controller and a graduate student’s help at Stanford University this past summer.

This is precisely why AI consumers—that means all of us—must be involved in creating and shaping AI for the future. And that starts when we develop students’ interest and experience in AI at a young age. By bringing diverse experiences and viewpoints into software and project creation, we’ll see more innovative and creative outputs that better meet the changing needs of our country. At this point, only about 40 percent of U.S. schools nationwide report offering computer science courses. African-American and Hispanic students made up only around 4 percent and 9 percent of test-takers in Advanced Placement computer science, respectively, in 2014. And girls make up only 27 percent of students who took computer science tests in 2017.

But education, mentorship, and outreach programs for underrepresented youths can make a big difference. At the San Francisco Bay Area-based organization AI4ALL (which I lead), we partner with AI labs at universities, such as Carnegie Mellon, Princeton, Stanford, and University of California, Berkeley, to introduce high school students to computer science concepts and skills. It only takes a few weeks of basic training before students are ready to apply concepts to humanitarian projects with help from professors. Some students have worked on projects to make hospitals safer using computer vision to identify hand germs, used natural-language processing on Twitter to find people in need of natural-disaster relief, and made driving safer and more accessible through designing autonomous cars.

Students in the Stanford AI4ALL program for girls visit with professor Oussama Khatib in the robotics lab at Stanford University.

Since the organization began in March, at least one-third of our more than 100 alumni have gone on to create their own AI programs in their communities, including teaching AI and computer science to middle schoolers from backgrounds traditionally underrepresented in STEM, running girls’ AI clubs, and hosting AI art workshops. Alumni of the program will also have access to ongoing support from peer and mentor networks as they continue their education and move into careers. For students of color and girls, having diverse mentors in the field plays an important role in showing them what is possible. Ours is not the only effort being made. Other education-based approaches—Code2040, the CSforAll Consortium, TEALS, and the College Board’s new AP computer science principles course—are also working to bridge gender and diversity gaps in coding and computer science through funding and after-school and summer programs.

There isn’t just one inevitable future for us. Making the latest developments in disruptive technology accessible to all is critical for taking control of our 21st-century lives. As Erik Brynjolfsson and Andrew McAfee write in The Second Machine Age, “technology is not destiny.” Let’s not wait until AI’s problems become even more expensive and difficult to address. As educators, policymakers, technologists, and technology users, we can all play an active role in ensuring that a future full of artificial intelligence is actually intelligent. Let’s act while there’s still time.

Coverage of science learning and career pathways is supported in part by a grant from The Noyce Foundation, at www.noycefdn.org. Education Week retains sole editorial control over the content of this coverage.
A version of this article appeared in the December 13, 2017 edition of Education Week as AI’s Future Is in the Hands of Those Who Create It

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Your Questions on the Science of Reading, Answered
Dive into the Science of Reading with K-12 leaders. Discover strategies, policy insights, and more in our webinar.
Content provided by Otus
Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Science Opinion The Solar Eclipse Is Coming. How to Make It a Learning Opportunity
The value of students observing this dramatic celestial phenomenon for themselves should be obvious, write two science educators.
Dennis Schatz & Andrew Fraknoi
3 min read
Tyler Hanson, of Fort Rucker, Ala., watches the sun moments before the total eclipse on Aug. 21, 2017, in Nashville, Tenn. (John Minchillo/AP) Illustrated with a solar eclipse cycle superimposed.
Education Week + John Minchillo/AP + iStock/Getty Images
Science Q&A The Skill Students Need to Find Reliable Scientific Information
A high school environmental science teacher shares how she incorporates media literacy into her lessons.
5 min read
Icons on theme of climate change.
bsd555/iStock/Getty
Science Opinion High-Quality Science Instruction Should Be 3-Dimensional. Here's What That Looks Like
Cookie-cutter lab assignments that ask students to follow explicit instructions to reach the "right" conclusion limit learning.
Spencer Martin
4 min read
Screen Shot 2024 02 07 at 1.23.09 PM
Canva
Science The NAEP Science Exam Is Getting a Major Update. Here's What to Expect
For the first time in 20 years, "the nation's report card" is updating how it gauges students' understanding of science.
4 min read
Yuma Police Department forensic technician Heidi Heck shows students in Jonathan Bailey's fifth grade science class at Barbara Hall Elementary School how fingerprints show up under a special light during a presentation about forensic science on March 1, 2023.
Yuma Police Department forensic technician Heidi Heck shows students in Jonathan Bailey's fifth grade science class at Barbara Hall Elementary School how fingerprints show up under a special light during a presentation about forensic science on March 1, 2023.
Randy Hoeft/The Yuma Sun via AP