Opinion
Future of Work Opinion

10 Really Hard Decisions Coming Our Way

By Tom Vander Ark — October 18, 2017 9 min read
  • Save to favorites
  • Print

Things are about to get interesting. You’ve likely heard that Google’s DeepMind recently beat the world’s best Go player. But in far more practical and pervasive ways, artificial intelligence (AI) is creeping into every aspect of life--every screen you view, every search, every purchase, and every customer service contact.

What’s happening? It’s the confluence of several technologies--Moore’s law made storage, computing, and access devices almost free.

This Venn diagram illustrates how deep learning is a subset of AI and how, when combined with big data, can inform enabling technologies in many sectors. For examples, to AI and big data add:

  • Robotics, and you have industry 4.0.
  • Cameras and sensor package, and you have self-driving cars.
  • Sensors and bioinformatic maps, and you have precision medicine.
  • While there is lots of good news here--diseases will be eradicated and clean energy will be produced--we have a problem: this stuff is moving faster than civic infrastructure can handle. Innovation is outpacing public policy on all fronts. The following are 10 examples of issues coming at us fast that we (in the US in particular) are not ready to deal with.

    1. Unemployment. We may be near “full employment now” but it doesn’t feel that way. “Workers who have steadily lost access to the economy as digital processes replace them have a sense of things falling apart, and a quiet anger about immigration, inequality, and arrogant elites,” said Brian Arthur, an external professor at the Santa Fe Institute.

    Things get worse from here. A PwC report predicts that more than a third of jobs could be at risk by 2030. New jobs will be created but that is even harder to predict than displacement which will vary by sector and geography.

    Arthur notes that we’re at the end of the old production economy driven by free market economics and old measures of growth and in a distribution economy where it’s all about who has access to what’s produced--and that’s a political problem.

    2. Income inequality. If you think people were ticked about income inequality last year, just wait. The folks that develop, finance, and own the robots are winning in the automation economy. Arthur predicts that “Jobs will be fewer, and work weeks shorter, and many jobs will be shared.”

    Income inequality will accelerate and, combined with massive job dislocation, will require some kind of income protection--a universal basic income or tax credit and safety net that serves the same function. We’re likely to see an increase in paid voluntary activities like looking after the elderly or mentoring young people.

    With socialist roots, Europe and Scandinavia will adjust more easily and quickly than America to income stratification. Finland is already using design thinking to inform basic income policy experiments.

    3. Privacy. There will be 50 billion devices connected by 2020 including a billion cameras--all feeding data to artificial intelligence platforms. Perhaps you’ve noticed the marked improvement in facial recognition on Facebook in the last few months. Chip maker wants cities to use its new Metropolis artificial intelligence platform to tap into all of those cameras. Very soon, wherever you are, it will be best to assume you’re on Candid Camera.

    We are approaching radical transparency where every search, every move, every test informs a merchant, authority, or insurer. Want to preserve any shred of privacy? That will take some new policies.

    4. Algorithmic bias. AI gets smarter the more data you feed it. But it quickly learns our biases and those embedded in our society. For example, cameras missed the mark on racial sensitivity and software used to predict future criminals showed bias against black people. Increasingly, AI determines who gets a loan, who is insured, and who gets hired.

    How do we get algorithmic transparency? It’s not easy when bots are inventing new ways to communicate with each other. Bias prevention will require creativity and diligence.

    5. Access. The most powerful tools the world has ever known have been created--and they are getting smarter every day. But who will have access to AI tools? Google open sources TensorFlow and last month Microsoft open sourced some tools but both require technical sophistication to use. OpenAI is a non-profit AI research company created by Elon Musk, Sam Altman and others to develop open source AI beneficial to humanity. All good news, but access to tools and the chops to use them will be an endless challenge.

    6. Machine ethics. John Giannandrea AI chief at Google is concerned that bias is being built in to many of the machine-learning algorithms by which the robot makes decisions, “The real safety question, if you want to call it that, is that if we give these systems biased data, they will be biased.”

    In Moral Machines: Teaching Robots Right from Wrong, Wendell Wallach and Colin Allen address suggest that teach robots right from wrong will advance human ethics by providing a platform for experimental investigation--a great thought but one requiring sophisticated public private partnerships to enact.

    Take autonomous vehicle policies as an example. AVs are on the road today and municipalities are scrambling to figure out if and how to regulate them. As Wallach and Allen predict, AVs surface moral dilemmas (e.g., kill the driver or the pedestrians?) and allow debate, but do we want 10,000 municipalities trying to figure this out on their own and building a patchwork of unique laws?

    7. Weaponization. Former President Obama kicked drone strikes into high gear--an opening salvo in modern mechanized warfare. Autonomous killer robots aren’t far behind the drones--and a global AI-powered arms race is inevitable. While the US walks away from global trade and climate treaties, do you see a new Geneva Convention for robo-war?

    8. Humanity. How do machines affect our behaviour and interaction? AI bots are becoming better and better at modelling human conversation and relationships. This paired with better calibration and gamification are making video and mobile games more addictive. Will tech addiction be next addiction wave after opioids?

    If not an addiction crisis, will AI simply build alienation and resentment, will it threaten human dignity? The answers will be a mixture of practice and policy.

    9. Genome editing. Machines are learning to recognize tumors and edit genomes. This is good news if you think cancer sucks but it raises a bunch of tough questions about who can edit genes for what purpose. And which of the soon to be 8 billion people on earth will have access to precision medicine?

    10. Bad AI. Elon Musk thinks AI is more worrisome than North Korea. His startup Neuralink is building a brain interface so that we’re smart enough to keep up with super AI--what Nick Bostrom thinks may be the last invention humans ever need to make.

    This is a few years out, but tech progress will continue to accelerate resulting in very powerful computers, advanced weaponry, space travel, human longevity (for some), realistic VR, and fine-tuned emotional and motivational controls. There are a bunch of ways this could go badly, very badly. Musk wants us to start considering limitations. Zuckerberg thinks he’s an alarmist. It’s worth having the conversation.

    Bonus. Robot rights. Will robots gain consciousness as we know it? As they get smarter are there moral obligations to smart machines. Do they deserve some form of human or animal rights?

    Crazy, right? Robot rights didn’t make the top 10 issues list but it illustrates how quickly the questions are becoming new and unfamiliar--and outside the moral, ethical, economic, and political frameworks that have guided life on earth for hundreds of years.

    Conclusions

    Code that learns is the most important invention in history--and it’s being supercharged by an explosion of sensors and supercomputing. It is changing the employment landscape and swamping our civic infrastructure with issues inconceivable just a few years ago. We’ve come to 10 conclusions:

    1. It’s time to #AskAboutAI. We launched a small effort a year ago to host community conversations about the implications of artificial intelligence. We’ve co-hosted conversations in a dozen communities. Every community should be discussing the enormous potential benefits and emerging challenges associated with AI.

    2. It’s time to teach AI-awareness. We’ve been teaching digital literacy for two decades but what’s new is that we all need to appreciate that algorithms curate every screen we see. As smart machines augment our capabilities, they will increasingly influence our perceptions, opportunities and decisions. To self and social awareness, add AI-awareness (maybe SEL becomes SEAIL).

    3. It’s time to update career education. Every high school student should study the ways AI could influence their life and livelihood. Don’t let kids graduate from high school without being AI aware. Read Stanford’s AI100 report for an overview of the 8 most sectors likely to be most impacted (we summarized most chapters in the #AskAboutAI series).

    4. It’s time to teach data science. Every field is becoming computational. Every big problem has a big data set associated with it that will be part of game changing solutions (we call it cause + code). Understanding how to wrangle and analyze data will equip young people for careers and civic contribution. In five or 10 years AI will take over much of the data wrangling but it will be important for today’s secondary students.

    5. It’s time to open source AI tools. Google and Microsoft have set a good example with TensorFlow and Azure Machine Learning but they both take a good deal of judgement and technical expertise to use. So that every high school and college student gains exposure to open AI tools and their use, we could really use advice on using open source tools.

    6. It’s time to update graduate profiles. Social and emotional learning, and an innovation mindset should be central --it’s clear that they are even more important than traditional measures of academic success.

    7. It’s time to build a stronger social safety net. At a minimum there will be more dislocation and transition. Vulnerable populations will be at risk more frequently.

    8. It’s time for a new civic infrastructure. Finland may be iterating on basic income schemes that won’t fly in the gridlocked US congress. Despite advocacy from tech leaders including Elon Musk and Mark Zuckerberg. Like Seattle’s leadership on minimum wage, cities and states will be the laboratories of the new social compact the US.

    9. It’s time for a new ethical infrastructure. Given the speed and technicality of of the subject, society needs a new ethical infrastructure to predict threats and opportunities, and to recommend policy, investment and learning responses.

    The new infrastructure will include nonprofit groups like Future of Life Institute, OpenAI, Foresight Institute, and Ethics and Governance of Artificial Intelligence Fund; industry consortia like Partnership on Artificial Intelligence; and university study groups like Stanford’s One Hundred Year Study on Artificial Intelligence.

    10. It’s time to build smart cities. Every region needs to develop a learning ecosystems that help people skill up fast around distinctive capabilities. As we noted in Smart Cities That Work for Everyone, learning ecosystems include innovation leadership, public private partnerships, aligned investment, talent pipeline, and multiple affordable learning entry points that recognize prior knowledge and certify new skills.

    The potential to make people’s lives better is enormous but the threats are equally daunting. We have work to do.

    For more see the #AskAboutAI series including:

    The opinions expressed in Vander Ark on Innovation are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.


    Commenting has been disabled on edweek.org effective Sept. 8. Please visit our FAQ section for more details. To get in touch with us visit our contact page, follow us on social media, or submit a Letter to the Editor.


    Events

    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Teaching Webinar
    What’s Next for Teaching and Learning? Key Trends for the New School Year
    The past 18 months changed the face of education forever, leaving teachers, students, and families to adapt to unprecedented challenges in teaching and learning. As we enter the third school year affected by the pandemic—and
    Content provided by Instructure
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Curriculum Webinar
    How Data and Digital Curriculum Can Drive Personalized Instruction
    As we return from an abnormal year, it’s an educator’s top priority to make sure the lessons learned under adversity positively impact students during the new school year. Digital curriculum has emerged from the pandemic
    Content provided by Kiddom
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Equity & Diversity Webinar
    Leadership for Racial Equity in Schools and Beyond
    While the COVID-19 pandemic continues to reveal systemic racial disparities in educational opportunity, there are revelations to which we can and must respond. Through conscientious efforts, using an intentional focus on race, school leaders can
    Content provided by Corwin

    EdWeek Top School Jobs

    Teacher Jobs
    Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
    View Jobs
    Principal Jobs
    Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
    View Jobs
    Administrator Jobs
    Over a thousand district-level jobs: superintendents, directors, more.
    View Jobs
    Support Staff Jobs
    Search thousands of jobs, from paraprofessionals to counselors and more.
    View Jobs

    Read Next

    Future of Work Which U.S. States Are Best Positioned to Innovate? How K-12 Schools Influence the Rankings
    A new report ranks states in terms of how good an environment they are for innovation, including K-12 education factors.
    2 min read
    BRIC ARCHIVE
    Getty
    Future of Work What's the Purpose of K-12 Education in the Age of Automation?
    Author Daniel Susskind talks about the role of education in a world where machines are taking over many of the tasks done by human beings.
    9 min read
    Daniel Susskind
    Daniel Susskind
    Courtesy Photo
    Future of Work Q&A How to Get More Students of Color Into STEM: Tackle Bias, Expand Resources
    Mathematician and former National Football League player John Urschel on what it will take to see more students of color in STEM careers.

    5 min read
    John Urschel
    Former professional football player John Urschel, the author of the New York Times bestseller <i>Mind and Matter:  A Life in Math and Football</i>, is making it his mission to encourage more students of color to enter STEM fields.
    National Museum of Mathematics (MoMath)
    Future of Work How Virtual Learning Is Falling Short on Preparing Students for Future Careers
    The pandemic is helping some students gain virtual working skills, but many are being left behind.
    7 min read
    Teacher Aaron Volkoff demonstrates  via Zoom how to monitor a heart rate for the students in his Exercise Science  class at Lakewood High School in Long  Beach, Calif.
    Teacher Aaron Volkoff demonstrates via Zoom how to monitor a heart rate for the students in his Exercise Science class at Lakewood High School in Long Beach, Calif.<br/>
    Morgan Lieberman for Education Week