Written by, Dale Lehman, Associate Director, Center for Business Analytics, Loras College
It was recently revealed that the Georgia Institute of Technology has begun using computerized teaching assistants (TAs) to respond to student questions in an online course. The digital TA was named Jill Watson in honor of IBM’s Watson, the winner of chess matches and Jeopardy contests against real people. Jill responds quicker than humans and is considered to operate at an “expert” level, only answering questions for which she has a confidence level of at least 97%. Within a year, she is expected to be able to handle at least 40% of student questions. Combined with massive open online courses (MOOCs), we may be looking at the future of education.
Such developments are no longer surprising. Algorithms are rapidly improving and capable of learning quickly. There are increasing claims that algorithms may be superior to expert judgments in a wide variety of tasks, including medical diagnosing and detecting police misconduct. And virtually all smartphone manufacturers now pride themselves on their “digital assistants” ready to help you even before you know you need help. I have little doubt that these trends will continue and that such smart systems, based on big data and predictive modeling tools, will play important roles in every aspect of our lives. Nor do I doubt that they will work very well – in many cases, better than humans. What caught my eye in this Georgia Tech story is a sense of déjà vu – that we’ve been here before and have not learned from the past.
There are two aspects of this story that I want to comment on. The first was the reaction of two well-known economists: Alex Tabarrok and Tyler Cowen
“Feedback from interactive systems will be more immediate and more informative. Adaptive tutoring systems are already nearly as effective as human tutors in many circumstances and much cheaper to scale.”
Over 30 years ago, Lawrence Tribe wrote an essay “Ways not to think about Plastic Trees” in which he referred to the planting of plastic trees in the median of a Los Angeles freeway:
“Consider again the plastic trees planted along a freeway’s median strip by Los Angeles county officials. If the most sophisticated application of the techniques of policy analysis could unearth no human need which would, after appropriate ‘education,’ be better served by natural trees, then the environmental inquiry would be at an end. The natural trees, more costly and vulnerable than those made of plastic, would offer no increment of satisfaction to justify the added effort of planting and maintaining them. To insist on the superiority of natural trees in the teeth of a convincing demonstration that plastic ones would equally well serve human purposes may seem irrational. Yet the tendency to balk at the result of the analysis remains. There is a suspicion that some crucial perspective has been omitted from consideration, that the conclusion is as much a product of myopia as of logic.”
If you find plastic trees to be disturbing in some fashion, and you find digital TAs to strike a similar chord, then you probably want to hear what Tribe concluded – and I cannot improve upon his words:
“What has been omitted is, at base, an appreciation of an ancient and inescapable paradox: We can be truly free to pursue our ends only if we act out of obligation, the seeming antithesis of freedom. To be free is not simply to follow our ever-changing wants wherever they might lead. To be free is to choose what we shall want, what we shall value, and therefore what we shall be. But to make such choices without losing the thread of continuity that integrates us over time and imparts a sense of our wholeness in history, we must be able to reason about what to choose – to choose in terms of commitments we have made to bodies of principle which we perceive as external to our choices and by which we feel bound, bodies of principle that can define a coherent and integrative system even as they evolve with our changing selves.”
If we conceive of education as purely the transmittal of knowledge from teacher to student, then digital TAs offer clear efficiency gains over labor-intensive human-based systems. Ms. Watson is quicker, in many cases more accurate, and may even be more sensitive to culture, ethnicity, and language than human TAs. Eventually, even teachers may be similarly replaced by algorithms. But after more than 30 years of teaching, I have found that the relationship between teacher and student is more than transmission of knowledge. It is a relationship – one in which learning is both individualized and flows in both directions. A student question is not only an opportunity to provide an answer. It is also a way to see how a student conceives of the world, to appreciate another person’s unique perspective, and to help develop that perspective – and along with that, to develop my own individual understanding.
This relates to the second aspect of this story that I want to point out. Intelligent systems, such as Ms. Watson, work best when humans are more predictable. The 97% confidence level embodied in Jill’s responses will cover more potential student questions if those questions are more consistent in their content and structure. That is, if we make ourselves more predictable, then these systems will perform better. This is true in many areas of our lives. Automated customer support sites (whether internet based or telephone based) work better when we are more consistent. It is the individualized use of language and expressions which render these systems unreliable and error-prone. Online search works best when our search terms and strategies conform to expected norms. Computers can easily learn these norms and continue learning and adapting.
It is when we behave individually – irrationally, idiosyncratically, unpredictably – that computers find it difficult to help us. So, organizations – businesses, government, and nonprofit institutions alike – invest in training us to be more predictable. Not surprisingly, they ask schools to assist in the process. Thus we have training in how to better conduct searches, how to learn course material by proceeding in the “correct” order, how to properly develop and use web-based learning systems, etc. In an ironic twist, these systems are often described as “individualized” or “customized.” Of course, customization only works if we are not too unique in how we learn.
One consequence of this mass customization is that it results in large economies of scale. Technology can handle large numbers of individual requests (orders, support questions, understanding queries, etc.) much more cheaply than humans, provided that there is sufficient scale to justify the initial expense of setting up the system. In other words, big businesses are able to operate more inexpensively (and more effectively) than smaller ones. Indeed, there is growing evidence that the average size of firms is growing across a variety of industries.  An additional effect is growing income disparity, part of which has been linked to the growth in the size of firms.
My intent is not to bemoan or derail the rise of algorithms. This is not a plea to resurrect the Luddite. Rather, I want us to consciously choose when and how to use technology. We should question whether we are automating a relationship, and if so, whether there is something lost when we do so; and if there is, whether it can be preserved while gaining the benefits of technology. We should question whether technology is really releasing our individuality or quashing it. The march of data, algorithms, and digital technologies may well be inexorable, but our uses of these are not. We can choose while honoring our commitment to what it means to be human and to learn. Digital TAs and plastic trees are both choices – let us not forget that.
 Cited on their blog, Marginal Revolution, on May 9, 2016. The quote above appeared in their paper about online education, https://www.aeaweb.org/articles?id=10.1257/aer.104.5.519.
 L. Tribe, 1974, “Ways Not to Think about Plastic Trees: New Foundations for Environmental Law, The Yale Law Journal, Vol. 83, pages 1315-1348.
 For example, see http://www.economist.com/blogs/graphicdetail/2016/03/daily-chart-13.
 H. M. Mueller, P. Ouimet, and E. Simintzi, 2015, “Wage Inequality and Firm Growth,” Centre for Economic Policy Research, http://cepr.org/active/publications/discussion_papers/dp.php?dpno=10365.
 Nor do I intend to complain about the more general rise of the digital economy, as discussed in E. Brynjolfsson and A. McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, 2014, W.W. Norton & Company.
 The recent book by Todd Rose, The End of Average, Harper Collins, 2015, traces the history of the concept of average and how destructive it can be to individuality.