Thomas Davenport of MIT and Julia Kirby of Harvard University Press, who have worked on the implications of automation, argue that there have been three broad eras of automation starting with the industrial revolutions, then the computer age, and now the age of Artificial Intelligence that we are living through.
The first of these automated those tasks that were dirty and dangerous. As the industrial revolution urbanized the United States and Europe, people were able to move from uncertain and low-productivity agricultural work to stable and high-productivity factory employment. Innovations in agriculture enabled more food to be produced with less work, much of which was extremely strenuous. Those looking for work went to urban factories, where as a result of steam engines, increased iron production, and power looms, textiles could be produced by unskilled laborers at fantastic speeds, as opposed to slowly by the hands of a skilled artisan. While this adaptation caused social unrest among the artisanal classes who lost their high status, it enabled people to afford luxuries that were once restricted, and enter lines of work they could not previously have dreamed off.
As cities grew with the migration of people seeking quality work, ideas also bounced around. Edward Glaeser, a Harvard economist, notes that when cities double in size, productivity per capita goes up 15%. People who share more time with each other, and work in close proximity, share more ideas, and are able to more easily turn them into reality. This is why the First Industrial Revolution led, after a brief pause of adaptation, into the second or technological revolution, which introduced railroads, petroleum products, mass-produced steel, electricity, and the car, among a whole host of other significant inventions. It’s hard to imagine this explosion in innovation being possible in 1790, when 90% of the labor force were farmers.
Automation of this kind, known as skill-unbiased, when new technologies make things easier to make than before, characterized the era of the Industrial Revolutions. They enabled greater opportunity for otherwise low-productivity workers to earn a living and enjoy leisure, which in turn spurred massive social movements. The high school movement, which wanted greater human capital investment at the turn of the 20th Century, is hard to imagine being brought about without the reduced need for demanding physical work.
The impacts of these new technologies were impossible to predict as they were introduced, but human ingenuity harnessed their creative potential. Take for example the automobile, the introduction of which into all spheres of life threatened the jobs of those who had made a living of the massive horse industry. In 1890 there were 13,800 companies that built horse-drawn carriages. Combined with the industry surrounding raising of horses, maintenance, food, cleaning the streets of their urine and feces, and all the other associated tasks, the horse seemed vital to American industry. Henry Ford’s assembly line in 1913 did not, however, destroy the American economy by reducing the demand for horses. It instead enabled a sprawling automotive industry that eventually led to some of the world’s largest corporations that instead become central to American life. In addition, it enabled faster, lower cost travel, which increased the growth of cities, the productivity of people, and enabled freer movement than before.
By the 1950s, the second age of automation began, with machines taking away the dull tasks of life. Routine and clerical work began being reduced through the introduction of improved telecommunications networks, punch cards, and airline kiosks. Information technology advanced rapidly, leading to falling costs of computing, and increased use of software to expedite rote work. When the Internet was born, the cost of information fell dramatically, reducing the time consuming and laborious work of research. The implications of the “knowledge economy” that this birthed are so profound, that we have yet to fully understand them.
Unlike the industrial revolutions, the computer age is characterized by skill-biased technologies. Instead of making once difficult tasks easier for people to do, they make once boring and repetitive tasks more knowledge based. The introduction of the ATM took away the routine part of a teller’s job which involved counting money and updating books, and replaced them with the more cognitive tasks of understanding customer needs and being a salesman. This has greatly improved the returns for the capable, while reducing opportunities for those that lack the newly in-demand skills.
The current age of automation, with artificial intelligence technologies that reduce the need for human prediction, are similarly skill-biased. By reducing the need to process information to make speedy conclusions, such as in translating speech to text in a foreign language, they increase the value of tasks that involve judgement and social skills. Fears over this skill-bias, however, neglect the means by which people have always harnessed technology.
Personal computers and smartphones are skill-biased technologies that have greatly improved the day to day lives of people, by increasing the efficiency of the tasks that they do, and increasing their interconnectedness. They have enabled people to become lifelong learners, and increased the ability of people to adapt to changes in their social surroundings. Artificial intelligence is an extension of these innovations, and magnifies the benefits that information technologies have brought. Transitioning to a world in which different skills are valued might be difficult, but in the long run, everybody benefits.
These are the sort of benefits—the gains from innovation—that we here at the Competitive Enterprise Institute are celebrating this week in the run-up to this year’s Human Achievement Hour. Read more about the celebration here, and see the rest of our HAH content here.