Sunday, December 21, 2014

The Unnecessary Inequality

As I work my way through The Second Machine Age by Erik Brynjolfsson and Andrew McAfee, a familiar argument came up: the widening inequality was a technological problem, not social or political. In other words, our digital technologies, which power the new, information-based economy, create the economic inequality. The fraud of Wall Street, the bad policies, etc. should not be blamed for such a macro problem. This argument is nothing new. I first encountered it with The Black Swan by Nassim Taleb, who argued forcefully that mega-rich people are black swan of our economies, and that the current world (post-industrial, information-based) makes it easier and easier for black swans (including mega rich people) to happen. What a beautiful argument! It resembles the original sin, religion-is-root-of-all-violence, 9/11, and various other "explanations" of inescapable problems. For one, they acknowledge that the problems (inequality, hell, the amount of bloodshed between 15th and 20th centuries, and bad economy). However, instead of finding a human cause to resolve, they blame an nonhuman force (technologies, our nature, the jerk-face bishops, the first attack on American soil in half a century). Thus, nobody is to blamed, so no fixes are needed. And all we can do is to workaround or pray for serenity.

Now, I don't have enough space to discuss the original sin or religion or 9/11 (wait, the last one is easy, but it's not the topic here). However, the link between technologies and inequality is, at best, a coincidence. In fact, this link reminds me of the story of the Russian Tsar and the doctors. The story goes like this: the Tsar noticed from the statistics that plagued areas had many doctors; to reduce the plague, thus, he ordered the execution of all doctors. Oh Tsar, how smart were you! Similarly, the rise of inequality, which is a very recent phenomenon (since late 1970s), coincides with the rise of computer and digital technologies. And no one should be blamed for the rise of inequality, right? So the technologies get the scapegoat role.

However, look at the general trend of history and technologies, and you quickly realize this does not make sense at all. After all, technologies exist to reduce labor and increase productivity. Thus, applying the argument of digital technologies, all technologies should have increased inequality. But history worked out the reverse. Each and every technology (for now, except digital ones) reduced inequality. Don't believe me? Look at history again for inequality, and compare each era against its predecessor. Let's do it here.

First, we may think of our time is so unequal. However, the difference between an average Joe, or even a relatively poor Joe, and Bill Gates pales against the difference between an average citizen and his or her king (or emperor) at the eve of World War I. Back then, most states lived under a monarch, whose wealth dwarfed the meager lots of the common people. However, this difference, again, paled against the "enlightened absolutism" of the earlier age. I mean, the name alone should tell you the difference in wealth, status, and rights. Moving further into the Medieval, and we started dealing with serfs and lords. The complaints of free citizens in 17th century must have sounded like spoiled brats to those poor land-bound souls of the earlier ages. Moving past Medieval into Classical time, and slavery came into focal point. Or, at the very least, look at the difference between a Pharaoh, who could build fabulous tombs filled with precious goods, and the poor souls labored to build those tombs. Moving out of historical time frame, and inequality is magnified to genocides: whenever a stronger people moved into an area, they wiped all natives from the face of the Earth. Example: Indo-European arrival in Europe. Moving yet further, we see how Homo sapiens cleared off each and every last of other Homos, and dominate the Earth.

As we see, each age, wielding its signature technologies, improved upon the lives of the regular people, and reduced the inequality of the previous age. Agriculture, city building, writing, empires, industrial age, electricity. All of them unleashed eras where common men and women could have better lives, enjoy more rights, and possess more political power. This great progress accumulated into the so-called middle class of the 1970s: a mass of people all equally empowered to make a better life for themselves, their families, and their world. In fact, this law of equality-induced-technologies is so powerful that a broad and rich middle class becomes the sign and the jewel of a developed country.

Until late 1970s and early 1980s, that is.

Remember that all previous technological advances have the same inequality-inducing qualities of digital technologies. All of them increase the productivity (thus reduce the amount labor needed to produce same amount of goods) and speed up communication (concentration of population, written languages, better roads, printing, better mode of transportation, telegram). Most crucially, all of them required a different skill set from the last. Farmers work differently from hunters and gatherers; city dwellers live differently from rural and forest dwellers; industrial workers need skills and discipline that farmers don't. And somehow, we assert that digital technologies are just different from the rest? That is just absurb.

Furthermore, close observation of the inequality data reveals some inconsistencies with our beautiful argument. Two main issues stand out. One, not every country experiences the spread. For example, Japan, original launch site of the transistors, barely experienced any significant spread. Neither do France and the Netherlands. Germany, which was explicitly cited in the book, is a mixed case. It inequality rose from mid-1980s to early 1990s, but then reduced in mid-1990s to 2000 (interestingly, the share of income of the top 1% crashed in early 2000s, signifying the effect of the dot com bubble?). Secondly, clear spreads (US, UK, Australia) did not coincide with the rise of the computer. They started in late 1970s in UK, early 1980s in US, and mid 1980s in Australia. However, computer technologies did not hit its stride until 1990s! What happened in late 1970 and mid-1980, you ask? Well, Thatcher rose in 1979, Reagan won in 1982, and Bob Hawke won in 1983. Thus, political events matched inequality much much better than technologies.

Last important point on this argument: digital technologies don't necessarily increase inequality. How? First, a super star actually requires a lot of supporting people, much more than our culture usually portray. Let's take Instagram, the example that The Second Machine Age held up to contrast with industrial age Kodak. Maybe it was true that creating Instagram itself did not take that many people. However, in contrast with ancient Kodak, Instagram requires active wireless internet to function. This means countless labor for building and maintaining of the wireless network, plus countless other for the smart phone itself. When you take all of these into account, well, Instagram in fact requires quite a few human hands to succeed, but we just don't think of them. Similarly, for J.K. Rowling to succeed as a writer, she need editors, graphic designers, translators, distribution channels, etc. For a movie to succeed and a few movie stars to earn big bucks, the movie needs countless experts in lightning, camera, music, scripts, etc. as well as advertisement and distribution. The problem is actually this: we don't think of these invisible people. We just think of the stars, and thus they alone gain the wealth and wield the power. Second, being a super star may not always profit the star. As pointed out by Invisibles book (by David Zweig), a hit or two on social media do not guarantee any profit. Instead, all enduring income streams require a lot of work, thus should provide jobs and share the gain. But of course, super stars don't share. Lastly, digital advance should have increased the number of "super stars," and thus driven down the actual benefits of being a super star. After all, in the age of endless personalization, there is no reason why I should read the same book, listen to the same music, or play the same game as you do. If the hype of the digital age actualizes, each small producer should have been able to easily reach out to its small niche, and thus allow for a large number of small producers, and increase competition. And what happens when competition is high? Lower profit on the side of the producers, of course. But this did not happen. Rather, we increasingly idolize a few and willfully ignore the rest.

Let's face it, digital age has not been the paradise it promised to be. No age has been, frankly. However, to blame our inequality on the digital technologies is absurd. We have the tools and the resources to educate each and every young person to pursuit his or her dream, but we don't. We have the resources to help all seniors retire in dignity, but we don't. We can bring good health care to all, but we don't. Our inequality feeds on our social and political systems, not our technologies. Therefore, let's fix the world rather than praying for serenity.

Wednesday, December 3, 2014

The Regression of Technologies

You know, when my Dell Venue 8 decided to shutdown Firefox out of nowhere for the upteenth time, I really need to vent my frustration somewhere. The state of our technologies is shameful, I tell you, shameful.

On the paper, my tablet has a 2.1 GHz dual core CPU with 1GB of RAM. Think about how marvelous this hardware is. This is above Windows Vista's recommended specification. Less than 10 years ago, with such system, I should expect a smoothly functional, graphical operating system on which I can work with office software, some small games, and read news. In parallel. And if you are willing to go just a bit back, this specification is like 5 times the recommended hardware for Windows XP, which, again, is a modern OS with graphical interface. On such powerful specification, you should be able to run suites of heavy applications (think Mozilla + Word + Outlook + games) at the same time, switching back and forth as you see fit.

However, Android could barely load itself on such capable system. And what do I run? A web-browser, a manga reader, and that's it. And, let's not pick on Dell either. My phone does similar things. Once, I was on the road, with Google Maps navigation and Audible running concurrently. At one point, the navigation told me that I will need to turn in about a mile. So I anticipated the signal to turn, and anticipated, and anticipated. After about 5 minutes (which should work about about at least 2 miles), I was confused why navigation has not told me which road to turn in yet. The mystery was easily seen once I looked at the phone: apparently, Android has decided that it ran out of memory or resources or whatever, and booted my navigation service out. No warning, no choices (seriously, given choice, Audible would go; I need navigation, no?), no nothing. Just kick it out.

I mean, guys, it's 2014. We know how to swap RAM to hard drive to emulate full memory. We have various web-based solution to divide the load between server and client. We have beautiful hardware that was unaffordable merely 10 years ago, and unthinkable 15 years ago (sounds long, but everyone using a smart phone once lived that time; it's in memory, for God's sake!). And the damned OS cannot have 2 services running at the same time. What gives?

And I don't mean Android is the only bad system, either. Have you heard of NodeJS? I still don't understand why people use it. It basically throws away every practical and theoretical advances of computer sciences in the last 20 or so years (no type checking, no multi-threading, no proper concurrent programming, no proper programming paradigm, etc.) and bills itself as, um, the greatest and latest. People talk about how fast NodeJS is. Seriously, guys. If you want to program without any modern features for speed, go and use assembly. I am about 200% sure it's faster. Hey, you can finally use more than 1 core!

The list of those "greatest and latest" go on and on. How about IDE? Emacs was ridiculed as "Eight Megabytes And Constant Swapping." How much memory any of your IDE takes? Or music. Once upon the time, people listened to vinyl. Then, it's CD. Then, it's MP3. Then, it's Youtube. Wanna talk about quality?

We keep talking about advances of technologies. However, there is this drive to reimplement everything on new platform, but the new implementation is slower, buggier, less capable than the last one. I still wonder why.