Google uses an interesting ranking system to determine, along with other measures, how “good” a search result is. This ranking system is called PageRank, named not after webpages but after Larry Page, one of the founders of the Silicon Valley giant. Informally, PageRank measures how well linked to a site is by other pages well linked to by other pages well linked to, and so on.
Similarly, Google Scholar takes into account an author’s h-index, a measure of a researcher’s prestigiousness. Someone with an h-index of five has published at least papers that have each been cited at least five times. Scholar Citations conveniently calculates this for us. Vincenzo De Florio, for example, has an h-index of fourteen at the time of writing.
This ranking system has gained popularity with the seemingly exponentional advancement of science, being extended even to measuring one’s reputation on Twitter—someone with a T-index of five has posted at least five tweets that have each been retweeted at least five times.
These three, PageRank, h-index, and T-index, share at least one essential feature.
Progress can be really, really hard.
Consider, for example, the task of increasing one’s T-index from five to six. This will require either the Tweeter to post six brand new posts and have each retweeted six times, for a total of thirty six retweets, post one new post and have it and previous posts retweeted up to six times each, for a total of eleven retweets, or, most likely, some combination in between.
Once this task is done, imagine now increasing the T-index to seven—requiring between forty nine and thirteen retweets. To then reach eight, between sixty four and fifteen retweets. Nine, between eighty one and seventeen. And so on, with an ever increasing requisite for hard work.
He asks a seemingly innocent question about the existence of “complexity barriers” in science and technology. For example, consider the increasing human life expectancy. Certainly, the maximum age should level off as we reach the limits of our biology, but, to the contrary of each generation’s predictions, it steadily grows at a rate of roughly two and a half every ten years.
What forces are at play here? Medicine advances at an exponentional rate, yet the human life span does so only linearly. Does science and technology have the same essential property as PageRank and the h-index? That is, just as it takes more work to move from an h-index of six to seven than it does from five to six, does it take more work to move from a life expectancy of sixty to seventy than it does from fifty to sixty?
I do not intend to say that the causes of these two “complexity barriers” are the same, only that the two barriers are analogous in nature.
Gros states that progress is not necessarily difficult, that it only takes more resources. This can be computing power, as is the case in forecasting weather conditions or other chaotic systems, or marketing prowess, as is the case in going viral on social media, or simply just more money, as is the case in—well, as is the case in a lot of things.
What does this essence of progress, of complex systems, mean really? Trivially, it means we will face complexity barriers when we are no longer willing to invest higher orders of resources. But, as Gros goes on to speculate, even when we are able and willing to spend more, what other limits exist? Is there, so to say, a “human factor” at play? Does the brain have wired in an inherent maximum capacity for information or advancement or so on? And if so, what effect does it have on the rate of scientific progress?
A simple consequence, according to Gros, of a law of diminishing returns in science, the result of human factors and increasing requisites, is that firms should invest not primarily in a few large projects but in a large number of small to medium ones instead.
What other effects of complexity barriers are there? ∎
Follow me on Twitter. Let’s chat sometime.