Part of what the Nobel Prize in economics does is create a narrative of what economics is. It is the first thing that outsiders look to learn what it is that economists do, and what things are important and unimportant. Unfortunately, most of the winners of the Nobel Prize are either very old or dead, and in any case are being rewarded for work done 20 years hence. It is very difficult to get a sense of who is on the cutting edge.
I hope to help those who are curious. I cannot pretend that I have a comprehensive knowledge of the field; nevertheless, in being an active consumer of research there are some people whose work has really impressed me. I have kept this to people under the age of 45. Though people older than that still have active careers, they have had sufficient time to become better known to the public – the only three who I simply must mention are Michael Kremer, Nicholas Bloom and Chad Jones. Some of the figures on the list will be combined for ease of exposition, and I exclude anyone who has won a major award – sorry Nakamura and Steinsson! The order is arbitrary, and is not a ranking.
Without any further preamble, here is the list:
I do not intend to give each author here an all-encompassing treatment. I will not make reference to papers I have not read; the focus will be on what work has impressed me.
Martin Rotemberg:
I first came across him while learning about the dispersion of firm productivity. There is a story that firms in India and China have much greater variance in productivity than the United States. There’s a long tail of unproductive, poorly managed firms which drag down productivity so much that, according to Hsieh and Klenow (2009), simply moving to the distribution of the United States would increase total factor productivity by 50%. Looking at India certainly gives the impression that there are masses of unproductive firms.
I was rather shocked, then, by Rotemberg’s, along with Hang Kim and T. Kirk White’s, paper on misallocation. In India, the statistical bureau does not clean their data at all. They send out the forms, enter what the firms say to them, and do nothing to guard against misreporting and error. Since the United States does clean their data, the misallocation figures are not comparable to each other. Applying the same procedure to both datasets largely removes the supposed dispersion of productivity.
In other work, he has a paper with Richard Hornbeck on the effect of the railroad on economic growth in America. They are currently digitizing the Census of Manufactures, in what will doubtless help a lot of other economists; using this, they are able to show that the American economy had massive allocative inefficiency, and that reducing this was responsible for perhaps a quarter of American economic growth.
I can also report that he responded to my email in the politest possible way.
Shengwu Li and Mohammed Akbarpour:
I can thank Akbarpour for my interest in auction theory. He posted a syllabus for an early summer class he was teaching with Hartline, which I read through. From there, I became more and more interested in designing markets, and in industrial organization, and really everything that I am interested in today. In this, I owe him quite a lot.
One of the papers on it was co-authored with Shengwu Li (who happens to be, by the way, the grandson of Lee Kwan Yew. Talented family, they). They show that auctions face a trilemma. They must choose two of three – they can either take place in one round, they can be strategy-proof, or they can be credible, but they cannot be all three. (Credibility is the resistance to either the auctioneer misreporting the bids which they receive, or having confederates bid up the price). The optimal auction for each situation depends on which criteria you can or must relax. An auction for advertising slots on the internet must be done instantaneously, for example, which precludes multi-round bidding.
Another really cool paper by the two of them (with Shayan Gharan) examines when an algorithm should match people. As time goes on, the number of possible matches goes up, but at the same time, people can leave the market. They show that the key parameter is the ability of the planner to tell who is going to leave the market. If the planner can tell, then patience is optimal – otherwise, a “greedy” matching algorithm is better. With large groups, accurate reporting by individuals is incentive compatible.
Richard Hornbeck:
I am consistently impressed with how big his work is. He works with big datasets to answer big questions in economic history. How important were the railroads for economic development? How important was barbed wire in the settlement of the American West? How did the Great Mississippi Flood of 1927 affect black migration and agricultural development? How much does built infrastructure hold back economic development? I am consistently blown away by them.
We saw his work with Rotemberg on the effect on industry; before that, he had a paper with Dave Donaldson on it. Lacking access to manufacturing data, they use modern trade theory to measure the effect on agricultural production and settlement. Here he finds estimates much smaller than the later paper, which fit in with the prior literature. (I have covered this!)
Anthony Lee Zhang:
You might already follow him on twitter; I did not make the connection between his account and the actual person for some time. I first came across him while learning about cryptocurrency. He had written a paper, along with others, on loss-versus-rebalancing, which gives us equations for figuring how much an automated market maker would lose from adverse selection. It’s a very practical paper – too practical, in fact, for me, who is not engaged in anyway in the creation of cryptocurrency exchanges.
But I poked around his CV more, and was struck by many of the papers. He has a paper which shook my view of what the NFT boom was about – these were not simply speculative assets, pursued by those who were risk-pursuing. The reselling of them was because demand in the initial auction was sensitive to other people’s demand for the product, and artists would far rather underprice. An NFT would have demand even if resale were not possible.
He has some other papers I quite like, but they will be covered under Bradley Larsen next. (It is my impression that Larsen gathered the data, and Zhang did the math). My personal impression of him is that he is quite unusually smart. His brain moves much faster than other people. He is also supremely confident, which I suspect some will find arrogant. I cannot penalize anyone for arrogance that they amply back up. Personally, I have found him friendly and helpful, and willing to give uncompromising advice.
Bradley Larsen:
Bradley Larsen is doing really impressive work on transactions and auctions. He used wholesale car auctions with Zhang in several papers to estimate the bargaining power that buyers and sellers had, and to show the value that intermediaries can bring by making transactions incentive compatible. In another, he is able to put a real value on the losses due to information constraints in one-on-one bargaining. The Myerson-Satterthwaite theorem states that there is no way to create a mechanism which guarantees all mutually beneficial trades occur, if people have private information and there is no outside money subsidizing trades; he able to say that the losses are between 12 and 23% of the possible gains from trade. With Freyberger, he uses data from Ebay to find that wholly 35% of trades are stymied by informational constraints.
I can also report that when I emailed him to inquire about his data, he was unfailingly polite, and was willing to walk me through what it had taken for him to acquire data. (A lot of work, and quite a lot of luck too).
Shoshana Vasserman:
Another economist working in industrial organization, she has done some incredible work measuring the risk aversion of firms. With Valentin Bolotnyy, she uses bidding on Massachusetts bridge maintenance contracts to quantity exactly how risk averse firms are. If we know how risk averse firms are, we know the optimal mechanism by which to hold the auction (if risk neutral, obviously, revenue equivalence holds and it really doesn’t matter how we hold the auction). In a common value auction, firms will bid for more than they expect the item to actually cost, because they fear overruns more than the gains from bidding precisely. The government has to choose incentivizing innovation and reducing the cost of a given project, and the parameters to compare are firm risk aversion and expected innovation; here is half of it.
That paper would be sufficient to land her here. I also thought her paper, with John Horton and Mitchell Watt, on reducing “congestion” in labor markets to be quite important. Especially with generative AI coming, employers are going to have a hard time screening applicants. Because applicants do not pay to have their applications reviewed, a fall in the cost of sending an application will deluge employers in worse and worse possible matches. They look at two possible solutions, and find that while capping the number of applicants reduced the number without changing the quality of matches, introducing a fee negatively affected the quality of applicants. Of course, the fees were very small, and suggestive of a discontinuity which people have when it comes to avoiding spending any amount of money.
Matt Backus:
He is here on the strength of his dissertation. It is very good. Concrete is very heavy (citation needed) and does not ship well. Because of this, each plant can only serve a very small area, which means market power can vary greatly. We economists think that a lack of competition means not only markups, but also firms choosing inefficient production techniques. He finds this – changes in competition lead to changes in productivity. What’s more, he is able to rule out this being due to selection – it is not that inefficient firms are dropping out, but that firms are changing what they do.
This is enormously important! The gains from trade can be unboundedly large once we allow for changes in what goods are produced or what techniques are used. We can harmonize the apparent harm of trade restrictions with the small losses given by conventional welfare analysis. And of course, the paper is incredible methodologically – pure wizardry.
Mark Koyama:
Prof. Koyama teaches at my department, and I admire him immensely. His book, “How the World Became Rich”, is the best book on the Industrial Revolution – and certainly the most up-to-date one. If you want to get to the frontier of our understanding of the Industrial Revolution, read it. It is short too – I read it in one frantic night. I could not put it down.
He also has several papers on the Jews in medieval Europe which I thought were cool, and an important one on the fractured-land hypothesis. The idea is that the fertile central fertile region of China led to it being repeatedly unified, while the rougher geography of Europe prevented the reformation of the Roman Empire. (It is given book-length treatment in Scheidel’s “Escape from Rome”). He and co-authors find that topography was sufficient, but not necessary, to explain the two. The differences in agricultural productivity would also suffice to explain the difference.
He also gave me his old television. Hell yeah.
These are some of my favorite young economists. I hasten to point out that not being included is simply a sign that I haven’t read your papers, and should. There is always yet more to learn. If you want more things to read, and enjoyed this article, I wrote something in a similar style on my favorite papers. Happy reading, and happy holidays. :)
I'll definitely add David Baqaee's work on aggregation and Input-Output linkages, very foundational results in macro