This counterpart to the popular and long-running Becker Brown Bag Series was created for undergraduates, offering them an opportunity for informal discussion with prominent economists. These casual and flexible talks will highlight economic analysis as a powerful tool for understanding a wide range of real-world issues and problems. Questions are welcomed. Lunch is provided.
Agenda
Using Text to Quantify Policy Uncertainty
Democratic decision-making is often messy and fraught with uncertainty about outcomes and economic consequences. Autocratic regimes also take actions and pursue policies that create economic uncertainty. In this talk, Steven J. Davis considered simple text-based methods for quantifying policy uncertainty and assessing its relationship to economic outcomes.
Davis and coauthors Nicholas Bloom of Stanford University and Scott R. Baker of Northwestern University developed a novel index of policy uncertainty, using automated analysis of key terms in news media stories and other sources to measure fluctuating levels of uncertainty about future economic conditions. Quantifying the level of uncertainty was a key first step to analyzing its impact on economic behavior, and the index is now widely used to assess the costs of policy uncertainty.
Building on that work, Davis has now developed an index of global policy uncertainty and tracked the level over time.
Automated Economic Reasoning: Friedman Forum with Casey Mulligan
The social sciences have been profoundly affected by progress in information technology that has facilitated the collection and processing of vast amounts of data related to human activity. So far, information technology has assisted less with theoretical reasoning of the sort done by Gary Becker or Milton Friedman. There are automatic algebraic simplifiers, but simplicity is often in the eye of the beholder, and such tools are sparingly used by social science theorists. Computers have already been used for generating numerical examples, but approximation quality is a concern, and more thinking is always needed to appreciate the generality of the results from examples.
A process for automated reasoning has emerged from real algebraic geometry and is readily applied to economics, sociology, and political science. It can, among other things, arrive at purely qualitative conclusions about human behavior. For example, we may not be ready to assume, say, how muchprice reduces quantity demanded, just that the relationship between price and demand is negative. Computers can tell us how markets would operate under that assumption and others.
2017 Friedman Forum with Casey Mulligan
New technology like this will transform reasoning in the social sciences the way it transformed data processing in the recent past. Much of pencil-and-paper reasoning may someday be as archaic as the mechanics of, say, arithmetic already is.
In this talk, Casey Mulligan, professor of economics in the UChicago Department of Economics, reviewed how technology assists with theoretical reasoning. He discussed the size and complexity of the problems that can be processed this way, and how to check whether “a machine made a mistake.”
Why Are Some Companies Efficient While Others Are Not?
Economists have shown that large and persistent differences in productivity levels across businesses are ubiquitous. In this talk, Chad Syverson reviewed what the research tells us about the reasons for these differences. He showed that the causes of productivity differences are manifold. They include elements sourced in production practices—and therefore over which producers have some direct control, at least in theory—as well as from producers’ external operating environments.
After reviewing current knowledge, Syverson, the J.Baum Harris Professor of Economics at Chicago Booth, outlined the major questions remaining to understand variation in firms’ efficiency.
His talk was based on this paper.