The Becker Friedman Institute (BFI) launched four new research initiatives earlier this year to support collaborative research and learning around several key issues, including: the study of Big Data; Development Economics; International Finance, Macro and Trade; and Labor. The work of these research initiatives will supplement the work already underway through ongoing research initiatives on Health Economics; Macro Financial Research; Price Theory; and Chicago Experiments that studies economic behavior in field settings.
We are pleased to spotlight the Big Data initiative, which has launched under the leadership of three scholars with strong expertise in this area: Stéphane Bonhomme, Professor in Economics and the College; Ali Hortaçsu, The Ralph & Mary Otis Isham Professor in Economics and the College; and Azeem Shaikh, Professor in Economics and the College and Thornber Research Fellow.
Professor Bonhomme participated in a brief Q&A with BFI to discuss their vision for this new research initiative:
What are the goals of the Big Data Initiative?
Broadly, we are enhancing the use of large data sets, which are becoming increasingly available. “Large” can mean lots of observations (firms, web searches, trades, etc.), but also refers to how much is known about the agents or events that the data codify. For example, firm information may include the workers employed, their occupation and tasks performed, info on their background, but also financial information and balance sheets, and links to other firms. Similarly, one may have a rich dataset on web browsing by a large set of people, allowing the ability to map the network of internet purchases across buyers and sellers.
Our research is focused on two main directions: studying substantive questions related to big data, and developing tools to better analyze and interpret this type of data. On the substantive side, we are investigating the ability of rich data to reduce informational frictions and improve market efficiency, privacy issues, role of government and regulation. On the methods side, we are exploring new econometric and statistical methods that can deal with high-dimensional objects (data structures, large number of parameters to estimate and make inferences, etc.), and the use of computing power and computer science tools to improve storage, access, and computation ability.
What types of research do you hope to support, and how could that research help advance the use of big data to solve key economic and social questions?
We would like to support work that combines economic thinking and data analysis tools. Such research would take advantage of the new wealth of information that data provide to improve the measurement of key quantities (e.g., aggregate quantities such as GDP or inflation, and their disaggregation across sectors or geography). This research would use and develop new methods to analyze large models of rich data sets combining multiple dimensions of economic activity, such as the interaction between labor market and product markets.
For example, I have worked with Thibaut Lamadon (UChicago) and Elena Manresa (NYU) on the sources of wage dispersion. In particular, we assess to what extent this dispersion arises from firms paying similar workers differently. This requires documenting what happens to the wage of a worker when she moves to a different firm. A first look at data on worker flows between firms is daunting: Individual records of wages and employment are now available for hundreds of thousands of firms and millions of workers. To analyze the data (in fact, even to “plot” it to make sense of the key data patterns) we need tools to reduce the dimensionality. This is what we do in our joint work, using simple but highly effective classification techniques from machine learning.
How is the University of Chicago uniquely suited for research on the study of big data and which disciplines will be part of your initiative’s work?
UChicago is really an obvious place to launch this initiative. The Department of Economics has a long tradition of data analysis and innovation in methods to answer core economic questions, exemplified by the work of UChicago economists Lars Peter Hansen and James Heckman. The analytical depth of BFI’s partners, which also includes the Booth School, the Harris School and the Law School, makes it a unique place to combine the advances in measurement with conceptual breakthroughs on issues such as the efficiency of markets or the role of regulation. The Department of Statistics and the Department of Computer Science are also well-placed to contribute to the success of the Big Data Initiative, by helping to provide and disseminate tools that make large data sets easier to manage and analyze.