Economists have long benefited from advanced mathematics and statistics to research questions critical to their field. Now, as the questions in the field become more complex, the Becker Friedman Institute is helping young economists expand their analytical toolbox with computational and numerical techniques borrowed from engineering, applied mathematics, and physics.
The 2015 Computational Economics Colloquium features talks by five experts at the forefront of applying sophisticated techniques in scientific computation, software engineering, and numerical methods to analyze vast sets of data. As a result, graduate students in economics are gaining new ways to tackle dynamic problems and achieve more realistic answers.
“Many studies in economics are based on stylized models with simplifying assumptions to obtain clean intuition or insights, which may not hold under more realistic or complex economics models,” explains Professor Che-Lin Su, an associate professor at the University of Chicago Booth School of Business who organized the colloquium with Philipp Eisenhauer of the University of Bonn. But when economists want to look at more complex problems, like the implications of a set of changes to tax policy during an inflationary period for instance, they require more sophisticated tools that can accommodate many variables and large data sets. “Numerical or computational methods will be the only way to thoroughly examine those models,” notes Su.
Su and Eisenhauer designed the colloquium to complement graduate courses in economics they each were teaching on campus. Su’s course, “Numerical Methods in Economics,” taught in Winter 2015, introduced computational approaches and techniques from numerical analysis to help students solve economic models. He covered standard methods like numerical optimization, interpolation and approximation techniques, and approaches to differential equations that are useful to analyzing problems in economics such as estimating empirical models.
In Eisenhauer’s course, “Software Engineering for Economists,” graduate students learned how to harness basic software engineering tools for the implementation of complex economic models. Students have access to the Azure cloud-computing infrastructure, thanks to a grant by Microsoft Research. This allows them to explore the scalability of the cloud to tackle computation and data-intensive research projects.
By bringing other experts to campus for the colloquium, the institute broadened and deepened the learning experiences that begin in such courses. Through exposure to these scientists, young economists get to see the potential for high-level applications of the tools they are learning in class.
“Numerical or computational methods will be the only way to thoroughly examine [more realistic and complex] models.”
They also gain insight that helps shape their own work. Jeremy Bejarano, a second-year Ph.D. student in economics who has been attending the colloquium in advance of forming his research questions, appreciates the opportunity to hear these cutting-edge perspectives.
“What makes the computation colloquium really great is that it brings into perspective the larger scientific community, not just economic ideas,” said Bejarano. He especially appreciated a presentation by Todd Munson, a computer scientist from Argonne National Laboratory. Munson linked economics, physics, and game theory, explaining how Argonne’s Advanced Photon Source processes capture data as an optimization problem in a two-player game.
“To be in a workshop with someone like Todd Munson and to gain insight into what he does and how it can be applied to economics is huge and very unique. It is great to have these ideas in the back of my mind about how to solve problems in the future,” Bejarano said.
Chiara Fratto, a third-year Ph.D. student in economics, shared this viewpoint and acknowledged the opportunities and challenges she and her fellow graduate students face. As the landscape of these sophisticated analytical tools expands for them, so do the challenges of choosing, learning, and applying the tools to their own research questions.
“Todd (Munson) was talking about something that was very foreign to me, about waves and physics,” said Fratto. “What was interesting about it was how to apply the software and methods he was describing to analyze the model that I am studying. That is the challenge we are going to face. It is a problem of language. How do I translate my question, my problem into a problem that a person who studies numerical methods can understand and solve?”
Fratto’s research focuses on modeling migration, a topic naturally full of complexities so ripe for using these kinds of methods.
“This problem can become very complicated,” she explains. “It takes into account people moving across locations in a dynamic sense and expecting future dynamics of the main variables. Then, there are people who have to consider where to go, which location to choose. This is the numerical challenge that I am facing.”
Fratto believes she has to acquire more numerical tools to help her finalize her approach, but even selecting from among the array of sophisticated options in front of her presents new challenges.
“You want to have the tools so that you know what kinds of questions you can answer, and, at the same time, if you have an interesting question, you have to look around and see what kind of tools can actually help you with that,” she said.
To be clear, many experienced economists have been using these scientific methods for years, but the early adopters still represent a small percentage of the profession. Skeptics of using these tools for economic analysis often cite concerns about the potential lack of transparency. Because the calculations and techniques are frequently embedded in code, it can be a hindrance to assessing the research and replicating the results. Eisenhauer believes these challenges are surmountable.
“This is a valid criticism,” he notes. “I hope we can make progress by using tools from computational science, which are widely adopted in other fields such as biology and chemistry to address these problems, can lead to more transparent and reproducible implementations even of complex models.”
Victor Zhorin, a senior researcher at UChicago’s Computation Institute and one of the colloquium speakers, concurs on the benefits of moving economic analysis in this direction. Trained as a theoretical physicist, Zhorin has an extensive background in applying these techniques to economic analysis. He argues that economists have known for ages that these relationships can be extremely complex and that scientific computational methods allow for greater accommodation of this complexity.
“For example,” said Zhorin, who is speaking on tensor-based computing, “my talk is about how economists can create the multidimensional structures necessary to solve these complex problems very efficiently and use existing computer architecture to do so. These would be impossible to solve by standard methods.”
Despite the skeptics, initiatives like the Computational Economics Colloquium may be compelling graduate students to explore these frontiers of economics research.
“There are a lot of hard problems within economics,” notes Bejarano. “To have someone like Jesús Fernández-Villaverde, Todd Munson, or Felix Kübler give us a sense of how we might solve problems with computational methods and techniques is exciting. And, if you can be the first one to successfully apply these to previously unsolvable problems, I believe the profession will reward that.”
—Jennifer Roche