Fordham University            The Jesuit University of New York
 


Back to News

Psychology Professor's Research Aims to Perfect Predictions









 
 

Psychology Professor’s Research
Aims to Perfect Predictions


David Budescu, Ph.D., is working to harness the “wisdom of the crowd” in a new joint research project.

Photo courtesy David Budescu

By Patrick Verel


Can you predict the future? If so, David Budescu, Ph.D., would like to talk to you.

Budescu, the Anne Anastasi Professor of Psychometrics and Quantitative Psychology, is working with a team of researchers to improve the ways crowd-sourcing can be used to predict future events. The team is led by Applied Research Associates, a private research company. It includes researchers from Fordham University, the University of Maryland, University of California at Irvine, University of Michigan, Wake Forest University, University of Missouri, Ohio State University and Global Cognition, another private research firm.

The Aggregative Contingent Estimation (ACE) Program is a four-year-long, continuously updating study that started last May. Anyone with online access can join: In fact, the more volunteers the study gets, the better, because the diversity of opinions helps the team to see who can best predict urgent questions of the day related to business, national security, science and technology.

The study is being funded by the Intelligence Advanced Research Projects Activity (IARPA), under the federal Office of the Director of National Intelligence. Because intelligence analysts are often asked to forecast significant events on the basis of limited quantitative data, IARPA’s goal in the ACE Program is to dramatically enhance the accuracy, precision and timeliness of forecasts through the development of advanced elicitation, aggregation and communication techniques.

Team members are currently addressing three challenges:

• How to best elicit information from participants;

• How to develop statistical models to best combine the estimates given by participants;

• How to best present information learned in the most efficient and informative ways to users.

Budescu’s group at Fordham is focusing on two issues in particular: How to present findings to decision makers, and how to ask people to make conditional forecasts.

As an example of the first issue, 200 volunteers might answer a question along the lines of “Will there be a major change in the government of Cuba during the next 12 months?”

Eventually, said Budescu, the event will materialize, and researchers will know the true answer.

“But people often need to make decisions before the deadline,” he said.

Therefore, the research project will take the 200 forecasts for the event, and present a distribution of people’s opinions and some summaries of these forecasts.

“We’re presenting forecasting data collected on our site in a variety of graphical displays to participants,” he said. “We test the degree to which they can correctly extract statistical information, summarize the information, and get a sense of how much consensus or disagreement there is between the various forecasters about various events.”

The eventual goal, he said, is to select the graphical displays that are the easiest to understand, and which provide the most accurate information to decision makers.

The goal of Budescu’s second focus, conditional forecasting, is to make judgments simultaneously about two events that are at the moment unknown.

“Typically we ask questions of the type, ‘What’s the probability that the Dow Jones will go to 13,500 by the end of July?’” Budescu said.

But the researchers are also interested in predicting events under various contingencies. So for example, researchers might pose the question, “What is the likelihood of a certain event in Afghanistan if the United States decides to withdraw the majority of its armed forces by a particular date?”

The questions consist of a mix of ones generated by the research team and by IARPA. Since participants can choose which questions they want to answer, the researchers are not looking for participants with specific levels of education or areas of expertise.

“The key to success in crowdsourcing is to maximize diversity of opinion and diversity of information and backgrounds,” he said, “so we’re always interested in adding more participants to increase the diversity and variety of information that contributes to our group effort.”

The idea of relying on the wisdom of the masses is not new, of course—polling has been used in politics for decades. But technological advancements and the popularity of rating websites like Yelp, Trip Advisor, and Rotten Tomatoes have made projects like Budescu’s more feasible. There is value in information not only from experts, but from regular people, he said.

And finally, he emphasized that there’s fun to be had too.

“This research represents one of the rare cases where you make forecasts, and within weeks or months, the events that you’re forecasting are being resolved,
and you can evaluate how well you did,” he said.

“We have a variety of ways of scoring performance,” he said. “There’s a leader board where people are ranked, so you know how well you’re doing compared to the rest of the team.

“It gives you a sense of how well you understand how much you know, and it gives you a sense of whether you are over- or under-confident in your predictions. It’s a learning experience.”

For more information and to join the team, visit http://www.forecastingace.com

 

 


More Top Stories in this issue:

Return to Top Stories index



Return to Inside Fordham home page

Copyright © 2012, Fordham University.


Site  | Directories
Submit Search Request