Researchers use science to predict success

We all want to know the secret to suc­cess and physi­cists are no dif­ferent. Like the rest of the aca­d­emic com­mu­nity, physi­cists rely on var­ious quan­ti­ta­tive fac­tors to deter­mine whether a researcher will enjoy long-​​term suc­cess. These fac­tors help deter­mine every­thing from grant approvals to hiring deci­sions. The only problem with this method, according to Dis­tin­guished Uni­ver­sity Pro­fessor of Physics Albert-​​László Barabási, is a known lack of pre­dic­tive power.

Impact factor, for example, is a mea­sure of a schol­arly journal’s impact on the field over time while the Hirsch index quan­ti­fies an indi­vidual researcher’s suc­cess. While these models do a good job of rep­re­senting past accom­plish­ments, they are not able to pre­dict the future for young researchers and new papers.

In a paper released Thursday in the journal Sci­ence, Barabási—a world-​​renowned net­work sci­en­tist who has joint appoint­ments in the Col­lege of Sci­ence and the Col­lege of Com­puter and Infor­ma­tion Sci­ence—and his team at Northeastern’s Center for Com­plex Net­work Research offer a new math­e­mat­ical model for quan­ti­fying impact that goes a step fur­ther in its ability to fore­cast long-​​term success.

“Nov­elty and impor­tance depend on so many intan­gible and sub­jec­tive dimen­sions that it is impos­sible to objec­tively quan­tify them all,” write the study’s authors. “Here, we bypass the need to eval­uate a paper’s intrinsic value.”

The team exam­ined the cita­tion his­to­ries of thou­sands of schol­arly physics arti­cles pub­lished between 1893 and 2010, hoping to find some pat­terns. “At first what we saw was true chaos,” explained Barabási. Some arti­cles met with plenty of atten­tion in the first year after pub­li­ca­tion but interest quickly fell there­after, others took four or five years before nose-​​diving, while still others never expe­ri­enced a spike.

To sort through this apparent dis­order, the team iden­ti­fied three mech­a­nisms that seemed fun­da­mental to the way a paper gen­er­ates cita­tions: its orig­i­nality, its age, and the number of cita­tions it has already accrued.

The team trans­lated each of these con­cepts into a math­e­mat­ical equa­tion and then com­bined the results to create a new model for rep­re­senting cita­tion pat­terns over the course of a paper’s life­time. The new model suc­cess­fully matched the cita­tion his­tory of every one of the 463,348 papers they examined.

Unlike any of the existing impact mea­sures out there, Barabási’s new model has the added func­tion­ality of being able to pre­dict long-​​term cita­tion his­to­ries based on just the first few years of data. The findings—which the team val­i­dated in fields beyond physics, including biology, chem­istry, and the social sciences—provide a new, arguably more effec­tive, tool for quan­ti­fying aca­d­e­mi­cians’ success.

The research con­tinues Northeastern’s ground­breaking work in net­work sci­ence. For instance, Barabási is also working to build the human dis­ea­some—a net­work of cel­lular and genetic inter­ac­tions that will help sci­en­tists better under­stand the causes of all kinds of ill­nesses and ail­ments. Researchers are also using net­work sci­ence to study pol­i­tics, social media, and the spread of epi­demic con­ta­gions. And this summer, North­eastern launched the nation’s first doc­toral pro­gram in net­work science.


Substack subscription form sign up