Judith Burns, BBC News 4 Sep 09;
Google's algorithm for ranking web pages can be adapted to determine which species are critical for sustaining ecosystems, say researchers.
According to a paper in PLoS Computational Biology, "PageRank" can be applied to the study of food webs.
These are the complex networks of who eats whom in an ecosystem.
The scientists say their version of PageRank could be a simple way of working out which extinctions would lead to ecosystem collapse.
Every species is embedded in a complex network of relationships with others. So a single extinction can cascade into the loss of seemingly unrelated species.
Investigating when this might happen using more conventional methods is complicated as even in simple ecosystems, the number of combinations exceeds the number of atoms in the universe. So it would be impossible to try them all.
Co-author Dr Stefano Allesina realised he could apply PageRank to the problem when he stumbled across an article in a journal of applied mathematics describing the Google algorithm.
The researchers say they had to make minor changes to it to adapt it for ecology.
Dr Allesina, of the University of Chicago's department of ecology and Evolution, told BBC News: "First of all we had to reverse the definition of the algorithm.
"In PageRank, a web page is important if important pages point to it. In our approach a species is important if it points to important species."
Cyclical element
They also had to design in a cyclical element into the food web system in order to make it applicable to the algorithm.
They did this by including what Dr Allesina terms the "detritus pool". He said: "When an organism dies it goes into the detritus pool and in turn gets cycled back into the food web through the primary producers, the plants.
"Each species points to the detritus and the detritus points only to the plants. This makes the web circular and therefore leads to the application of the algorithm."
Dr Allesina and co-author Dr Mercedes Pascual of University of Michigan have tested their method against published food webs, using it to rank species according to the damage they would cause if they were removed from the ecosystem.
They also tested algorithms already in use in computational biology to find a solution to the same problem.
They found that PageRank gave them exactly the same solution as these much more complicated algorithms.
Dr Glyn Davies, director of programmes at WWF-UK, welcomed the work. He said: "As the rate of species extinction increases, conservation organisations strive to build political support for maintaining healthy and productive ecosystems which hold a full complement of species.
"Any research that strengthens our understanding of the complex web of ecological processes that bind us all is welcome."
Google’s Internet Techniques Inspire Studies of Food Webs
Henry Fountain, The New York Times 4 Sep 09;
A major reason Google’s search engine is so successful is its PageRank algorithm, which assigns a pecking order to Web pages based on the pages that point to them. A page is important, according to Google, if other important pages link to it.
But the Internet is not the only web around. In ecology, for instance, there are food webs — the often complex networks of who eats whom.
Inspired by PageRank, Stefano Allesina of the University of Chicago and Mercedes Pascual of the University of Michigan have devised an algorithm of their own for the relationships in a food web. As described in the online open-access journal PLoS Computational Biology, the algorithm uses the links between species in a food web to determine the relative importance of species in a food web, which will have the most impact if they become extinct.
Dr. Allesina, who studies network theory and biology, was reading a paper about Google’s algorithm one day while at the University of California, Santa Barbara. “I said, ‘This reminds me of something,’ ” he recalled.
One key to PageRank’s success is that its developers introduced a small probability that a Web user would jump from one page to any other. This in effect makes the Web circular, and makes the algorithm solvable. But in food webs, Dr. Allesina said, “you can’t go from the grass to the lion — the grass has to go through the gazelle first.
“We could not use the same trick to make food webs circular,” he went on.
So they used another trick, he said. Since all organisms die and decompose, they created a “detritus pool” that all species link to. The pool also links to primary producers in a food web, which make use of the decomposed matter.
Their algorithm differs also in that it determines the relative importance of species through reverse engineering — by seeing which species make the food web collapse fastest if they are removed. The researchers found that the algorithm produces results that were as accurate as much more complex (and computationally costly) software that builds webs from the ground up, simulating evolution.
The next step, Dr. Allesina said, is to refine the algorithm so that it will work with more complex webs. There are many other factors that affect extinctions, including pollution and habitat loss. The goal is to create an algorithm that can take these and other elements into account as well.