大网络,小世界分享 http://blog.sciencenet.cn/u/yangfanman 我是少数时,考验自己的勇气;我是多数时,考验自己的宽容。

博文

Stepping Away From the Trees For a Look at the Forest

已有 4862 次阅读 2010-12-22 15:43 |个人分类:复杂网络|系统分类:论文交流|关键词:学者

貌似复杂网络还是挺受重视的!

 

Stepping Away From the Trees For a Look at the Forest

  1. The News Staff

Ten years ago, Karl Deisseroth was stuck. A psychiatrist and neuroscientist, he wanted to learn how different brain circuits affect behavior—and what went awry in the brains of his patients with schizophrenia and depression. But the tools of his trade were too crude: Electrodes inserted into the brain would stimulate too many cells in their vicinity. So in 2004, Deisseroth and his students invented a new tool. They inserted a gene for a light-activated algal protein into mice brains, where it entered nerve cells. By stimulating those cells with a laser, the researchers could control the activity of specific nerve circuits with millisecond precision and study the effects.

The technique, called optogenetics, took the field by storm. Today, thousands of scientists in hundreds of labs are using optogenetics to probe how brains work. They have examined which cells in the brain's reward pathway get hijacked by cocaine, and how deep brain stimulation relieves the symptoms of Parkinson's disease. The list of questions that can now be addressed is endless.

Deisseroth's story repeats itself everywhere in science: An ingenious new tool triggers a cascade of new insights. In this special section, Science's news reporters and editors mark the end of the current decade by stepping back from weekly reporting to take a broader look at 10 insights that have changed science since the dawn of the new millennium. This survey covers only a small fraction of the decade's scientific advances, of course; many others could have filled these pages.

First, however, a shout-out to some of the tools that made those insights possible. In the past 10 years, new ways of gathering, analyzing, storing, and disseminating information have transformed science. Researchers generate more observations, more models, and more automated experimentation than ever before, creating a data-saturated world. The Internet has changed how science is communicated and given nonscientists new opportunities to take part in research. Whole new fields, such as network science, are arising, and science itself is becoming more of a network—more collaborative, more multidisciplinary—as researchers recognize that it takes many minds and varied expertise to tackle complex questions about life, land, and the universe.

Seeing and sensing

Many of the decade's most useful new tools, like optogenetics, were advances in sensing and imaging. Cryoelectron tomography brought into focus the cell's components, allowing scientists to get atomic-level detail of whole-cell organization. The technique builds up a three-dimensional image from a series of two-dimensional slices of a flash-frozen cell, a computer-intensive process.

In 2002, intravital scanning two-photon confocal microscopy revolutionized immunology when it was applied to lymph nodes, showing immune cells in intact tissue in real time. The work opened the door to understanding the interactions among immune cells and the body and changed how researchers thought about immune responses.

Developmental biology has also made huge leaps since 2000 thanks to new microscopy techniques. Now researchers can keep samples alive and have longer-lasting fluorescent tags to track specific cells. They can follow whole-organ and whole-animal development in movies that follow cells as they divide and move around. Other microscopy techniques have been able to sidestep a fundamental limit of optics to look at proteins and the fine structure of cells smaller than the diffraction limit—half the wavelength of the light being used.

Not only can we see better, but we can also send our eyes to hard-to-reach places. On Mars, for example, the Spirit and Opportunity rovers marked a big step up from earlier spacecraft: able to roam for kilometers instead of meters and carrying more capable instruments to analyze the chemical and physical properties of rock. Their observations rewrote the history of water on Mars. Closer to home, torpedolike robots with no connection to the surface searched for oil in the Gulf of Mexico and explored the water under glacial ice shelves in Antarctica. Remotely operated planes, some the size of model airplanes and some the size of military Predators and Hawks, routinely monitor sunlight passing through clouds and fly over hurricanes. Thousands of oceangoing floats send back data on water properties and, therefore, currents. These mobile sensing devices, along with stationary counterparts on land and in the sea, will soon monitor the state of the planet around the clock, making ecology and environmental science almost as data-rich as astronomy and particle physics are.

The indispensable machine

Key to handling such unprecedented torrents of data, of course, have been ever more powerful and more affordable computers. No field has benefited more than genomics. A decade ago, sequencing a human genome took years, hundreds of people, hundreds of machines, and endless hours of sample preparation to generate the pieces of DNA to be deciphered, one at a time. Some researchers favored a shotgun approach—cutting up whole genomes, then sequencing and assembling them all at once—but available computers weren't up to the task. Proponents had to build a massive supercomputer and laboriously program it for the job. Now whole-genome shotgun is de rigueur, and efficient software abounds. In the past 5 years, new “next-generation” sequencing technologies have streamlined the work even more. Today, a single machine can decipher three human genomes in little more than a week.

Sequencing is getting so cheap that researchers are using it to study gene expression and protein-DNA interactions on an unprecedented scale. Geneticists now depend on sequencing to track down the genetic causes of rare diseases. In 2008, 10 countries set up the International Cancer Genome Consortium, which aims to catalog mutations and other DNA and epigenetic changes for about 50 types of cancer by sequencing part or all of 25,000 genomes.

Computers aren't limited to piecing genomes back together. With their help, genomics researchers predict gene locations and compare, say, chimp and human genomes to identify sequences of evolutionary importance—deriving new insights about how genomes work. The Web has served as a vital link between researchers and publicly accessible databases of genome information.

In biochemistry, computer technology has led to huge strides in understanding proteins. To help with the calculations needed to model the jumps and squiggles proteins make as they fold into their “final” shape, scientists beefed up their hardware with graphics processing units used to render 3D images in video games.

Computational chemistry also got a boost from field programmable gate arrays: chips that allow users to essentially design their own hardware to streamline their simulations. And one research team built a supercomputer with customized integrated circuits that dramatically speed protein-folding calculations, allowing simulations on the time scale of milliseconds.

In some cases, computing power is transforming the basic way researchers work together. In astronomy, for example, the Sloan Digital Sky Survey is cataloging everything that can be seen in a fifth of the sky using a 2.5-meter telescope at the Apache Point Observatory in New Mexico. Researchers will tap into the masses of data over the Web. Similar “data utilities” are business as usual in particle physics and seismology, but they are a far cry from the way small teams of observers have divided up telescope time in the past. Europe's CERN laboratory has gone even further. To handle the petabytes of data its Large Hadron Collider (LHC) will generate every year, CERN has set up a computing “grid” system—a virtual organization that pools and shares the computer processing power of each member institution. The grid also makes it possible for thousands of scientists to access LHC data and work together in unprecedented ways.

Other organizations are harnessing the power of networking through crowd-sourcing, in which large numbers of researchers (even nonscientists) can contribute to solving problems, setting policy, or forecasting the future. An Internet company called InnoCentive, for example, posts problems online and offers rewards of up to $1 million for their solution. It boasts that 200,000 people have weighed in on more than 1000 problems in fields as wide-ranging as drug synthesis and brick manufacture and have solved two-thirds of them.

Networking programs also enable volunteers around the world to donate idle time on home computers for protein-folding calculations, to search for comets in images from the SOHO satellite, and to classify galaxy types in data from the Sloan Digital Sky Survey.

To get a handle on all this interconnectedness and grasp the ways information travels through complex systems, theorists have spawned a new field called network science. The field took off about 10 years ago, after physicists developed mathematical models to explain some of the network phenomena sociologists had observed. Now, thanks to technologies that make possible measurements on thousands of genes or proteins at once and to computers that can track and analyze the movements, voting habits, or shopping preferences of millions of people, the network approach is bursting into full flower. A host of new insights are bound to lie just ahead.

But that is a tale for another decade.

PDF Version:  Stepping Away From the Trees For a Look at the For







https://m.sciencenet.cn/blog-64458-396187.html

上一篇:公平与正义:复旦大学黄山门
下一篇:一个影响非常大的“二姨夫”,这个可以有!

0

发表评论 评论 (0 个评论)

数据加载中...
扫一扫,分享此博文

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-5-1 18:59

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部