科学网

 找回密码
  注册

tag 标签: control

相关帖子

版块 作者 回复/查看 最后发表

没有相关内容

相关日志

[转载] ... the keys to being a "skinny" person
热度 2 zuojun 2012-10-17 09:58
Exercise and a portion control are the keys to being a skinny New Yorker. Read more here: http://abcnews.go.com/blogs/lifestyle/2012/03/skinny-secrets-in-manhattan-dieting/ ps. I believe in portion control, something that I can do most of the time but not all the time...
个人分类: From the U.S.|1794 次阅读|8 个评论
开花时间控制:又一个透视反义RNA和染色质互作的窗口
bioysy 2012-8-15 09:28
Flowering time control :another window to the connection between antisense RNA and chromatin 关注点: 1 反义RNA可以作为一种调节元件,调节对象是染色质 2 反义RNA的调节和开花时间控制有关,这个我比较关心 3 扩展下,RNA本身(而不是RNA的编码产物,如蛋白质)有重要的功能 Flowering time control another window to the connection between antisense RNA an.pdf
个人分类: flowering|3868 次阅读|0 个评论
Bringing energy into Engineering Cybernetics (control)
josh 2012-8-6 22:36
Bringing energy into Engineering Cybernetics (control)
The "cybernetics" of Wiener ("Cybernetics, or Control and Communication in the Animal and the Machine," John Wiley Sons, Inc., New York, 1948) is the science of organization of mechanical and electrical components for stability and purposeful actions. A distinguishing feature of this new science is the total absence of considerations of energy, heat, and efficiency, which are so important in other natural sciences. In fact, the primary concern of cybernetics is on the qualitative aspects of the interrelations among the various components of a system and the synthetic behavior of the complete mechanism. The purpose of "Engineering Cybernetics" is then to study those parts of the broad science of cybernetics which have direct engineering applications in designing controlled or guided systems. The above three paragraphs are exerpted from Tsien Hsue-Shen's Engineering Cybernetics, which was published over half a century ago. Back then, the observation that" this new science is the totally absent of considerations of energy, heat, and efficiency" was certainly valid. Actually it was quite sharp and insightful. Thus ever since then, Engineering Cybernetics (control engineering) has beencategorizedby many researchers as an information science, or even as a branch of applied mathematics. But things have changed. Bringing energy, heat, and efficiency into Engineering Cybernetics (control) nowadays is the big trend, and attracts more and more attentions. Along this trend, one may predict that physics experiments, rather than mathematical derivations and numerical simulations, would be more and more important for control researchers. And hopefully, this change would rescue Engineering Cybernetics (control) by preventing it from being absorbed into information science or applied mathematics, considering the saying "Control is Dead? http://blog.sciencenet.cn/blog-1565-344686.html " by Professor Ho Yu-Chi http://blog.sciencenet.cn/home.php?mod=spaceuid=1565 .
个人分类: Engineering Cybernetics|4406 次阅读|0 个评论
David M. Koenig on white noise and control
josh 2012-7-27 18:07
David M. Koenig on white noise and control
David M. Koenig, who had a 27 year career in process control and analysis for Corning, Inc. before he retired, wrote his relevant experience in this field into his new book “Practical Control Engineering: Guide for Engineers, Managers, and Practitioners” (2009), making it very different from other books on this topic. In the end of chapter eight of this book, after introducing the basic concepts of stochastic process disturbance, he added a very insightful comment section “Comments on Stochastic Disturbances and Difficulties of Control” to this whole chapter: “How well can a process be controlled when it is subject to white noise only? This is an interesting question because many statisticians will immediately throw up their hands and make some condescending comment suggesting that control engineers should keep their hands off processes subject to white noise because any attempts to control such a process only causes troubles. Part of that answer, minus the condescension, is correct, at least in my opinion. Consider the case where you are the controller and you observe samples of the process output whose average has been satisfactorily close to set point and that suffers only from white noise disturbances. Should you make an adjustment to the control output upon observing a sample of the process output that is not on set point? If the average of the process output is indeed nearly at the set point then any deviation, if it is really white or unautocorrelated, will be completely independent of the previous value of the control output and it will have no impact on subsequent disturbances. Therefore, if you should react to such a deviation, you would be wasting your time because the next observation will contain another deviation that has nothing to do with the previous deviation on which you acted. You, in fact, may make things worse. Figure 1. White noise with standard deviation 0.103 …Consider the Foreword Figure 1 above where the process output is subject to white noise (whose standard deviation is 0.103), the process is first-order with a time constant of 10.0 time units and a PI controller, conservatively tuned, is active. Note the activity of the control output as the noise on the process output feeds through the controller. At 50 time units, the set point is stepped and the controller satisfactorily drives the average value of the process output to the new set point. However, the standard deviation of the process output about the set point is 0.115. So, the controller has amplified the noise. Ha! The statistician is smirking. A feedback controller cannot decrease the standard deviation of the white noise riding on the process output. At best it can keep the average on set point. The catch is that in most industrial situations one needs the controller actively watching and controlling the process in case there are set-point changes and in case some non-white noise disturbance appears. To quote a famous control engineer, “life is not white noise.”…” In chapter eleven of his book, David once again touched upon the topic of “Control of White Noise”: “… Statisticians consistently claim that processes subject to white noise should not be controlled because the act of control amplifies the white noise riding on the process variable. The logic (which we have already touched on in earlier chapters, see the foreword of this report) goes something like this. Consider the case where you are the controller and you are responsible for making control adjustments based on a stream of samples coming at you at the rate of, say, one per minute. Assume that you know that a sample is deviating from the target solely because of white noise. Therefore, the deviation of the ith sample is completely unautocorrelated with the deviation of the i -1th sample and will be completely unautocorrelated with the i + 1 th sample. Consequently, it would be useless to make a control adjustment. If you did make an adjustment based on the ith sample's deviation, it would likely make subsequent deviations larger. On the other hand, if you knew the deviation of the ith sample was the result of a sudden offset that would persist if you did nothing, then you would likely make an adjustment. I certainly agree with this logic but there are some realities on the industrial manufacturing floor where automatic feedback control of process variables subject to white noise is unfortunately necessary, especially when a load disturbance comes through the process or when there is a need to change the set point.”
个人分类: Engineering Cybernetics|3301 次阅读|0 个评论
Restoring eco-balance
蒋高明 2012-7-11 23:14
Jiang Gaoming July 06, 2012 A 1970s botanist and his theory of ecological balance are all but forgotten in China. But the country needs them today more than ever, writes Jiang Gaoming. “An overreliance on human interference has led to excessive use of chemical feeds, fertiliser, pesticides, plastic membranes, herbicides, additives and GM technology, gravely upsetting the ecological balance of our farms.” Related articles “We will only succeed if we are equal” June 15, 2012 Our bounded world June 15, 2012 We’re all farmers now May 29, 2012 In the late 1970s, China was swept by a wave of economic growth, and with it a wholesale attack on nature. Grain was planted on grasslands and profits extracted from rivers. Land was reclaimed from lakes and seas and forests were felled for arable land. Seeing those drastic and potentially disastrous steps, an ecologist named Hou Xueyu spoke out. Humanity needed to respect nature’s rules, he said, and safeguard its ecological balance. Today, Hou Xueyu, who was a member of the Academic Divisions of the Chinese Academy of Sciences and researcher at the academy’s Institute of Botany, is all but forgotten. But his theory of ecological balance remains as relevant as ever. It still has a role to play in guiding social and economic development. Put simply, ecological balance is a state of adaptation, harmony and unity between organisms and their environment. When an ecosystem is in balance, different parts of the system maintain certain ratios to each other and can maintain that balance even in the face of external interference. But if balance is lost, the system will head towards collapse. To allay worries that maintaining “balance” meant halting development, Hou used a vivid metaphor: ecological balance is like riding a bike, he said. A bike must be in motion to be stable; if it’s not moving, you fall off. If one component fails – if the handlebars fall off, the brakes fail or the tire springs a leak – the balance is lost and the rider won’t move forward. Three decades on, nobody talks of ecological balance. Instead, the buzzwords are “ecological construction” or “ecological development”. “Pollute and destroy first, clean up and develop later.” “Better to die of pollution than poverty.” These are the mottos by which China’s officials live. Development trumps all, nothing is more important than GDP, and Hou’s warnings – indeed Hou himself – have vanished into history. Virtually all of China’s ecosystems – both natural and artificial – are in various states of crisis. Grasslands are degraded by overgrazing and mining. Efforts to exterminate locusts on the grasslands have killed off birds, beneficial insects and other predators of locusts. The natural wetlands of the rural north have been drained and vast swaths of surface water turned filthy. Polluted coastal wetlands have been struck by red tides and natural woodlands swallowed up by development, while village trees are uprooted, trussed up and carted off to the cities for “urban greening” projects. Invasive species create constant crisis, and are found even in nature reserves. And look at our agricultural ecosystems. An overreliance on human interference has led to excessive use of chemical feeds, fertiliser, pesticides, plastic membranes, herbicides, additives and GM technology, gravely upsetting the ecological balance of our farms. Fertiliser use has increased 100-fold since the early 1950s, and has long focused on providing nitrogen, phosphorus and potassium – not on returning organic material to the soil. The carbon-nitrogen ratio is badly out of balance, resulting in hardpan soil , acidification, and less fertile land than ever before. Overreliance on pesticides has killed off the natural predators of pests, while the pests themselves continue to develop new resistance, driving use of ever more toxic chemicals. Herbicides temporarily controls weeds, but the weeds return the next year, and so does the herbicide – in greater quantities. Plastic agricultural membranes are everywhere; fields are full of this “white terror”. The old days of Chinese villages blessed with clean air, water and fresh food are gone. Pests and weeds are growing in number, while extended exposure to chemicals is causing illness – particularly cancer – among villagers. The levels of pesticides, herbicides and growth hormones in food are rising and affecting the health of urban consumers. These are the harsh lessons seen when human arrogance disrupts the ecological balance. More worryingly, humans are failing to look for the root of the problem. When we see more pests and weeds, we just step up the fight. The pest-killing Bt gene is transferred into crops, turning plant cells into “pesticide factories”, while complementary pesticides are used in a pincer attack. The more highly toxic glyphosate weed killers are used in tandem with crops genetically engineered to resist the chemical, while everything else green dies off. But the dangers of glyphosate when it enters the environment, our food and our bodies, have not been explained. GM technology adds insult to the injury of an already unbalanced ecosystem. After a decade of genetically modified crop cultivation, US fields are now plagued by “ superweeds ” and “ superpests ”. America is the largest planter of such crops, and the water and air of its agricultural areas are already widely polluted with genetically modified material. And American farmers are afflicted by higher planting and pesticide costs. Early this year, 300,000 organic farmers took GM giant Monsanto to federal court , claiming the company was infringing their rights to plant traditional crops and damaging the agricultural foundations of their industry. The problems are not going unrecognised. Last year, the UN’s Food and Agriculture Organisation issued a call for agriculture to return to nature. With Save and Grow , a publication of the FAO’s plant production and protection division, the organisation launched an initiative “to produce food for a growing world population in an environmentally sustainable way.” “The present paradigm of intensive crop production cannot meet the challenges of the new millennium. In order to grow, agriculture must learn to save,” the FAO said. “The Save and Grow model incorporates an ecosystem approach that draws on nature's contribution to crop growth – soil organic matter, water flow regulation, pollination and natural predation of pests.” UN expert Oliver de Schutter has gone so far as to say that small-scale farmers could double food production in 10 years in critical regions by adopting eco-farming methods – plenty to meet the additional needs created through population growth. Meanwhile, Europe has remained wary of the GM project: in January, German chemical company BASF announced plans to stop producing genetically modified crops for the European market and move its plant-science headquarters to the United States due to “lack of acceptance for this technology in many parts of Europe from the majority of consumers, farmers and politicians.” And, in both Europe and America, public and private efforts are bolstering eco-agriculture and trade, a trend neatly symbolised by Michelle Obama, America’s first lady, when she planted organic vegetables in the White House garden. Unfortunately, however, what the developed nations increasingly see as trash technology is now moving to the developing world, and in particular to China. If humanity wants to survive, it must better manage its relationship with nature. We must recognise that the productive forces of science and technology can also destroy. If we do not protect ecosystems, human society is doomed. We must learn from the fall of Mayan civilisation and from the failings of the old development model. And we must listen to Hou Xueyu; it’s time to get back on our bikes and restore ecological balance. Jiang Gaoming is chief researcher at the Chinese Academy of Science’s Institute of Botany and deputy secretary of the Ecological Society of China. Homepage image by Greenpeace
个人分类: 环保呐喊|3709 次阅读|0 个评论
什么个道理
热度 5 cosismine 2012-7-2 13:08
在网上想找一篇自己的博文,用google,输入自己的名字,刘玉仙,然后被告知: We've observed that searching for in mainland China may temporarily break your connection to Google. This interruption is outside Google's control. 什么个道理?
3137 次阅读|5 个评论
50+ years of control reseach at Harvard in 5+ minutes
热度 2 何毓琦 2012-6-1 22:16
Fornew readers and those who request to be “ 好友 good friends” please read my 公告 栏 first. My colleague Roger Brockett retired from Harvard faculty onMAY 16, 2012. This brought the history of control research at the Harvard Schoolof Engineering and Applied Sciences for the past half century to a close. I wasasked to say a few words at his retirement dinner. Since I was physically inChina, I used the technology of video chat to give my congratulatory remarkswith the above title remotely. The period 1959-2012 is a remarkable time for many of us in systems and control.The website to download and/or watch this short video is https://dl.dropbox.com/u/82540297/Larry_Ho.mp4 Note added 4/4/2013: The link above no longer works. But the link/instruction below works) It is also available at the Tsinghua CFINS website under “activities” You may or may not need the very useful FREE software“dropbox” which can be downloaded from dropbox.com if you don’t already haveit. I am also try to get this video mounting on my personalHarvard webpage in the meantime. Let meknow if you have any problem watching this. Note added 6/17/2012. You can now also downlaod this video also from my own webpage: hrl.harvard.edu/~ho under the button video
个人分类: 生活点滴|11595 次阅读|4 个评论
高通量测序中的一个read具有多个位点map上的现象的一些看法
TripleW 2012-1-8 16:57
很难避免,这种现象,高通量产生的数据中,有部分在基因组上有多个map质量是一样好的现象,这往往是基因组中重复区域造成的。 而这个往往researchers采取了抛弃处理,但是如果我们仔细想想,这种处理会造成哪些后果,我们是不是应该手下留情,进而进一步的慎重考虑对这部分序列采取什么样的后备方案。 看一下背景,这部分的reads,因所测得基因组不同而不同,对于基因组有大量的重复区域,这部分被抛弃的reads的比重将非常高,我没有查过拟南芥的重复区域占有的比重多少(如果您知道,请告诉我),但是我在map某个ChIP-seq数据到拟南芥基因组上的时候,发现有20%的是在基因组上有多个同等质量map位点的,而这部分序列都被抛弃掉了,在加上那些没有map到基因组上的,这样下来,就有近50%的reads被抛弃。用剩下的50%所做出的研究,虽然也许可能提高了研究的灵敏度,但是同时也忽略了大量的信息,甚至可能引入某些假阳性。 为什么可能会引入假阳性? 对于具有两个样本(一个实验组A,一个control组B)的实验研究,这样就会产生两套数据,各自align到基因组上的时候,就会分别出现各自上面说过的现象。假设,他们各自有20%的比率吧,如果这两者间的这些被抛弃的序列彼此都非常相近,我们可以想见,即使去掉了这部分序列,也无伤大雅,但是如果他们不是呢? 对于A组,这20%被抛弃的reads,假设这其中有10%的reads在B组中,只有一个最好map质量,那么在B组中就不会被抛弃,而是会会被保留下来;反之亦然,B组中的有10%如果再A组中的map只有一个最好的map位点,那么也将会被保留。这样下来,就会产生20%的差异(如果真是这样的情况出现,应该还不止)。 所以在基因组上具有多个位点所对应的reads,是不是应该被抛弃,就值得我们进一步的深思。。。。
个人分类: ChIP/RNA-seq|5140 次阅读|0 个评论
请高手给出译文
热度 3 histly 2012-1-4 16:26
Yet, all things considered, I remain sympathetic to Merton’s view that there exists ‘a distinctive pattern of institutional control of a wide range of motives which characterizes the behavior of scientists’ and I am happy to live with the ‘noble fiction’ – Plato’s idea of the useful lie – since the alternative to accepting that there are commonly held imperatives which guide the actions of scientists and scholars is rampant relativism and, ultimately, epistemological anarchy. 最后一行is的主语是哪一个?
3315 次阅读|8 个评论
[转载]control yourself is very important
jiaguangjia 2011-12-9 14:22
面对无中生有的讥毁,我们没有必要心生怨气;听到别人的夸奖,也没有必要沾沾自喜,不妨把度量放大一些,要知道我们短暂的一生,并不是为了生气或赞叹而活。毁誉从来不可听,要懂得放下对外境的执着,守住这颗自然纯朴之心。活着,要能够看得到自己,能够掌握自己的情绪。 So listen to your heart and love yourself!
2272 次阅读|0 个评论
[转载]BP Medications More Effective When Given at Night
xuxiaxx 2011-10-27 09:04
Among patients with chronic kidney disease (CKD) and hypertension, taking at least 1 antihypertensive medication at bedtime significantly improves blood pressure (BP) control, with an associated decrease in risk for cardiovascular events, according to new research. Ramón C. Hermida, PhD, and colleagues from the Bioengineering and Chronobiology Laboratories at the University of Vigo, Campus Universitario, Spain, published their findings online October 24 in the Journal of the American Society of Nephrology . According to the researchers, the beneficial effect of taking BP medication at night has been previously documented, but "the potential reduction in risk associated with specifically reducing sleep-time BP is still a matter of debate." The current prospective study sought to investigate in hypertensive patients with CKD whether bedtime treatment with hypertension medications better controls BP and reduces CVD risk compared with treatment on waking. The study included 661 patients with CKD who were randomly assigned either to take all prescribed hypertension medications on awakening or to take at least 1 of them at bedtime. Ambulatory BP at 48 hours was measured at least once a year and/or at 3 months after any adjustment in treatment. The composite measure of cardiovascular events used included death, myocardial infarction, angina pectoris, revascularization, heart failure, arterial occlusion of lower extremities, occlusion of the retinal artery, and stroke. The investigators controlled their results for sex, age, and diabetes. Patients were followed for a median of 5.4 years; during that time, patients who took at least 1 BP-lowering medication at bedtime had approximately one third of the CVD risk compared with those who took all medications on awakening (adjusted hazard ratio , 0.31; 95% confidence interval , 0.21 - 0.46; P .001). A similar significant reduction in risk with bedtime dosing was noted when the composite CVD outcome included only cardiovascular death, myocardial infarction, and stroke (adjusted HR, 0.28; 95% CI, 0.13 - 0.61; P .001). Patients taking their medications at bedtime also had a significantly lower mean BP while sleeping, and a greater proportion of these patients had ambulatory BP control (56% vs 45%; P = .003). The researchers estimate that for each 5-mm-Hg decrease in mean sleep-time systolic BP, there was a 14% reduction in the risk for cardiovascular events during follow-up ( P .001). According to Dr. Hermida and colleagues, "treatment at bedtime is the most cost-effective and simplest strategy of successfully achieving the therapeutic goals of adequate asleep BP reduction and preserving or re-establishing the normal 24-hour BP dipping pattern." The authors suggest that a potential explanation for the benefit of nighttime treatment may be associated with the effect of nighttime treatment on urinary albumin excretion levels. "We previously demonstrated that urinary albumin excretion was significantly reduced after bedtime, but not morning, treatment with valsartan," they note. In addition, this reduction was independent of 24-hour changes of BP, but correlated with a decline in BP during sleep. 来源: http://www.medscape.com/viewarticle/752348
1364 次阅读|0 个评论
[转载]NEWS & VIEWS: Degrees of control
热度 1 Fangjinqin 2011-5-13 09:13
Degrees of control .pdf NEWS VIEWS: COMPLEX NETWORKS Degrees of control One might expect that social networks would generally be harder to control than naturally occurring systems such as biological networks. But this is not so, according to a new study. See Article p.167 MAGNUS EGERSTEDT Networks can be found all around us. Examples include social networks (both online and offline), mobile sensor networks and gene regulatory networks. Such constructs can be represented by nodes and by edges (connections) between the nodes. The nodes are individual decision makers, for instance people on the social-networking website Facebook or DNA segments in a cell. The edges are the means by which information flows and is shared between nodes. But how hard is it to control the behaviour of such complex networks? On page 167 of this issue, Liu et al.1 show that the answer to this question is anything but intuitive. The flow of information in a network is what enables the nodes to make decisions or to update internal states or beliefs — for example, an individual’s political affiliation or the proteins being expressed in a cell. The result is a dynamic network, in which the nodes’ states evolve over time. The overall behaviour of such a dynamic network depends on several factors: how the nodes make their decisions and update their states; what information is shared between the edges; and what the network itself looks like — that is, which nodes are connected by edges. Imagine that you want to start a trend by influencing certain individuals in a social network, or that you want to propagate a drug through a biological system by injecting the drug at particular locations. Two obvious questions are: which nodes should you pick, and how effective are these nodes when it comes to achieving the desired overall behaviour? If the only important factor is the overall spread of information, these questions are related to the question of finding and characterizing effective decision-makers. However, the nodes’ dynamics (how information is used for updating the internal states) and the information flow (what information is actually shared) must also be taken into account. In their study, Liu and co-workers1 do just this by combining the principles of network science with tools found traditionally in the domain of control theory2,3. Central to the question of how information, injected at certain key locations, can be used to steer the overall system towards some desired performance is the notion of controllability — a measure of what states can be achieved from a given set of initial states. Different dynamical systems have different levels of controllability. For example, a car without a steering wheel cannot reach the same set of states as a car with one, and, as a consequence, is less controllable. Liu and colleagues1 found that, for several types of network, controllability is connected to a network’s underlying structure4–6. The authors identified what driver nodes — those into which control inputs are injected — can direct the network to a given behaviour. The surprising result is that driver nodes tend to avoid the network hubs. In other words, centrally located nodes are not necessarily the best ones for influencing a network’s performance. So for social networks, for example, the most influential members may not be those with the most friends. The result of this type of analysis1,4 is that it is possible to determine how many driver nodes are needed for complete control over a network. Liu et al. do this for several real networks, including gene regulatory networks for controlling cellular processes, large-scale data networks such as the World Wide Web, and social networks. We have a certain intuition about how hard it might be to control such networks. For instance, one would expect cellular processes to be designed to make them amenable to control so that they can respond swiftly to external stimuli, whereas one would expect social networks to be more likely to resist being controlled by a small number of driver nodes. It turns out that this intuition is entirely wrong. Social networks are much easier to control than biological regulatory networks, in the sense that fewer driver nodes are needed to fully control them — that is, to take the networks from a given configuration to any desired configuration. Liu and colleagues find that, to fully control a gene regulatory network, roughly 80% of the nodes should be driver nodes. By contrast, for some social networks only 20% of the nodes are required to be driver nodes. What’s more, the authors show that engineered networks such as power grids and electronic circuits are overall much easier to control than social networks and those involving gene regulation. This is due to both the increased density of the interconnections (edges) and the homogeneous nature of the network structure. These startling findings1 significantly further our understanding of the fundamental Figure 1 | Tough job. Liu et al.1 show that complex networks such as biological networks, metaphorically depicted by this locust swarm, are not at all easy to control. AP PHOTO/J. ACOSTA 158 | NATURE | VOL 473 | 12 MAY 2011 NEWS VIEWS 2011 Macmillan Publishers Limited. All rights reserved properties of complex networks. One implication of the study is that both social networks and naturally occurring networks (Fig. 1), such as those involving gene regulation, are surprisingly hard to control. To a certain extent this is reassuring, because it means that such networks are fairly immune to hostile takeovers: a large fraction of the network’s nodes must be directly controlled for the whole of it to change. By contrast, engineered networks are generally much easier to control, which may or may not be a good thing, depending on who is trying to control the network. ■ Magnus Egerstedt is in the School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332, USA. e-mail: magnus@gatech.edu 1. Liu, Y.-Y., Slotine, J.-J. Barabási, A.-L. Nature 473, 167–173 (2011). 2. Mesbahi, M. Egerstedt, M. Graph Theoretic Methods in Multiagent Networks (Princeton Univ. Press, 2010). 3. Kalman, R. E. J. Soc. Indus. Appl. Math. Ser. A 1, 152–192 (1963). 4. Rahmani, A., Ji, M., Mesbahi, M. Egerstedt, M. SIAM J. Contr. Optim. 48, 162–186 (2009). 5. Tanner, H. G. 43rd IEEE Conf. Decision Contr. 3, 2467–2472 (2004). 6. Lin, C.-T. IEEE Trans. Automat. Contr. 19, 201–208 (1974). CANCER The flipside of Notch Mutations that lead to increased activity of the Notch signalling pathway are well defined in human cancer. New work implicates decreased activity of this pathway in a type of blood cancer. See Letter p.230 DEMETRIOS KALA I T Z I D I S SCOTT A . ARMSTRONG Some of the most common and well studied mutations in human cancers affect signal-transduction pathways. For instance, mutations that lead to increased activity of the receptor protein Notch are frequently found in a type of blood cancer called T-cell acute lymphoblastic leukaemia/ lymphoma1. On page 230 of this issue, Klinakis et al.2 report that mutations that lead to reduced activity of this protein are associated with another human blood cancer, chronic myelomonocytic leukaemia. This finding suggests that Notch can have either an oncogenic or a tumour-suppressive effect in blood cancers. The Notch signalling pathway is evolutionarily conserved and has crucial roles in the development and maintenance of embryonic and adult tissues. Notch signalling is initiated when one cell expressing the appropriate ligand interacts with another cell expressing a Notch receptor. Ligand–receptor binding leads to a series of steps involving Notch processing. One such step requires the γ-secretase enzyme complex, which, through protein cleavage, generates a portion of the Notch receptor — called the Notch intracellular domain (NICD) — that is no longer bound to the cell membrane and that relocates to the nucleus (Fig. 1). In the nucleus, the NICD interacts with DNA-bound protein factors (CSL/CBF1/ RBPjγ) and recruits MAML proteins to modulate the expression of many genes3. One of the genes is the Notch target Hes1, whose increased expression is part of the mechanism by which Notch signalling influences cellular physiology. The functions of the Notch pathway are highly cell-type dependent in different embryonic and adult tissues, as well as in cancers4. It therefore seems likely that Notch regulates diverse context-specific gene-expression programs that we are just beginning to understand. To investigate the role of Notch in the haematopoietic system, Klinakis et al.2 specifically inactivated the Nicastrin gene in mouse blood cells. (Nicastrin is an essential component of the γ-secretase complex and so is required for the Notch-pathway function.) Surprisingly, the mice died relatively quickly — 20 weeks after birth — from a blood disorder similar to human chronic myelomonocytic leukaemia. The γ-secretase complex has other functions besides processing Notch5. However, the authors confirm the significance of losing Notch signalling by showing that deletion of just the Notch1 and Notch2 receptors from blood cells is sufficient to produce the same cancer in mice. In addition, activation of the Notch pathway in cells lacking Nicastrin ameliorated the leukaemia, further supporting the crucial role of the Notch pathway. Klinakis and colleagues also show that the effects of Notch loss on blood cells is cellautonomous — that is, the cancer is due to the loss of Notch function in blood cells and not to its effects on other organs that then feed back to blood cells. This is an important demonstration, because the disruption of Notch signalling in mouse skin also leads to blood disorders in a non-cell-autonomous manner6,7. Klinakis et al. further report that Notch signalling actively represses a gene-expression program in blood stem and progenitor cells that is associated with differentiation of these cells along the myeloid lineage. Thus, loss of Notch signalling seems to ‘rewire’ early blood cells to inappropriately express genes specifying a myelomonocytic fate that, in mouse models, leads to leukaemia. In addition to defining a new role for Notch signalling as a suppressor of leukaemia Inactivation Activation T-ALL CMML Notch NICD Nucleus -secretase Figure 1 | The Notch signalling pathway and blood disorders. On interaction with an appropriate ligand (not shown) the Notch receptor is processed by the γ-secretase complex to form an intracellular domain (NICD), which accumulates in the nucleus to modulate gene expression. Activating mutations in Notch receptors have been described in T-cell acute lymphoblastic leukaemias/lymphomas (T-ALL), making the receptors an attractive drug target for this cancer. But Klinakis et al.2 ascribe a tumoursuppressor role for Notch in another blood cancer, chronic myelomonocytic leukaemia (CMML). 1 2 M AY 2 0 1 1 | VO L 4 7 3 | N AT U R E | 1 5 9 NEWS VIEWS RESEARCH 2011 Macmillan Publishers Limited. All rights reserved
个人分类: 学术文章|2718 次阅读|0 个评论
[转载]Special Issue on: "Sliding Mode Control Theory and App
zhaody 2010-12-24 08:29
International Journal of Modelling, Identification and Control (IJMIC) Call For papers Special Issue on: Sliding Mode Control Theory and Applications Subject Coverage Suitable topics, include, but are not limited to, the following: Sliding mode control in power electronics Sliding mode control applications in robotics and complex motion steering systems Sliding mode observers and their applications Higher order sliding modes Output feedback sliding mode controllers Discrete time sliding mode controllers Combination artificial intelligence and sliding mode controllers Chattering analysis Notes for Prospective Authors Submitted papers should not have been previously published nor be currently under consideration for publication elsewhere. (N.B. Conference papers may only be submitted if the paper was not originally copyrighted and if it has been completely re-written). All papers are refereed through a peer review process. A guide for authors, sample copies and other relevant information for submitting papers are available on the Author Guidelines page Important Dates Submission deadline: August 31, 2011 Editors and Notes You may send one copy in the form of an MS Word file attached to an e-mail (details in Author Guidelines ) to the following: Prof. Shaocheng Qu Department of Information and Technology Central China Normal University Wuhan, Hubei 430079 P.R. China Email : qushaocheng@mail.ccnu.edu.cn Prof. Feng Qiao Faculty of Information and Control Engineering Shenyang Jianzhu University Shenyang, Liaoning 110168 P.R. China Email : fengqiao@sjzu.edu.cn Dr. Dongya Zhao College of Mechanical and Electronic Engineering China University of Petroleum Changping, Beijing 102249 P.R. China Email : dongyazhao@gmail.com Please include in your submission the title of the Special Issue, the title of the Journal and the name of the Guest Editor . Please visit: http://www.inderscience.com/browse/callpaper.php?callID=1510 Variable structure systems with sliding modes have recently become one of the most exciting research topics in variety of fields. This is mainly due to their inherent robustness with respect to the plant model uncertainty, parameter perturbation and external disturbance which are unavoidable in any industrial environment. Furthermore, numerous very successful engineering applications of the systems have recently been reported. Therefore, the purpose of the special issue is focused on state-of-the-art solutions in the field of sliding mode control system design and their industrial applications. We welcome original research papers that describe new ideas and approaches in theory and applications. Guest Editor : Prof. Shaocheng Qu, Central China Normal University, P.R. China Prof. Feng Qiao, Shenyang Jianzhu University, P.R. China Dr. Dongya Zhao, China University of Petroleum, P.R. China
个人分类: 未分类|5337 次阅读|0 个评论
Mercurial 的学习
weihuayi 2010-12-17 12:22
Mercurial is software of distributed revision control. 1, 在主目录下建立 Mercurial 的配置文件,.hgrc # This is a Mercurial configuration file. username = Huayi Wei huayiwei1984@gmail.com 2, 常用命令: hg clone hello hello-push hg log -v p r hg commit hg incoming | hg pull ../my-hello hg outgoing | hg push 3 , It is possibleand often usefulto have the default location for hg push and hg outgoing be different from those for hg pull and hg incoming . We can do this by adding a default-push entry to the section of the .hg/hgrc file, as follows. default = http://www.selenic.com/repo/hg default-push = http://hg.example.com/hg Starting a new project hg init my project | Simplifying the pull-merge-commit sequence hg pull -u hg merge hg commit -m 'Merged remote changes'
个人分类: Mercurial: distributed revision control|1 次阅读|0 个评论
Using Biomarkers: Problems & Control (Case-Cohort Study)
molecularepi 2010-10-6 19:54
个人分类: 生物标记物 (Biomarker)|5114 次阅读|0 个评论
ASIS&T 2010年会更多关注受控词表和分类表
xuechunxiang 2010-8-30 16:59
ASIST2010年会将于10月份在匹兹堡大学召开(http://www.asis.org/asist2010 /index.html),这次会议的主题是:Navigating Streams in an Information Ecosystem。从会议的seminar和workshop以及要求的keynotes来看,受控词表和分类法仍然是今年的焦点,此外还涉及信息检索和人机交互。 5个seminar,其中前两个是基于SKOS的受控词表构建与相关软件培训,第三和第四是关于分类工具和分类表设计的,第五个是关于意义建构的,涉及人机交互领域: (1)SKOS-2-HIVE: Part 1: Introductory Session: SKOS-2-HIVE (2)SKOS-2-HIVE: Part 2: Implementing HIVE (3)Taxonomy Tools: Requirements and Capabilities (4)Getting Started with Business Taxonomy Design (5)Sense-Making Methodology Interviewing Tutorial: Approaches for Research, Practice, Pedagogy, and Design 此外,workshop中也涉及了分类法研究与实践的。 * What in the World are We Talking About? Info Needs, Seeking And Use: Key Concepts: Convergences And Divergences - The Differences That Definitions Make in Core Concepts (SIGs SI USE) Organizers: Brenda Dervin, Howard Rosenbaum and Christine Urquhart * Defining the Limits of Classification Research Practice (SIG CR) Speakers: Diane Neal, Megan Winget, Molly Tighe, Grant Campbell, Joe Tennis, Abby Blachly * Assessing Research Data Needs at Your Institutiom (SIG DL) Speakers: Barrie Hayes, Susan Wells Parham, Jake Carlson, Brian Westra * Current Research and Thinking in Image Analysis, Descriptions, and Systems (SIG VIS) Speakers: Abby Goodrum, Samantha Hastings, Corinne Jorgensen, Krystyna Matusiak, Elaine Menard, Diane Neal, Brian O'Connor, Abebe Rorissa, Jaime Snyder * Intercultural Information Ethics in the Global Information Ecosystem: Opportunities and Challenges (SIG III) Speakers: Dennis Ocholla, Soraj Hongladarom, Rafael Capurro, Mohammed Aman, Thomas Froehlich, Toni Carbo (moderator) *Educating Information Professionals Around the World (SIGs III ED) Speakers: Gwen Alexander, Alexander Arbuthnot, Gail Bonath, Pascal Calarco, Anne Caputo, Rumi Graham, Guoqiu Li, Ann Prentice, Yukiko Sakai, Nancy Roderer (moderator) 这些能否说明受控和规范语言仍是信息组织与检索领域当前发展的不二法门呢?
个人分类: 学海泛舟|5335 次阅读|3 个评论
随机控制与最优停时
huguixin2002 2010-5-15 08:45
随机控制大致分为四类:开环控制,反馈控制,部分控制(与滤波有关的一类控制问题),Markov控制。其中的Markov控制是一类比较重要的随机控制,虽然有其特殊性,但是在现实控制中,有很多的问题都属于此类控制,而且此类控制操作起来比较的方便。HJB方程在此类控制中起了非常重要的作用。 最优停时问题最为关键的是最优停时的存在性,在很多的时候最优控制存在,但是最优停时不一定存在。
个人分类: 科研经历|7 次阅读|0 个评论
什么是FMO( Fluorescence minus one)?
FlowJo 2010-1-13 06:33
个人分类: FlowJo使用|6825 次阅读|0 个评论
Joseph Juran
fmchen 2010-1-3 12:01
Joseph Juran was born in Braila, Romania. His family came to the United States in nineteen twelve when he was eight. They settled in Minneapolis, Minnesota. He studied electrical engineering at the University of Minnesota. He was also the school champion at the game of chess. After college, the Western Electric Company put him to work on mathematical methods of quality control. He became interested in the idea he termed vital few and trivial many. This idea is popularly known as the eighty-twenty rule. It could mean, for example, that eighty percent of manufacturing problems result from twenty percent of the causes. He named it the Pareto principle, for the Italian economist Vilfredo Pareto. A century ago, Pareto observed that eighty percent of the wealth in Italy went to twenty percent of the population. But Joseph Juran came to recognize that he had misnamed this principle. He knew that unequal distribution had long been observed in other areas, not just wealth. Yet he gave Pareto credit for identifying it as universal when, it seemed, he could have taken the credit himself. He could have called it, he said, the Juran principle. In nineteen fifty-one, he published his Quality Control Handbook. This influential book especially interested the Japanese. He was invited to teach in Japan, and he advised some of its largest companies. The Japanese also had help from another American, William Edwards Deming. The two experts helped Japan became a world leader in quality control. In nineteen sixty-four Joseph Juran published Managerial Breakthrough. This book formed the basis of several other strategies to reduce manufacturing mistakes and cut waste. Among them are the methods known as Six Sigma and lean management. In nineteen seventy-nine, Joseph Juran established the Juran Institute in Connecticut. It works with organizations that want to improve quality. But the main purpose of the institute, he said, is to improve society. Joseph Juran died on February twenty-eighth in Rye, New York. That was where he lived with Sadie Juran, his wife of eighty-one years.
3851 次阅读|2 个评论
[转载]At the Gates of the Millennium: Are We in Control?
josh 2009-7-31 10:34
Proceedingso f the 38thConference on Decision ControlPhoenix, Arizona USA - December 1999 Panos Antsaklis: The 20th century has been full of marvelous advancements in science and technology that have changed dramatically the way we live and work, with the most recent example being the role the internet has assumed in our everyday lives. Our area of Systems and Control is based on firm mathematical foundations and significant theoretical contributions to the area have been made in the past half century. However it sometimes appears that we have not been taking full advantage of the incredible advances in sensor, actuator and microprocessor technologies that are being taking place. If that is the case, what do we plan to do in the future to meet the challenges of the 21st century? There are challenges in designing highly complex engineering systems to meet very ambitious goals in manufacturing and process industries, in transportation and communications to mention but a few. In addition, all these systems are expected to perform well with minimum human supervision, that is with higher autonomy. This presents considerable challenges but also wonderful opportunities, as advances in sensors, actuators, microprocessors and computer networks offer unique opportunities to implement ambitious control and decision strategies. To meet the challenge we will need to develop new methodologies, new ways of addressing control problems and will also need to adjust the way we teach control to students. Changes in control education together with adjustments in research directions and improvement of the public's awareness of our role and contributions may provide the necessary foundations and tools to meet this challenge in the 21st century. It was decided to organize this panel discussion to hear the opinions of a number of experts who agreed to comment on these issues. The panelists come from universities and industry, and their contributions to the field of systems and control collectively span many decades. They bring to the discussion their considerable expertises and experiences, they review past notable research results and make recommendations on how to meet the future challenges. Timothy L. Johnson: (1) Five most notable research results in Systems and Control Theory in the last Century: Bode (1920s) - Stability of operational amplifiers and supporting approach of the Bode plot. Applications too numerous to mention. Caratheodory (1935) - Variational calculus leading to the Maximum Principle. Applications to orbital dynamics and spacecraft. R. E. Bellman (1957) - Dynamic Programming. Applications to social sciences and business. R. E. Kalman (1960) - Kalman Filter. Numerous applications to tracking, estimation, and signal processing. G. Zames (et al, 1970s) - Internal model principle and conic sector theory for Nonlinear Control. Basis for stability analysis for numerous nonlinear and adaptive control schemes. Robust Linear Multivariable Control - (1980s) -Contributors too numerous to mention. Applications just beginning. (2) Five research milestones for the next decade: A practical and general theory of discrete dynamical systems. Petri nets, queueing theory, and CSP notations are all of limited application. A tool with the generality of difference or differential equations is lacking. This has impeded progress in practical applications. These should be presented as special cases of a more general theory which is computationally tractable. A method for the analysis of qualitative properties of hybrid systems. A full theory of these systems probably will not be developed within the decade. However, useful methods for the analysis of qualitative properties, and stability in particular, would open the door to further progress. Formal verification methods for control systems. Given a control system (software) implementation,and a model of the plant it should be possible to prove that the implementation is correct to the extent that it achieves the performance specifications. Applicability of these methods to complex control algorithms involving rule bases, AI, Neural Nets, and other more modern methods. (This is an analog of formal verification of computer programs). Results which are central to the synthesis of control and communications systems. In particular, results which lead to a practical solution to problems of decentralized control in the presence of communications bandwidth limitations. Quantum control systems. Application of feedback theory to quantum mechanical systems, e.g., to understand quantum phenomena in particle physics, biochemistry or in astronomy where feedback is present. (3) Education issues: Students should have a broad engineering (or science) background, detailed practical expertise in at least one applied field, and (in the case of Ph.D. students) either leading edge theoretical knowledge in one area, or patentable inventions in a leading edge technology. A top priority continues to be closing the widening gap between theory and practice in control engineering. (4) Technology Issues: Control is becoming a specialty of applied mathematics and embedded software engineering. The field must either recognize and pursue excellence in these fields or make major changes to reintroduce its linkage to physical systems and system design engineering. (5) Computational tools: The historical trend for new control methods to prove themselves first through physical applications may be changing. An alternative path will be to introduce new theories in the form of design software accessible via the Web, and then to let market demand pick the winners. Currently popular control algorithms (e.g., linear time invariant compensation) are far too restricted (as classes of computations), and will likely give way to more general algorithms which synthesize many approaches and/or use on-line adaptation and design. John C. Doyle: Ill interpret systems very liberally and broadly, perhaps too broadly, and include both notable research results and the subsequent larger programs that followed, with an emphasis on classical results from the mid-century. Ill particularly highlight the fundamental tradeoffs in feedback systems that were first articulated in Bodes Integral Formula and later in various interpolation results by Zames and others, which I would, mostly for sake of an interesting argument, rank at the top of my systems top 5, which are: 1) Feedback, dynamics, and causality (Bode, Zames, ...). 2) Undecidability and computational complexity. 3) Chaos and dynamical systems (Poincare, Eorenz,...). 4) Information (Shannon, Kolmogorov, ...). 5) Optimal control (Pontryagin, Bellman, ...). The twentieth century may be viewed as bringing near closure to the first scientific revolution, which aimed for a simple, certain, reproducible view of nature, in part by a radical denial of the complex and uncertain. Quantum mechanics, relativity, the nature of the chemical bond, and the role of DNA in genetics were among the highlights of this reductionist program, which could presumably be placed in some similar top 5. Mainstream science has focused overwhelmingly on characterizing the fundamental material and device properties of natural systems, and in contrast, has provided few rigorous and predictive tools for dealing with the complexity and uncertainty of the real world outside the laboratory. Unfortunately, current mainstream advocates of a new science of complexity have further abandoned rigor and predictability, in favor of vague notions of emergence and self-organization. Hopefully this collection of systems results will form the basis for truly new science of complex systems, which despite the recent hype does not yet exist. The existing theory is far too disconnected and fragmented, and creating a more unified picture of computation, dynamics, feedback and information is the great challenge of the next decade and next century. Of course, this has been the aim of many researchers at least since Wiener, and the accomplishments so far have not been at all encouraging. It is natural that Bodes integral formula should have a central place in any theory of complex systems, as it was the first result to focus completely on robustness tradeoffs, in this case imposed by causality. The part count in complex systems, from biology to engineering, is dominated by the need to provide robustness to uncertain environments and components. Indeed, most systems could be built in the laboratory under idealized circumstances with orders of magnitude less complexity than is required in their natural environment. Thus robustness tradeoffs must be at the heart of any theory of complexity, with limitations due to computation, dynamics, nonlinearity, and information playing important supporting roles. Yu-Chi Ho: The test of time and rules of history rules out mentioned anything developed in the past 25 years or involving living persons. Furthermore, scientific discovery often is a matter of standing on the shoulder of others. To single out specific results do not seem to be fair to others who laid the foundation. Instead, I propose to list couple of ideas that seems to me have influenced the development of our field in a major way. The fundamental role and the myriad ways of probability and stochastic process in system work. The concept of what constitutes a solution to a problem, e.g., that which can be reduced to a routinely solved problem such as numerical integration and how technology influences it. The notion of dynamics and feedback in all their ramifications. The first item represents how knowledge from outside the field influenced our research while the third states what specific concepts our field contributed to other fields. The second item deals with how practices in science and mathematics are changed by technology. These notions are generic and have parallel in other topics and fields.Scientific crystal balling has a notorious record in the past. The dust heap of past predictions is filled with gross miscalculations and estimations by noted scientists with the best of intentions. Let me try to approach the question whats next in control systems in the 21st century? in a somewhat different way. During my travel and lectures, I am often asked by young scientist engineers starting out in their careers on what are profitable avenues of research to pursue. One is often tempted to point to ones own current research topic, which by definition must be the most interesting things to do. However, this is selfish and dangerous advice. My considered reply, which I myself have followed, is this: Go find a real world problem that a group of people is eager to solve, that happens to interest you for whatever reason, and that you dont know much about. Make a commitment to solve it but not a commitment to use tools with which you happen to be familiar. Such an approach has several immediate advantages. First, if you are successful then you have some free built-in PR. Unsolicited testimonial by others is the best kind of publicity for your work. Second, most probably you have discovered something new or have found a new application of existing knowledge. In either case, you can try to generalize such discovery later into a fruitful research area for which you will be credited with its founding. Third, in a new problem area there is generally less legacy literature you will have to learn and reference. Fourth, a new problem area is like a newly discovered mine. For the same effort you can pick up more nuggets lying near the surface than digging deep in a well worked out mineshaft. By the same reasoning, the probability of serendipity at work is also by definition higher in a new area. Lastly, even if you are unsuccessful in solving the original problem, you will have at the very minimum learned something new and broadening which will increase the chance of your success in future tries. My own personal experiences whether it was differential games, manufacturing automation, perturbation analysis in discrete event simulation, or ordinal optimization reinforces the above belief. Above all, faith in the ability of the future generation of scientists and engineers makes me an optimist in saying, the best is yet to be, you aint seen nothing yet. It is fine to make predictions and to look forward, but there is no need to get too obsessed with divining the future. As far as technology and computational issues are concerned, I believe I have already given my answer in the recent op-ed piece published in the June 99 issue of the IEEE Control System Magazine The No Free Lunch Theorem and the Human-Machine Interface. I shall simply add by repeating what I said at my Bellman Award acceptance the subject of control which is based on mathematics, enabled by computers, is about to have a new birth of freedom under computational intelligence . John S. Baras: The five most notable research results in systems and control are from my perspective: The maximum principle Dynamic programming System realization theory (both linear and nonlinear) Nonlinear filtering theory and the general separation theorem in partially observed stochastic control Robust control synthesis (in the sense of linear and nonlinear H-infinite theory). An important limitation of current theories is that they do not take into account explicitly hardware implementation limitations. An important such limitation is limited bandwidth in feedback loops, limited complexity and computational capabilities of the controller. Developinga methodology for the systematic design of single and networked controllers under severe bandwidth resources in the feedback is an important challenge for the next 5-10 years. As implementations with MEMS and microsystems become more attractive this challenge will translate in many benefits and applications. We still do not have a satisfactory and quantitative way in which to characterize the intelligence of a controller or of a system. The late George Zames had initiated an effort for defining such an index, as roughly a measure of the tasks and satisfactory performances an intelligent controller could achieve, vs. The tasks and satisfactory performances of a classical controller. George focused on adaptive controllers in notes and discussions I had with him. The challenge involves characterization of performance in unknown environments, learning, controller and task complexity, and associated trade-offs. At the conservative end we have robust control. What lies on the other end? Can one develop a theory to start developing meaningful and useful such indices for interesting classes of systems? MEMS, nanoelectronics, nanosensors and nanoactuators, bring sensors and actuators in much closer coupling than before. At these scales the physics are quite different and our traditional models need to be rethought out. More specifically, one should think of the combined design of sensors and actuators without early decisions on system architecture. How can we develop systematic theories for such designs? To what extent these new systems at these extreme scales touch upon quantum systems and quantum computations? The recent excitement in quantum computations and related physical implementations involves some fundamental questions on measurement/sensing and actuation. Systems and control theorists can make significant contributions here. Networks of systems, each equipped with sensors and actuators, is a fundamental paradigm of recent technological and other systems. In such networked systems, subsystems interact through local interactions. An important challenge is to develop modeling and control theories that explain coordination, and emerging global behavior, from these local interactions. This is an important challenge but a promising one. From sensor webs, to microrobots, to biological systems, this is a central problem. On the educational side, we should be promoting systems and control education, as part of the fundamental education any Engineering College undergraduate should receive. We must accomplish this goal within the next decade. On both the undergraduate and graduate level we should be emphasizing a more balanced view between system modeling and control; not just control. In addition it would be important to arrange for both undergraduate and graduate students, with specialization in systems and control, to spend some times in industry internships targeted at industrial strength design projects. The ability to miniaturize sensors and actuators, and to produce essentially materials, sensors and actuators made to order, will change the metrics we currently use to evaluate controls and systems implementations. Handling efficiently the enormous amounts of information needed to describe such systems, controls, and performance criteria, via new appropriate abstractions will require fundamentally new developments.
个人分类: Engineering Cybernetics|321 次阅读|0 个评论

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-4-26 19:04

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部