汤森路透于昨日公布了 2016 年全球引用次数最多的文章的作者名单(人员已确定,机构名称将在 10 月 31 号最后确定)。根据汤森路透的新闻中的报道, 2016 年被引频次最高的研究人员某种程度上代表了全球最有影响的科学大脑。 由于没有看到关于这个名单是如何计算出来的,因此,不清楚是按作者年度发文量的总和计算的还是按单篇计算的,从名单中看到同一个人出现两次,估计是按单篇文章来计算的。错误之处,欢迎大家指出。 从发布的名单中的人员和机构来看,这些高被引论文涉及了临床医学、精神病学、生理学、化学、物理学、空间科学、生态学、生物学、环境科学、等多个学科领域,其中,化学有两篇,临床医学两篇。 从发布的名单的国别来看,除一篇是非英语国家奥地利学者,另一篇为德国学者以外,其余的全是英语国家的学者,而且主要集中在美国、英国、澳大利亚三国,其中,美国占 6 席,而且美国加州伯克利大学的 A . Paul Alivisatos 有两篇分属物理学和化学两个学科领域的文章,让人不得不惊奇,有兴趣的可能去再做一些深入的分析,看看能否揭开高被引论文的神奇谜底。 写这篇博文的目的除了想解读一下汤森路透发布的高引作者消息以外,还想说一下关于论文引用这个话题。高引论文 TOP10的 结果是基于 Web of Science 数据库所收录的文献的引用,非英语国家特别是像中国这样一个以中文作为主要发表形式的国家,在统计Web of Science中论文的引用结果时,是没有办法统计到学者们在非英语期刊或者非 SCI 收录期刊上发表的论文的引用次数,使得非英语国家的学者发表的论文的被引频次打了较大的折扣。 姓名 学科分类 第一机构 第二机构 A . Reimer Space Science 空间科学 Univ Innsbruck, Austria A . Harvey Millar Plant Animal Science 植物与动物科学 Univ Western Australia, Australia A . John Rush Psychiatry/Psychology 精神病学 / 心理学 Univ Texas SW Med Ctr Dallas, USA A . John Camm Clinical Medicine 临床医学 St Georges Univ London, UK A . Keith Dunker Biology Biochemistry 生物学及生物化学 Indiana Univ, USA A . Michael Lincoff Clinical Medicine 临床医学 Cleveland Clin Fdn, USA A . Paul Alivisatos Physics 物理学 Univ Calif Berkeley, USA Lawrence Berkeley Natl Lab, USA A . Paul Alivisatos Chemistry 化学 Univ Calif Berkeley, USA Lawrence Berkeley Natl Lab, USA A Stephen K . Hashmi Chemistry 化学 Ruprecht Karl Univ Heidelberg, Germany King Abdulaziz Univ, Saudi Arabia A. Townsend Peterson Environment/Ecology 环境 / 生态 Univ Kansas, USA
现在越来越多的科研人员知道论文被引情况的重要性,虽然多半有项目申请、报奖等因素,但是对于单一的看期刊影响因子而言,无疑是好的现象。 检索发表的论文,常规方法都是利用作者身份信息,如姓名、机构名、通信地址等,编辑检索式,这种方法对检索人员的经验要求比较高,容易出现误检、漏检。最头大的问题是,中国人的姓名同名太多,缩写之后更是难以区分,因此对于从事科研较久、发表论文数量较多的科研人员的论文收引情况分析,光是编写检索式都需要花很多时间。在工作中就听闻有大牛论文数量和被引次数太多,让学生做了两个星期都没弄完。 因为检索方面的难度,导致虽然科研人员普遍存在即时了解自己论文收引情况的需求,仍然多数都是在查新机构做收引证明的时候才得以了解。 事实上,我们完全可以非常简单的掌握自己SCI论文收引情况,就是利用SCI的收录号。 每篇被SCI收录的出版物都有一个收录号,查找方法如下: 在web of science数据库里检索到需要的结果后,选择需要的结果,在页面下方的输出记录里,导出的时候选择“全记录”,保存为制表符分隔的格式(Win)。如下图: 再到Excel中导入得到的文本,其中UT字段即是收录号: 将收录号拷贝到记事本里,编辑如下形式的检索式,收录号之间以“OR”连接: UT=(000172783900007 or 000179514300007 or 000220405600004 or 000238139000001 or 000242007100005 or 000243505400012 or 000246836000004 or 000257849800001 or 000258529400005 or 000263370200104 or 000263690600010 or 000266138600009 or 000270640300016 or 000273754800007 or 000274839200002 or 000281107500010 or 000289263900009 or 000291942900006) 到Web Of Science数据库,在“高级检索里”粘贴检索式进行检索,结果显示在页面最下方“检索历史”中: 点击检索结果,进入结果列表,接下来利用Web Of Science的分析功能,点击页面右边的“创建引文报告”,就可以了解这些论文的收引详情了: 利用SCI收录号进行检索的特点是检索对象准确,有唯一性,而且方便修改,新增论文仅仅需要在检索式中加一个“OR”即可,非常适合对特定作者论文的跟踪。 希望对大家有所帮助
想让论文被引用得更多? 那就引用更多的文章 来源: http://songshuhui.net/archives/41975.html 研究表明,如果科学家在写论文时引用了较多的参考文献,那么他们的文章也会被引用更多次。研究人员总结了《科学》杂志1901年至2000年的53894篇论文,他们发现,某篇文章 引用文献的数量,和同一篇文章的被引用次数之间有着极强的关系。美国佛罗里达大学Gainesvill分校(University of Florida in Gainesvill) 的教授Gregory Webster领导了 这项研究,他说:如果想让你的文章被引用得更多,那就引用更多的文章吧。 此外,研究还发现,引用文献数和被引数之间的关系在过去的一百年间增加了超过3倍。 值得注意的是,这项研究排除了综述的影响。和原创论文相比,综述会引用更多的文献,也会被很多其它文章引用。不过Webster发现,原创论文引用文献数和被引次数之间关系的 强度,比综述更大。 Webster现在已经开始分析另一著名的科学期刊《自然》的论文。同时他也在试着和科学家们交流,看看引用数和被引数是否是因果关系。虽然仍然缺乏证据,不过Webster认为科 学家们用以牙还牙的心理来处理论文的引用文献。 不过有些学者对这项研究有不同的解读。文献计量学家Jonathan Adams认为,既然全世界的文章数越来越多,那么引用的文章也会增加,总体引用率也就随之增加了。(这里有个 小问题,文章多导致了总被引数多没问题,可是为什么能推理出引用率,也就是每篇文章的引用数多呢?) An easy way to boost a paper's citations An analysis of over 50,000 Science papers suggests that it could pay to include more references. 来源: http://www.nature.com/news/2010/100813/full/news.2010.406.html Zo Corbyn A long reference list at the end of a research paper may be the key to ensuring that it is well cited, according to an analysis of 100 years' worth of papers published in the journal Science. The research suggests that scientists who reference the work of their peers are more likely to find their own work referenced in turn, and the effect is on the rise, with a single extra reference in an article now producing, on average, a whole additional citation for the referencing paper. There is a ridiculously strong relationship between the number of citations a paper receives and its number of references, Gregory Webster, the psychologist at the University of Florida in Gainesville who conducted the research, told Nature. If you want to get more cited, the answer could be to cite more people. Although previous research has suggested or shown a relationship, Webster says, he believes that his study is the first to investigate the phenomenon comprehensively: he has looked at different journals and a large number of articles over a long timescale. Webster has also found the effect, although to a lesser extent, in the the Journal of Consulting and Clinical Psychology and Evolution and Human Behavior both important journals in their fields with the results for the latter published last year1. His latest study, presented at the International Society for the Psychology of Science Technology conference in Berkeley, California, on 7 August, gathered data from the Thomson Reuters Web of Science database for all 53,894 articles and review articles published in the journal Science between 1901 and 2000. A plot of the number of references listed in each article against the number of citations it eventually received reveal that almost half of the variation in citation rates among the Science papers can be attributed to the number of references that they include. And contrary to what people might predict the relationship is not driven by review articles, which could be expected, on average, to be heavier on references and to garner more citations than standard papers. The study also looked at how the relationship has changed over time, finding that it had strengthened more than threefold over the 100-year period studied. By most metrics it is considered a pretty big effect, says Webster. There was a small difference with review articles but, in fact, it was in the wrong direction. On average, review articles actually showed less of a relationship than standard articles. Webster who now wants to extend the analysis to include Nature papers, as well as interview scientists about their behaviour says there is not yet enough evidence to say for sure that the relationship is causal. But he thinks that the psychology of working scientists may see them behave in an almost 'tit-for-tat' way that boosts their citation counts. Relationships based on reciprocal altruism may bloom and fade but over time they might be driving the effect, Webster says. Scientists are subject to social forces as much as anyone in any other profession. But others urge caution in interpreting the results. Jonathan Adams, a bibliometrics expert and director of research evaluation at Thomson Reuters, says that although the findings are intriguing they are not surprising. At a global level, he adds, there are increasing levels of output and therefore referencing, which would, of course, increase gross citation rates. Different subjects also have very different citation patterns and lumping them together as one doesn't tell you very much. Basic citation count must be contextualised against typical rates for the field, says Adams. Webster says it's true that he didn't look at different disciplines but that was not what he was interested in. This study is just looking at the entire pattern, says Webster, The research question I was really interested in is what do things look like in general on average in Science. He says that while he agrees that the scientific enterprise has expanded over time, it shouldn't necessarily affect the relationship between citations and references. They might both increase in tandem but this effect appears to be independent of that trend. References Webster, G. D., Jonason, P. K. Schember, T. O. Evol. Psychol. 7, 348-362 (2009).