科学网

 找回密码
  注册

tag 标签: 信息融合

相关帖子

版块 作者 回复/查看 最后发表

没有相关内容

相关日志

基于算术均值一致性的高效、分布式、联合传感定位与多目标跟踪
热度 1 JRoy 2019-11-25 11:26
A Computationally Efficient Approach for Distributed Sensor Localization and Multitarget Tracking Publisher: IEEE Kai Da ; Tiancheng Li (corresponding author) ; Yongfeng Zhu ; Qiang Fu Abstract: In the context of distributed target tracking based on a mobile peer-to-peer sensor network, the relative locations between the sensors are critical for their internode information exchange and fusion. For accurate coordinate calibration between the neighboring sensors, namely sensor localization, we propose a computationally efficient approach that minimizes the mismatch error between position estimates of the common targets yielded at neighbor sensors. This mismatch error is given by a Wasserstein-like distance that is a mean square error between two sets of position estimates which are associated efficiently via Hungarian assignment. Simulations have demonstrated that our approach, on the testbed of an arithmetic average fusion based probability hypothesis density filter, performs similar to the cutting-edge approach based on loopy belief propagation, but computes much faster and has much lower communication cost. Published in: IEEE Communications Letters ( Early Access ) Page(s): 1- 1 Date of Publication: 21 November 2019 为实现基于分布式传感器网络的目标跟踪,传感器之间的相对位置是节点间信息交换和融合的关键。为了精确标定相邻传感器之间的坐标,本文提出了一种计算效率很高的传感器自定位方法,该方法通过最小化相邻传感器产生目标位置点估计之间的失配误差而实现。这种失配误差类似于Wasserstein距离,其通过匈牙利算法关联的两组位置估计值之间的均方误差而得到。该方法物理概念清晰,计算简便而有稳定,通信需求低。仿真结果表明,在基于算术平均融合的PHD滤波器的实验平台上,该方法的性能与现有基于后验分布匹配的方法相似,但计算速度快,通信成本低。 此文是在笔者在算术均值一致性信息融合方向上系列文章更进一步,首次考虑动态传感网中传感器位置不确定性问题。所采用的滤波器仍然是最基础的PHD滤波器,因此本文方法可以扩展到后需要的其他滤波器。 前面系列论文请见: 基于算术均值一致性的分布式伯努利滤波目标联合探测与跟踪 多目标信息融合问题 并行一致性:网络通讯与节点滤波计算同步进行! 基于受限测距传感网的分布式多目标跟踪 分布式网络信息共享:Many Could Be Better Than All 通讯量最小的分布式多目标跟踪器 基于多传感器观测聚类的鲁棒多传感器PHD滤波
个人分类: 科研笔记|4379 次阅读|2 个评论
多目标信息融合问题
热度 4 JRoy 2019-2-9 16:48
话说,搞信息融合有一个很风靡的概念 Covariance Intersection (CI),就是搞出来 UKF -- 这个卡尔曼滤波器最成功的变种(除EKF外) -- 的Uhlmann 和 Julier搞出来的,和 UKF几乎是相近的两年(据最早的刊印都在1995-1996,都在其博士论文中有体现,也可见剑桥的博士学位真有水平)提出来的。。UKF和CI都是解决KF所没能解决或者说没有考虑的更一般性的问题,一个是非线性UKF滤波器,一个是未知关联CI融合方法。 问题扎扎实实存在,两个方法都简单到了极限,但是又那么有说服力和好用 (人家论文引用也说明了,比如有几个滤波器能达到上万的引用呐?) 。 真可谓天才(如果非要在卡尔曼之后搞滤波和估计人中选几个)。。。现在回头看, 两个人出道即巅峰!! 后来可以说一直吃这两碗饭,但这也足够了,能达到这个光辉层次的工作寥寥无几。 尤其是J.K. Uhlmann,迷一样的存在,至今文章不多(多数是会议),而且一直都不发什么高档次但篇篇有思想。还是一个专业的电影制作人和录音师!! 重点:标准CI仅考虑单目标情形的融合,常规的贝叶斯后验,没考虑漏检、虚警等。 能够跟UKF/CI媲美的可能就是 有限集统计学(finite set statistics,FISST)了,大牛Mahler提出的一套解决多目标跟踪的新理论(也是从1993年默默发会议发到2000年,然后开始不断出现爆款)。。相比UKF和CI这类极度简单思想和套路,FISST就显得恢宏大气,甚至早期的工作拖泥带水般让人难理解。。恰显天才的两种定义,各有千秋各有各的美! PS:Mahler在2000年对CI扩展到多目标随机集框架下,得到 GCI(generalized CI),现在(提出了13年后)也风靡起来了,这得益于FISST和 线传感网的快速发展...... 毕竟是两大宗师接力发展出来的感觉..... 但是我却认为,从单目标贝叶斯后验到含有虚警、漏检的多目标,不能是简单的扩展!不是简单地数学推导从单目标的PDF/density 到 多目标multitarget density这么简单。。还有漏检和虚警,还有不同目标之间的交互呐等这些新问题! 非常简单的一个问题:一个目标A的信息和另一个目标B的信息融合的结果算是什么呐? 一个目标的信息和一个杂波进行融合得到什么呐? 等等 即使 CI 无可挑剔,但是GCI却还有一个物理意义的鸿沟没有填平,被想当然掩盖了。 即使0到1很难,很伟大,但是1到N也可能不是那么简单,甚至可能有更难的问题。 直接完成数学上的推导和扩展,“ 把多目标后验信息当做单目标后验信息来看”,只是在定义域等问题上扩展一下, 从单目标下CI得到多目标的GCI是不靠谱的。会产生很多没有物理意义的东西(比如不同目标之间的信息融合得到什么?)或者说物理上无法解释的现象(近邻的目标会被融合为一个大目标)。。。 理论出发点没有错,数学推导和 计算也没有错 ,但是你得解释计算出来的东西是个什么? 工程既要有数学的装潢,也要有物理的支撑。 虽然GCI可能也work,甚至在某些场景表现优异 -- 这些不能说没有问题,因为它在有些场景根本不work,融合结果还不如不融合,证实问题确实存在。 然而,这挡不住爱玩数学的人继续这么玩下去,偏就硬生生的忽略这些物理机理/意义问题...... 也许CI提出者会说,我原本没有考虑多目标和漏检、虚警问题,问题不在我;GCI提出者说 我只是一篇小文章为了推广我的FISST,蹭了一下当初的热点扩展了一些CI.. .... 上帝有时候还打盹呐,人类的链条知识传播,也容易走样,变了味(虽然有时候是往更好的方向变了去,众人的力量能够达到超出最初预想的结果).... 下文通过一些直观性的、具体的案例和一些基础的、针对性统计学分析来研究上述的问题。 Second Order Statistics Analysis and Comparison between Arithmetic and Geometric Average Fusion https://www.sciencedirect.com/science/article/pii/S1566253518308303 Tiancheng Li , Hongqi Fan , Jesús G.García , Juan M Corchado (Submitted on 23 Jan 2019) Two fundamental approaches to information averaging are based on linear and logarithmic combination, yielding the arithmetic average (AA) and geometric average (GA) of the fusing initials, respectively. In the context of multi-sensor target tracking, the two most common formats of data to be fused are random variables and probability density functions, namely v -fusion and f -fusion, respectively. In this work, we analyze and compare the second order statistics (including variance and mean square error) of AA and GA in terms of both v -fusion and f -fusion. The case of weighted Gaussian mixtures representing multitarget densities in the presence of false alarms and missed detections (whose weight sums are not necessarily unit) is also considered, the result of which turns out to be significantly different from that of a single target. In addition to exact derivation, exemplifying analyses and illustrations are also provided. . https://doi.org/10.1016/j.inffus.2019.02.009 Keywords Multisensor fusion, Average consensus, Distributed tracking, Covariance intersection, Arithmetic mean, Geometric mean, Linear pool, Log-linear pool, Aggregation operator, Highlights Arithmetic averaging (AA) and geometric averaging (GA) are compared AA performs better in fusing variables while GA performs better in fusing PDFs Multitarget density fusion in the presence of false alarms and missed detection is studied. A hybrid fusion rule is proposed combining AA and GA for multitarget density fusion GA is comparably more accurate but less robust compared to AA
个人分类: 科研笔记|9070 次阅读|16 个评论
基于多传感器观测聚类的鲁邦多传感器PHD滤波
热度 2 JRoy 2018-8-14 16:43
论文: A Robust Multi-Sensor PHD Filter Based on Multi-Sensor Measurement Clustering 作者:Tiancheng Li ; Javier Prieto ; Hongqi Fan ; Juan M. Corchado Published in: IEEE Communications Letters (Volume: 22, Issue: 10 , Oct. 2018) Page(s): 2064 - 2067 连接: https://ieeexplore.ieee.org/document/8425712/ This letter presents a novel multi-sensor probability hypothesis density (PHD) filter for multi-target tracking by means of multiple or even massive sensors that are linked by a fusion center or by a peer-to-peer network. As the challenge we confront, little is known about the statistical properties of the sensors in terms of their measurement noise, clutter, target detection probability and even potential cross-correlation. Our approach converts the collection of the measurements of different sensors to a set of proxy, homologous measurements. These synthetic measurements overcome the problems of false and missing data and of unknown statistics and facilitate linear PHD updating that amounts to the standard PHD filtering with no false and missing data. Simulation has demonstrated the advantages and limitations of our approach in comparison to the cutting-edge multi-sensor/distributed PHD filters. 本文提出了一种新的多传感器概率假设密度(PHD)滤波器,用于集中式式或点对点分布式网络链接的多个甚至大量传感器下的多目标跟踪。 本文主要解决的一个挑战是系统缺乏传感器的统计特性如测量噪声、杂波率、目标检测概率甚至潜在的传感器相互关联等。 我们的方法是将不同传感器的测量数据集合、聚类转换为一组合成的代理、同源测量数据。 这些合成测量取代原始量测数据可以实现线性PHD更新,克服了虚警和漏检数据以及未知传感器统计信息的挑战。这一过程相当于一个没有漏检和虚警环境的标准PHD更新。 仿真验证了我们的方法与当前主流的集中式多传感器/分布式PHD滤波器相比的优势和局限性。 T. Li, J. Prieto, H. Fan and J. M. Corchado, A Robust Multi-Sensor PHD Filter Based on Multi-Sensor Measurement Clustering, in IEEE Communications Letters , vol. 22, no. 10, pp. 2064-2067, Oct. 2018. doi: 10.1109/LCOMM.2018.2863387
个人分类: 科研笔记|3136 次阅读|2 个评论
英国剑桥:信息融合大会2018专题-"智能信号处理与数据挖掘"跟踪
热度 1 JRoy 2018-3-10 16:20
欢迎投稿,信息融合大会2018专题-智能信号处理与数据挖掘应用于目标跟踪”。大会2018年7月10-13号在美丽的英国剑桥举办! information fusion - data mining - signal processing 交叉前沿! SS11 - Intelligent Information Fusion and Data Mining for Tracking Research on Intelligent Systems for information fusion data mining has matured during the past years and many effective applications of this technology are now deployed such as Wearable Computing, Intelligent Surveillance, Smart City/Home-Care, Smart Grid, Web Tracking, Network Management. The rapid development of modern sensors and their application to distributed networks provide a foundation for new paradigms to combat the challenges that arise in target detection, tracking, trajectory forecasting and sensor fusion in harsh environments with poor prior information. For example, the advent of large-scale/massive sensor systems provides very informative observation, which facilitates novel perspectives based on data clustering and model learning to deal with false alarms and misdetection, given little knowledge about the objects, sensors and the background. Sensor data fitting and regression analysis provide another unlimited means to utilize the unstructured context information such as “the trajectory is smooth” for continuous-time trajectory estimation and forecasting. As such, the sensor community has the interest in novel information fusion data mining methods coupling traditional statistical techniques for substantial performance enhancement, especially for challenging problems that make traditional approaches inappropriate. This special session aims to assemble and disseminate information on recent, novel advances in intelligent systems, information fusion sensor data mining techniques and approaches, and promote a forum for continued discussion on the future development. Both theoretical and practical approaches to address the problems in this area are welcome. IMPORTANT DEADLINES Full paper submission Deadline extended to15 March 2018
个人分类: 科技动态|4796 次阅读|4 个评论
分布式网络信息共享:Many Could Be Better Than All
热度 2 JRoy 2017-12-20 00:05
( 复杂 ) 网络涉及到一个基础的信息分享问题,即网络节点之间通过信息分享与融合,最终达成“一致” /Consensus ,及网络一致性。特别是相比于基于含有一个网络中间节点的中心式 /Centralized 网络,分布式网络中只通过节点与节点连接(相互称为邻居节点)进行通信,而没有中心节点,所以网络结构更为稳定(不会因为某一节点的破坏等而造成网络瘫痪),易于扩展(网络节点的性质一致,所以任何节点都可以再增加邻居节点)等,也实际上是很多物理网络(如监控传感网、社交网络等)的本质特征。 然而,在多目标跟踪多传感器信息融合里面却存在一个有趣的发现:传感器邻居节点相互之间分享的信息并不一定越多对于大家越有利,这里的“利”特指提高传感器节点估计的精度。这一点初感违背我们的直觉,因为一般的来讲:信息越多(应该)越有利。 那么为什么呐? 物理传感器往往都遭受两类问题:一类是漏检,一类是虚警。前者是传感器没能获得目标的观测数据所造成,即 missingdata 问题。而后者是传感器遭遇干扰,获得观测数据不属于任何目标,是假信号,即 falsedata 问题。如此情况下,信息的分享就不见得越多越好,因为有些信号可能是 falsedata 相关,其对于邻居节点没有益处,反而可能造成误导。 这一现象可称之为: Many Could Be Better Than All ,或者 Less-is-More 。实际这一现象并不罕见,如在认知科学/cognitive science (Gigerenzer, G., Brighton, H., 2009. Homo heuristicus: Why biased minds make better inferences. Topics in Cognitive Science, 1(1):107–143.) 和神经网络/neural networks( Zhi-Hua Zhou, Jianxin Wu, Wei Tang, Ensembling neural networks: Many could be better than all, In Artificial Intelligence, Vol. 137, Issues 1–2, 2002, Pages 239-263。 ) 因此适当的控制信息分享量(更宽泛的是,只分享有利的信息,而尽量减少误导性或者干扰性的信息),不但显然有利于降低通讯开支(这一点在现实中往往很重要,甚至是网络的重要限制。特别是分布式传感器网络往往都是 low-powered/ 低耗的传感器构成,以减少通讯和造价开支等) , 反而还可能更利于获得更高估计精度。 如下两篇论文分别基于高斯混合 /Gaussian Mixtures 和 序贯蒙特卡洛 /Sequential Monte Carlo( 也称粒子滤波 ) 实现 PHD 滤波进行多目标探测估计揭示这一发现,提出了“部分一致性” Partial Consensus 的概念。同时在随机集 PHD 一致性信息融合方式上给出了一些探索性思考,特别明确和比较了(简单却被忽视的)算术平均 ArithmeticAverage 和(当前主流)几何平均 GeometricAverage 的区别和相比优势,提出了实时一致性 RealTime Consensus (即网络通信在理论上不对滤波器造成延迟,两者在一定程度上可以实现并行进行)的概念。 T. Li, J.M. Corchado and S. Sun, Partial Consensus and Conservative Fusion of Gaussian Mixtures for Distributed PHD Fusion, IEEE Trans. Aeros. Electr. Syst., 2018, DOI: 10.1109/TAES.2018.2882960. 连接:https://ieeexplore.ieee.org/document/8543158 Partial Consensus and Conservative Fusion of Gaussian Mixtures for Distributed PHD Fusion Tiancheng Li , Juan M Corchado , Shudong Sun Link : arXiv:1711.10783 We propose a novel consensus notion, called partialconsensus, for distributed GM-PHD (Gaussian mixture probabilityhypothesis density) fusion based on a peer-to-peer (P2P) sensor network, inwhich only highly-weighted posterior Gaussian components (GCs) are disseminatedin the P2P communication for fusion while the insignificant GCs are notinvolved. The partial consensus does not only enjoy high efficiency in both network communication and local fusion computation but also significantly reduces the effect of potential false data (clutter) to the filter, leading to increased signal-to-noise ratio at local sensors. Two conservative mixture reduction schemes are advocated for fusing the shared GCs in a fully distributed manner. One is given by pairwise averaging GCs between sensorsbased on Hungarian assignment and the other is merging close GCs based a new GMmerging scheme. The proposed approaches have a close connection to theconservative fusion approaches known as covariance union and arithmetic meandensity. In parallel, average consensus is sought on the cardinalitydistribution (namely the GM weight sum) among sensors. Simulations for tracking either a single target or multiple targets that simultaneously appear are presented based on a sensor network where each sensor operates a GM-PHD filter, in order to compare our approaches with the benchmark generalized covariance intersection approach. The results demonstrate that the partial, arithmeticaverage, consensus outperforms the complete, geometric average, consensus. 上文正式发表在 : IEEE Transactions on Aerospace and Electronic Systems DOI: 10.1109/TAES.2018.2882960 Distributed SMC-PHD Fusion for Partial, Arithmetic Average Consensus Tiancheng Li Link : arXiv:1712.06128 We propose an average consensus approach for distributed SMC-PHD(sequential Monte Carlo-probability hypothesis density) fusion, in which local filters extract Gaussian mixtures (GMs) from their respective particle posteriors, share them (iteratively) with their neighbors and finally use the disseminated GM to update the particle weight. There are two distinguishable features of our approach compared to existing approaches. First, a computationally efficient particles-to-GM (P2GM) conversion scheme is developed based on the unique structure of the SMC-PHD updater in which the particle weight can be exactly decomposed with regard to the measurements and misdetection. Only significant components of higher weight are utilized for parameterization. The consensus, conditioned on partial information dissemination over the network, is called partial consensus.Second, importance sampling (IS) is employed to re-weight the local particles for integrating the received GM information, while the states of the particles remain unchanged. By this, the local prior PHD and likelihood calculation can be carried out in parallel to the dissemination \\ fusion procedure. To assess the effectiveness of the proposed P2GM parameterization approach and IS approach, two relevant yet new distributed SMC-PHD fusion protocols are introduced for comparison. One uses the same P2GM conversion and GMdissemination schemes as our approach but local particles are regenerated from the disseminated GMs at each filtering iteration - in place of the IS approach. This performs similar to our IS approach (as expected) but prevents any parallelization as addressed above. The other is disseminating the particles between neighbors - in place of the P2GM conversion. This avoids parameterization but is communicatively costly. The state-of-the-art exponential mixture density approach is also realized for comparison.
个人分类: 科研笔记|3339 次阅读|4 个评论
信息融合大会(FUSION'17)之Sensor Data Mining For Tracking
热度 2 JRoy 2017-3-5 08:03
SS6: Sensor Data Mining For Tracking Description: The rapid development of advanced sensors and their joint application provide a foundation for new paradigms to combat the challenges that arise in target detection, tracking and forecasting in harsh environments with poor prior information. As a consequence, the sensor community has expressed interest in novel data mining methods coupling traditional statistical techniques for substantial performance enhancement. For example, the advent of multiple/massive sensor systems provides very rich observation at high frequency yet low financial cost, which facilitates novel perspectives based on data clustering and model learning to deal with false alarms and misdetection, given little statistical knowledge about the objects, sensors and the background. Numerical fitting and regression analysis provide another unlimited means to utilize the unstructured context information such as “the trajectory is smooth” for continuous-time target trajectory estimation. Incorporating additional, readily available information to constrain the adaptive response and to combat poor scenario knowledge, has shown promise as a means of restoring sensor capability over a range of challenging operating conditions as well as to deal with a variety of challenging problems that makes traditional approaches awkward. The purpose of this special section is to assemble and disseminate information on recent, novel advances in sensor signal and data mining techniques and approaches, and promote a forum for continued discussion on the future development. Both theoretical and practical approaches in the area are welcomed. Organizers: Tiancheng Li ( t.c.li@usal.es ) Haibin Ling ( hbling@temple.edu ) and Genshe Chen ( gchen@intfusiontech.com ) The topics of interest of this specialsection include but are not limited to: · Adaptive filtering · Learning for state space models · Manoeuvring target detectionand tracking · Object recognition/classificationusing sonar, radar, video, soft data sources, etc. · Clustering approaches fortracking · Regression analysis for trajectoryestimation · Multiple Intelligent dataassociation/fusion · Machine learning technology fortracking Submission链接: http://www.fusion2017.org/submissions.html 欢迎投稿! The 20th International Conference on Information Fusion (Fusion 2017) will be held in Xi'an, China during July 10–13, 2017. Conference Venue: Wyndham Grand Xian South Video of Xi'an : http://www.fusion2017.org/video/Fusion2017_2.ogv
个人分类: 科研笔记|9969 次阅读|3 个评论
【泥沙龙笔记:再谈知识图谱和知识习得】
热度 1 liwei999 2016-1-3 19:31
雷: 知识获得与知识图谱的关系是怎么样的?获得的知识以图谱表示? 我: 知识获得或习得(acquisition),通常特指本体知识,与知识图谱没直接关系。信息抽取(IE)和挖掘 (核心是信息融合 fusion)才导致知识图谱。这不是我的一家之言,而是有相当的共识。由于知识图谱(knowledge graph)的爆红,不排除现在有人把 本体知识(ontology)习得也纳入了这个筐。即便如此,在这个筐里面区分这两种知识是十分重要的。ontology 是本体、元知识,有相当的稳定性和永恒性质。而世界上存在和发生的流动性关系和事件却不是本体元知识,而是一种情报。本体不具有情报的价值。核心基础技术也不同,本体主要靠聚类(clustering)技术,机器学习比较擅长。而图谱主要靠抽取,parsing 是核武器。当然 parsing 也可以帮助提高聚类质量,林德康当年的主要成就就在这方面的研究。 雷:本体是专业词典的计算机版本,词典不是动的知识。 我: 是的。词典的延伸就是百科,百科可以为情报服务,它本身不是情报。(当然,现在的wiki百科里面也开始包括动态新闻事件了,而且有人实时更新,开始打破两种知识间的界限。)具体的entity 所牵涉到的关系、事件或评价,才是情报,表达为图谱,存在库里。做本体知识的习得,大多用关键词 加上个别的一些 patterns就可以了。 而林德康当年的研究深入了一步,他用他自己写的 parser 作为支撑,做了结构基础上的本体聚类。 阿: 已发生的事件都不是情报。 我: 那你说的是另一个层次的情报含义了。广义的情报含义是包含发生了的事件的,因为发生了的事件对于未闻者当然具有情报价值,尤其在大数据年代,很多发生了的事件情报被淹没在数据海洋,需要抽取挖掘才好满足客户的情报需求。 雷: 本体是表示知识的框架。定义好class后,程序是实例化。 我: 那个程序是类似检索的工具,cyc 的程序除了检索,还有推理。 雷:在面向对象的计算机实现中,微软的Windows是范例。Windows被定义为一系列class。以后的基于Windows的系统都是这些class的展开,实例化。 我: 那是自然的。本体本身就是个概念class的体系(hierarchy)。图谱的基点是个体,而本体体系的基点则是概念 class。因为针对的是 class,所以本体没有情报性。对于个体,情报的价值才凸显。本体知识在支持情报挖掘的过程中,才会从 class 实体化并可继承这个class的taxonomy,譬如,当产品这个 class 实体化为 iPhone 6S 的时候,系统会继承它的上位概念 Product -- PhysicalObject --Concrete -- Entity。但多数挖掘没有这样的本体知识的支持,也可以凑合。 雷: 人是 class,立委是人的实体化。 我: 是啊,关于立委这家伙的所有关系和事迹的抽取挖掘,构成知识图谱的一个节点,但是这与人的本体知识习得无关。这是抽取,不是习得。这个最近论过, 见博文: 《泥沙龙笔记:知识习得对本体知识,信息抽取对知识图谱》 。 雷: 怎么无关法?立委的种种事迹都是人的种种属性。本体知识是静态的,是用来表达知识的。 我: 关系是这样的:习得不需要IE的支持,IE可以用习得的知识做一点支持,也可以绕过去。 我: 这个是。IE是动态的。 我: 没有本体习得从知识图谱去概括的,没有这种做法。虽然理论上可以说 “立委的种种事迹都是人的种种属性”,但从这些抽取出来的个体事迹去习得ontology 不是正道,没什么效果,也无必要。两种知识层次不同,手段也不同,不好混谈。 雷: 具体点,本体可以用rdf表示,知识也是用rdf表示。 我: 本体是苦功夫,有学术价值,但没太大的直接实用价值,需要圣人去做,譬如 Lenat 和董老师这样的语义大师 ( 《语义三巨人》 )。图谱不然,图谱是直接为应用服务的,是知识产品的必要后盾。 雷: 图谱是本体的实例化。 我: 这种说法属于宇宙真理,没有多大意义,没有实践指导作用,还容易误导。一个可能的误导就是,以为本体知识习得了,就可以在上面实体化而做图谱了。不是这样的。第一,抽取挖掘图谱大多可以绕过本体(绕过去包含把ontology的繁琐体系简化为零星的features来用)。第二,从本体到图谱,还有十万八千里,不是一个听上去很简单的 “实体化” 过程 。知道了立委是个语言学家及其 taxonomy(人--生物--物体--实体),不经过具体的抽取挖掘,还是无法“实体化”出立委的图谱来。本体知识(包括常识)对于图谱工作,总体来说,既不必要,更不充分。 彭: 谢谢伟哥,原来对知识图谱一知半解,现在提升到一知大半解了。 我: 举个例子说图谱吧,就跟一个履历表差不多。无数个互相链接的履历库就是关于求职者的知识图谱。见博文: 《 知识图谱的先行:从 Julian Hill 说起 》。 彭: 谢谢,经常从略知一二的窃喜到只知一二的窘迫。 雷: 表面上没有本体,但是follow了你心中的本体 洪: 厚积薄发得机缘, 伟爷不靠蓝药丸。 每日自语或群侃, 转手滚烫博客篇。 我: @洪爷 这话,倒是话糙理不糙 【相关】 《泥沙龙笔记:知识习得对本体知识,信息抽取对知识图谱》 【新智元笔记:深度 parsing 的逻辑化 】 《 知识图谱的先行:从 Julian Hill 说起 》 《语义三巨人》 【立委科普:信息抽取】 【立委科普:自然语言理解当然是文法为主,常识为辅】 【置顶:立委科学网博客NLP博文一览(定期更新版)】
个人分类: 立委科普|6698 次阅读|1 个评论
UIUC图书情报研究生院(GSLIS)学术报告:从信息分歧到信息融合
热度 1 terahertz 2015-2-14 06:55
当地时间2015年2月12日下午4时,地点位于美国伊利诺伊大学香槟分校图书情报研究生院的126报告厅。来自法国 Jodi Schneider博士做了一场“从信息分歧到信息融合”讲座。 要理解大量的信息是一件极具挑战性的事情。从信息转化为知识必须要解决信息争论和信息分歧问题。需要结合机器与人工智能的信息融合的新方法。在维基百科,制定决策通常需要在线开放讨论。每周在英文版维基百科就某一个主题至少 2 到 200 人参与约 500 次讨论,不管这个讨论的主题是否从百科全书中删去。作者提出了支持在线意义建构的新方法,并且用于讨论中。作者展示了基于这些方法进行更有意义讨论的一个新的、可重构的 Web 界面。 作者简介。 Jodi Schneider ,硕士毕业于伊利诺 伊大学香槟分校图书情报专业,之前曾在 Amherst College 和 Appalachian State University 做过图书馆员。主要贡献是发起创办了 Code4Lib Journal 和参与 了 W3C 发布的图书馆关联数据应用指南 (W3C Library Linked Data Incubator GroupFinal Report) ,有法语版、西班牙语版、日语版和中文版。 博士于 2014 年毕业于爱尔兰国立大学 (NationalUniversity of Ireland) 信息学专业。目前在法国国家信息与自动化研究所攻读博士后。研究论文主要发表在《 SemanticWeb 》 和《 BiomedicalSemantics 》及 ACM's Computer-Supported Collaborative Work 等学术会议。 博士论文的主题是利用语义网技术组织维基百科信息质量的讨论。正在与药剂师合作开发提高临床药物信息来源的智能众包系统。
个人分类: 出国留学|3372 次阅读|2 个评论
第十四届信息融合(IF)国际会议
huangfuqiang 2011-6-5 16:26
Topics of interest include (but are not limited to) the following: 1. Theory and Representation Probability theory; Bayesian inference; Fuzzy sets and fuzzy logic; Dempster-Shafer, evidential reasoning, belief functions, logic-based fusion and preference aggregation; Random sets, Finite Set Statistics; Topic modeling 2. Algorithms Registration; Detection, localization and signal processing; Automatic target recognition and classification; Nonlinear filtering, tracking and data association; Automated situation assessment, prediction, pattern and behavioral analysis; Distributed fusion; Process and sensor resource management 3. Solution Paradigms Sequential inference; Data mining; Graph analysis; Machine learning; Ontologies 4. DATA Specific Processing and Fusion Image and video; Radar; Passive sensors; Soft data sources 5. Modeling, Simulation, and Evaluation Target and sensor modeling; Benchmarks; Testbeds; Fusion performance modeling and evaluation 6. Applications Aided fusion; Sensor networks; Persistent surveillance; Defense and intelligence; Security; Robotics; Transportation and logistics; Manufacturing; Economics and financial; Environmental monitoring; Medical care 站点地址:http://www.fusion2011.org/ 信息融合国际协会网站
个人分类: 物联网工程|3448 次阅读|0 个评论
元搜索引擎提问融合方法研究(论文摘要)
doublejing 2011-3-27 19:19
此文为DJ的硕士学位论文《元搜索引擎提问融合方法研究》,特发表摘要以供交流。由于符号无法在正文显示,请看附件。 摘要.doc
2109 次阅读|0 个评论

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-5-20 11:32

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部