找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook

[复制链接]
楼主: metabolism
发表于 2025-3-28 17:37:08 | 显示全部楼层
Using Entities in Knowledge Graph Hierarchies to Classify Sensitive Information to the public. However, automatically classifying sensitive information is difficult, since sensitivity is often due to contextual knowledge that must be inferred from the text. For example, the mention of a specific named entity is unlikely to provide enough context to automatically know if the in
发表于 2025-3-28 22:25:42 | 显示全部楼层
发表于 2025-3-29 02:08:04 | 显示全部楼层
发表于 2025-3-29 03:07:55 | 显示全部楼层
Query Expansion, Argument Mining and Document Scoring for an Efficient Question Answering Systemcomparative question by retrieving documents based only on traditional measures (such as TF-IDF and BM25) does not always satisfy the need. In this paper, we propose a multi-layer architecture to answer comparative questions based on arguments. Our approach consists of a pipeline of query expansion,
发表于 2025-3-29 09:55:56 | 显示全部楼层
Transformer-Encoder-Based Mathematical Information Retrievalrieval systems should not only be able to process natural language, but also mathematical and scientific notation to retrieve documents..In this work, we evaluate two transformer-encoder-based approaches on a Question Answer retrieval task. Our pre-trained ALBERT-model demonstrated competitive perfo
发表于 2025-3-29 12:51:31 | 显示全部楼层
发表于 2025-3-29 17:41:05 | 显示全部楼层
发表于 2025-3-29 22:42:11 | 显示全部楼层
Tracking News Stories in Short Messages in the Era of Infodemic[.]), its impact on the results and why it is key to this type of work. We used a supervised algorithm proposed by Miranda et al. [.] and K-Means to provide evaluations for different use cases. We found that TF-IDF vectors are not always the best ones to group documents, and that algorithms are sens
发表于 2025-3-30 00:39:24 | 显示全部楼层
发表于 2025-3-30 05:52:18 | 显示全部楼层
Rhythmic and Psycholinguistic Features for Authorship Tasks in the Spanish Parliament: Evaluation anobtained by a BETO transformer, when the latter is trained on the original text, i.e., potentially learning from topical information. Moreover, we further investigate the results for the different authors, showing that variations in performance are partially explainable in terms of the authors’ poli
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-2 17:52
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表