Cultivate 发表于 2025-3-23 10:45:12

Introduction,yone doing Natural Language Processing (NLP). Representing words as vectors rather than discrete variables, at least in theory, enables generalization across syntactically or semantically similar words; and easy-to-implement, easy-to-train word embedding algorithms (Mikolov et al., 2013a, Pennington

TEM 发表于 2025-3-23 15:01:47

Cross-Lingual Word Embedding Models: Typology, of alignment required for supervision, and the comparability these alignments encode. Unsupervised approaches are discussed in Chapter 9, and we show that they are very similar to supervised approaches, with the only core difference being how they obtain and gradually enrich the required bilingual

带来墨水 发表于 2025-3-23 20:45:59

A Brief History of Cross-Lingual Word Representations,ght seem like a novel phenomenon in representation learning, many of the high-level ideas that motivate current research in this area can be found in work that pre-dates the popular introduction of word embeddings inspired by neural networks. This includes work on learning cross-lingual clusters and

Heart-Rate 发表于 2025-3-24 01:11:56

http://reply.papertrans.cn/25/2404/240354/240354_14.png

audiologist 发表于 2025-3-24 03:10:35

From Bilingual to Multilingual Training, (2017) and Duong et al. (2017) demonstrate that there are clear benefits to including more languages, moving from bilingual to . settings, in which the vocabularies of more than two languages are represented.

量被毁坏 发表于 2025-3-24 08:53:51

Unsupervised Learning of Cross-Lingual Word Embeddings,at was previously assumed. Vulić and Moens (2016) were first to show this, and Artetxe et al. (2017), Smith et al. (2017), and Søgaard et al. (2018) explored using even weaker supervision signals, including, numerals and words that are identical across languages. Several authors have recently propos

招惹 发表于 2025-3-24 10:47:54

Useful Data and Software,y providing more and more directly usable code in readily accessible online repositories. In what follows, we provide a (non-exhaustive) list of links to online material that can provide hands-on support to NLP practitioners entering this vibrant field.

Exhilarate 发表于 2025-3-24 16:21:19

General Challenges and Future Directions,onstrated the similarity of many of these models. It provided proofs that connect different word-level embedding models and has described ways to evaluate cross-lingual word embeddings, as well as how to extend them to the multilingual setting. Below we outline existing challenges and possible futur

ligature 发表于 2025-3-24 22:46:05

Cross-Lingual Word Embeddings978-3-031-02171-8Series ISSN 1947-4040 Series E-ISSN 1947-4059

酷热 发表于 2025-3-24 23:19:43

T. C. K. Brown,A. E. E. Meursing (2017) and Duong et al. (2017) demonstrate that there are clear benefits to including more languages, moving from bilingual to . settings, in which the vocabularies of more than two languages are represented.
页: 1 [2] 3 4 5 6
查看完整版本: Titlebook: Cross-Lingual Word Embeddings; Anders Søgaard,Ivan Vulić,Manaal Faruqui Book 2019 Springer Nature Switzerland AG 2019