Introduction(英语笔记)

Introduction

  1. In recent years, recommender systems, which help users discover items of interest from a large resource collection, have been playing an increasingly important role in various online service
    近年来,××××××××××在各种在××××××××中发挥着越来越重要的作用,它可以帮助用户×××××××

  2. Convolutional Neural Networks (CNNs) have been successfully applied to tackle problems such as image classification , semantic segmentation or machine translation
    ×××××××已成功应用于×××××、×××××××或×××××××等问题

  3. Nevertheless 然而

  4. The success of recommendation system makes it prevalent in Web applications, ranging from search engines, E-commerce, to social media sites and news portals
    推荐系统的成功使其在从搜索引擎、电子商务、社交媒体网站和新闻门户网站的Web应用程序中流行起来

  5. without exaggeration
    毫不夸张

  6. Entity alignment is the task of finding entities from different knowledge graphs (KGs) that refer to the same real-world identity.
    ××××是指×××××××××××××的任务。(一般用于第一句)

  7. Recently, increasing attention has been paid to the utilization of KG representation learning rather than symbolic formalism for tackling this task.
    近年来,利用×××××××××解决这一问题越来越受到重视。

  8. However, existing GNN-based entity alignment models still face a critical problem.
    然而,现有的×××××模型仍然面临一个关键问题。

  9. Most existing approaches to generating node embeddings are inherently transductive. The majority of these approaches directly optimize the embeddings for each node using matrix-factorization-based objectives, and do not naturally generalize to unseen data, since they make predictions on nodes in a single, fixed graph [×].
    大多数现有的××××××××方法本质上都是××××××。这些方法中的大多数直接使用×××××××,并且不会×××××××××,因为×××××××[×]。

  10. The challenge of resolving this issue lies in the difficulty of fully mitigating the non-isomorphism in the neighborhood structures of counterpart entities from different KGs.
    解决此问题的挑战在于×××××

  11. Inspired by this recent work, we introduce an attention-based architecture to perform node classification of graph-structured data. The idea is to compute the hidden representations of each node in the graph, by attending over its neighbors, following a self-attention strategy. The attention architecture has several interesting properties:
    受这项最新工作的启发,我们引入了×××××××××。××××××架构有几个有趣的特性:×××××××

  12. Motivated by the fact that the semantically-related information can appear in both direct and distant neighbors of counterpart entities, we propose the KG alignment network AliNet which aggregates both direct and distant neighborhood information.
    基于×××××这一事实,我们提出了一种×××××方法——方法name

  13. A recent research trend in deep learning is the attention mechanism,which deals with variable sized data and encourages the model to focus on the most salient parts of data.
    最近在深度学习领域的一个研究趋势是注意力机制,它处理可变大小的数据,并鼓励模型关注数据中最显著的部分。

  14. (注意力机制)It has demonstrated the effectiveness in deep neural network framework and is widely applied to various applications, such as text analysis [×], knowledge graph[×] and image processing [×]
    它在深层神经网络框架中已经证明了它的有效性,并被广泛应用于文本分析[××]、知识图[××]和图像处理[××]

  15. Despite the success of attention mechanism in deep learning, it has not been considered in the graph neural network framework for heterogeneous graph.
    尽管××××××在深度学习中取得了成功,但在×××××的图神经网络框架中却没有考虑到。

  16. Although HIN based methods have achieved perfor-mance improvement to some extent, there are two major problems for these methods using meta-path based similari-
    ties.
    虽然××××××××方法在一定程度上提高了性能,但是这些××××××××方法存在两个主要问题

  17. However, current techniques fail to satisfactorily define and optimize a reasonable objective required for scalable unsupervised feature learning in networks.
    然而,目前的技术未能很好地定义和优化网络中××××××××所需的合理目标。

  18. In this paper, we propose a novelHeterogeneous graph Attention Network, named HAN, which considers both of node-level and semantic-level attentions. In particular, given the node features as input, we use the type-specific transformation matrix to project different types of node features into the same space. Then the node-level attention is able to learn the attention values between the nodes and their meta-path based neighbors, while the semantic-level attention aims to learn the attention values of different meta-paths for the specific task in the heterogeneous graph.
    在本文中,我们提出了×××××××,它同时考虑了×××××××××。特别是,×××××××,我们使用×××××××××。然后,××××××××××××。

  19. The contributions of our work are summarized as follows:
    我们的工作贡献总结如下:

  20. Overall our paper makes the following contributions:
    总体而言,本文的贡献如下:

  21. (服务发现)An example Web service network among Mashups, APIs, and tags. For simplicity, the annotation relationships between Mashups and tags are not shown for the case where Mashups and APIs share the tag space. The composition relationships characterize the functional dependencies between Mashups and APIs, while the annotation relationships characterize the functional similarities between Mashups and APIs.
    mashup、api和标记之间的Web服务网络示例。为了简单起见,在mashup和api共享标记空间的情况下,不会显示mashup和标记之间的注释关系。组合关系描述了mashup和api之间的功能依赖关系,而注释关系描述了mashup和api之间的功能相似性。
    Introduction(英语笔记)_第1张图片

  22. Introduction的最后一段
    In the rest of this paper, we first review the related work in Section 2, and introduce ×××××××× problem definition and analysis in Section 3. We present the proposed ×××× framework in Section 4 and show experiment results in Section 5. Finally, we conclude the paper in Section 6.

  23. (未来工作) As future work, we would like to utilize better
    attentionmechanisms to fuse aspect-level latent factors. In
    addition
    , we can explore the strategy of automatic selection
    of meta- paths in different datasets.

作为未来的工作,我们希望利用更好的×××来×××××。此外,我们还可以探索××××××。

你可能感兴趣的:(笔记)