论文里的好句子(7)

1.

However, intricate temporal patterns of the long-term future prohibit the model from finding reliable dependencies.

然而,长期未来的复杂时间模式使得模型无法找到可靠的依赖关系。

2.

First, it is unreliable to discover the temporal dependencies directly from the long-term time series because the dependencies can be obscured by entangled temporal patterns.

首先,直接从长期时间序列中发现时间相关性是不可靠的,因为相关性可能被纠缠的时间模式所掩盖。

3.

While performance is significantly improved, these models still utilize the point-wise representation aggregation.

虽然性能得到了显著提高,但这些模型仍然使用逐点表示聚合。

4.

This common usage limits the capabilities of decomposition and overlooks the potential future interactions among decomposed components.

这种常见的用法限制了分解的能力,并忽略了分解组件之间未来可能的交互。

5.

empower the deep forecasting models with immanent capacity of progressive

decomposition

使深度预测模型具有内在的渐进分解能力。

6.

Benefiting from this, we try to take advantage of the series periodicity to renovate the point-wise connection in self-attention.

我们试图利用系列周期性来更新自我注意中的点式连接。

7.

the immense importance of time series forecasting

时间序列预测的巨大重要性

8.

least favorable distribution

最不利分布 9.

2022 NIPS

Generating multivariate time series with Common Source CoordInated GAN (COSCI-GAN)

用COSCI-GAN生成多元时间序列

Generating multivariate time series is a promising approach for sharing sensitive data in many medical, financial, and IoT applications.

生成多元时间序列是一种很有前途的方法,可用于在许多医疗、金融和物联网应用中共享敏感数据。

A common type of multivariate time series originates from a single source such as the biometric measurements from a medical patient.

This leads to complex dynamical patterns between individual time series that are hard to learn by typical generation models such as GANs.

常见类型的多变量时间序列源自单个源,诸如来自医疗患者的生物测定测量。这导致了个体时间序列之间的复杂动态模式,这很难通过典型的生成模型(如GAN)来学习。

There is valuable information in those patterns that machine learning models can use to better classify, predict or perform other downstream tasks.

这些模式中包含有价值的信息,机器学习模型可以使用这些信息更好地进行分类、预测或执行其他下游任务。

We propose a novel framework that takes time series‘ common origin into account and favors channel/feature relationships preservation.

提出了一种新的时间序列分析框架,该框架考虑了时间序列的共同起源,并支持通道/特征关系的保持.

The two key points of our method are:

1) the individual time series are generated from a common point in latent space and

2) a central discriminator favors the preservation of inter-channel/feature dynamics.

1)从本征空间中的公共点生成各个时间序列,

以及2)中央鉴别器有利于保持声道/特征间的动态。

We demonstrate empirically that our method helps preserve channel/feature correlations and that our synthetic data performs very well in downstream tasks with medical and financial data.

我们通过经验证明,我们的方法有助于保持通道/特征相关性,并且我们的合成数据在下游任务中与医疗和金融数据一起表现得非常好。

10.

2022 NIPS

Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting

非静止变压器:时间序列预测中平稳性的探讨

Transformers have shown great power in time series forecasting due to their global-range modeling ability.

However, their performance can degenerate terribly on non-stationary real-world data in which the joint distribution changes over time.

然而,在联合分布随时间变化的非平稳真实世界数据上,它们的性能可能会严重退化。

Previous studies primarily adopt stationarization to attenuate the non-stationarity of original series for better predictability.

But the stationarized series deprived of inherent non-stationarity can be less instructive for real-world bursty events forecasting.

以往的研究主要采用平稳化的方法来减弱原始序列的非平稳性,以提高预测能力。但是,失去了固有非平稳性的平稳序列对实际突发事件的预测指导意义不大。

实际突发事件的预测

real-world bursty events forecasting

bursty events

突发事件

attenuate the non-stationarity

减弱非平稳性

real-world bursty events

真实世界的突发事件

can be less instructive

指导意义不大

This problem, termed over-stationarization in this paper, leads Transformers to generate indistinguishable temporal attentions for different series and impedes the predictive capability of deep models.

该问题在本文中被称为过平稳化,导致变压器对不同序列产生不可区分的时间注意力,阻碍了深度模型的预测能力。

impedes the predictive capability of deep models

阻碍深度模型的预测能力

This problem, termed over-stationarization in this paper

该问题在本文中被称为过平稳化

indistinguishable temporal attentions

不可区分的时间注意力

To tackle the dilemma between series predictability and model capability, we propose Non-stationary Transformers as a generic framework with two interdependent modules:

Series Stationarization and De-stationary Attention.

为了解决系列可预测性和模型能力之间的困境,我们建议将非稳态变压器作为一个通用框架,其中包含两个相互依赖的模块:序列平稳化与去平稳注意。

一个通用框架

a generic framework

Concretely, Series Stationarization unifies the statistics of each input and converts the output with restored statistics for better predictability.

具体而言,系列固定化统一了每个输入的统计数据,并将输出与恢复的统计数据转换,以提高可预测性。

To address the over-stationarization problem, De-stationary Attention is devised to recover the intrinsic non-stationary information into temporal dependencies by approximating distinguishable attentions learned from raw series.

为了解决过平稳化问题,提出了去平稳注意力模型,通过对原始序列的可区分注意力进行近似,将内在的非平稳信息恢复为时间依赖性。

你可能感兴趣的:(深度学习,人工智能)