Few-Shot 小样本学习 论文检索

References

  1. Discriminative k-shot learning using probabilistic models. arXiv preprintarXiv:1706.00326 (2017)
  2. A closer look at few-shot classification. In: ICLR (2019) 【已读】
  3. Diversity with cooperation: Ensemble methods for few-shot classification. In: ICCV. pp. 3723–3731 (2019)
  4. One-shot learning of object categories. TPAMI28(4), 594–611 (2006)【已读】
  5. Model-agnostic meta-learning for fast adaptation of deep networks. In: ICML. pp. 1126–1135. JMLR. org (2017) 【已读】
  6. Few-shot learning with graph neural networks. ICLR (2017)
  7. Boosting few-shot visual learning with self-supervision. In: CVPR. pp. 8059–8068 (2019)【已读】
  8. Dynamic few-shot visual learning without forgetting.In: CVPR. pp. 4367–4375 (2018)【已读】
  9. Generating classification weights with gnn denoising autoencoders for few-shot learning. In: CVPR. pp. 21–30 (2019)
  10. Low-shot visual recognition by shrinking and hallucinating features. In: ICCV. pp. 3018–3027 (2017)
  11. Few-shot learning with metric-agnostic conditional embeddings. arXiv preprint arXiv:1802.04376 (2018)
  12. Cross attention network for few-shot classification. In:NIPS.pp. 4005–4016 (2019)
  13. Exploiting unsupervised inputs for accurate few-shot classification (2020)
  14.  Edge-labeling graph neural network for few-shot learning. In: CVPR. pp. 11–20 (2019)
  15. Transductive few-shot learning with meta-learned confidence. arXiv preprint arXiv:2002.12017 (2020)
  16. Meta-learning with differentiable convex optimization. In: CVPR. pp. 10657–10665 (2019)【已读】
  17. Learning to self-train for semi-supervised few-shot classification. In: NeurIPS. pp. 10276–10286 (2019)
  18. Deep metric transfer for label propagation with limited annotated data. In:CVPR. pp. 0–0 (2019)
  19.  Learning to propagate labels: transductive propagation network for few-shot learning. In: ICLR(2019)
  20. Charting the right manifold: Manifold mixup for few-shot learning.arXiv preprint arXiv:1907.12087 (2019)
  21. Tadam: Task dependent adaptive metric for improved few-shot learning. In: NeurIPS. pp. 721–731 (2018)
  22. Few-shot image recognition by predicting parameters from activations. In: CVPR. pp. 7229–7238 (2018)
  23. Meta-learning with implicit gradients. In: NeurIPS. pp. 113–124 (2019)
  24. Optimization as a model for few-shot learning. In: ICLR(2016)【已读】
  25. Meta-learning for semi-supervised few-shot classification. In: ICLR (2018)
  26. Meta-learning with latent embedding optimization. In: ICLR (2018)【已读】
  27. Few-shot learning with graph neural networks. In:ICLR (2018)
  28. Prototypical networks for few-shot learning. In:NeurIPS. pp. 4077–4087 (2017)【已读】
  29. Meta-transfer learning for few-shot learning. In: CVPR. pp. 403–412 (2019)
  30. Learning to compare: Relation network for few-shot learning. In: CVPR. pp. 1199–1208 (2018)【已读】
  31.  Matching networks for one shot learning. In: NeurIPS. pp. 3630–3638 (2016)【已读】
  32. Low-shot learning from imaginary data. In: CVPR. pp. 7278–7286 (2018)
  33. Adaptive cross-modal few-shot learning. In: NeurIPS. pp. 4848–4858 (2019)
  34. Distribution propagation graph network for few-shot learning. In: CVPR. pp. 13390–13399 (2020)
  35. Few-shot learning via embedding adaptation with set-to-set functions. In: CVPR. pp. 8808–8817 (2020)【已半读】
  36. Transmatch: A transfer-learning scheme for semi-supervised few-shot learning. In: CVPR. pp. 12856–12864 (2020)
  37. Hybrid attention-based prototypical networks for noisy few-shot relation classification[C]. AAAI. 2019, 33: 6407-6414.【已读】
  38. One-shot learning with memory-augmented neural networks. arXiv preprint arXiv:1605.06065, 2016.【已读】
  39. Transferrable prototypical networks for unsupervised domain adaptation[C]. CVPR. 2019: 2239-2247.
  40.  Model agnostic meta-learning for fast adaptation of deep networks[C]. In ICML, 2017.
  41. Semi-Supervised and Active Few-Shot Learning with Prototypical Networks[J]. 2017.
  42.  Metric learning for large scale image classification: Generalizing to new classes at near-zero cost. In ECCV, 2012.
  43. Meta-learning: A survey[J]. arXiv preprint arXiv:1810.03548, 2018.【已读】
  44. Generalizing from a few examples  A survey on few-shot learning[M]. arXiv: 1904.05046. 2019.【已读】
  45. Large-scale few-shot learning: Knowledge transfer with class hierarchy[C].CVPR. 2019  7212-7220.【已半读】
  46. Siamese neural networks for one-shot image recognition[C].ICML deep learning workshop. 2015, 2.【已读】
  47.  Meta-SGD: Learning to learn quickly for few-shot learning[J]. arXiv preprint arXiv:1707.09835, 2017.【已读】
  48. Dense classification and implanting for few-shot learning[C].CVPR. 2019: 9258-9267.【已读】
  49. A simple neural attentive meta-learner[J]. arXiv preprint arXiv:1707.03141, 2017.
  50. Meta networks[C].ICML 70. JMLR. org, 2017: 2554-2563.
  51. Low-shot learning with imprinted weights[C].CVPR. 2018: 5822-5830.
  52. Deep Prototypical Networks for Imbalanced Time Series Classification under Data Scarcity[C].International Conference on Information and Knowledge Management. 2019: 2141-2144.
  53. One-Way Prototypical Networks[J]. arXiv preprint arXiv:1906.00820, 2019.
  54.  Prototypical Clustering Networks for Dermatological Disease Diagnosis[J]. arXiv preprint arXiv:1811.03066, 2018.
  55. Infinite mixture prototypes for few-shot learning[J]. arXiv preprint arXiv:1902.04552, 2019.
  56. Prototype propagation networks (PPN) for weakly-supervised few-shot learning on category graph[J]. arXiv preprint arXiv:1905.04042, 2019.
  57. Robustness of regularized linear classification methods in text categorization[C].conference on Research and development in information retrieval. 2003: 190-197.

 

你可能感兴趣的:(小样本学习)