ACL 2020接收論文列表公開,接收率25.2%,你上榜了嗎?

机器之心發表於2020-05-19

ACL 2020接收論文列表公開,接收率25.2%,你上榜了嗎?

此次 ACL 會議的投稿數量為 3088 篇,與去年的投稿數量 2906 篇相比稍有增長。ACL 2020 共接收 779 篇論文,包括 571 篇長論文和 208 篇短論文,接收率為 25.2%。

ACL 2020接收論文列表公開,接收率25.2%,你上榜了嗎?

在接收論文列表中,我們看到了很多熟悉的名字:

Christopher D. Manning(史丹佛大學教授、史丹佛 AI 實驗室負責人):

  • Finding Universal Grammatical Relations in Multilingual BERT

  • Optimizing the Factual Correctness of a Summary: A Study of Summarizing Radiology Reports

  • Syn-QG: Syntactic and Shallow Semantic Rules for Question Generation

Yoshua Bengio(加拿大電腦科學家、蒙特利爾大學教授):

  • Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach

Yoav Goldberg(以色列 Bar-Ilan 大學電腦科學系高階講師):

  • A Formal Hierarchy of RNN Architectures

  • Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection

  • Simple, Interpretable and Stable Method for Detecting Words with Usage Change across Corpora

  • Unsupervised Domain Clusters in Pretrained Language Models

  • A Two-Stage Masked LM Method for Term Set Expansion

  • Towards Faithfully Interpretable NLP Systems: How should we define and evaluate faithfulness?

Noah A. Smith(華盛頓大學電腦科學與工程系教授):

  • A Formal Hierarchy of RNN Architectures

  • A Mixture of h − 1 Heads is Better than h Heads

  • Don't Stop Pretraining: Adapt Language Models to Domains and Tasks

  • Improving Transformer Models by Reordering their Sublayers

  • Social Bias Frames: Reasoning about Social and Power Implications of Language

  • The Right Tool for the Job: Matching Model and Instance Complexities

  • Recollection versus Imagination: Exploring Human Memory and Cognition via Neural Language Models

Percy Liang(史丹佛大學計算機系副教授、史丹佛人工智慧實驗室成員):

  • Robust Encodings: A Framework for Combating Adversarial Typos

  • Selective Question Answering under Domain Shift

  • Enabling Language Models to Fill in the Blanks

  • ExpBERT: Representation Engineering with Natural Language Explanations

  • Shaping Visual Representations with Language for Few-Shot Classification

Sebastian Ruder(DeepMind 研究科學家):

  • A Call for More Rigor in Unsupervised Cross-lingual Learning

  • On the Cross-lingual Transferability of Monolingual Representations

周明(微軟亞洲研究院副院長、國際計算語言學協會(ACL)主席):

  • A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction

  • Curriculum Pre-training for End-to-End Speech Translation

  • Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension

  • Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder

  • Graph Neural News Recommendation with Unsupervised Preference Disentanglement

  • Improving Neural Machine Translation with Soft Template Prediction

  • LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network

  • MIND: A Large-scale Dataset for News Recommendation

  • MuTual: A Dataset for Multi-Turn Dialogue Reasoning

  • Reasoning Over Semantic-Level Graph for Fact Checking

  • A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation

  • A Simple and Effective Unified Encoder for Document-Level Machine Translation

劉鐵巖(微軟亞洲研究院副院長):

  • A Study of Non-autoregressive Model for Sequence Generation

  • SEEK: Segmented Embedding of Knowledge Graphs

  • SimulSpeech: End-to-End Simultaneous Speech to Text Translation

劉群(華為諾亞方舟實驗室語音語義首席科學家):

  • Perturbed Masking: Parameter-free Probing for Analyzing and Interpreting BERT

  • Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

宗成慶(中科院自動化所研究員):

  • Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

孫茂松(清華大學電腦科學與技術系教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation

  • Fine-grained Fact Verification with Kernel Graph Attention Network

  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

劉知遠(清華大學電腦科學與技術系副教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation

  • Expertise Style Transfer: A New Task Towards Better Communication between Experts and Laymen

  • Fine-grained Fact Verification with Kernel Graph Attention Network

  • Grounded Conversation Generation as Guided Traverses in Commonsense Knowledge Graphs

  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

  • MOOCCube: A Large-scale Data Repository for NLP Applications in MOOCs

黃民烈(清華大學電腦科學與技術系副教授):

  • A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction

  • KdConv: A Chinese Multi-domain Dialogue Dataset Towards Multi-turn Knowledge-driven Conversation

  • Multi-Agent Task-Oriented Dialog Policy Learning with Role-Aware Reward Decomposition

萬小軍(北京大學電腦科學技術研究所研究員):

  • Automatic Generation of Citation Texts in Scholarly Papers: A Pilot Study

  • Heterogeneous Graph Transformer for Graph-to-Sequence Learning

  • Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization

  • Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction

  • Multi-Granularity Interaction Network for Extractive and Abstractive Multi-Document Summarization

  • Semantic Parsing for English as a Second Language

  • Multimodal Transformer for Multimodal Machine Translation

邱錫鵬(復旦大學電腦科學技術學院教授):

  • Extractive Summarization as Text Matching

  • Heterogeneous Graph Neural Networks for Extractive Document Summarization

  • Improving Image Captioning with Better Use of Caption

  • FLAT: Chinese NER Using Flat-Lattice Transformer

韓松(MIT 電子工程和電腦科學系助理教授):

  • HAT: Hardware-Aware Transformers for Efficient Natural Language Processing

歡迎中了 ACL 2020 論文的讀者留言,機器之心也將持續為大家推薦更多優質論文。

ACL 2020 接收論文列表,參見:https://acl2020.org/program/accepted/#long-papers

相關文章