top of page
Mt Badly, CA
August 17, 2019

"THE BEST WAY TO PREDICT THE FUTURE IS TO IMPLEMENT IT."

yuchen

eleanor

jiang

ABOUT ME

Eleanor is the founder&CEO of AIWaves Inc. She was a Ph.D. researcher in the Institute of Machine Learning at ETH Zürich, where she was supervised by Ryan Cotterell and Mrinmaya Sachan. She was in the Direct Doctorate program and also obtained her Master’s in Computer Science at ETH Zürich. Previously, she did her undergraduate studies at Zhejiang University's Chu Kochen Honors College and spent some time at UCLA, supervised by Kai-Wei Chang. 

 

Her research is focused on Natural Language Processing and Machine Learning. At the moment, she is mainly interested in generating and understanding long-form structured content (e.g. novels, dialogues, etc.). To this end, she has recently been dabbling into interactive generation, controlled text generation, document-level machine translation, coreference resolution, and coherence modeling. She is also a big fan of fun applications of natural language generation (chatGPT, lyrics generation, etc.).

When Eleanor is not busy honing large language models, she indulges in hobbies like skiing, hiking, playing the piano, and learning new languages.

Drop her a line at eleanorjiang630@gmail.com if you're up for building a cool NLP product together.

If you are an MSc or Bachelor student at ETH Zürich looking for fun NLP projects, feel free to reach out to her at yuchen.jiang@inf.ethz.ch.

  • pngwing.com
  • googlescholar
  • github
  • Twitter
  • LinkedIn
  • blog
About

NEWS

  • [May 2023] 🎉 Two papers submitted to NeurIPS 2023:
    • RecurrentGPT: Interactive Generation of (Arbitrarily) Long Text. [demo]

    • Efficient Prompting via Dynamic In-Context Learning. 

  • [May 2023] 🎉 Two papers accepted at ACL 2023:

    • Don't Group, Just Rescore: A Simpler Alternative to Constrained Beam Search.

    • Discourse-Centric Evaluation of Machine Translation with a Densely Annotated Parallel Corpus.

  • [Apr. 2023] 🎉 One paper accepted at ICML 2023:

    • Controlled Text Generation with Natural Language Instructions.

  • [Feb. 2023] Back to Microsoft Research Asia! Ready for some serious waves with ChatGPT!

  • [Jan. 2023] 🎉 One paper accepted at EACL 2023:

    • Poor Man's Quality Estimation: Predicting Reference-Based MT Metrics Without the Reference.

  • [July 2022] Attending EMNLP 2022 in Abu Dhabi🇦🇪!

  • [Oct. 2022] 🎉 One paper accepted at EMNLP 2022:

    • Autoregressive Structured Prediction with Language Models.

  • [Sept. 2022] Invited talk at Univesity of Tokyo, hosted by Miyao Group.

  • [Sept. 2022] Invited talk at Tokyo Institute of Technology, hosted by Okazaki Lab.

  • [August 2022] Interning at the National Institute of Advanced Industrial Science and Technology (AIST) in Japan, working with Dr. Hiroya Takamura.

  • [July 2022] Invited talk at the TextShuttle.

  • [July 2022] Attending NAACL 2022 in Seattle!

  • [May 2022] 🎉 Two papers (both oral) accepted at NAACL 2022:

    • BlonDe: An Automatic Evaluation Metric for Document-level Machine Translation.

    • A Structured Span Selector.

  • [March 2022]  I became a TA for Advanced Formal Language Theory (263-5352-00L)

  • [Feb. 2022] 🎉 Officially finished all the courses as an MSc student. Glad to be full-time on NLP research & teaching!

  • [Sept. 2020] Joining Rycolab and Mrinmaya's Lab!

news

Publications

BlonDe: An Automatic Evaluation Metric for Document-level Machine Translation.

Yuchen Eleanor Jiang, Tianyu Liu, Shuming Ma, Dongdong Zhang, Jian Yang, Haoyang Huang, Rico Sennrich, Mrinmaya Sachan, Ryan Cotterell, Ming Zhou. [NAACL 2022 oral]

Keywords: machine translation; evaluation

A Structured Span Selector.

Tianyu Liu, Yuchen Eleanor Jiang, Mrinmaya Sachan, Ryan Cotterell. [NAACL 2022 oral]

  • A structured model which directly learns to select an optimal set of spans for various span selection problems, e.g. coreference resolution and semantic role labeling. [pdf][code]

Keywords: coreference resolution; semantic role labeling

Autoregressive Structured Prediction with Language Models

Tianyu Liu, Yuchen Eleanor Jiang, Nicholas Monath, Mrinmaya Sachan, Ryan Cotterell. [EMNLP 2022 findings]

  • An approach to model structures as sequences of actions in an autoregressive manner with pretrained language models, allowing in-structure dependencies to be learned without any loss. [pdf][code]

Keywords: entity and relation extraction; coreference resolution; NER

Poor Man's Quality Estimation: Predicting Reference-Based MT Metrics Without the Reference

Vilém Zouhar, Shehzaad Dhuliawala, Wangchunshu Zhou, Nico Daheim, Tom Kocmi, Yuchen Eleanor Jiang, Mrinmaya Sachan. [EACL 2023 oral]

  • We propose a better approach to leverage monolingual data to boost machine translation performance, i.e. leveraging the metric estimation task for pre-training a quality estimation model.  [pdf][code]

Keywords: machine translation; quality estimation

A Bilingual Parallel Corpus with Discourse Annotations

Yuchen Eleanor Jiang, Tianyu Liu, Shuming Ma, Dongdong Zhang, Ryan Cotterell, Mrinmaya Sachan.

  • The BWB corpus consists of Chinese novels translated by experts into English, and the annotated test set is designed to probe the ability of machine translation systems to model various discourse phenomena. [pdf][dataset]

Keywords: machine translation; discourse; narrative

Investigating the Role of Centering Theory in Neural Coreference Resolution.

Yuchen Eleanor Jiang, Mrinmaya Sachan, Ryan Cotterell. 

  • We investigate the connection between centering theory (the best-known linguistic model for discourse coherence) and modern coreference resolution systems. [pdf][code]

Keywords: coreference resolution; discourse coherence

Deconstructing the Reading Time--Surprisal Relationship.

Yuchen Eleanor Jiang, Clara Isabel Meister, Tiago Pimentel, Tianyu Liu, Mrinmaya Sachan, Ryan D Cotterell, Roger P. Levy. 

  • We investigate the non-linear relationship between "the predictive power of language models" (surprisal) and the cognitive processing load. [pdf][code]

Keywords: reading comprehension; human language processing

Learning Directional Sentence-Pair Embedding for Natural Language Reasoning.

Yuchen Jiang, Zhenxin Xiao, Kai-Wei Chang. [AAAI 2020 (SA)]

  • A mutual attention mechanism for modeling directional inter-sentence relations. [pdf][code]

Keywords: natural language reasoning

Learning Language-agnostic Entity Prototype for Zero-shot Cross-lingual Entity Linking.

Haihong Yang*, Yuchen Jiang*, Zhongkai Hu, Wei Zhang, Yangbin Shi, Boxing Chen, Huajun Chen. [Arxiv]

  • A cross-lingual entity linking model and a knowledge-augmented dataset for cross-lingual entity linking. [pdf][code][dataset]

Keywords: entity linking; cross-lingual language model

BOSH: An efficient meta-algorithm for decision-based attacks.

Zhenxin Xiao, Puyudi Yang, Yuchen Jiang, Kai-Wei Chang, Cho-Jui. Hsieh. [Arxiv 2019]

  • An efficient meta-algorithm, which improves existing hard-label black-box attack algorithms through Bayesian Optimization (BO) and Successive Halving (SH). [pdf][code]

Keywords: machine learning; optimization; robustness 

publcations
Experience

EXPERIENCE

Research Internship at Microsoft

Feb. 2023 - March. 2023

remote

Natural Language Computing Group, Microsoft Research Asia

  • ChatGPT

Research Internship at National Institute of Advanced Industrial Science and Technology (AIST)

Aug. 2022 - Sept. 2022

Tokyo, Japan

Artificial Intelligence Research Center (AIRC)

  • Work with Dr. Hiroya Takamura

  • Controlled Generation in Finance

Teaching Assistant at ETH Zurich

Mar. 2022 - Aug. 2022

Zurich, Switzerland

Advanced Formal Language Theory

  • Pushdown Automata, Dependency Parsing

Research Internship at Microsoft

Oct. 2019 - May 2020

Beijing, China

Natural Language Computing Group, Microsoft Research Asia

  • Working on Document Level Machine Translation

CSST Researcher at University of California, Los Angeles

Jul 2019 - Oct 2019

Los Angeles, CA

Advised by Prof. Kai-Wei Chang, Department of Computer Science

  • Working on Commonsense & Sentence Representation

  • Working on algorithms for Adversarial Robustness

Internship at Alibaba Group

Sept. 2018 - Jun. 2019

Hangzhou, China

Knowledge Graph Group, AZFT

  • Working on Applications of Knowledge Graph

Research Assistant at Zhejiang University

Mar. 2018 - Jun. 2018

Hangzhou, China

Advised by Prof. Huajun Chen, Department of Computer Science

  • Working on Low Resource NLP

  • Working on Cross-lingual Entity Linking

Enter Technology (website)

Dec. 2017 - Jul. 2018

Hangzhou, China

Tech director and Co-founder of EasySleep Team

  • Working on EEG recognition and domain adaption

  • Working on product promotion

Research Assistant at Zhejiang University

Nov. 2017 - Jun. 2018

Hangzhou, China

Advised by Prof. Zhengxing Huang, Department of Biomedical Engineering

  • Working on Medical Text Mining

  • Working on Applications of Blockchain Technology in Healthcare

1200px-Zhejiang_University_Logo.svg.png
1200px-Zhejiang_University_Logo.svg.png
searchpng.com-alibaba-logo-png-image-fre
UCLA_Bruins_script_logo.png
microsoft-logo-hd-11563140791jvuhkgsxpd_
enter.png
ETH logo.png
aist_logo.png
microsoft-logo-hd-11563140791jvuhkgsxpd_

AWARDS

Direct Doctorate Scholarship 

                                                    

  • Department of Computer Science of ETH Zürich

  • 2020 - 2022

  • Top 0.1% out of 60,000 teams

Gold Medal Winner in China College Students' 'Internet+' Innovation and Entrepreneurship Competition (Top 0.1%)

Chinese National Scholarship (Top 1% across China) 

                                                    

  • 2 times (both 2017 & 2018)     

Tang Lixin Scholarship (Top 0.2%)     

  • Lifetime honor.

  • Honored to be one of the only four students who got this honor as a sophomore in 2017.

  • 2 times (both 2017 & 2018)     

First-class Scholarship of Outstanding Student (Top 3%)   

First-class Scholarship of Academic Excellence (Top 3%) 

  • 2 times (both 2017 & 2018)     

  • March 2018

Honorable Mention in Mathematical Contest in Modeling, USA

awards
Contact
bottom of page