Andrew Y. Ng

Also published as: Andrew Ng


2020

pdf bib
Combining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Akshay Smit | Saahil Jain | Pranav Rajpurkar | Anuj Pareek | Andrew Ng | Matthew Lungren
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

The extraction of labels from radiology text reports enables large-scale training of medical imaging models. Existing approaches to report labeling typically rely either on sophisticated feature engineering based on medical domain knowledge or manual annotations by experts. In this work, we introduce a BERT-based approach to medical image report labeling that exploits both the scale of available rule-based systems and the quality of expert annotations. We demonstrate superior performance of a biomedically pretrained BERT model first trained on annotations of a rule-based labeler and then finetuned on a small set of expert annotations augmented with automated backtranslation. We find that our final model, CheXbert, is able to outperform the previous best rules-based labeler with statistical significance, setting a new SOTA for report labeling on one of the largest datasets of chest x-rays.

2019

pdf bib
Neural Text Style Transfer via Denoising and Reranking
Joseph Lee | Ziang Xie | Cindy Wang | Max Drach | Dan Jurafsky | Andrew Ng
Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation

We introduce a simple method for text style transfer that frames style transfer as denoising: we synthesize a noisy corpus and treat the source style as a noisy version of the target style. To control for aspects such as preserving meaning while modifying style, we propose a reranking approach in the data synthesis phase. We evaluate our method on three novel style transfer tasks: transferring between British and American varieties, text genres (formal vs. casual), and lyrics from different musical genres. By measuring style transfer quality, meaning preservation, and the fluency of generated outputs, we demonstrate that our method is able both to produce high-quality output while maintaining the flexibility to suggest syntactically rich stylistic edits.

2018

pdf bib
Noising and Denoising Natural Language: Diverse Backtranslation for Grammar Correction
Ziang Xie | Guillaume Genthial | Stanley Xie | Andrew Ng | Dan Jurafsky
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)

Translation-based methods for grammar correction that directly map noisy, ungrammatical text to their clean counterparts are able to correct a broad range of errors; however, such techniques are bottlenecked by the need for a large parallel corpus of noisy and clean sentence pairs. In this paper, we consider synthesizing parallel data by noising a clean monolingual corpus. While most previous approaches introduce perturbations using features computed from local context windows, we instead develop error generation processes using a neural sequence transduction model trained to translate clean examples to their noisy counterparts. Given a corpus of clean examples, we propose beam search noising procedures to synthesize additional noisy examples that human evaluators were nearly unable to discriminate from nonsynthesized examples. Surprisingly, when trained on additional data synthesized using our best-performing noising scheme, our model approaches the same performance as when trained on additional nonsynthesized data.

2015

pdf bib
Lexicon-Free Conversational Speech Recognition with Neural Networks
Andrew Maas | Ziang Xie | Dan Jurafsky | Andrew Ng
Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

2014

pdf bib
Grounded Compositional Semantics for Finding and Describing Images with Sentences
Richard Socher | Andrej Karpathy | Quoc V. Le | Christopher D. Manning | Andrew Y. Ng
Transactions of the Association for Computational Linguistics, Volume 2

Previous work on Recursive Neural Networks (RNNs) shows that these models can produce compositional feature vectors for accurately representing and classifying sentences or images. However, the sentence vectors of previous models cannot accurately represent visually grounded meaning. We introduce the DT-RNN model which uses dependency trees to embed sentences into a vector space in order to retrieve images that are described by those sentences. Unlike previous RNN-based models which use constituency trees, DT-RNNs naturally focus on the action and agents in a sentence. They are better able to abstract from the details of word order and syntactic expression. DT-RNNs outperform other recursive and recurrent neural networks, kernelized CCA and a bag-of-words baseline on the tasks of finding an image that fits a sentence description and vice versa. They also give more similar representations to sentences that describe the same image.

2013

pdf bib
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Richard Socher | Alex Perelygin | Jean Wu | Jason Chuang | Christopher D. Manning | Andrew Ng | Christopher Potts
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

pdf bib
Parsing with Compositional Vector Grammars
Richard Socher | John Bauer | Christopher D. Manning | Andrew Y. Ng
Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

2012

pdf bib
Improving Word Representations via Global Context and Multiple Word Prototypes
Eric Huang | Richard Socher | Christopher Manning | Andrew Ng
Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

pdf bib
Semantic Compositionality through Recursive Matrix-Vector Spaces
Richard Socher | Brody Huval | Christopher D. Manning | Andrew Y. Ng
Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning

2011

pdf bib
Learning Word Vectors for Sentiment Analysis
Andrew L. Maas | Raymond E. Daly | Peter T. Pham | Dan Huang | Andrew Y. Ng | Christopher Potts
Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies

pdf bib
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
Richard Socher | Jeffrey Pennington | Eric H. Huang | Andrew Y. Ng | Christopher D. Manning
Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing

2008

pdf bib
Cheap and Fast – But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks
Rion Snow | Brendan O’Connor | Daniel Jurafsky | Andrew Ng
Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing

2007

pdf bib
Learning to Merge Word Senses
Rion Snow | Sushant Prakash | Daniel Jurafsky | Andrew Y. Ng
Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL)

2006

pdf bib
Semantic Taxonomy Induction from Heterogenous Evidence
Rion Snow | Daniel Jurafsky | Andrew Y. Ng
Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics

pdf bib
Solving the Problem of Cascading Errors: Approximate Bayesian Inference for Linguistic Annotation Pipelines
Jenny Rose Finkel | Christopher D. Manning | Andrew Y. Ng
Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing

pdf bib
A Graphical Framework for Contextual Search and Name Disambiguation in Email
Einat Minkov | William Cohen | Andrew Ng
Proceedings of TextGraphs: the First Workshop on Graph Based Methods for Natural Language Processing

2005

pdf bib
Robust Textual Inference via Graph Matching
Aria Haghighi | Andrew Ng | Christopher Manning
Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing