Sheng Shen

a first-year Ph.D. student in BAIR, EECS at the University of California, Berkeley.

profile Berkeley, California
Email: sheng.s@berkeley.edu
Google Scholar
Github
Twitter
Linkedin
Instagram

I studied Natural Language Processing.
At Berkeley, I worked closely with Prof. Kurt Keutzer, Prof. Dan Klein and Prof. Michael Mahoney. Prior to Berkeley, I received my bachelor degree in computer science from Peking University, advised by Prof. Xuanzhe Liu.

Selected Publications (* equal contribution)

  • Reservoir Transformers ACL 2021
  • Noisy Self-Knowledge Distillation for Text Summarization NAACL 2021
  • PowerNorm: Rethinking Batch Normalization in Transformers ICML 2020
  • Train Large, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers ICML 2020
  • Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT AAAI 2020
  • Pragmatically Informative Text Generation NAACL 2019 short
  • Ermes: Emoji-Powered Representation Learning for Cross-Lingual Sentiment Classification WWW 2019
  • Experience

    Facebook AI Research, Research Intern
    Advised by Michael Auli and Douwe Kiela, May. 2020 - Dec. 2020

    Berkeley AI Research, Junior Specialist II
    Advised by Prof. Kurt Keutzer, Prof. Dan Klein and Prof. Michael Mahoney, Jun. 2019 - May. 2020

    Tencent AI Lab, Research Intern
    Advised by Yaliang Li and Wei Fan, Apr. 2018 - Sept. 2018

    University of Illinois at Urbana-Champaign, Research Intern
    Advised by Prof. Aditya Parameswaran, Jun. 2017 - Sept. 2017