Skip to Main Content

Advertisement

Skip Nav Destination

Compressing Large-Scale Transformer-Based Models: A Case Study on BERT

Transactions of the Association for Computational Linguistics (2021) 9: 1061–1080.
This article has been cited by the following articles in journals that are participating in Crossref Cited-by Linking.
  • David Peer
  • Sebastian Stabinger
  • Stefan Engl
  • Antonio Rodríguez-Sánchez
Pattern Recognition Letters (2022) 157: 76.
  • Shizhen Huang
  • Enhao Tang
  • Shun Li
  • Xiangzhan Ping
  • Ruiqi Chen
Electronic Research Archive (2022) 30 (10): 3755.
  • Abhijit Guha
  • Abdulrahman Alahmadi
  • Debabrata Samanta
  • Mohammad Zubair Khan
  • Ahmed H. Alahmadi
IEEE Access (2022) 10: 11341.
  • Bo Huang
  • Shuai Zhang
  • Jitao Huang
  • Yijun Yu
  • Zhicai Shi
  • Yujie Xiong
Applied Intelligence (2022)
  • Ferhat Demirkıran
  • Aykut Çayır
  • Uğur Ünal
  • Hasan Dağ
Computers & Security (2022) 121: 102846.
  • Kenneth Ward Church
  • Xingyu Cai
  • Yibiao Ying
  • Zeyu Chen
  • Guangxu Xun
  • Yuchen Bian
Natural Language Engineering (2022) 28 (4): 519.
  • Jou-An Chen
  • Wei Niu
  • Bin Ren
  • Yanzhi Wang
  • Xipeng Shen
ACM Computing Surveys (2022)
Close Modal

or Create an Account

Close Modal
Close Modal