Search : [ author: 임혜민 ] (1)

PGB: Permutation and Grouping for BERT Pruning

Hye-Min Lim, Dong-Wan Choi

http://doi.org/10.5626/JOK.2023.50.6.503

Recently, pre-trained Transformer-based models have been actively used for various artificial intelligence tasks, such as natural language processing and image recognition. However, these models have billions of parameters, which require significant computation for inference, and may be subject to many limitations for use in resource-limited environments. To address this problem, we propose PGB(Permutation Grouped BERT pruning), a new group-based structured pruning method for Transformer models. PGB effectively finds a way to change the optimal attention order according to resource constraints, and prunes unnecessary heads based on the importance of the heads to minimize the information loss in the model. Through various comparison experiments, PGB shows better performance in terms of inference speed and accuracy loss than the other existing structured pruning methods for the pre-trained BERT model.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr