Search : [ keyword: 심층 학습 ] (2)

New Transformer Model to Generate Molecules for Drug Discovery

Yu-Bin Hong, Kyungjun Lee, DongNyenog Heo, Heeyoul Choi

http://doi.org/10.5626/JOK.2023.50.11.976

Among various generative models, recurrent neural networks (RNNs) based models have achieved state-of-the-art performance in the drug generation task. To overcome the long-term dependency problem that RNNs suffer from, Transformer-based models were proposed for the task. However, the Transformer models showed worse performances than the RNNs models in the drug generation task, and we believe it was because the Transformer models were over-parameterized with the over-fitting problem. To avoid the problem, in this paper, we propose a new Transformer model by replacing the large decoder with simple feed-forward layers. Experiments confirmed that our proposed model outperformed the previous state-of-the-art baseline in major evaluation metrics while preserving other minor metrics with a similar level of performance. Furthermore, when we applied our model to generate candidate molecules against SARs-CoV-2 (COVID-19) virus, the generated molecules were more effective than drugs in commercial market such as Paxlovid, Molnupiravir, and Remdesivir.

Image Caption Generation using Object Attention Mechanism

Da-Sol Park, Jeong-Won Cha

http://doi.org/10.5626/JOK.2019.46.4.369

Explosive increases in image data have led studies investigating the role of image caption generation in image expression of natural language. The current technologies for generating Korean image captions contain errors associated with object concurrence attributed to dataset translation from English datasets. In this paper, we propose a model of image caption generation employing attention as a new loss function using the extracted nouns of image references. The proposed method displayed BLEU1 0.686, BLEU2 0.557, BLEU3 0.456, BLEU4 0.372, which proves that the proposed model facilitates the resolution of high-frequency word-pair errors. We also showed that it enhances the performance compared with previous studies and reduces redundancies in the sentences. As a result, the proposed method can be used to generate a caption corpus effectively.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr