![]() In: Proceedings of the 2019 Conference on EMNLP-IJCNLP, pp. Jiang, Q., Chen, L., Xu, R., Ao, X., Yang, M.: A challenge dataset and effective models for aspect-based sentiment analysis. In: Advances in Neural Information Processing Systems, pp. Hochreiter, S., Schmidhuber, J.: LSTM can solve hard long time lag problems. Association for Computational Linguistics (2019) In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. He, R., Lee, W.S., Ng, H.T., Dahlmeier, D.: An interactive multi-task learning network for end-to-end aspect-based sentiment analysis. In: Proceedings of the 34th ACM/SIGAPP Symposium on Applied Computing (2019) arXiv preprint arXiv:1810.04805 (2018)Įjieh, C., Ezeife, C., Chaturvedi, R.: Mining product opinions with most frequent clusters of aspect terms. arXiv preprint arXiv:1406.1078 (2014)ĭevlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. 28(1), 41–75 (1997)Ĭho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. Our model outperforms previous models in several ABOM tasks, and the experimental results support its validity. To evaluate our model’s performance, we have used the SemEval-14 task 4 restaurant dataset. Our proposed system also builds models to identify user’s opinions for aspect terms and aspect categories by applying different pooling strategies on the last layer of the BERT model. ![]() This paper proposes a new system for ABOM, called BERT-MTL, which uses Multi-Task Learning (MTL) approach and differentiates from these previous approaches by solving two tasks such as aspect terms and categories extraction simultaneously by taking advantage of similarities between tasks and enhancing the model’s accuracy as well as reduce the training time. Moreover, they do not consider the subtasks of aspect categories (e.g., a category of aspect pizza in a review is food) and related opinion polarity. Also, the BERT-LSTM/Attention approach uses different pooling strategies on the intermediate layers of the BERT model to achieve better results. Both approaches use different training algorithms, such as Post-Training and Adversarial Training. These approaches build separate models to complete each ABOM subtasks, such as aspect term extraction (e.g., pizza, staff member) and aspect sentiment classification. ![]() The most prominent neural network-based methods to perform ABOM tasks include BERT-based approaches, such as BERT-PT and BAT. ![]() Aspect-Based Opinion Mining (ABOM) mainly focuses on mining the aspect terms (product’s features) and related opinion polarities (e.g., Positive, Negative, and Neutral) from user’s reviews. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |