Usage
  • 179 views
  • 249 downloads

Studying Limitations of Generative Transformer based models for Aspect Based Sentiment Analysis

  • Author / Creator
    Mullick, Dhruv
  • Companies can only progress if they understand what their customers feel about their products and services. With companies having an online presence, and with the availability of third-party online reviewing platforms like Yelp, it becomes critical to scour through online reviews. Analysing millions of online reviews across various platforms is not a trivial task. Aspect-based Sentiment Analysis (ABSA) is an NLP task useful for automating such an analysis. ABSA solutions have historically used discriminative models, but there have been recent advances in the field which use generative transformer models (like T5 and BART). Generative ABSA models treat the ABSA task as a text generation problem. We study the latest generative ABSA models and discuss some of their limitations. We find that state-of-the-art generative ABSA models perform well for the standard ABSA settings. However, they face problems in certain real-life scenarios like handling cross-lingual settings and with reviews containing coreference resolution. We propose solutions for these limitations, justifying why they work.

  • Subjects / Keywords
  • Graduation date
    Spring 2023
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-w2g1-xj34
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.