This story on HackerNoon has a decentralized backup on Sia.
Transaction ID: YKcrduxALLv6tyvCX7hsA-F9ojWWelYWjnXzWsHHBw8
Cover

Unimodal Training for Multimodal Meme Sentiment Classification: Hyperparameters and Settings

Written by @memeology | Published on 2024/4/7

TL;DR
This study introduces a novel approach, using unimodal training to enhance multimodal meme sentiment classifiers, significantly improving performance and efficiency in meme sentiment analysis.

Authors:

(1) Muzhaffar Hazman, University of Galway, Ireland;

(2) Susan McKeever, Technological University Dublin, Ireland;

(3) Josephine Griffith, University of Galway, Ireland.

Abstract and Introduction

Related Works

Methodology

Results

Limitations and Future Works

Conclusion, Acknowledgments, and References

A Hyperparameters and Settings

B Metric: Weighted F1-Score

C Architectural Details

D Performance Benchmarking

E Contingency Table: Baseline vs. Text-STILT

A Hyperparameters and Settings

Table 6: Hyperparameter values and settings used during model training by input type.

This paper is available on arxiv under CC 4.0 license.

[story continues]


Written by
@memeology
Memes are cultural items transmitted by repetition in a manner analogous to the biological transmission of genes.

Topics and
tags
meme-sentiment-analysis|text-stilt|unimodal-sentiment-analysis|sentiment-labeled-data|model-training-hyperparameter|model-training-settings|sentiment-analysis|meme-sentiment-classification
This story on HackerNoon has a decentralized backup on Sia.
Transaction ID: YKcrduxALLv6tyvCX7hsA-F9ojWWelYWjnXzWsHHBw8