<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>The 3rd Workshop &amp; Challenge on Human Behavior Analysis for Emotion Understanding (IJCAI-MiGA2025)</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>August</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Guangzhou</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>China</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Editors Haoyu Chen, University of Oulu</institution>
          ,
          <addr-line>Finland Björn W. Schuller</addr-line>
          ,
          <institution>University of Augsburg, Germany; Imperial College London, London/UK Ehsan Adeli, Stanford University, USA Guoying Zhao, University of Oulu</institution>
          ,
          <country country="FI">Finland</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Preface
The 3rd MiGA Workshop &amp; Challenge to explore using body gestures for both hidden and general
emotional state analysis (MiGA3 in short) was jointly hosted at the IJCAI 2025 conference, in
Guangzhou, China.</p>
      <p>As introduced in the 1st and 2nd MiGA workshops, we focus on a specific group of body gestures,
called micro-gestures (MGs), used in the psychology research field to interpret inner human feelings.
With more and more research attention drawn to Micro-gestures, we continue to organize the third
workshop focusing on human behaviour and furthermore, a newly introduced task: how to utilize
human behaviors for emotion understanding this year.</p>
      <p>The third MiGA workshop and challenge seeks to broaden the research community focused on human
behaviour analysis and its applications in emotion understanding by introducing a brand new task in the
MiGA series, MG-based emotion understanding. The event aims to foster dialogue among researchers
from academia and industry, highlighting key attributes influencing gesture-based emotion recognition
and evaluating recent advancements in the field. Similar to the first and second MiGA, we introduce
two distinct datasets (SMG and iMiGUE datasets) and corresponding benchmarks (MG classification,
online recognition, and a new task, MG-based emotion recognition), with the goal of shaping a new
trajectory for the emotion AI community.</p>
      <p>Building on the success of its inaugural event, MiGA 2025 was organized as a half-day workshop in
Guangzhou, China. The workshop featured one invited talk and addressed topics spanning the
theoretical foundations, technological advancements, and practical applications of gestures and
micro-gestures in emotion understanding. The MiGA 2025 program, hosted in conjunction with IJCAI
2025, included a distinguished invited speaker: Assoc. Prof. Zheng Lian from the Institute of
Automation, Chinese Academy of Sciences, China. Additionally, 11 full papers were presented during
the workshop, selected through a rigorous peer-review process. An invited paper to summarize the
workshop is included later.</p>
      <p>We extend our heartfelt thanks to Assoc. Prof. Zheng Lian for his insightful and thought-provoking
talks. We are equally grateful to all the participants for their invaluable contributions, which were
instrumental in making MiGA 2024 a remarkable event and a dynamic forum for knowledge exchange
within the community. Their engagement sparked vibrant discussions on pivotal and contemporary
advancements, highlighting an exceptional program that exemplified cutting-edge work at the
intersection of AI and emotion AI. Special thanks are also given to Associate Prof. Xiaobai Li and Dr.
Yante Li for assisting with this event. We look forward to the opportunity to host future events of this
caliber, continuing to foster innovation and collaboration in this exciting field.
The following full papers presenting original research works were accepted, and we divided them into three
sessions based on the content of the work.</p>
      <p>In Session 1: MG Classification and Online Recognition, five papers are included in this session, reporting their
competition-winning schemes for the MG recognition tasks.</p>
      <p>In Session 2: MG-based Emotion Recognition, three papers are included in this session, reporting their
competition-winning schemes for the MG-based emotion recognition task.</p>
      <p>In Session 3: Human Behaviour Analysis for Emotion Understanding, four papers are included in this session,
bringing a broader spectrum of the research entry in human behaviour analysis for emotion understanding, as
well as a summary paper of the workshop and challenge.</p>
    </sec>
    <sec id="sec-2">
      <title>Declaration on Generative AI</title>
      <p>The authors have not employed any Generative AI tools for content generation.</p>
    </sec>
    <sec id="sec-3">
      <title>Invited talk</title>
      <p>Large-scale emotion understanding for in-the-wild human-computer interaction scenarios (Assoc.
Prof. Zheng Lian from the Institute of Automation, Chinese Academy of Sciences, China)</p>
    </sec>
    <sec id="sec-4">
      <title>Organizing Committee</title>
      <sec id="sec-4-1">
        <title>Haoyu Chen (University of Oulu, Finland) Guoying Zhao (University of Oulu, Finland)</title>
      </sec>
      <sec id="sec-4-2">
        <title>Ehsan Adeli (Stanford University, USA)</title>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Data Chairs</title>
      <sec id="sec-5-1">
        <title>Yueyi Yang (University of Oulu, Finland)</title>
        <p>Fang Kang (University of Oulu, Finland)
Björn W. Schuller (University of Augsburg, Germany; Imperial College London, London/UK)</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list />
  </back>
</article>