/awesome-LLM-for-social-HRI

A up-to-date list of papers and codebases working on apply large language models (LLM) to enable better social and socially assistive human-robot interaction.

Primary LanguageTeX

Awesome-LLM-for-SAR

An up-to-date list of papers and codebases working on applying large language models (LLM) to enable better social and socially assistive human-robot interaction.

Contents

Papers

Survey

Socially assistive robotics [IEEE 2011]
Socially assistive robotics [2016]
Social {IQ}a: Commonsense Reasoning about Social Interactions [arXiv 2019]
Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism [arXiv 2019]
A systematic review of research into how robotic technology can help older people [2018]
A survey of robot-assisted language learning [THRI 2019]
Use of social robots in mental health and well-being research: systematic review [JMIR 2019]
Impacts of low-cost robotic pets for older adults and people with dementia: scoping review [JMIR 2021] Physical human-robot interaction influence in ASD therapy through an affordable soft social robot [JIRS 2022]
The usability and impact of a low-cost pet robot for older adults and people with dementia: qualitative content analysis of user experiences and perceptions on consumer websites [JMIR 2022]
Turn-taking in conversational systems and human-robot interaction: a review [CSL 2021]
Robotic vision for human-robot interaction and collaboration: A survey and systematic review [THRI 2023]
A systematic evaluation of large language models of code [SIGPLAN 2022]
A survey on multimodal large language models for autonomous driving[IEEE/CVF 2024]
Large language models for human-robot interaction: A review [BIR 2023]
A systematic literature review of experiments in socially assistive robotics using humanoid robots [arXiv 2017]
Learning to summarize from human feedback [NeurIPS 2020]
Recent advances in recurrent neural networks [arXiv 2017]

Natural Language Dialogue

State of the Art

A survey on recent advances in social robotics [MDPI 2022]
Knowledge-grounded dialogue flow management for social robots and conversational agents [IJSR 2022]

LLM and SAR Developments

Language Models for Human-Robot Interaction [HRI 2023]
A social robot connected with chatGPT to improve cognitive functioning in ASD subjects [Front Psychol 2023]
VITA: A Multi-modal LLM-based System for Longitudinal, Autonomous, and Adaptive Robotic Mental Well-being Coaching [arXiv 2023]
GPT Models Meet Robotic Applications: Co-Speech Gesturing Chat System [arXiv 2023]
Applying Large Language Models to Companion Robots for Open-Domain Dialogues with Older Adults [2023]

Multimodal Understanding

State of the Art

Emotional dialogue generation using image-grounded language models [CHI 2018]
Generalizing to unseen domains: A survey on domain generalization[IEEE 2022]
Toward personalized affect-aware socially assistive robot tutors for long-term interventions with children with autism[THRI 2022]
Intention understanding in human--robot interaction based on visual-NLP semantics [Front. Neurorobot. 2021]
Vision-language models for vision tasks: A survey [arXiv 2023]

LLM and SAR Developments

Learning transferable visual models from natural language supervision [PMLR 2021]
Scaling up visual and vision-language representation learning with noisy text supervision[PMLR 2021]
GPT4Vis: What Can GPT-4 Do for Zero-shot Visual Recognition? [arXiv]
Palm-e: An embodied multimodal language model[arXiv 2023]
Prompting Visual-Language Models for Dynamic Facial Expression Recognition [arXiv 2023]
MLLM-Bench, Evaluating Multi-modal LLMs using GPT-4V[arXiv 2023]

LLMs as Robot Policies

State of the Art

Escaping oz: Autonomy in socially assistive robotics [2019]
Reinforcement learning approaches in social robotics [MDPI 2021]

LLM and SAR Developments

Do As I Can, Not As I Say: Grounding Language in Robotic Affordances [arXiv 2022]
Understanding social reasoning in language models with language models [arXiv 2023]
Progprompt: Generating situated robot task plans using large language models [IEEE]
A Sign Language Recognition System with Pepper, Lightweight-Transformer, and LLM [arXiv 2023]
Real-time emotion generation in human-robot dialogue using large language models [Front. Robot. AI]
CognitiveDog: Large Multimodal Model Based System to Translate Vision and Language into Action of Quadruped Robot [arXiv 2024]

Risks and Safety

Between Reality and Delusion: Challenges of Applying Large Language Models to Companion Robots for Open-Domain Dialogues with Older Adults [2023]
Ethical and social risks of harm from language models [arXiv 2021]
A Survey of Safety and Trustworthiness of Large Language Models through the Lens of Verification and Validation [arXiv 2023]
Siren's Song in the AI Ocean: A Survey on Hallucination in Large Language Models [arXiv 2023]
Ethical use of electronic health record data and artificial intelligence: recommendations of the primary care informatics working group of the international medical informatics association [2020]