embodied-agent
There are 38 repositories under embodied-agent topic.
hyp1231/awesome-llm-powered-agent
Awesome things about LLM-powered agents. Papers / Repos / Blogs / ...
zchoi/Awesome-Embodied-Agent-with-LLMs
This is a curated list of "Embodied AI or robot with Large Language Models" research. Watch this repository for the latest updates! 🔥
TheShadow29/awesome-grounding
awesome grounding: A curated list of research papers in visual grounding
tmgthb/Autonomous-Agents
Autonomous Agents (LLMs) research papers. Updated Daily.
eric-ai-lab/awesome-vision-language-navigation
A curated list for vision-and-language navigation. ACL 2022 paper "Vision-and-Language Navigation: A Survey of Tasks, Methods, and Future Directions"
kyegomez/RT-2
Democratization of RT-2 "RT-2: New model translates vision and language into action"
haoranD/Awesome-Embodied-AI
A curated list of awesome papers on Embodied AI and related research/industry-driven resources.
RobotecAI/rai
RAI is a vendor-agnostic agentic framework for robotics, utilizing ROS 2 tools to perform complex actions, defined scenarios, free interface execution, log summaries, voice interaction and more.
allenai/allenact
An open source framework for research in Embodied-AI from AI2.
zju-vipa/Odyssey
Odyssey: Empowering Minecraft Agents with Open-World Skills
mbodiai/embodied-agents
Seamlessly integrate state-of-the-art transformer models into robotics stacks
Yuxing-Wang-THU/SurveyBrainBody
Brain-Body Co-Design for Embodied Agents: Taxonomy, Frontiers, and Challenges
Gary3410/TaPA
[arXiv 2023] Embodied Task Planning with Large Language Models
iris0329/SeeGround
[CVPR'25] SeeGround: See and Ground for Zero-Shot Open-Vocabulary 3D Visual Grounding
Zhoues/MineDreamer
[IROS'25 Oral & NeurIPSw'24] Official implementation of "MineDreamer: Learning to Follow Instructions via Chain-of-Imagination for Simulated-World Control "
bigai-nlco/langsuite
Official Repo of LangSuitE
wendell0218/GVA-Survey
Official repository of the paper "Generalist Virtual Agents: A Survey on Autonomous Agents Across Digital Platforms"
mazpie/genrl
[NeurIPS 2024] GenRL: Multimodal-foundation world models enable grounding language and video prompts into embodied domains, by turning them into sequences of latent world model states. Latent state sequences can be decoded using the decoder of the model, allowing visualization of the expected behavior, before training the agent to execute it.
declare-lab/Emma-X
Emma-X: An Embodied Multimodal Action Model with Grounded Chain of Thought and Look-ahead Spatial Reasoning
xyz9911/FLAME
[AAAI-25 Oral] Official Implementation of "FLAME: Learning to Navigate with Multimodal LLM in Urban Environments"
Josh00-Lu/DiffusionVeteran
Official PyTorch Implementation of "What Makes a Good Diffusion Planner for Decision Making?" [ICLR 2025 Spotlight]
ZJLAB-AMMI/LLM4Teach
Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model
opendilab/OpenPaL
Building open-ended embodied agent in battle royale FPS game
rese1f/STEVE
[ECCV 2024] STEVE in Minecraft is for See and Think: Embodied Agent in Virtual Environment
CEC-Agent/CEC
Official Implementation of NeurIPS'23 Paper "Cross-Episodic Curriculum for Transformer Agents"
GenesisOrigin/BodyGen
Official PyTorch Implementation of "BodyGen: Advancing Towards Efficient Embodiment Co-Design" [ICLR 2025 Spotlight]
LoopMind-AI/loopquest
A Production Tool for Embodied AI
BayesBrain/Habi
Official PyTorch Implementation of Habitizing Diffusion Planning for Efficient and Effective Decision Making
airs-cuhk/airsoul
Next-gen Foundation Model for Embodied AI
raphael-sch/map2seq_vln
Code for ORAR Agent for Vision and Language Navigation on Touchdown and map2seq
Charmve/PuppyGo
vision language model and large language model powered embodied robot
eric-ai-lab/Naivgation-as-wish
Official implementation of the NAACL 2024 paper "Navigation as Attackers Wish? Towards Building Robust Embodied Agents under Federated Learning"
feifeiobama/Awesome-Embodied-Instruction-Following
A leaderboard for Embodied Instruction Following papers and BibTeX entries
eric-ai-lab/R2H
Official implementation of the EMNLP 2023 paper "R2H: Building Multimodal Navigation Helpers that Respond to Help Requests"
nilax97/Agent-Tracking
Tracking an embodied AI agent to estimate movement from observations
seanxuu/Awesome-Embodied-AI
This is a curated list of awesome papers on Embodied AI.