I am a senior undergraduate student majoring in Artificial Intelligence at Northeastern University, China. Currently, I have been working as a Research Assistant working on Embodied AI, Spatial Intelligence and Robotics under the supervision of Prof. Mingyu Ding at UNC-Chapel Hill and Prof. Yao (Mark) Mu at ScaleLab, SJTU.

My research interests mainly focus on:

  • Grounding language in spatial understanding and robotic manipulation
  • Egocentric embodied self-perception of MLLMs
  • Reasoning and planning in multi-modality and embodied AI systems
  • Human-Robot Collaborative Interaction in Real-Simulation Hybrid Environments

My long-term research goal is to develop embodied agents that learn from rich, physically-informed data gathered through real-world interaction. I am dedicated to imbuing these agents with the capacity for causal reasoning and sophisticated decision-making, enabling them to collaborate seamlessly and safely with humans. This pursuit is guided by a core principle: to create technology with everyone, for everyone, ensuring that advancements in AI provide accessible assistance to people from all walks of life.

✨ I am looking for a Ph.D. position starting in 2026 Fall. Please feel free to reach out!

📝 Publications

(*: equal contribution; †: corresponding author)

Tianxing Chen*, Zanxin Chen*, Baijun Chen*, Zijian Cai*, Yibin Liu*, … Ping Luo†, Yao Mu†. RoboTwin 2.0: A Scalable Data Generator and Benchmark with Strong Domain Randomization for Robust Bimanual Robotic Manipulation. arXiv 2025. (Webpage / Repo) GitHub repo stars

Nan Gao†, Yibin Liu, Xin Tang, Yanyan Liu, Chun Yu, Yun Huang, Yuntao Wang, Flora D. Salim, Xuhai Orson Xu, Jun Wei, Yuanchun Shi. The Homework Wars: Exploring Emotions, Behaviours, and Conflicts in Parent-Child Homework Interactions. ACM IMWUT/UbiComp 2025. (Paper)

Yibin Liu, Zhixuan Liang, Zanxing Chen*, Tianxing Chen, Mengkang Hu, Wanxi Dong, Congsheng Xu, Zhaoming Han, Yusen Qin, Yao Mu†. HyCodePolicy: Hybrid Language Controllers for Multimodal Monitoring and Decision in Embodied Agents. ICCV 2025 Workshop on Multi-Modal Reasoning for Agentic Intelligence. (Paper/ Code)

Yibin Liu, Zhenghao Liu†, Yukun Yan, Shi Yu, Shuo Wang, Liner Yang, Yu Gu, Ge Yu, Huimin Chen. Self-Guide: A LLM Reasoning Enhancement Method Based on Self-Guided Planning. CCL 2024 / Journal of Chinese Information Processing. (Paper EN / Paper CN/ Code)

📖 Research Experiences

University of North Carolina at Chapel Hill – Research Intern

Advisor: Prof. Mingyu Ding · 📅 June 2025 – Present, Remote

At UNC, I work on enhancing multimodal large language models (MLLMs) with spatial understanding and action-level reasoning for robotic manipulation, by leveraging human language instructions and interactive guidance in augmented reality (AR) environments. This research explores how embodied agents can better ground language in physical contexts to perform complex tasks more effectively.

Shanghai Jiao Tong University – Research Assistant

Advisor: Prof. Yao Mu · 📅 March 2025 – Present, Shanghai, China

At SJTU, I serve as an equal first author and core contributor of RoboTwin 2.0, a scalable benchmark for robust bimanual robotic manipulation. I led the design and implementation of the robot policy code generation agent, providing the foundation for robust policy data generation in embodied agents.

Tsinghua University – Pervasive HCI Lab – Research Assistant

Advisor: A/Prof Nan Gao, A/Prof Chun Yu · 📅 June 2024 – January 2025, Beijing, China

At Tsinghua, I developed LLM-based methods to infer human behaviors and mental states from dialogue data, aiming to enhance self-awareness and promote well-being. I combined LLM-driven analysis with expert qualitative coding to process large-scale family education data and built a recommendation system that provides personalized guidance for parent–child interactions.

🏢 Industry Experiences

Horizon Robotics – Cloud Platform Intern

Mentor: Yusen Qin (VP of Technology, D-Robotics) · 📅 June 2025 – Present, Beijing, China · Hybrid

At Horizon Robotics, I focus on the development of the RDK-agent . I am building an LLM-powered Copilot system within VSCode to support robotic system development, including automatic coding, environment setup, test generation, code explanation, and manipulation data acquisition, bridging cutting-edge language models with industrial-grade robotics development workflows.

💬 Talks

  • 2024.08, “Retrieval-Augmented Generation Modeling” for Mingtong Weilai (Beijing) Digital Health Science & Technology Research Institute.

👥 Academic Service

🏆 Awards

  • 2025.07 Outstanding Poster at ChinaSI 2025 (Ranking 1st among 61 posters, RoboTwin 2.0).
  • 2024.11 Outstanding Individual in Technological Innovation of Northeastern University.
  • 2024.05 Finalist of Mathematical Contest in Modeling (MCM/ICM 2024, Top 1.69% in 10,387 teams).
  • 2023.10 National Level Third Prize in 2023 RoboCup China Competition Simulation 3D League Simulation (RoboCup 2023).
  • 2023.10 National Level Second Prize in 2023 FIRA SimuroSot China Competition (RoboCup 2023).
  • 2023.11 Future Technology Taihu Scholarship.
  • 2023.09 Excellent Student Scholarship in Northeastern University.

💻 Projects

RDK Copilot: LLM-powered Development Copilot for Robotics at Horizon Robotics

  • Developed and deployed a VSCode plugin that assists robotic system development using LLMs. Supported features include automatic coding, environment setup, code completion, test generation, code explanation, and manipulation data acquisition.

MinRL: Minimal, Clean Code for Reinforcement Learning GitHub repo stars

  • Recognized and pinned by MathFoundationRL, the most popular RL course on Chinese platforms, under “Third-party code and materials.”

Bencao RAG Medical Intelligent Assistant

  • Developed a medical knowledge question-answering system that integrates context awareness, internet access, knowledge graphs, and RAG method to provide accurate and personalized medical information.

🛠️ Technologies

Languages: Python, C++, C, HTML/CSS, JavaScript, Swift, SQL, MATLAB/Simulink, LaTeX

Technologies: PyTorch, Hugging Face, scikit-learn, ROS, OpenCV, NumPy, Git, RAG, Linux, SLAM, iOS Development (SwiftUI, UIKit), ARKit, RealityKit


“Live, travel, adventure, bless, and don’t be sorry.🌍✨” — Jack Kerouac