Found 19 repositories(showing 19)
chengxuxin
[RSS 2024]: Expressive Whole-Body Control for Humanoid Robots
edpsw
Official implementation of "Exbody2: Advanced Expressive Humanoid Whole-Body Control"
EstellaHumanoids
Estella is an open-source humanoid robot dedicated to building a lifelike anatomical replicant with expressive design and human-centered interaction.
asxiaofengzi
No description available
Environment-aware humanoid locomotion via Dual-Advantage PPO
lyushipeng96
This code is designed for genenrating expressive behavior for humanoid robots
gentlefress
The Project Page of paper "Do You Have Freestyle? Expressive Humanoid Locomotion via Audio Control"
M4YH3M-DEV
SENTIO — The Conscious Empathy Interface A deep-tech humanoid empathy system by DevSora, combining AI cognition (AETHER) and expressive robotics (SENTIO) into a unified platform that perceives, understands, and responds with human-like emotional intelligence.
RoseVZ
RoboJackson is an end-to-end framework that teaches humanoid robots expressive dance movements by learning from short real-world videos. This project combines 3D pose estimation, motion retargeting, and reinforcement learning to enable lifelike, whole-body robotic dancing—starting from YouTube Shorts.
Evm7
Webpage for "Unsupervised human-to-robot motion retargeting via expressive latent space" (Humanoids 2023)
Ke-Wang1017
No description available
ShipengLYU
Enhancing End-user Cognition in Human-Robot Interaction with Expressive Humanoid Robots
dbdxnuliba
No description available
expressive-humanoid
No description available
ESP32-based autonomous humanoid serving robot featuring dual operating modes, IMU-stabilized locomotion, sensor-based obstacle avoidance, expressive OLED eyes, and dashboard-controlled interaction.
darshandoijode04-netizen
ESP32-based autonomous humanoid serving robot featuring dual operating modes, IMU-stabilized locomotion, sensor-based obstacle avoidance, expressive OLED eyes, and dashboard-controlled interaction
PrinceRaut01
Voice-controlled humanoid robot using Arduino Mega 2560, PCA9685 servo driver, and DF2301Q offline voice module. Executes expressive arm, eye, mouth, and shoulder movements based on voice commands via I2C and UART, with GPIO indicators for each action.
This project develops a Graph Neural Network that maps human facial landmarks to servo motor commands, enabling humanoid robots to mirror expressions in real time. Using MediaPipe FaceMesh, PyTorch Geometric, and the i2Head InMoov platform, it advances natural, expressive human–robot interaction.
ThakkarVidhi
NAOgest is a software application that explores how the NAO humanoid robot can use expressive hand gestures to improve conflict resolution in customer support. Developed with Webots, the simulation evaluates the effects of verbal and non-verbal behaviors on user trust, empathy, and satisfaction during stressful interactions.
All 19 repositories loaded