A cutting-edge 3D hand gesture-controlled game built with React 18, Babylon.js, and TensorFlow.js. Experience intuitive hand tracking with real-time gesture recognition to control interactive 3D objects in a virtual environment.
- Real-time Hand Detection: 60fps gesture recognition using TensorFlow.js MediaPipe
- Precise Overlay Alignment: Accurate hand landmark visualization with perfect coordinate mapping
- Multiple Gesture Types: Open hand, closed fist, pinch gestures with confidence scoring
- 3D Motion Mode: Full spatial hand tracking with calibration system
- Adaptive Detection: Automatic adjustment for different lighting conditions
- Temporal Smoothing: Reduces jitter and improves tracking stability
- Debug Tools: Comprehensive coordinate debugging utilities for development
- Babylon.js 3D Engine: High-performance WebGL rendering
- Interactive Cube: Responds to hand gestures with visual feedback
- Physics Simulation: Realistic movement with gravity, friction, and collision detection
- Dynamic Lighting: Adaptive lighting system with multiple presets
- Smooth Animations: Framer Motion powered UI transitions
- Minimalistic Design: Clean, dark theme with accent colors
- Real-time Performance HUD: FPS, latency, and tracking quality metrics
- Status Indicators: Visual feedback for system components
- Responsive Layout: Works on desktop and mobile devices
- Error Boundaries: Comprehensive error handling with user-friendly messages
- WebGL Context Recovery: Automatic handling of graphics context loss
- Calibration System: Personalized 3D motion tracking setup
- Performance Monitoring: Real-time metrics and optimization
- State Management: Zustand-powered centralized state
- TypeScript Ready: Full type support for development
- Node.js: Version 16.0 or higher
- Yarn: Package manager (required - npm not supported)
- Modern Browser: Chrome, Firefox, Safari, or Edge with WebGL 2.0 support
- Webcam: Required for hand tracking
- Good Lighting: Recommended for optimal hand detection
-
Clone the repository
git clone https://github.com/your-username/3d-hand-pose-game.git cd 3d-hand-pose-game
-
Install dependencies
yarn install
-
Start the development server
yarn dev
-
Open your browser Navigate to
http://localhost:3002
(or the port shown in your terminal) -
Grant camera permissions Allow webcam access when prompted by your browser
- ✋ Open Hand: Move the cube by moving your hand
- ✊ Closed Fist: Grab and drag the cube with enhanced control
- 🤏 Pinch: Scale the cube by changing finger spread distance
- Click the "3D Motion" toggle in the top-left corner
- Follow the calibration process to set up your interaction space
- Enjoy full 3D spatial control with hand orientation tracking
- Optimal Positioning: Keep your hand centered in the camera view for best tracking accuracy
- Lighting: Ensure good, even lighting conditions without harsh shadows
- Distance: Maintain a distance of 1-3 feet from the camera for optimal detection
- Background: Use a contrasting background to improve hand detection
- Movement: Use smooth, deliberate movements for better tracking stability
- Calibration: Use 3D Motion calibration for personalized tracking optimization
src/
├── components/ # React components
├── core/ # Core hand tracking logic
├── 3d/ # Babylon.js 3D scene management
├── hooks/ # Custom React hooks
├── objects/ # 3D object classes
├── store/ # Zustand state management
├── utils/ # Utility functions
└── styles/ # CSS and styling
- React 18: Modern React with concurrent features
- Babylon.js 6.38: 3D graphics engine
- TensorFlow.js 4.15: Machine learning for hand detection
- Framer Motion: Animation library
- Tailwind CSS: Utility-first CSS framework
- Zustand: Lightweight state management
- Vite: Fast build tool and development server
# Build the project
yarn build
# Preview the production build
yarn preview
- CPU: Modern multi-core processor
- GPU: Dedicated graphics card recommended
- RAM: 4GB minimum, 8GB recommended
- Browser: Latest version with WebGL 2.0 support
- Adaptive quality settings based on performance
- Efficient hand detection with configurable frame rates
- WebGL context recovery for stability
- Memory management for long sessions
Create a .env
file in the root directory:
VITE_HAND_DETECTION_FPS=30
VITE_ENABLE_DEBUG_MODE=false
VITE_DEFAULT_QUALITY=high
- Modify
src/core/HandDetectionEngine.js
for detection parameters - Adjust 3D scene settings in
src/3d/SceneManager.js
- Customize UI themes in
tailwind.config.js
Hand detection not working
- Check camera permissions in browser settings
- Ensure good lighting conditions
- Try refreshing the page
- Check browser console for errors
Poor performance
- Lower quality settings in the performance HUD
- Close other browser tabs
- Update graphics drivers
- Use a dedicated graphics card if available
3D scene not loading
- Verify WebGL 2.0 support: visit
webglreport.com
- Update your browser to the latest version
- Disable browser extensions that might interfere
- Try incognito/private browsing mode
- ✅ Chrome 90+
- ✅ Firefox 88+
- ✅ Safari 14+
- ✅ Edge 90+
We welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Make your changes and test thoroughly
- Commit your changes:
git commit -m 'Add amazing feature'
- Push to the branch:
git push origin feature/amazing-feature
- Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- TensorFlow.js Team for the MediaPipe hand tracking model
- Babylon.js Community for the excellent 3D engine
- React Team for the amazing framework
- Open Source Community for inspiration and tools
- 📧 Email: [email protected]
- Video Capture: Real-time webcam feed processing
- MediaPipe Model: TensorFlow.js hand landmark detection
- Gesture Classification: Custom gesture recognition algorithms
- 3D Coordinate Mapping: Transform 2D landmarks to 3D space
- Smoothing & Filtering: Temporal smoothing for stable tracking
- Scene Setup: Babylon.js engine initialization
- Object Creation: Interactive 3D objects with physics
- Lighting System: Dynamic lighting with multiple presets
- Animation System: Smooth interpolation and transitions
- Render Loop: Optimized 60fps rendering
- Adaptive Quality: Dynamic quality adjustment based on performance
- Frame Rate Control: Configurable detection and rendering rates
- Memory Management: Efficient resource cleanup and disposal
- WebGL Recovery: Automatic context restoration on graphics errors
# Run all tests
yarn test
# Run tests in watch mode
yarn test:watch
# Run tests with coverage
yarn test:coverage
- Unit tests for core hand detection logic
- Integration tests for 3D scene interactions
- Performance benchmarks for optimization
- Browser compatibility testing
# Install Vercel CLI
yarn global add vercel
# Deploy
vercel
# Build the project
yarn build
# Deploy dist/ folder to Netlify
FROM node:18-alpine
WORKDIR /app
COPY package.json yarn.lock ./
RUN yarn install --frozen-lockfile
COPY . .
RUN yarn build
EXPOSE 3002
CMD ["yarn", "preview"]
- Multi-hand tracking support
- Gesture sequence recognition
- Multiple 3D objects interaction
- Voice commands integration
- VR/AR compatibility
- Multiplayer support
- Custom gesture training
- Advanced physics simulation
- Mobile app version
Made with ❤️ by GreenHacker