human-computer-interaction

There are 750 repositories under human-computer-interaction topic.

  • MVIG-SJTU/AlphaPose

    Real-Time and Accurate Full-Body Multi-Person Pose Estimation&Tracking System

    Language:Python8.1k2071.1k2k
  • xinghaochen/awesome-hand-pose-estimation

    Awesome work on hand pose estimation/tracking

    Language:Python3.1k18238534
  • mpatacchiola/deepgaze

    Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Detection and Tracking, Saliency Map.

    Language:Python1.8k9986477
  • b12io/orchestra

    Orchestra is a human-in-the-loop AI system for orchestrating project teams of experts and machines.

    Language:Python673518370
  • charliegerard/gaze-detection

    👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.

    Language:JavaScript61419442
  • DmitryRyumin/AAAI-2024-Papers

    AAAI 2024 Papers: Explore a comprehensive collection of innovative research papers presented at one of the premier artificial intelligence conferences. Seamlessly integrate code implementations for better understanding. ⭐ experience the forefront of progress in artificial intelligence with this repository!

    Language:Python4678020
  • Kingjux/Venocyber-md

    Introducing Venocyber md bot your personal chuddybuddy md you were looking for this is most powerful Whatsapp chat bot created to ensure your WhatsApp personal requirements you are all in one ✍️👋👋

    Language:JavaScript368871.4k
  • MVIG-SJTU/WSHP

    Code for CVPR'18 spotlight "Weakly and Semi Supervised Human Body Part Parsing via Pose-Guided Knowledge Transfer"

    Language:Python300203453
  • alexanderkiel/phrase

    Clojure(Script) library for phrasing spec problems.

    Language:Clojure28813228
  • RylanBot/awesome-hands-control

    an application based on gesture recognition for controlling desktop softwares | 手势识别进行自定义操控电脑程序

    Language:TypeScript2843412
  • quickpose/quickpose-ios-sdk

    Quickly add MediaPipe Pose Estimation and Detection to your iOS app. Enable powerful features in your app powered by the body or hand.

    Language:Swift24291411
  • AmrMKayid/awesome-affective-computing

    A curated list of awesome affective computing 🤖❤️ papers, software, open-source projects, and resources

  • DReyeVR

    HARPLab/DReyeVR

    VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research

    Language:C++1561114143
  • phantasmlabs/phantasm

    Toolkits to create a human-in-the-loop approval layer to monitor and guide AI agents workflow in real-time.

    Language:Svelte1516
  • TobiasRoeddiger/GazePointHeatMap

    Easy to use Python command line based tool to generate a gaze point heatmap from a csv file. 👁️

    Language:Python1403344
  • PedroLopes/openEMSstim

    openEMSstim: open-hardware module to adjust the intensity of EMS/TENS stimulators.

    Language:HTML13413239
  • aruneshmathur/dark-patterns

    Code and data belonging to our CSCW 2019 paper: "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites".

    Language:Jupyter Notebook127134121
  • emexlabs/WearableIntelligenceSystem

    Wearable computing software framework for intelligence augmentation research and applications. Easily build smart glasses apps, relying on built in voice command, speech recognition, computer vision, UI, sensors, smart phone connection, NLP, facial recognition, database, cloud connection, and more. This repo is in beta.

    Language:C++126142126
  • gesticulator

    Svito-zar/gesticulator

    The official implementation for ICMI 2020 Best Paper Award "Gesticulator: A framework for semantically-aware speech-driven gesture generation"

    Language:Python12353619
  • takeyamayuki/NonMouse

    a webcam-based virtual gesture mouse that is easy to use with hands on the desk.

    Language:Python12321414
  • DirtyHarryLYL/HAKE-Action

    As a part of the HAKE project, includes the reproduced SOTA models and the corresponding HAKE-enhanced versions (CVPR2020).

  • tpetricek/teaching

    :mortar_board: Materials for my lectures including programming langauge design, software engineering and human-computer interaction.

    Language:F#100606
  • mrezaei92/TriHorn-Net

    Official PyTorch implementation of TriHorn-Net

    Language:Python7941113
  • abikaki/awesome-speech-emotion-recognition

    😎 Awesome lists about Speech Emotion Recognition

  • srogatch/ProbQA

    Probabilistic question-asking system: the program asks, the users answer. The minimal goal of the program is to identify what the user needs (a target), even if the user is not aware of the existence of such a thing/product/service.

    Language:C++69816
  • manikandan-ravikiran/HCI_Notes

    Notes for Human Computer Interaction course - CS6750

  • aadilkhalifa/virtual-try-on

    Augmented Reality (AR) app for shoe try-on and foot size measurement

    Language:Dart65219
  • lorenz-liu/awesome-hai

    All about human-AI interaction (HCI + AI).

  • chychen/BasketballGAN

    Basketball coaches often sketch plays on a whiteboard to help players get the ball through the net. A new AI model predicts how opponents would respond to these tactics.

    Language:Python60675
  • CrowdTruth/CrowdTruth-core

    CrowdTruth framework for crowdsourcing ground truth for training & evaluation of AI systems

    Language:Jupyter Notebook5716611
  • montybot/FACSHuman

    FACSHuman plugin for MakeHuman project

    Language:Python558714
  • CSN

    Collection-Space-Navigator/CSN

    Interactive Visualization Interface for Multidimensional Datasets

    Language:JavaScript521213
  • xiumingzhang/mosculp-demo-ui

    Demo for "MoSculp: Interactive Visualization of Shape and Time"

    Language:Python52479
  • ahmetozlu/human_computer_interaction

    Fist, palm and hand detection & tracking for intelligent human-computer interaction game character movement control with OpenCV on Java (Processing sketchbook).

    Language:Processing492218
  • FIGLAB/Vid2Doppler

    This is the research repository for Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition.

    Language:Python487217
  • NarendrenSaravanan/Emotion-Recognition-using-Python

    The python code detects different landmarks on the face and predicts the emotions such as smile based on it. It automatically takes a photo of that person when he smiles. Also when the two eyebrows are lifted up, the system plays a music automatically and the music stops when you blink your right eye.

    Language:Python461018