human-machine-interface

There are 36 repositories under human-machine-interface topic.

  • michpolicht/CuteHMI

    CuteHMI is an open-source HMI (Human Machine Interface) software written in C++ and QML, using Qt libraries as a framework. GitHub repository is a mirror!

    Language:C++21321184
  • NeuralAction/NeuralAction

    Neural Action is a real-time CNN-based gaze tracking application providing human-machine interface to improve accessibility.

    Language:C#48525
  • LMBooth/pybci

    Create real-time BCI's with the LSL, PyTorch, SKLearn and TensorFlow packages.

    Language:Python212243
  • cooelf/OpenIME

    Open Vocabulary Learning for Neural Chinese Pinyin IME (ACL 2020)

    Language:Lua17517
  • timechain

    MitchellTesla/timechain

    🐝 Hive of The CyberHornets 🐝

    Language:M413408
  • OpenSmock/Penfeld

    Penfeld is an User-Interface (UI) definition model framework for Pharo.

    Language:Smalltalk11691
  • anion0278/mediapipe-jetson

    Google's MediaPipe (v0.8.9) and Python Wheel installer for Jetson Nano (JetPack 4.6) compiled for CUDA 10.2

    Language:C++9212
  • ATVM-INTERFACE

    Amey-Thakur/ATVM-INTERFACE

    It is an interface for an Automatic Ticket Vending Machine.

    Language:JavaScript720
  • Amey-Thakur/HUMAN-MACHINE-INTERACTION-AND-HUMAN-MACHINE-INTERACTION-LAB

    CSC801: Human Machine Interaction [HMI] & CSL801: Human Machine Interaction Lab [HMI Lab] <Semester VIII>

    Language:JavaScript7201
  • rparak/Unity3D_Robotics_ACOPOStrak

    A digital-twin of the ACOPOStrak transport system integrated into the Unity3D development platform.

    Language:Standard ML7101
  • vr-app-handtracking

    GiovannyJTT/vr-app-handtracking

    Virtual Reality Application implemented as part of my Master's degree thesis. Using: Oculus Rift DK2, Leap Motion, Unity 3D, Nvidia 3D Vision glasses, hand tracking, computer-human interaction Hands are an indispensable way for humans to interact with the environment in their daily lives. To incorporate the possibility that they can interact with their hands within a virtual world, it raises the degree of immersion and provides a form of natural interaction in virtual reality applications. In this final master work, we undertake the development of a virtual reality application in which stereoscopic visualization and hand interaction are used. At same time, we compare two modes of stereoscopic visualization to determine if participants cope with any of them better. We will observe if the results indicate that regardless of gender, age, or profession, participants prefer the mode with a virtual helmet. riunet.upv.es/handle/10251/77848

    Language:C#6200
  • ibe16/Distributed-chains-monitoring

    :factory: :bar_chart: Online monitoring systems of distributed manufacturing lines

  • yahiayasser/Door_Locker

    Door_Locker Two micro-controllers, one acts as a HMI (Human Machine Interface) This micro-controller has a Dio module, LCD Module, Keypad Module and UART Module This one will be used just interfacing with user. The other micro-controller will be responsible for controlling the motor (actuation part) This micro-controller has a Dio module, Timer Module, DC_Motor Module and UART Module This one will be used for just Controlling the motor that will act as the lock for the door. The scenario will be as following: 1- first use Mode: the user will be prompted to enter pass and confirm it using keypad, if passwords are matched the password will be saved to the internal EEPROM of the first micro-controller. If the passwords are not matched it will tell you that the passwords are not matched and ask you to re-enter passwords. 2- Operating Mode: LCD Shows Supported Operations: a- Open the door: If the user choose to open the door he will be prompted to enter the password, if he fails with 4 trials, he will have to wait for 30 seconds to re-enter the password again. if the password is right, a message will be sent to the other micro-controller using uart to open the door. b- Change Password : user must enter the old password first to change. 3- Opening the door: When the message of opening the door will be received at the second micro-controller, it will rotate the motor 0.5 in the clockwise direction. 4- Closing the door: At first micro-controller, a message will be shown: "[1]Close Lock", if the user entered 1 a message will be sent to the other micro-controller to close the door, the second micro-controller will receive the message and rotate motor will rotate 0.5 in the anticlockwise direction. The LCD at the first micro controller will reflect any action / State happens at the system. for example if the door is opening it should show : Lock is opened

  • natanaelalmeida/ceptron-mov

    Ceptron-mov is a study project that aims to apply the concepts of Machine Learning and Data Science to build a human-machine interface capable of identifying physical movements performed by humans.

    Language:Jupyter Notebook210
  • stevenlowery011/pymensor

    Python driver for Mensor Modular Pressure Controllers

    Language:Python2302
  • umutcvc/HMI

    Human machine interface with Unity & Arduino

    Language:C#2201
  • aman-ankam/GESTURE_HAWK

    controlling the robot using hand gestures through image processing

    Language:Python1
  • andres48381/UPM-domoticaHMI

    Programa basado en Processing para la representación del panel de control de la vivienda. Comunicación serie mediante bluetooh con la tarjeta de control Arduino de la maqueta

  • rparak/BaR_Robotic-table-football

    Robotic table football controlled by B&R Automation system.

    Language:C110
  • rparak/JXCP1_BaR_SMC

    An open-source library for controlling JXCP1 (Step Motor Controller) via B&R Automation PLC.

    Language:C110
  • UtBotsAtHome-UTFPR/display_emotions

    ROS package that displays emotions through faces and speech characteristics

    Language:C++100
  • will-chung/tritonai-hmi

    Web-based human-machine interface for autonomous racecars.

    Language:Svelte110
  • dms_moveit

    anion0278/dms_moveit

    Improving Mutual Understanding for Human-Robot Collaboration

    Language:C++0100
  • cemtorun/Tremble

    🏆 HackWestern 1st Place (Data Visualization) - 🏥 Developed a Human Machine Interface though position tracking to accurately diagnose degenerative disorders such as Parkinson's. Built using Leap Motion hardware integrated with FFT ML algorithm with Python and displayed with Javascript and Angular 7.

    Language:Python0301
  • neves-nvs/IPM

    IPM Project repo

    Language:AMPL0000
  • prozedur/prozedur

    Engineer & Designer

  • santimontiel/HMI-Eurobot

    📺 Interfaz máquina-usuario para los robots de competición del Equipo de Robótica de la Universidad de Alcalá. Compitiendo en Eurobot 2021. Escrito en Python, integrando PyQt5 con ROS.

    Language:Python0200
  • alexi-courieux/DUT_S2A_SubPay

    This project is a training about HMI, the objective was to create an application allowing customers to make orders on a terminal.

    Language:Java
  • anion0278/mediapipe

    Fork of Google's MediaPipe (v0.8.9) and Python Wheel installer for Jetson Nano (JetPack 4.6) CUDA (10.2)

    Language:C++
  • Axelvel/FittsLaw

    User Interface created in Qt to demonstrate Fitts' Law.

    Language:C++22
  • ElsevierSoftwareX/SOFTX-D-20-00094

    A C++ pre-processor for CAD (stl) features extraction. To cite this software publication: https://www.sciencedirect.com/science/article/pii/S2352711021000650

  • industrialtablet/7inch-Human-Machine-Interface

    This repository is Allwinner A133 7inch Human Machine Interface Documents from HYY Technology Co.,Ltd.

  • kwonus/Quelle-Obsolete

    A clear and concise HMI specification with an open-source reference implementation. Supplied with projects that compile to dotnet 4.8 and dotnet 5.

    Language:C#
  • MathematicFirmsofMemphis/-NSN-987654321---MIL-STD-67890---MIL-SPEC-12345---PART67890-

    Visual Effects and Simulations of America LLC | Search Pressure and Spatial Redux Hunt Machinery LLC | Smart Superior Anatomy and Kardashev 9 Human Eating Hunting LLC

    Language:C#
  • Pratyush-Sharma/VR-Vision

    VR Vision bot was showcased in TechEvince 5.0, IIT Guwahati, as a simple image processing based human-machine interface with a picam fitted in it through which we can see live reality on oculus .

    Language:C++10
  • yolanda93/human_machine_interface

    The Human Machine Interface allows operators or users of the system to bi-directionally communicate with each aerial robotic agent.

    Language:Logos211