Aravindhan Thaninayagam Arizona State University , Tempe, AZ
Abstract
The goal of this project is to implement a Tic Tac Toe game using a camera and a 6-axis collaborative robot (cobot) for the player's moves. The program utilizes computer vision techniques to detect the game board and allows the player to make moves by placing objects in front of the camera. Additionally, an AI-controlled cobot makes its moves on the board, creating a dynamic and interactive gaming experience.
Tic Tac Toe, a timeless game of strategy, simplicity, and universal appeal, inspired us to explore innovative ways to enhance its gameplay. The project introduces a groundbreaking concept by integrating a 6 - axis collaborative robot (cobot) into the traditional Tic Tac Toe setup. This integration not only challenges players' strategic thinking but also adds an element of dynamism and unpredictability to the game. The use of computer vision techniques, coupled with a robust AI algorithm, ensures that the cobot autonomously engages in the gameplay, making moves that keep players on their toes. This article delves into the methodologies, technologies, and outcomes of this exciting project.
In this project, the implementation comprises three main components: Object Detection, Cobot Control, and Game Logic. The Object Detection component, managed by the Object_detect class, utilizes OpenCV for color-based object detection and ArUco markers to locate the Tic Tac Toe board within the camera feed. Cobot Control is achieved through the pymycobot library, enabling precise movements of the 6-axis cobot. The cobot is initialized, and specific movements are programmed to pick and place game pieces on the board. The Game Logic section includes functions for initializing the board, drawing it on the camera feed, and checking for game outcomes such as a winner or a draw. An AI-controlled opponent, using the minimax algorithm, autonomously makes moves on the physical board, providing an interactive gaming experience. Together, these components seamlessly integrate computer vision, robotics, and game logic to create a unique and engaging Tic Tac Toe experience.
2.1. Object Detection
The Object_detect class, a pivotal element of the project, leverages OpenCV for sophisticated color- based object detection. This, coupled with ArUco markers, provides an efficient means to identify the Tic Tac Toe board's position and orientation within the camera feed. This section explores the intricacies of implementing computer vision for seamless interaction.
2.2. Cobot Control
Cobot Control, facilitated by the pymycobot library, enables precise and dynamic movements of the 6 - axis cobot. By programming specific movements, we ensured that the cobot picks and places game pieces on the board with finesse. This segment provides insights into the challenges and triumphs of integrating robotics seamlessly into the gaming experience.
2.3. Game Logic
The core of the project lies in the Game Logic, where we implemented functions for initializing the board, drawing it on the camera feed, and checking for game outcomes. An AI-controlled opponent, powered
by the minimax algorithm, adds an element of challenge and unpredictability. This section delves into the complexities of AI integration and its impact on user experience.
2.4. Human-Computer Interaction
A crucial aspect of the project is how users interact with the game. This involves placing objects in front of the camera to make moves, with the cobot responding dynamically. Exploring the nuances of this human- computer interaction adds depth to the understanding of user experience design.
The project seamlessly integrates computer vision, robotics, and AI to create an interactive Tic Tac Toe game. However, the journey was not without its challenges. In this section, we discuss the outcomes of the project, highlighting both successes and areas for improvement.
3.1. Object Detection Challenges
Implementing object detection posed several challenges, with calibration being a critical aspect. Achieving accurate calibration between the camera and the cobot's movements was crucial for precise interactions. Fine-tuning this calibration proved to be a meticulous process, as small discrepancies could result in misalignments between the virtual and physical Tic Tac Toe boards. This aspect was particularly challenging due to the need for real-time adjustments based on lighting conditions and the placement of the game board.
3.1 Accuracy in Predicting Box Coordinates
One of the central elements of the project was predicting the coordinates of the Tic Tac Toe grid boxes. While the system demonstrated commendable accuracy, occasional discrepancies were observed, especially in scenarios with varying lighting conditions. This inconsistency in predicting box coordinates introduced an element of unpredictability, impacting the overall gaming experience. The discussions delve into the strategies employed to enhance accuracy, considering factors such as ambient lighting and object placement.
3.2 Precision Issues During Cobot Movements
The cobot's precision during box placement was another facet that required careful consideration. Despite meticulous programming, there were instances where the cobot placed game pieces slightly off- center or in between lines. This was particularly noticeable during rapid movements, introducing challenges in maintaining synchronization between the virtual and physical boards. Addressing this precision issue involved a combination of refining the cobot's movement algorithms and enhancing the coordination between the camera feed and cobot control. We elaborate on the strategies employed to minimize these discrepancies and enhance overall precision.
3.3 Human-Computer Interaction Dynamics
The dynamics of human-computer interaction played a pivotal role in the success of the project. The user's ability to make moves by placing objects in front of the camera was a key element. However, ensuring a seamless interaction experience required addressing challenges such as occlusions, where objects could temporarily block the camera's view, leading to momentary disruptions. The discussions explore the strategies employed to enhance user experience and minimize potential disruptions during gameplay.
3.4. Future Considerations and Improvements
As we reflect on the outcomes and challenges faced, it becomes evident that future iterations of the project could benefit from advancements in computer vision algorithms, robust calibration methodologies, and
enhanced cobot precision. We discuss potential avenues for improvement, including exploring advanced AI techniques for predicting box coordinates and refining the cobot's movement algorithms. In conclusion, while the project successfully integrates multiple technologies to create an interactive Tic Tac Toe game, addressing these challenges opens avenues for further research and development. The intersection of computer vision, robotics, and AI continues to offer exciting possibilities for enhancing gaming experiences.
In conclusion, the project not only successfully combined computer vision, robotics, and game logic to create an interactive Tic Tac Toe game but also unveiled the complexities inherent in such interdisciplinary endeavors. As we celebrate the achievements, we recognize the journey as a stepping stone, inviting future exploration into enhanced calibration methodologies and precision in cobot movements. This project stands as a testament to the potential of technology to elevate traditional games into captivating, immersive experiences, marking the commitment to pushing the boundaries of innovation in human-computer interaction. improvements could include refining the object detection algorithms and exploring additional game features.
Sincere thanks to Prof Sangram Redkar for the course, and the graders Rohith, Ranga and Tatwik for their constant support throughout the lab sessions.
Serial Number : ERMC