Welcome to Open Science
Contact Us
Home Books Journals Submission Open Science Join Us News
SMILE: A Verbal and Graphical User Interface Tool for Speech-Control of Soccer Robots in Ghana
Current Issue
Volume 6, 2019
Issue 1 (February)
Pages: 1-4   |   Vol. 6, No. 1, February 2019   |   Follow on         
Paper in PDF Downloads: 60   Since Apr. 9, 2019 Views: 1009   Since Apr. 9, 2019
Patrick Fiati, Electrical/Electronic Engineering, Cape Coast Technical University, Cape Coast, Ghana.
SMILE (Smartphone Intuitive Likeness and Engagement) application, a portable Android application that allows a human to control a robot using speech input. SMILE is a novel open source and platform independent tool that will contribute to the robot soccer research by allowing robot handlers to verbally command robots. The application resides on a smartphone embedded in the face of a humanoid robot, using a speech recognition engine to analyze user speech input while using facial expressions and speech generation to express comprehension feedback to the user. With the introduction of intuitive human-robot interaction into the arena of robot soccer, we discuss a couple specific scenarios in which SMILE could improve both the pace of the game and autonomous appearance of the robots. The ability of humans to communicate verbally is essential for any cooperative task, especially fast-paced sports. In the game of soccer, players must speak with coaches, referees, and other players on either team. Therefore, if humanoids are expected to compete on the same playing field as elite soccer players in the near future, then we must expect them to be treated like humans, which include the ability to listen and converse. SMILE (Smartphone Intuitive Likeness and Engagement) is the first platform independent smartphone based tool to equip robots with these capabilities. Currently, humanoid soccer research is heavily focused on walking dynamics, computer vision, and intelligent systems; however human-robot interaction (HRI) is overlooked. We delved into this area of robot soccer by implementing SMILE, an Android application that sends data packets to the robot’s onboard computer upon verbal interaction with a user.
John Paul Titlow, “For Robots Like Baxter, The Interface Becomes A Personality.” Fast Company Labs. Retrieved September 29, 2014 [Online]. Available: http://www.fastcolabs.com/3009374/ for-robots-like-baxter-the-interface-becomes-a\ -personality
Google, Inc. “Package Summary: android. speech,” Android Developers, [Online]. Available: http://developer.android. com/reference/android/speech/package-summary. html (Accessed October 12, 2013.)
Google, Inc. “Package Summary: android. speech. tts,” Android Developers, [Online]. Available: http://developer. android.com/reference/android/speech/tts/ package-summary.html (Accessed October 12, 2013.)
J. Cassell, \Embodied Conversational Agents: Representation and Intelligence in User Interface", AI magazine.
A. J. Davison, M. Montemerlo, J. Pineau, N. Roy, S. Thrunand V. Verma,\Experiences with a Mobile Robotic Guide for the Elderly," in Proceedings of the AAAI National Conference on Artificial Intelligence 2002.
P. Elinas, J. Hoey, D. Lahey, J. D. Montgomery, D. Murray, S. S. James and J. Little,\Waiting with Jos∂e, a vision-based mobile robot" in Proceedings of the 2002 IEEE International Confer-ence on Robotics and Automation Washington, DC, May 2002, pp. 3698-3705, 2002. 6
ITEA Ambience project, http://www.extra.re-search.philips.com/euprojects/ambience/.
B. J. A. Krƒose, N. Vlassis, R. Bunschoten and Y. Motomura, \Aprobabilisticmodelforappearance-basedrobot localization". Image and Vision Computing, 19 (6): 381-391, April 2001.
B. J. A. Krƒose, N. Vlassis and R. Bunschoten, \Omnidirectional visionfor Appearance-based Robot Localization", in Lecture Notes in Computer Science, pages 39-50. Springer, 2002.
Marcel Missura and Sven Behnke. “Self-stable Omnidirectional Walking with Compliant Joints.” In Proceedings of 8th Workshop on Humanoid Soccer Robots 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Atlanta, GA, 2013.
J. M. Porta, B. Terwijn and B. Krƒose, \efficient Entropy-Based Action Selection for Appearance-Based Robot Localization". In Proc. IEEE Int. Conf. on Robotics and Automation, Taipei, May 2003.
Hafez Faraz, Mojtaba Hosseini, Vahid Mohammadi, Farhad Jafari, Donya Rahmati, Dr. Esfandiar Bamdad, “Baset Teen Size 2014 Team Description Paper.” (2014). Available: http://application.germanteam.org/upload/7708a96f5dbd4e541e64bacc933c3db21f90410c/Baset-Teen_TDP.pdf
D. A. Norman, “How might humans interact with robot’s Human Robot Interaction and the Laws Of Robotology, Notes for a Keynote Address to the DARPA/NSF Workshop on Human? Robot Interaction, San Luis Obispo, CA. (2001, September). [Online]. Available: http://www. jnd.org/dn.mss/Humans_and_Robots.html (July 2002).
Christian Theobalt, et al. “Talking to Godot: Dialogue with a mobile robot.” Intelligent Robots and Systems, 2002. IEEE/RSJ International Conference on. Vol. 2. IEEE, 2002.
Ben JA Krse, et al. “Lino, the user-interface robot.” Ambient Intelligence. Springer Berlin Heidelberg, 2003.
Open Science Scholarly Journals
Open Science is a peer-reviewed platform, the journals of which cover a wide range of academic disciplines and serve the world's research and scholarly communities. Upon acceptance, Open Science Journals will be immediately and permanently free for everyone to read and download.
Office Address:
228 Park Ave., S#45956, New York, NY 10003
Phone: +(001)(347)535 0661
Copyright © 2013-, Open Science Publishers - All Rights Reserved