Augmented Electronic Sensing and Robotics

We are currently witnessing the birth of a cultural movement where everyday people tinker with STEM  (Science, Technology, Engineering, Mathematics) concepts, simply for the purpose of satisfying creativity and curiosity. This “maker movement” is powered by the wide availability of low-cost electronics and manufacturing tools, which allow amateur student makers to create interactive objects while exploring scientific phenomena. Dale Dougherty (2011), a founder of the maker movement, characterizes makers as just “playing with technology… They don't necessarily know what they're doing or why they're doing it. They're playing to discover what the technology can do, and probably to discover what they can do themselves.” Such environments have great potential to engage people with STEM concepts and activities while empowering individuals to physically manifest their dreams. However, in these learning environments, the students’ focus is more on completing a technological project rather than comprehending scientific concepts. While some learning does happen in such contexts, the majority of maker activities are about following a list of instructions and trusting that they are going to result into a functional project. We believe that emerging technologies, such as Augmented Reality, have the potential to radically transform STEM education by making challenging concepts accessible to students.

In this work we are developing systems for integrating electronic sensors with interactive augmented reality visualizations. This allows a student to look directly at an electronic system and understand its perception of the real world, even while the system is being manipulated or constructed. This capability can allow novice programmers to more easily understand and debug electronic systems that integrate spatially-distributed sensors and actuators. This work involves hardware and software development (C++, C#, Unity3D, Hololens, Arduino, Raspberry Pi), and research studies involving mixed methods and multimodal sensors (Kinect and physiological sensors) for measuring learning, collaboration, and attitude change.

This work contributes to our understanding of how student education of STEM concepts and processes can be enhanced by new technologies such as augmented reality; contributes a set of reusable modules to visualize and simulate the inner workings of digital sensing systems that are commonly encountered in maker activities; and produces guidelines to help the design of innovative learning environments.

Funding

NSF Award #1748093: EAGER: Making with Understanding

Team Members

Iulian Radu, Michael English, Vivek Hv, Ronell Sicat, Elliot Lee, Bertrand Schneider

 

 

2018_06_15_ar_electronics_p2.mp429.68 MB