Note: Special thanks to Jim Cook & Jai Honeybrook-Carter at the Innovation TechLab at the University of Sydney, Rob Manson & Alex Young at BuildAR, and to the staff at the Powerhouse Museum for sharing their experiences, showing off their gadgets, and catalyzing change.
Brainwave headsets like the MindWave safely measure EEG signals and
translate them into measures such as “attention” or “relaxation”. These devices output brainwave types (eg. Gamma, Theta, high Alpha, low Beta, etc.) which can be connected to software or other devices so that learners can
(roughly and with training) control simple interfaces using only their mind. Art educators at the University of Sydney have used the headsets as part of a public art exhibition. The Mindfulness Mind Painting Project allowed members of
the public to shape a digital dynamic
artwork based on their levels of mindfulness.
Gesture
Cameras like the XBox Kinect can detect your body position and movement in space. The Innovation TechLab at the University of Sydney is leading
a project designed to teach correct musical instrument posture to music students. Students can adjust their posture based on real-time visual feedback showing them which parts of their body are not yet properly positioned.
Gesture cameras, as well as standard web cameras, can also now roughly detect facial expressions and heart rate (based on skin tone changes). Researchers at the Positive Computing Lab have used facial recognition to detect student emotional responses during Intelligent Tutoring (eg. engagement, boredom, curiosity, frustration, etc.)
Gesture cameras, as well as standard web cameras, can also now roughly detect facial expressions and heart rate (based on skin tone changes). Researchers at the Positive Computing Lab have used facial recognition to detect student emotional responses during Intelligent Tutoring (eg. engagement, boredom, curiosity, frustration, etc.)
3D Printers & Scanners like the MakerBot print objects. The printer looks kind of like a microwave. It works by extruding layer upon layer of liquified plastic (or some other material) over the space of 20 minutes to several hours (depending on the complexity of the object). The objects designs can be purchased or modeled using software like Google SketchUp.
You can also now easily scan objects in the real world using a 3D scanner such as the Structure Sensor which can be attached to an iPad. With these you smoothly move your scanner around the object you wish to scan (eg. a person's head) and then pull it into modelling software or perhaps print a 3D copy for them. 3D printers are incredible for allowing students to cheaply and quickly prototype and test ideas.
Virtual
Reality headsets like the Oculus Rift allow the user to be totally immersed in a virtual world and to
navigate that world either by walking around in real space, by using a standard keyboard and mouse, or by using an additional device like a 360 degree treadmill (eg. Virtuix's Omni). The folks at Stanford's Virtual Human Interaction Laboratory use the Oculus for environmental education and for teaching empathy. For one project, they invite users to play the part of a coral immersed in a coral reef that is dying due to pollution.
In addition to custom 3D animated worlds, users of the Occulus can become immersed in off-the shelf gaming environments like MineCraft which dramatically increases practical possibilities for educators. If even the Occulus is beyond budget, Google's Cardboard (a cheeky low-tech approach to high-tech VR) makes use of any Android phone and some cardboard for instant DIY goggles.
In addition to custom 3D animated worlds, users of the Occulus can become immersed in off-the shelf gaming environments like MineCraft which dramatically increases practical possibilities for educators. If even the Occulus is beyond budget, Google's Cardboard (a cheeky low-tech approach to high-tech VR) makes use of any Android phone and some cardboard for instant DIY goggles.
A mountable device such as the LeapMotion can be attached to 3D goggles allowing hand and arm movements to be tracked. This
means students can pick up and interact with objects in the virtual world
directly. For example, why not let students pull constellations out of a night sky (as the Leap demo shows), pick up sculpture at an art museum, or extract organs from a human body and move them around in 3D space?
Immersive Video - Although most virtual worlds are animated, you can now also guide learners through video using a 360 degree video camera like the Ladybug. Imagine allowing students to wander a museum, a dangerous science lab, a historic but remote location, or prepping them for fieldwork in advance. This can now be done with immersive video taken on site rather than an animated model.
Another way to get previously hard-to-get, or prohibitively expensive imagery is by strapping a digital camera onto a drone. As long as you don't let your drone stray near the White House (or anywhere else you don't have permission to record) you're good to go. Jim cook, Innovation Lead at the University of Sydney, described to me how he and his team helped the University's archeology department take overhead shots of digs by sending a drone over the site daily. This not only saved them money on aerial photography, but they could take many more pictures over time allowing for the creation of a time-lapse video of the project's evolution. (Time-lapse is another trick you can now do easily with your phone, by the way.)
Are you using wearable, 3D or other cutting edge devices for learning? Why not share your experience in the comments?