“ForWarn epitomizes the type of product envisioned when the Threat Assessment Centers were created. This tool literally puts space-age technology into the hands of forest resource professionals. It’s a remarkable collaborative achievement.”
Some of the possible types of forest disturbances that the tools can detect include those “caused by insects, diseases, wildfires, extreme weather, or other natural or human-caused events.” The tool is available at http://www.forwarn.forestthreats.org.
The problem of how to mitigate and clean up e-waste is top of mind for many researchers and policy-makers. A new infographic helps to put the problem into perspective, as do reports of some recent discussions about e-waste governance in the EU and the announcement of e-waste related research efforts in the US.
In Amsterdam last week, participants at an international e-waste conference concluded that a higher e-waste collection objective that is to be met by 2021 was “unfeasible.” A United Nations University (UNU)announcement stated that, “The forthcoming EU collection objective for discarded electrical equipment and energy saving lamps (e-waste) is only achievable if governments are prepared to introduce additional measures.”
According to the UNU, part of the challenge of meeting the EU quota for e-waste collection is that the waste will be outside the mandated collectors’ reach due to legal or illicit exportation. In addition, much of the waste is simply thrown into the trash by consumers.
“Depending on how the nanotubes are used, they can be toxic – exhibiting properties similar to asbestos in laboratory mice. It’s an emerging technology. We want to get ahead of it and make sure that the progress is sustainable — in terms of the environment and human health.”
— Jean-Claude Bonzongo, associate professor of environmental engineering, College of Engineering, University of Florida
Shopping around for your next robot chef? Thinking about getting a little exercise with a virtual workout partner? Maybe you’re planning on taking your autonomous auto out for a spin, or searching for ways to lead wildlife to safety in the wake of a natural disaster?
As the following summaries show, scientists are working overtime to help make each of those visions a reality.
Researchers at New York University’s Polytechnic Institute (NYU-Poly) are exploring ways robots might help to guide wildlife away from environmental disasters and human-created dangers in the future. Their findings show that biomimetic robotic fish can actually assume leadership roles, which is described in the Journal of the Royal Society Interface.
The paper reports that live fish in the experiments appear to “follow in the wake of the biomimetic robot fish, taking advantage of the energy savings generated by the robot.”
These experiments may open up new channels for us to explore the possibilities for robotic interactions with live animals — an area that is largely untapped. By looking to nature to guide our design, and creating robots that tap into animals’ natural cues, we may be able to influence collective animal behavior to aid environmental conservation and disaster recovery efforts.”
Karlsruhe Institute of Technology (KIT) and the FZI Research Center for Information Technology are presenting innovations for everyday life in the future at CeBIT, the trade fair that showcases digital IT and telecommunications solutions, which will run from March 6 to 10, 2012, in Hanover, Germany. The innovations from KIT and FZI include a humanoid kitchen robot that will move around and the interactive HoLLiE service robot.
Kitchen Robot Learns by Watching
In addition to recognizing objects and grasping them with just the right amount of pressure, ARMAR, a humanoid robot, can negotiate its environment, understands and executes commands independently, and even learns by watching humans.
HoLLiE (House of Living Labs intelligent Escort) will show how it intuitively interacts with people at CeBIT. The system is designed to provide those who need care with food, medicine, and interactive entertainment.
Thanks to a modern 3D sensor system, HoLLiE can understand the body movement of its counterpart. In the scenario presented, HoLLiE asks its counterpart to do some sports together in order to remain in good shape. This function may be of therapeutic value to elderly people or people in need of care, but also serve to entertain everybody regardless of age.”
— Press Release, Karlsruhe Institute of Technology
After building an autonomous car with his students, Peter Stone of The University of Texas at Austin has turned his attention to research on autonomous intersection management.
A future where sitting in the backseat of the car reading our newspaper while it drives us effortlessly through city streets and intersections is not that far away. Computers can already fly a passenger jet much like a trained human pilot, but people still face the dangerous task of driving automobiles. Vehicles are being developed that will be able to handle most of the driving tasks themselves. But once autonomous vehicles become popular, we need to coordinate those vehicles on the streets”
Fans of Charles and Ray Eames won’t want to miss this one. Scale of the Universe 2 is an eye-opening take on the powers-of-ten concept that the mid-century design team pioneered. Fortunately for us, this version capitalizes on the powers of some pretty standard digital technology to let viewers take the wheel, so to speak.
In Scale of the Universe 2, 14-year old twins Cary Huang and Michael Huang have created an annotated tour of both inner and outer space. It’s a must see for anyone with even the slightest affinity for planetariums or microscopes.
“Just when you thought the age of the Flash-based web animation had come and gone — check out this amazing visualization of the universe, created by 14-year old Cary Huang, with the technical assistance of his twin brother, Michael. It lets you zoom through objects of any scale, starting with the Planck length and going all the way up to the whole observable universe.”
— Josh Rothman, Boston Globe
For those unfamiliar with the classic 1968 film, it’s easy to find online: simply search for “eames powers of ten video.”
Intelligent technologies for adding value to information and handling “noisy data” are among the expected advances showcased in a list of “Important Technologies” that, according to the World Economic Forum’s Global Agenda Council on Emerging Technologies, are poised to alter the world. The list was produced by a group of experts representing many areas, including science, businesses, and public policy.
“Accelerating progress in science and technology has stimulated a new age of discovery, and many of the technologies identified by the council are critical to building a sustainable and resilient future.”
In alphabetical order, the Council’s top 10 emerging technologies are as follows:
Enhanced Education Technology
Green Revolution 2.0 – Technologies for Increased Food and Biomass
High Energy Density Power Systems
Informatics for Adding Value to Information
Nanoscale Design of Materials
Personalized Medicine, Nutrition and Disease Prevention
Synthetic Biology and Metabolic Engineering
Systems Biology and Computational Modeling and Simulation of Chemical and Biological Systems
Utilization of Carbon Dioxide as a Resource
While the Council members expect that technological progress in these areas could have major impacts on social, economic, and environmental issues, they also caution that their “safe and successful development is far from guaranteed.”
Several announcements in the past week report on advances in scientists’ understanding of the oceans, our galaxy, and the universe. New data collected using multibeam echo sounders, radio frequency observations, and even using a natural anomaly as a magnifying lens, is revealing the physical universe in ever-greater detail.
Earth’s Deepest Place More Accurately Sounded
On February 6, an announcement from the University of New Hampshire Center for Coastal and Ocean Mapping/Joint Hydrographic Center reported that its efforts to map the ocean have revealed new data about the deepest place on Earth, the Mariana Trench in the western Pacific.
Using multibeam echo sounder technology, the researchers discovered four “bridges” that cross the trench. In addition, according to the release, the ocean mapping expedition measured the depth of the trench “with greater precision than ever before.”
Earlier in the week, the Scripps Institution of Oceanography announced that the ocean terrain data in Google Earth had received a “major update.”
The original version of Google Ocean was a newly developed prototype map that had high resolution but also contained thousands of blunders related to the original archived ship data. UCSD undergraduate students spent the past three years identifying and correcting the blunders as well as adding all the multibeam echosounder data archived at the National Geophysical Data Center in Boulder, Colorado.”
On February 3, the Naval Research Laboratory announced that scientists from around the globe have shared their radio observations to create the most precise map of the Milky Way galaxy’s magnetic field.
The key to applying these new techniques is that this project brings together over 30 researchers with 26 different projects and more than 41,000 measurements across the sky. The resulting database is equivalent to peppering the entire sky with sources separated by an angular distance of two full moons.”
— Dr. Tracy Clarke, a member of the research team, U.S. Naval Research Laboratory
Natural ‘Zoom Lens’ Revealing Contours of Time Itself
Maps have always implied an element of timing, but NASA’s February 2 announcement provides a look at how the contours of time itself are being revealed by cutting edge technology.
Using what the agency called “a natural ‘zoom lens’ in space,” NASA’sHubble Space Telescope, a project of international cooperation between NASA and the European Space Agency, has produced close-up images of the brightest “magnified” galaxy to be discovered so far.
A so-called gravitational lens is produced when space is warped by a massive foreground object, whether it is the sun, a black hole or an entire cluster of galaxies. The light from more-distant background objects is distorted, brightened and magnified as it passes through this gravitationally disturbed region.”
What if robots could run the way cheetahs do? Or take to the sky like a butterfly? Or clean out your fridge? For insights into those questions and more, here’s a quick summary of some of the most interesting robot-related research announced this week:
Floats like a butterfly…
By learning how butterflies get around with so much nimbleness and grace, Johns Hopkins engineers hope to help small airborne robots, commonly called micro aerial vehicles or MAVs, to imitate these types of movement and thereby prepare the way for a new generation of tiny flying machines.
The research is using three high-speed video cameras capable of recording 3,000 one-megapixel images per second to scrutinize painted lady butterfly flight dynamics. It may lead to the development of technology that will aid in supporting safer reconnaissance, search-and-rescue, and environmental monitoring missions.
…MAVs must be able to fly successfully through complex urban environments, where there can be tight spaces and turbulent gusts of wind. Flying insects are capable of performing a dazzling variety of flight maneuvers. In designing MAVs, we can learn a lot from flying insects.”
— Tiras Lin, undergraduate, Whiting School of Engineering
Poulakakis hopes his efforts might help to enable quadruped robots to move about quickly and effectively, avoid falls through self-correcting movements, and imitate the running motion of living animals. In addition, under the grant, Poulakakis is developing experiences for K-12 teachers that are intended to stimulate interest for budding engineers.
Biomechanics research demonstrates that springs and running are intimately related. When you run, the knee of the leg that is on the ground initially bends and then extends to prepare the body for take-off. During knee bending, energy is stored in elastic elements such as tendons or muscle fibers. Then, this energy is released during knee extension, pushing the body upward and forward.”
— Ioannis Poulakakis, assistant professor, University of Delaware
Thinking about building the perfect robot? It should have a more humanlike voice that should sound not too young or too old. Interestingly, 51% also preferred that the voice wouldn’t sound too feminine or too masculine either. In terms of appearance, it would be more humanlike than machinelike, a little on the funny side, more colorful than metallic, more round than square shaped, and allow for personal design, perhaps like buying a car.”
— Press Release, Persuadable Research Corporation
The researchers uncovered a long list of desired abilities for domestic robots to help out with. It includes moving heavy things, providing home security, cleaning windows, doing laundry, and — my personal wish-list topper — washing floors and dishes.