SensorFly: a minimalist approach for emergency situational awareness
PDF version | Permalink
Quick and safe monitoring of the location and movement of a fire or locating people inside a room are critical activities in many emergency situations. For example, a burning building is a potentially life-threatening environment for emergency responders. Sensor networks can provide useful information about the surroundings—such as temperature and location of obstacles—to responders in these situations. Traditional sensor network research focuses on long-term environmental sensing and understanding. For many currently available sensor networks mobility is nonexistent (e.g., fixed sensors in the home) or uncontrollable (e.g., sensors on a person or animal collar).1, 2 For applications such as emergency response and surveillance, however, human involvement in the deployment and maintenance of these types of systems can be costly or even impossible. In addition, a static approach cannot dynamically respond to changing application and environmental needs. To address these challenges, we have developed SensorFly: a minimalist, disposable, miniature helicopter-based sensor that works in a group to provide real-time situational awareness to responders in an unknown arena without needlessly risking rescuers' lives.
Mobile systems have recently been explored in robotics research.3–6 They are designed for independent navigation with a large number of sensors per device. Such devices are usually fairly capable and result in much heavier, and more expensive, platforms that are not suitable, or are even dangerous, for indoor use. Remote-controlled robots are also used in emergency scenarios to provide video feedback. However, these systems are slow, and generally use only one camera that provides a single point of failure.
SensorFly tackles these difficulties with its miniature heli-copter-based mobile sensors (see Figure 1).7–10 The recent advent of new, miniature electric motors and improvements in battery technologies enabled us to develop the system, which is inexpensive, disposable and provides maximum accessibility in unknown environments. However, SensorFly presents many new challenges—most of which stem from the 30g weight limit of the helicopters—including limited individual sensing capability for each helicopter (gyroscope, compass, accelerometers and one add-on sensor) and limited individual processing capability for computational needs.
We addressed these issues with a combination of collaborative network communications to permit navigational as well as collaborative processing from multiple nodes. This structure leverages physical node mobility and processing in a heterogeneous system.
Figure 2 shows the concept of our collaborative system. The nodes (SensorFly helicopters) can operate in a number of roles. After nodes first fly into a building, they land randomly in the environment. These nodes will transition from sensing nodes into control nodes. The control nodes provide locational information and maintain the local map to guide the sensing nodes that are still flying. In addition, these control nodes collectively process and filter the sensed information before passing the data to the base station. This approach significantly reduces the communication bandwidth requirement of the system.
The sensing nodes take the information on location from the control nodes and track their own motion in the environment. These nodes can also quantify the environment through the sensors they carry; each has an application-specific sensor, for example for temperature, images, sound or radiation. To discover the layout of unknown locations, they bump into obstacles, detect each bump with the accelerometer and inform the control nodes of the obstacles' locations. This approach avoids processing requirements for computer-vision-based navigation algorithms.
The SensorFly senses people through multi-point heterogeneous validation. Various SensorFly nodes equipped with different sensor types will produce an estimate of the likelihood of persons within their sensing area. Their individual estimates are combined to form a comprehensive estimate. When the likelihood of a person's presence reaches a certain threshold, images are sent to the base station for verification. Furthermore, owing to the low weight of the sensors (30g, roughly the weight of 10 pennies), collision with obstacles causes no damage and only slight annoyance if it collides with a person. The device is designed to crash and then recover once landed on the ground: the system is not designed to avoid failures but to expect and tolerate them.
The SensorFly is the first look into inexpensive, minimalist swarming technology. Not only do the networked nodes sense and understand the physical world through distribution of resources, they also interact with it because they are mobile. This research explores the new dimensions of colla-borative-mobility and multi-level distributed recognition with a minimalistic approach. Future research activity in the SensorFly project will focus on a wide range of issues including: collaboration protocols for system mobility, maintenance and navigation; distributed, collaborative, mobility-enhanced sensing and understanding; and realistic experimental testing and implementation in miniature swarm SensorFly nodes.
SensorFly will enable research in collaborative control as well as collaborative sensing. Although the weight limit for a given size of micro-helicopter might increase in the future, increasingly smaller nodes will be attractive because of their minimal intrusiveness. Thus, this investigation into limited capability mobile sensors will have a lasting impact that extends beyond today's systems.