We deployed an experimental testbed at the BWN-lab based on currently-off-the-shelf advanced devices to demonstrate the efficiency of our newly developed algorithms and protocols for multimedia communications through wireless sensor networks. The architecture of our testbed is as follows:
Architecture of the BWN-lab Multimedia Wireless Sensor Network Testbed
Our laboratory has a sensor network testbed with 6 imotes from Intel and 50 micaz scalar motes from Crossbow. We plan to increase the number of micaz sensors in order to deploy a higher scale testbed that allows testing more complex algorithms and assess the scalability of the communication protocols under examination.
The testbed includes three different types of multimedia sensors:
The testbed includes three different types of multimedia sensors:
- Low-end imaging sensors
- Medium-quality webcam-based multimedia sensors
- High-end pan-tilt cameras mounted on mobile robots
Low-end imaging sensors such as CMOS cameras can be interfaced with Xbow's micaz motes based on the Cyclops platform. Cyclops is an electronic interface between a CMOS camera module and a wireless mote such as MICA2 or MICAz, and contains programmable logic and memory for high-speed data communication.
The medium-end video sensors are based on Logitech webcams interfaced with Stargate platforms. Stargate is a high-performance processing platform designed for sensor, signal processing, control, robotics, and wireless sensor networking applications.
The high-end video sensors consist of pan-tilt cameras installed on a robotic platform. The objective is to develop a high-quality mobile platform that can perform adaptive sampling based on event features detected by low-end motes. The mobile actor, as we call it, can redirect high-resolution cameras to a region of interest when events are detected by lower-tier, low-resolution video sensors that are densely deployed.
The testbed also includes storage and computational hubs. These are needed to store large multimedia content and perform computationally intensive multimedia processing algorithms.
The medium-end video sensors are based on Logitech webcams interfaced with Stargate platforms. Stargate is a high-performance processing platform designed for sensor, signal processing, control, robotics, and wireless sensor networking applications.
The high-end video sensors consist of pan-tilt cameras installed on a robotic platform. The objective is to develop a high-quality mobile platform that can perform adaptive sampling based on event features detected by low-end motes. The mobile actor, as we call it, can redirect high-resolution cameras to a region of interest when events are detected by lower-tier, low-resolution video sensors that are densely deployed.
The testbed also includes storage and computational hubs. These are needed to store large multimedia content and perform computationally intensive multimedia processing algorithms.
44 MICAZ motes equipped with whip antennas and placed in a rectangular grid formation. | 6 Intel Imotes with Bluetooth radios forming a tree shaped scatternet. (Courtesy of Lama Nachman at Intel Research) |
Acroname GARCIA, a mobile robot with a mounted pan-tilt camera and endowed with 802.11 as well as ZigBee interfaces. | GARCIA deployed on the sensor test-bed. It acts as a mobile sink, and can move to the area of interest for closer visual inspection.. It can also coordinate with other actors and has built-in collision avoidance capability. |
44 MICAZ motes equipped with whip antennas and placed in a rectangular grid formation. | STARGATE based multimedia sensor interfaced with the MICAZ sensor testbed. This allows for video monitoring through the camera-STARGATE interface and communication to the sink through the on-board sensor mote or independently linking with a laptop through the 802.11 card. |
No comments:
Post a Comment