Current Lab Applications

Researchers' Links
Roberto Tamassia
Network Monitoring Applications
A common need in highly networked environments involves the monitoring of network traffic to detect interesting or anomalous events. This monitoring must occur in real-time in order to initiate the appropriate corrective action. Telecommunications companies have enormous needs in this area. We have begun exploring requirements of these applications with Verizon and Lucent and could easily deploy a network monitoring system within the Internet Lab to discover operational patterns of use (e.g., congestion, poor Quality of Service). Similarly, we would be able to prototype a financial networking application as occurs in an organization like Fidelity (one of Brown's industrial partners). Fidelity wants to monitor large numbers of on-line customer transactions to detect potential fraud or malfunctioning data feeds. SAND could be used to manage the delivery and placement of key data in the network; Aurora* would be used to detect significant events; Pervasive Programming would be used to build component-based applications that react to these events, and our work on network security would have an obvious role in the handling of this type of highly sensitive information.

Aurora*
Aurora is a stream data management system. It is designed to efficiently process very large numbers of push-based inputs with potentially very high data rates using a dataflow paradigm consisting of specialized operators (boxes) connected by queues (arcs). We currently have a working prototype of this system. A major effort called Aurora* is underway to move Aurora into a distributed setting. Aurora* partitions an Aurora processing graph over many nodes. Some of the technical problems in this context include load sharing and high availability. While we have been able to test Aurora on a small set of workstations, Aurora* presents more demanding requirements. First we would need a collection (10-50) of dedicated workstations as participants. We also need a dedicated network (We need to control the traffic.) that we can configure to test the effect of varying degrees of bandwidth and intermittent connectivity. Finally, we need a set of tools to generate workloads with high data rates to test our load sharing algorithms. The Internet Lab provides a perfect environment for this. While it might be possible to buy a modest number of nodes from existing grants the network infrastructure and the scale of resources would enable a much more realistic testbed for our work. With this scale of equipment, we could deploy a reasonable application with real data.