Distributed Resource Monitoring Tool and its Use in Security and Quality of Service Evaluation

No Thumbnail Available

Date

2002-04-10

Journal Title

Series/Report No.

Journal ISSN

Volume Title

Publisher

Abstract

As networks become increasingly more open and complex, their management becomes that much more important. Numerous commercial and open-source tools already exist that are capable of providing useful network analysis. Unfortunately, the majority of these commercial tools are either quite expensive, or require a significant amount of effort for their deployment. Other tools are very specific in their tasks, and offer little in the way of customization. Tools able to provide favorable statistics often fall short when it comes to operating on loaded high-bandwidth networks. Also, as the majority of network management tools operate at the packet level, they often require administrator-level access to capture the data. In addition, there are privacy issues that may also limit who has access to what part of the data. On large networks, this can amount to a lot of data that needs to be scrutinized by a limited number of people. The purpose of this project was to develop an inexpensive, customizable network-monitoring tool (called Resource Usage Monitor) that a) is capable of providing a variety of traffic-related statistical data on a high-bandwidth network, b) provides a user-friendly selective access to that data for users with different privacy privileges, and c) interfaces to a policy management toolset to allow pro-active management of the network based on security, quality and resource information it gathers. The processing engine behind the Resource Usage Monitor (RUM) examines data at the monitored gateway, collecting inbound and outbound information on the number of bytes, packets, network and application flows, and such, transmitted or received for each internal and external host. A web interface provides persistence graphs and reports that can disclose general and specific traffic patterns on the network. This information can be used to assess security, resource usage, and quality of service (QoS) assessment of the monitored network and hosts. For example, setting flow, load, and other activity thresholds at different levels of granularity allows for the detection of anomalies throughout the network. Port scans, Denial of Service (DOS) attacks, and Trojan applications have been detected through surveillance of simple threshold-based patterns. More complex, possibly multi-probe, patterns can reveal much more subtle anomalies and side effects. RUM operates in a statistical mode, rather than continuous mode. It samples the network every few minutes. After each sample the collected data is analyzed and appropriate warnings and interactions with the policy services are effected. Collected packets are sorted and stored by pre-defined subnets allowing parallelism in the processing of the data. This also separates datasets for another reason; it enables secure access to just the network traffic for which an administrator is responsible. Persistence data is kept in the form of graphs generated using RRDtool, a round robin database utility, and in logs. The information logged is completely customizable and can even be offloaded for analysis by other systems. This thesis describes the RUM architecture, data it collects, analysis modules, user handling, and other features.

Description

Keywords

Citation

Degree

MS

Discipline

Computer Science

Collections