Not long ago, data was relatively difficult to distribute and copy. Early data was, of course, stored in book and carved hieroglyphic form and made to last millennia without change. With the development of computers’ data storage took on a new dimension. It was stored on paper cards, magnetic tape, giant magnetic wheels, and even hard-wired, requiring a human to physically move cables every time data needed to change.
Many factors have changed how data is stored and distributed. But when it comes right down to it, data storage is not dissimilar today to the methods used years ago. Symbols or numbers still represent valuable information that we want to keep and share. We’ve just gotten much better at making the data small, efficient, and portable. And those improvements mean we can store more and more data, and we can share it more readily with others today than having to ship them an obelisk. Most of today’s information systems are connected via networks that can transfer massive amounts of data at the speed of light, anywhere in the world, with or without wires.
As the capacity for sharing and transferring data evolves, some challenges surface. For example, how do responsible parties monitor the information being exchanged? How can a security administrator verify that the data being transferred is between authorized individuals and not intruders with nefarious intent? These challenges have certainly been around for a long time, but they grow exponentially more difficult as the volume of data increases, the speed of data transfer increases, and the connections between systems become more ubiquitous. Companies of every size and industry are finding the necessity of analyzing the data that moves across their systems. This is even truer as an ever increasing number of employees have access to more and more data and simpler ways to compromise that data.
One approach that administrators take is to monitor network traffic. This approach involves the administrator setting up specialized software or devices that look at all the data being transferred across a network. Often the software or device can look for specific patterns and take a defined action. For example, if the software detects a known network traffic pattern indicating an attacker infiltrating the network, it can capture the traffic for analysis, alert an administrator, shut down the router or almost any type of response the administrator wants.
Obviously this is a very powerful approach and can be useful in a variety of ways. Network traffic patterns can be monitored to analyze peak usage. Different types of network traffic can be identified so an administrator can understand how the network is being utilized. Locations and nodes can be identified as heavy users and allocated the appropriate resources or limited. And unauthorized use can be detected, such as watching videos or downloading music from the Internet.
The tools used for monitoring network traffic must be high-performing and unobtrusive to the network; otherwise they will negatively impact productivity. For example, many intrusion prevention systems (IPS) are implemented as network capture devices that sit physically between a corporate network and the Internet. As each packet of data comes across the IPS, it must be analyzed and a decision made instantaneously. Any delay will result in performance degradation and will only get worse as more network traffic happens.
Consider voice over IP (VoIP) traffic. Your office might use VoIP for its phone systems—many do. If you were to implement a network monitoring system that required analysis and authorization of all network traffic, your phone calls would all go through this system. Each data packet that was delayed would result in the call becoming choppy and laggy, akin to an overseas call in the 1970’s. A passive system could monitor and capture the network traffic but would be able to interfere with an attack.
When high-speed networks first evolved, high-performance network monitoring and analysis solutions were mostly the domain of technical network engineers and law enforcement agencies. Dedicated hardware, complex software, difficult setup, and complex analysis were huge challenges at the time. Luckily times have changed. Technology has evolved. And these problems are largely gone.
Network monitoring software tools are available today that run on commodity hardware, are high-performing, simple to use, and complete a great deal of analysis and reporting autonomously. Let’s look at each of these strengths.
Performing network monitoring tasks on commonly available hardware instead of dedicated, custom-built, and proprietary machines brings the network monitoring capability into the hands of virtually every sized organization and IT department. Many companies today use older laptop or desktop computers to perform network analysis. (Laptops are often preferred for their portability) A decommissioned laptop can easily be fitted with a USB network adapter (wired or wireless), install a software network monitoring tool, and begin capturing and analyzing data. This is a great recycling of existing assets.
Performance is important when running network analysis tools on hand-me-down hardware. If a software monitoring tool requires massive processor or memory resources it will not be useful on most computers. It will exhibit behaviors that make it useless including missing pieces of data, misinterpreting other pieces of data, and lengthy delays in data analysis and report generation. Most software network monitoring tools available today are very efficient. They usually require some significant amount system resources but almost never require the entire machine’s CPU or memory resources. And analysis is done in real-time with many tools, meaning that the network traffic is analyzed at the moment it is captured
Speaking of analysis, this task has been greatly automated and simplified in modern software network monitoring solutions. As mentioned, early solutions required manual scrutiny of every data packet and provided very little automation. The only individuals who could operate such solutions were dedicated network analysis staff, and their care and feeding were so expensive that only the largest organizations could afford them. Today most of their value has been converted into analysis software and algorithms that automatically interpret data, convert it into a readable format, and store it efficiently for later analysis or evidence. These software tools often also have simplified reporting capabilities that can take gigabytes of network data, scrutinize it, and create a one-page summary report of what’s happening on the network. That’s a skill that the network analysis staff never mastered.
In some cases, the network monitoring software solution performs its tasks autonomously. It knows when to capture and analyze network traffic. And it knows when to take action all by itself. This feature is particularly useful for small and medium businesses that cannot justify a 24 hour IT staff monitoring the network. While the software needs to be told what to look for and what to do when things happen, it is almost always worth the investment to have an IT monitoring solution that doesn’t sleep, doesn’t eat, and listens to its boss.
Monitoring data transfers has long been a challenge. The proliferation of data and the increase of computer speed have made the challenge even more difficult. But the need to monitor data transfers has become more important, not less, over that same time.
Network monitoring software and hardware solutions used to be the sole domain of large corporations and law enforcement organizations that were spying on users or trying to trap a foreign agent. Today these solutions have evolved into inexpensive, simple to use, and versatile tools that every administrator should have in her network analysis and troubleshooting toolbox.