Thursday, August 20, 2020

Network Traffic Analysis (NTA) - The First Step in Intrusion Detection!

 

Next-generation intrusion detection systems (IDS) are supplanting their legacy predecessors to provide complete security for complex networks. This new breed of security solutions take advantage of intelligent data and machine learning to provide full network traffic analysis (NTA).



NTA actually is a term coined by research firm, Gartner. 

The company defines NTA as a way to separate legacy (mostly layer 3 technology) from next-gen layer 7-based technology – what that means is that NTA analyzes network activities intelligently to provide comprehensive security.

NTA now is inextricably linked with modern IDS solutions, relying on intelligent data and machine learning to offer full visibility of the network. It works in tandem with, or is complementary to, perimeter protection offerings to provide a holistic view of the entire network, within and beyond the network’s edge.

To understand why NTA is an essential building block of next-gen IDS, we need to examine its critical role regarding data, traffic flow and network deployment.

NTA’s role in complex networks 

Network traffic and data are the two absolutes in any network. Unauthorized access and malicious behaviors occur as network activity, and can be detected within traffic data.

Next-generation IDS relies on complete, holistic data about all network traffic to work effectively. Consequently, all traffic and transactions taking place throughout the network must be analyzed to achieve 100% visibility. Network layers 2 through 7 must be thoroughly analyzed to detect threats and illicit behavior.

Of course, that’s where legacy IDS offerings fail, providing limited visibility of traffic in the upper layers of the network. NTA offers a clear view of all traffic and transactions, capturing data intelligently and automatically.

Less manual manipulation of data means there are less chances of human error. Beyond that, next-generation IDS solutions using NTA are typically lightweight and have no impact on network speed and quality of service once deployed. They work by acquiring relevant information from traffic packets and storing it as intelligent metadata.

This smart, cost-effective, lightweight approach to capturing and analyzing network data is what makes NTA so attractive for next-generation IDS solutions.

How next-generation intrusion detection systems use machine learning and analytics

Machine learning and analytics are critical components of next-generation IDS solutions. These capabilities bridge data to the network platform to offer signature, statistical and anomaly threat and behavior protection.

Analytics and data intelligence are used for investigations and support of threat and behavior detection. They also trigger alerts and inform alert management, offering guidance about issues that have been pinpointed and suggested areas that need additional investigation.

Further, real-time data is seamlessly combined with historical data for advanced forensics and analytics. Warning signals for threats, indicators of compromise (IOCs), attacks and other malicious activity are triggered more accurately as well. The result is that organizations can find and remediate issues quickly and efficiently.

Why NTA is an important consideration when choosing a next-generation IDS 

Network architectures are becoming increasingly sprawling and complex, and IDS solutions need to be able to work with a variety of platforms. That includes public and private cloud environments, data centers and IaaS, PaaS and SaaS deployments.

Next-generation IDS offerings need to integrate easily with third-party applications and data to offer true visibility and coverage. That means a solution should be able to enforce third-party APIs and orchestrators. It should also be able to integrate threat intelligence from third parties as well as offer integrated active directory to provide enriched incident context.

According to Gartner, many of the firm’s clients report that NTA has detected suspicious network traffic other security tools missed. Gartner believes NTA has a vital role to play in security operations and should be a strong consideration for any organization upgrading its network security.

Behavior-based machine learning detection will be a core component in next-gen security, and NTA places behavior analysis at its core. Scalability is another important consideration for many organizations, and NTA’s lightweight nature lends itself to easy and affordable scalability.

The ability to automatically investigate threats and attacks is a major factor in mitigating security breaches. NTA enables intelligent and automated investigation and response, making it an invaluable part of any next-generation IDS solution.

NTA should be an important consideration when choosing a next-generation IDS solution. It’s an ideal fit for today’s complex, sprawling multi-layered network topologies. By analyzing network traffic and behavior intelligently and automatically, NTA builds on its findings through machine learning to pinpoint malicious behavior quickly and efficiently.

NTA-based solutions also are designed to work with public and private cloud infrastructure as well as data centers and other network elements. The end result is a holistic solution offering a unified view of the entire network, its traffic and its behaviors.

This blog post is part of a three-part series on the importance of next-generation IDS solutions for securing complex networks. Our previous post discussed how next-gen IDS solutions can work effectively beyond the network edge: “How intrusion detection systems work effectively beyond the network edge”. 

Our next post, “Protecting against perimeter breaches with network traffic analysis (NTA) in next-generation intrusion detection systems”, will discuss the importance of NTA for detecting illicit activities and behaviors.


Author Andrey Yesyev  - Before joining Accedian as the Director of Cybersecurity Solutions, Andrey spent nearly 6 years at IBM as a security engineer and a threat analytics architect working on QRadar Incident Forensics and DNS Analytics projects. He was also a part of the IBM team that supports collaboration with Quad9, a secure public DNS service that was created as a collaboration between PCH, IBM, and the Global Cyber Alliance. With more than 10 years of experience in deep packet inspection and traffic analytics, Andrey placed 1st, 2nd, 3rd, and 2nd in the Network Forensic Puzzle Contest at DefCon 21, 22, 23 and 24, respectively. 

Wednesday, August 19, 2020

Empathy: A Requiem

 

It was the summer of 1977 when I drove with my fiancé’s younger brother to the Hollywood Boulevard Walk-of-Fame. Neither of us was that impressed with the famous names on the sidewalk; we were there to stand in line at Mann’s Chinese Theater for the very first Star Wars movie. These were the days when Hollywood was still a thing, and Mann’s (built in 1927) was a historic place to see a film. It also boasted of the revolutionary new THX sound system. We were blown away before the opening crawl.

That was a long time ago...


For reasons of both nostalgia and curiosity, I sat down at home recently to re-watch Episode IV (that same film). No longer was I captivated by light sabers, the quirky patrons of Mos Eisley Cantina, or the mysteries of “The Force”. Our 4K HD flatscreen and Dolby Surround sound system did a respectable job of blowing me away, but the novelty of the film had worn off, and I was drawn more to the story arc and the various plot clues.


One memorable line was delivered by Sir Alec Guinness as Obi-Wan – “I felt a great disturbance in the Force, as if millions of voices suddenly cried out in terror and were suddenly silenced.” The characters soon discover that the entire planet Alderaan, with all its inhabitants, had just been destroyed. The Force, it seems, has a lot to do with empathy – the ability to connect with the feelings of others or to feel with them.


The entire Star Wars ennealogy, we are reminded at the beginning of each episode, takes place a long time ago in a galaxy far, far away. Assuming the filmmakers intended their work as a glimpse into our future, we can surmise that we will one day travel at the speed of light, colonize planets throughout the galaxy, and still not have paint that doesn’t peel off. It also appears that empathy will be the province of a select few in whom the Force is strong.


There is already some evidence that this waning of empathy is underway. It is hard to miss the impact of Internet anonymity on human civility – it is much like the road rage that arises when the protection of a four-wheeled steel cocoon frees some to vent their accumulated anger.

Not surprisingly, empathetic people are better at reading body language – difficult to do on the Internet – whereas non-empathetic people are less self-aware and more apt to be needing approval from others. As Kaitlin Phillips points out in “The Future of Feeling”, social technology may offer to superficially link people, but it doesn’t foster genuine human connection. Reading about tragic world events may produce sympathy, but rarely does it lead to true empathy.


As empathy declines and our world becomes ever more divisive, it is easy to get discouraged. Some of the best advice for times like these comes from the late Fred Rogers, the indisputable master at explaining difficult and scary things to children. “Look for the helpers”, Fred would quote his mother as saying.


One such helper is Dylan Marron, an actor/activist who is combating debate, anger and hurt feelings with listening and understanding. Dylan’s podcast “Conversations with People Who Hate Me” is his attempt to turn up the empathy level. His approach is to reach out to people who have expressed anger on the Internet and engage them in a conversation where debating is to be replaced by listening and perhaps agreeing to disagree. The powerful takeaway is this – empathy is not endorsement.


Over forty years had passed when my wife’s younger brother and I went to the Arlington Theater in Santa Barbara to see the ninth and supposed final movie of the Stars Wars saga. The Arlington (opened in 1931) is another historic site for movie-goers. The sound system, special effects, and the films characters were no longer revolutionary. Movie theaters themselves were losing the entertainment battle to sophisticated home systems and streaming services.


While the Internet was hosting a lively debate over how the Star Wars plot clues accumulated over four decades would finally be resolved, it was also beginning to take note of a little-known city in central China named Wuhan. Three months later, a global pandemic has changed everything.


Cultural change is a sluggish thing, but from time to time the events of history can provoke a sudden shift. Movie theaters were already succumbing to technology, and it is difficult to imagine ever again paying premium prices and crowding into a room to watch a film, no matter how historic the venue. In a similar way, technology is leading us down a path of isolation where discerning the feelings of others is ever more challenging. It appears that empathy may also be a victim of technology and the times.


I have a very bad feeling about this.

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetWorkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara

Tuesday, August 18, 2020

Investigating TCP Checksum Issues With Wireshark

 Protocol analysis is an ever changing art because of 2 significant variables:

Protocols

  • Every time an application gets an update it might affect the way it interacts with protocols.

  • Operating system upgrades may change the actual protocols or drivers.

  • Certain applications might come with its own ‘built in protocols’

Tools

  • Every protocol analyzer will have its own different GUI, protocol dissector or decoder and presentation

  • Even when you think you got the hang of the tool, that vendor may decide it’s time for an upgrade which may remove, add or break some significant features

In this example I will focus on Wireshark and TCP checksum issues.

Quick review a checksum is calculated and included by the sender of the data. The receiver performs the same math, using the same formula and should get the same checksum value. If this is not the case the receiver ‘may’ decide to discard that packet. I say may because the behavior is based entirely on the vendor and specific protocol in question.

When it comes to TCP I have seen scenarios where a bad driver miscalculates the checksum and the received discarded it. In most cases the receiver will discard the packet if there is a TCP checksum issue.

This is the important bit, if you see TCP checksum errors, take a moment and verify if the corrupted packets have responses, with no retransmissions or large delta times. If that is the case, then the packets are not truly corrupted.

Depending where Wireshark/npcap captured the packet, it is entirely possible that the checksum was not calculated when it was captured. In some cases TCP checksum is enabled on the card which creates the same symptom.

This is yet another reason why I prefer to capture packets after it has left the device.



Popular post