Thursday, October 15, 2020

Anatomy of a Connection

 When I analyze a trace file with a client, I always look for the beginning of the TCP connection or application.

The reason is quite simple, there are many items that you can only see at the beginning of the capture. Sure you can make assumptions if you don’t have the beginning captured, but life is so much easier if you had it.

A great example is the TCP 3 way handshake that has many nuggets of info that you can’t obviously see later in the trace. Something as simple as the MSS is easily identified during the 3 way handshake as MSS=, sure if you have a bunch of data packets and the TCP LEN is maxed out with a value of 1212, you can assume the MSS is 1,212 bytes but either end can restrict this. Without the handshake, you will be guessing and need to do more testing.

In this video I cover some of the basics that you can use for baselining, protocol education and troubleshooting. A very important term that I used is “TCP Connect Time” which is the time it takes to send a SYN ACK back to the initial SYN. If you have a system that reports this, you might want to check if that product is working properly. Scenarios such as packet loss/retransmissions can through some of the application reporting tools for a loop.

Measuring the TCP Connection time is a lot better and more accurate than any ping will ever be. Then you can use the same methodology for HTTP or any other application command/response scenarios.



Thursday, October 1, 2020

Free ARP Scanner

 I can never stress enough that you need to ‘get to know’ your tools for a bunch of reasons, here are a few examples;

- Understand how the tool behaves on the network

- Ensure that no extra communication is going on

- Better understand when it doesn’t ‘work’

- What changes network equipment or firewalls may require to get your tool to work


In this article I want to show you a cool utility from Nirsoft.net called Wireless network watcher http://www.nirsoft.net/utils/wireless_network_watcher.html Just a note that I’m not paid by Nirsoft, nor have I been asked by them to include their utility in my article.


This utility scans your network using ARP instead of the more popular pings that most tools use. In the video you will see that no icmp packets were sent and it still able to identify the host name. Having the MAC address is also helpful and having the ability to run this from the command line is a bonus on top of the fact that this is a portable tool, so you can put it on a USB key.



Wednesday, September 16, 2020

Analyzing Multiple Trace Files- Setup

 I get many emails asking for assistance when you have multiple trace files. Lets start with a quick review of the benefits of having multiple trace files:

  • Determine the source of lost packets

  • Determine network latency

  • Determine the source of out of sequence packets

The hardest part of this process is the setup or preparing your trace files. I try to keep the capture points as consistent as possible. For example, if you have Wireshark installed on your server, I would prefer that you have Wireshark installed on the client computer. If you span the server port, I would prefer we span the client port, that sort of thing.

If you use Wireshark, use Wireshark for all your captures, don’t have a Wireshark capture from the client, but a capture from your router using its capture software. I’m sure you see where I’m going with this.

Next step is determine why you need multiple trace files, in this example I will start with something straightforward like looking for dropped/lost packets. First thing I do is open both trace files that are filtered between the 2 device ip addresses. Then I save the filtered trace files with a meaningful name. Now we can zero in on a conversation between the devices. I typically go to layer 4, in this case it’s the TCP layer. I use the Statistics > Conversations -> TCP tab and sort by Bytes or Packets. In this video I choose to sort by Bytes and apply a one way filter from the data sender to the receiver.

Once I have the filter created on one side, I simply copy and paste the filter to the other trace file. This works because this trace file has the same Layer 3, and higher information. The process will be slightly different if address translation devices were involved in the communication path. I will create another video to cover this technique in the future.

Now I add the TCP sequence number as a column and now I can easily see if a packet was retransmitted or lost. For the advanced user, you can see how you can export this new trace file to excel, database or script to compare the sequence numbers and automate the process.

This will be the first of more videos to cover various aspects and techniques of multiple trace file analysis.


Tuesday, September 15, 2020

Common Sense

 

Common Sense

According to the Internet, the phrase “Common sense is not so common” originated with a Frenchman – Francois-Marie Arouet – who was a leading figure during the Age of Enlightenment. Francois, who had a knack for catchy phrases, began writing them at the age of 12. Eighteenth Century authorities were not always amused, and he often found himself in and out of the Bastille. He eventually moved to London and adopted the pen name Voltaire.

For those of us who work in the STEM fields, common sense is frequently the starting point from which we design our hypotheses and launch our experiments. The process, loosely defined as the “scientific method”, first appeared c. 1600 BCE, but is generally credited to Aristotle. The great Greek philosopher believed that because the world is a real thing, the best way to discern the truth is by experiencing it.


Such empiricism is the foundation upon which the scientific community has built its enviable reputation, reinforced by the rigor with which the method is applied, peer reviewed, and communicated. ”Follow the science” is an oft-heard refrain when complex choices present themselves.


While science has made contributions that changed the course of humankind, not all its discoveries have been trustworthy; there have been some notable failures along the way. Rarely has the path to any scientific discovery been without a few missteps, but some results received far too much credibility before eventually being debunked.


Attaching a famous name to a scientific discovery may add gravitas where none is warranted. One of the most respected physicists of his time, William Thompson (aka Lord Kelvin) was known for his contributions to the study of thermodynamics. Scientists he deemed “soft” (e.g., biologists and geologists) opined about an ancient earth, and so it was only natural for a “hard” physicist to try and prove them wrong. Noting that the once molten earth was cooling, Kelvin used his thermodynamics calculations to estimate that the planet could be no more than 20-40 million years old. His arrogance and pubic influence further underpinned this “truth.”


Lord Kelvin’s bluster held fast until the advent of radiometric dating which provided a more accurate method for estimating the age of things. We know now that the Earth is around 4.5 billion years old, and Kelvin should have confined himself to his eponymous temperature scale.


Even as renowned a physicist as Albert Einstein was not immune to scientific blunders. Albert was known for his elegant theories of General and Special Relativity, where he wrestled with the effects of gravity, mass and the speed of light. He also went along with a substantial number of his contemporaries in believing that the universe was static.


As great at Einstein’s General Theory of Relativity was, however, there was one catch; in order for it to work the universe had to be either contracting or expanding. He fixed the apparent contradiction by conjuring the cosmological constant, known in layman’s terms as a fudge factor. As gravity pulled the entire universe inward, this cosmological constant provided the repulsive force that kept everything from collapsing.


True to Aristotle’s vision, empirical data once again overruled learned speculation. Edwin Hubble’s 1929 observation of the red-shift of galaxies proved that they were in fact moving away from us, consistent with Einstein’s General Theory of Relativity but rendering his cosmological constant obsolete. When faced with the data, Einstein admitted his mistake.


The last example is perhaps the most familiar. Prior to their press conference in 1989, the names Stanley Pons and Martin Fleischmann were little known outside of their own specialty field of electrochemistry. That quickly changed when they announced to the world that they had achieved “cold fusion”, basically doing on the kitchen table what the Sun accomplishes at a temperature of around 27 million degrees. In spite of much well-deserved scientific skepticism, the world wanted their claim to be true because it could lead to an essentially endless supply of clean energy.


It was little more than a month later, after independent attempts to duplicate the results failed, that the energy spikes reported by Pons and Fleischmann were attributed to tritium contamination in their apparatus. Instead of receiving the Nobel Prize for their work, the two electrochemists became forever branded as the originators of “Fusion Confusion.” Shortly thereafter, work on building enormous multi-billion-dollar fusion reactors resumed.


Over 2000 years ago, Aristotle recommended that we pay attention to the real world and base our conclusions on what we see. While a great reputation or the promise of an epic breakthrough are compelling, we should always be wary of results that don’t pass the ever-reliable smell test. Good science requires innovation, knowledge, experience, patience, hard work, and peer review along with a healthy dose of common sense.


Voltaire himself said it best – “Cherish those who seek the truth but beware of those who find it.”

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetWorkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.

Popular post