Wednesday, September 16, 2020

Analyzing Multiple Trace Files- Setup

 I get many emails asking for assistance when you have multiple trace files. Lets start with a quick review of the benefits of having multiple trace files:

  • Determine the source of lost packets

  • Determine network latency

  • Determine the source of out of sequence packets

The hardest part of this process is the setup or preparing your trace files. I try to keep the capture points as consistent as possible. For example, if you have Wireshark installed on your server, I would prefer that you have Wireshark installed on the client computer. If you span the server port, I would prefer we span the client port, that sort of thing.

If you use Wireshark, use Wireshark for all your captures, don’t have a Wireshark capture from the client, but a capture from your router using its capture software. I’m sure you see where I’m going with this.

Next step is determine why you need multiple trace files, in this example I will start with something straightforward like looking for dropped/lost packets. First thing I do is open both trace files that are filtered between the 2 device ip addresses. Then I save the filtered trace files with a meaningful name. Now we can zero in on a conversation between the devices. I typically go to layer 4, in this case it’s the TCP layer. I use the Statistics > Conversations -> TCP tab and sort by Bytes or Packets. In this video I choose to sort by Bytes and apply a one way filter from the data sender to the receiver.

Once I have the filter created on one side, I simply copy and paste the filter to the other trace file. This works because this trace file has the same Layer 3, and higher information. The process will be slightly different if address translation devices were involved in the communication path. I will create another video to cover this technique in the future.

Now I add the TCP sequence number as a column and now I can easily see if a packet was retransmitted or lost. For the advanced user, you can see how you can export this new trace file to excel, database or script to compare the sequence numbers and automate the process.

This will be the first of more videos to cover various aspects and techniques of multiple trace file analysis.


Tuesday, September 15, 2020

Common Sense

 

Common Sense

According to the Internet, the phrase “Common sense is not so common” originated with a Frenchman – Francois-Marie Arouet – who was a leading figure during the Age of Enlightenment. Francois, who had a knack for catchy phrases, began writing them at the age of 12. Eighteenth Century authorities were not always amused, and he often found himself in and out of the Bastille. He eventually moved to London and adopted the pen name Voltaire.

For those of us who work in the STEM fields, common sense is frequently the starting point from which we design our hypotheses and launch our experiments. The process, loosely defined as the “scientific method”, first appeared c. 1600 BCE, but is generally credited to Aristotle. The great Greek philosopher believed that because the world is a real thing, the best way to discern the truth is by experiencing it.


Such empiricism is the foundation upon which the scientific community has built its enviable reputation, reinforced by the rigor with which the method is applied, peer reviewed, and communicated. ”Follow the science” is an oft-heard refrain when complex choices present themselves.


While science has made contributions that changed the course of humankind, not all its discoveries have been trustworthy; there have been some notable failures along the way. Rarely has the path to any scientific discovery been without a few missteps, but some results received far too much credibility before eventually being debunked.


Attaching a famous name to a scientific discovery may add gravitas where none is warranted. One of the most respected physicists of his time, William Thompson (aka Lord Kelvin) was known for his contributions to the study of thermodynamics. Scientists he deemed “soft” (e.g., biologists and geologists) opined about an ancient earth, and so it was only natural for a “hard” physicist to try and prove them wrong. Noting that the once molten earth was cooling, Kelvin used his thermodynamics calculations to estimate that the planet could be no more than 20-40 million years old. His arrogance and pubic influence further underpinned this “truth.”


Lord Kelvin’s bluster held fast until the advent of radiometric dating which provided a more accurate method for estimating the age of things. We know now that the Earth is around 4.5 billion years old, and Kelvin should have confined himself to his eponymous temperature scale.


Even as renowned a physicist as Albert Einstein was not immune to scientific blunders. Albert was known for his elegant theories of General and Special Relativity, where he wrestled with the effects of gravity, mass and the speed of light. He also went along with a substantial number of his contemporaries in believing that the universe was static.


As great at Einstein’s General Theory of Relativity was, however, there was one catch; in order for it to work the universe had to be either contracting or expanding. He fixed the apparent contradiction by conjuring the cosmological constant, known in layman’s terms as a fudge factor. As gravity pulled the entire universe inward, this cosmological constant provided the repulsive force that kept everything from collapsing.


True to Aristotle’s vision, empirical data once again overruled learned speculation. Edwin Hubble’s 1929 observation of the red-shift of galaxies proved that they were in fact moving away from us, consistent with Einstein’s General Theory of Relativity but rendering his cosmological constant obsolete. When faced with the data, Einstein admitted his mistake.


The last example is perhaps the most familiar. Prior to their press conference in 1989, the names Stanley Pons and Martin Fleischmann were little known outside of their own specialty field of electrochemistry. That quickly changed when they announced to the world that they had achieved “cold fusion”, basically doing on the kitchen table what the Sun accomplishes at a temperature of around 27 million degrees. In spite of much well-deserved scientific skepticism, the world wanted their claim to be true because it could lead to an essentially endless supply of clean energy.


It was little more than a month later, after independent attempts to duplicate the results failed, that the energy spikes reported by Pons and Fleischmann were attributed to tritium contamination in their apparatus. Instead of receiving the Nobel Prize for their work, the two electrochemists became forever branded as the originators of “Fusion Confusion.” Shortly thereafter, work on building enormous multi-billion-dollar fusion reactors resumed.


Over 2000 years ago, Aristotle recommended that we pay attention to the real world and base our conclusions on what we see. While a great reputation or the promise of an epic breakthrough are compelling, we should always be wary of results that don’t pass the ever-reliable smell test. Good science requires innovation, knowledge, experience, patience, hard work, and peer review along with a healthy dose of common sense.


Voltaire himself said it best – “Cherish those who seek the truth but beware of those who find it.”

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetWorkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.

Wednesday, September 2, 2020

Sylsog – Use it!!

 Syslog has been lumped in with SNMP as an ineffective, insecure way to monitor equipment and I thought it was time I threw my 2 bits in.

I like to use syslog for the following reasons;

- Centralized location for many devices

- Standard interface when using different vendor make and models

- Easy to define similar alerts across multiple devices

- Send alerts or ‘push’ as they happen

- I don’t need any device passwords to check device logs or events


A quick google search will reveal a ton of syslog applications, just be prepared to spend some time learning the various product differences but here’s what I look for; - Support for a large number of vendors and devices - The ability to add or customize alerts - Easy filtering engine or interface - Bonus; ability to set email alerts The only advice I can give when learning how to use syslog is to determine ahead of time what kind of devices you want to monitor and ensure it fits that need. For example, in most cases you will use it with network equipment, but in some specific circumstances I’ve used it with printers when they are in a public area. The other point worth noting is to test your syslog server in various scenarios, like device boot up, interface flapping and anything else you normally have to troubleshoot manually.



Popular post