Wednesday, July 20, 2022

5 reasons TAPs are better than SPAN ports.

 

  1. If the speed of the SPAN port becomes overloaded frames are dropped. Because the SPAN session copies full-duplex traffic, a fully loaded 1Gbps link actually can produce 2Gbps of traffic to the monitor port oversubscribing the capability of the port. Note also, that SPAN traffic is the lowest priority traffic in the switch. This will cause all output traffic beyond 1Gbps to be dropped.

  2. Proper spanning requires that a network engineer configure the switches properly and this takes away from the more important tasks that network engineers have many times configurations can become a political issue (constantly creating contention between the IT team, the security team, and the compliance team). The TAP, on the other hand, is independent of the network endpoints making up a link. There are many different points in the network where taps can be inserted offering access to a variety of analysis, compliance, and security tools.

  3. Because of a TAP’s independence from the network endpoints, it can copy 100% of the data to the monitor port. Physical layer errors, error packets, short frames, and other packets that might be filtered out on a SPAN session are all passed through TAPs to the monitor port(s). This provides the IT Manager with a legally defensible, pure data stream for analysis and reporting. TAPs guarantee access to all the data all the time.

  4. TAPs do not analyze packets, change packet timing, alter or otherwise interfere with network traffic. While Spanning or mirroring changes the timing of the frame interaction (what you see is not what really happened).

  5. TAPs also provide flexibility in how they pass traffic to the monitor port. There are four different modes of operation: Breakout, Aggregation, Regeneration, and Inline or Bypass TAP. Allowing for out-of-band and in-line operations.

In summary, IT Managers are increasingly turning to TAPs as the preferred method for connecting network, performance, and security tools. TAPs provide access to all the data to ensure an accurate analysis. They grant fail-safe operation avoiding the risk of network disruption as a result of power interruption or failure of an appliance.


To learn more about network TAPs and the key features to optimize your network traffic visibility, visit https://www.networkcritical.com/network-taps and become an expert on your network!




Monday, July 18, 2022

Wireshark Silent Install

 I will keep this write-up short.


In this video, I show you how to silently install Wireshark, so no installation prompts appear.


It's a really helpful tip when you need to install Wireshark either quickly or have someone install Wireshark for the first time



Friday, July 15, 2022

Measuring Up

 

For better or worse, we humans have been measuring things for a long time. In Genesis Chapter 6, God provides Noah with detailed plans for building a very large wooden boat – 300 x 50 x 30 cubits to be exact. Noah presumably knew how to measure a cubit, the distance from the elbow to the tip of the middle finger. 6,000 years ago, the Egyptians built their pyramids using cubits as their standard measure. Long before there was an accurate way to measure time, Galileo (1564-1642) used musicians to supply a steady beat and help determine the acceleration due to gravity. Technology has come a long way, and we now have ridiculously accurate atomic clocks, along with lasers for precise length measurement.


By the Middle Ages, trade had expanded, and a need for recognized standards arose. In the late 18th Century, the French Academy of Sciences decided that the standard for length should be the shortest distance from the North Pole to the Equator (passing, of course, through Paris). One ten-millionth of this distance, which would be measured by a pair of French mathematician/astronomers, was christened the “meter.” While less dependent on human anatomy than the cubit, it did pose some difficulty in accuracy and replication. Thanks to modern science, we now know that there were errors in those original calculations, and the true meter is about 0.2 mm short. In testimony to the somewhat arbitrary nature of “standards”, that error has never been corrected. The half meridian through Paris hasn’t changed.


Today the maintenance of standards is the responsibility of the National Institute of Standards and Technology. You would expect them to have insanely accurate standards for length, weight and time and you would not be disappointed. As an example, the NIST Strontium atomic clock is accurate to within 1/15,000,000,000 of a second and would not have gained or lost even a second if it had been started at the dawn of the Universe.


In a somewhat more esoteric vein, NIST maintains the SRM library, where over 1,300 standard reference materials, including such items as whale blubber (SRM-1945) at $803/30g and domestic sludge (SRM- 781) at $726/40g are stored. There is even peanut butter (SRM-2387) at $1069 for three 170g jars. Precise scientific measurements can now confirm what some people have always questioned – domestic sludge and peanut butter are distinctly different.

For at least 3,000 years, beginning with the ancient Greek Olympic Games, we have also been measuring ourselves in one way or another. We periodically assemble athletes from all over the world to measure who runs the fastest, jumps the highest or throws things the furthest. Once the athletes started wearing clothes, corporate sponsors showed up and brought even more focus on the numbers. Today even arm-chair athletes can measure their key metrics with fitness tracking devices and apps.


Our obsession with measuring ourselves grew during the 19th Century. Until then, consumer products were made by skilled artisans who hand-crafted everything from start to finish. Once machines were invented that could stamp, cut, and otherwise fabricate components, manufacturing became more and more of a rote process. The skill level of the workforce dropped, and companies sought ways to measure and standardize worker performance.

Frederick Taylor, founder of Taylorism, believed that management knew too little about the capabilities and motivations of the workers to be effective. Fred deserves most of the blame for starting to quantify individual performance, as well as for the much-despised time-card. His time and motion studies gave rise to the assembly line, where workers were seen as mere mechanisms in the larger machine.


Most of us first felt the harsh impact of all this quantification in school, where we were sorted according to age, grade and rank and soon learned that our future success was at stake. GPAs, SAT scores, performance reviews, salaries, and net worth all signal our growing obsession with self-measurement. My annual physical includes a number of blood tests, producing dozens of cryptic readings which I can plot out over the course of many years. In an attempt to add clarity, each graph includes horizontal lines for “High”, “Low”, and “Average”. As long as my doctor smiles and tells me to come back in a year, I brush it all off and move on. Yet still I wonder, what is all this measuring doing to us?


Having worked as a professional in STEM for forty-plus years, I have personally witnessed and benefited from the continual advances in measurement instrumentation and standards, awaiting the next development with hopeful anticipation. While measurement is a powerful tool to help us understand and control our environment, it can also be somewhat fluid and random. It remains critical to “know your gauge.” As measurement technology improves, previously indeterminate entities will be revealed. If current trends continue, our future might even include a direct, standardized reading of our own intelligence and emotional stability.


I hope not.


Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetworkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.



Popular post