February 18, 2020

Wireshark Interfaces and File List Tip

 Networkdatapedia.com has been asking for material that focus on knowing your network and/or knowing your tools.


Sounds pretty simple but trust me this is anything but simple or obvious. When you use the same tool and it becomes your ‘favorite’ or ‘go to tool’ you might be resistant to trying new tools.


Great example is back in the early 90’s when I was using Network General Sniffer products. I was getting very comfortable with it and was actually solving issues with no training. Through the years I heard of Lanalyzer, Capsa, Cinco, NetXRAY, Observer, Microsoft Network monitor, Protocol Inspector and of course Ethereal (aka Wireshark) as well as ton, I’ve probably forgotten.


I remember showing my Sniffer sales person Microsoft Network Analyzer and Ethereal explaining some if the features I liked. His response is “don’t waste your time on that free stuff”, followed up with “how good can it possibly be when its free”.


I soon figured out that every tool has its pluses and minuses and figuring out what works best for you is the toughest part. When you find that tool that you always reach for first, you need to take the time to learn all the nuances and what features new versions may bring – or break ;)

In this video I spend a few minutes showing you how to clear your ‘most recently used file list’ and how to hide network interfaces you won’t be using. Its important to note that hiding the interfaces does not delete, disable or affect them directly.


For example, if you hide your WiFi adapter in Wireshark, you can still use it to surf, ping, etc.. it will just be hidden from the available adapter list in Wireshark.


Enjoy.



February 10, 2020

Determining ARP Refresh Rate With Wireshark

 

Determining ARP Refresh Rate With Wireshark

There have been more than a few times where I had to illustrate that ARP was an issue.

When a suspected ARP issue, or you just need to understand how often a device ARP’s, or you need a good challenge when using your favorite packet analyzer, this is a great exercise.

February 04, 2020

DNS Client Issues

 In this video I wanted to show you an issue that I encounter quite often. DNS is one of those protocols we all take for granted and most people believe that if its working, you can’t do much to tune it.

There are many things you can do to improve DNS performance. One of the more common techniques is to configure a device as a local DNS server, cache or relay for those scenarios where you might have slow internet connections.

I want to focus on the client configuration. I’ve seen DNS server entries (manual or DHCP assigned) that are problematic. For example, DNS servers that no longer exist, typos, slow DNS servers or DNS servers that are located on slow links or paths.In this case I highlighted what your packet trace will look like when you attempt to use a device as a DNS server that is not a DNS server. I also explain why the ICMP packets are important in this process.

Big take away is to review your DNS or any name server configurations every so often to ensure there isn’t any issues.

.


January 20, 2020

Slice It Smart: Extend Your Capture Time With Packet Slicing

I would say packet slicing is one of the most critical techniques to understand.

Back in the day when we had hard drives with limited disk space and we needed to capture for long periods of time, we used packet slicing.

Packet slicing in Wireshark is one of those features that doesn’t get much love, but once you use it, you wonder how you ever captured packets without it. The basic idea is simple: instead of grabbing the entire packet payload, you only capture the first N bytes. For many troubleshooting and analysis tasks, that’s more than enough to see headers, flags, and protocol behavior without hauling around a ton of unnecessary data. One of the biggest benefits of packet slicing is smaller capture files. Full packet captures can balloon in size fast, especially on busy links or during long troubleshooting sessions. By slicing packets, you drastically reduce disk usage and make your capture files easier to store, share, and archive. This is especially handy when you need to send a capture to a colleague or attach it to a ticket without watching your email client cry. Packet slicing also improves performance during both capture and analysis. Writing less data to disk means less I/O overhead, which can be critical on laptops, virtual machines, or resource-constrained systems. Later on, when you open the capture in Wireshark, smaller files load faster, filters apply quicker, and scrolling through packets feels noticeably smoother. Less data means less waiting, Finally, packet slicing can help reduce risk and noise. By not capturing full payloads, you lower the chance of collecting sensitive or private data you don’t actually need for troubleshooting. In many cases—like diagnosing TCP handshakes, DNS issues, or routing problems—the headers tell the whole story. Packet slicing keeps your captures focused, efficient, and a little safer, proving that sometimes less really is more when you’re packet wrangling.

I use packet slicing for a slightly different situation. Sure, I might have a large drive but now the network speeds are much higher than 15 years ago. The other important reason why I use packet slicing is when the data is sensitive and we are not allowed to see the captured data. There are some other reasons covered in the video.

The point of the video is to introduce you to packet slicing but you should go look at your packet capture tool to determine if you have packet slicing and how to configure it.





ekahau's Free Webinar: Thursday, January 22, 2026 | 12:00 – 1:00 pm ET
Designing 5-star Wi‑Fi for Hotels


January 13, 2020

A Random Walk(*) with Artificial Intelligence (by Paul W. Smith)

 

A Random Walk(*) with Artificial Intelligence (by Paul W. Smith)

My wife and I recently joined some of the last homeowners to contribute to the $11 billion robot vacuum industry. Since I am usually the one who does the vacuuming, I liked the idea of completing this chore with the simple push of a button. It was an exciting day when I took our new Roomba out of the box, placed it on it’s little charging dock (which we located in the laundry room for our convenience), and then stood back and pressed “Clean” on my iPhone app.

Our Roomba (which we named “MaxBot”) banged up against the dryer a few times, then headed off down the hallway toward the living room. Our 13-yr old Shih Tzu soon began barking frantically, having failed to grasp the utility of this intruder. At his age, the only thing he does frantically anymore is eat; it was clear that a black hockey puck bigger than he is should not be scooting around the house unattended – at least not on his watch. Lesson #1: AI robots are not for everyone.

According to the promotional literature, robot vacuums use Artificial Intelligence to map out and store the boundaries of your home, calculating the most efficient way to cover every square foot. They are also supposed to anticipate their own demise, returning to the home base and recharging when needed.

On the morning after the first trial run, I awoke expecting to find a bunch of neatly vacuumed rows, like the freshly mowed patterns on a major league baseball field. I assumed the robot would be back in the laundry room, with full batteries and eagerly awaiting its next cleaning assignment. Instead, MaxBot was cowering guiltily under the dining room table and the living room looked as if it had been run over by a drunk monkey. While I was peacefully dreaming of things unrelated to household chores, MaxBot had texted a desperate message to my phone, pleading for a charge. The laundry room base station was the most convenient for us, but not for MaxBot which was unable to find its way back there. Lesson #2: AI will require some accommodations.

The naturally intelligent folks at Stanford have studied the impact of emerging AI developments on robotics, predicting that by the end of the next decade, domestic robots like ours will be much more common. They noted that current robot vacuums don’t do stairs, while the majority of homes have one or more of them (MaxBot will tip over about 30 degrees at our top stair before altering course in search of level ground). Those same folks who tape over the little camera on their laptop computer have also expressed concern that robotic vacuums contain valuable data about the size and floor-plan of our rooms, coupled with the geo-location of the home. Stanford NI believes these, and other problems, will eventually be overcome.

One of the most puzzling aspects of AI in many of its applications is that it can be hard for even the developers to understand what it’s up to. Algorithms based on self-learning, neural networks, or deep learning can make AI smarter without revealing exactly how it got that way. One example from the medical field was recently reported in New Scientist.

If your cardiologist diagnoses you with a serious incurable heart problem, you would understandably want to know just how much time you have left. Doctors expect this question and will make a reasonable and compassionate effort to answer it. Meanwhile, scientists in a Pennsylvania healthcare group have been evaluating an AI solution.

In order to avoid the inevitable ethical questions, their AI test system was given electrocardiogram data for patients whose date of demise was already known. The researchers not only found that the AI was much more accurate than the cardiologists in its predictions, it also forecast the risk of death in patients previously classified as having a normal ECG. Although this new AI technology can tell you when you will die, no one is quite sure how it knows.

All of which leads to the most important lesson of all, Lesson #3: Don’t expect too much from technology, especially one that cleans like a drunk monkey.

(*) The movements of an object or changes in a variable that follow no discernible pattern or trend.

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetworkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.




Popular post in the past 30 days