Friday, October 30, 2020

Implementing 802.11ac/ax Does Not Guarantee High Throughput

 After upgrading a Wi-Fi network to new technologies that boast speeds in excess of 1Gpbs, network engineers are often frustrated to find that the performance of network applications and services still lags. After all, if the data rate promises 10x the speed, shouldn’t we see a similar improvement in end user experience?


Well, not necessarily. By now, many of us have experienced that especially in the Wi-Fi space, 1Gbps does not really mean 1Gbps. There are many variables that go into delivering data over radio waves, and as network engineers we are tasked with delivering the best performance possible.


When testing and troubleshooting a Wi-Fi network, it is critical that we understand the difference between PHY data rate, bandwidth, and true throughput – as these values can vary dramatically depending on the environment. These measurements can also help to determine if a given area is performing as it should, or if there is room for improvement.


Wait – PHY data rate, bandwidth, and throughput – aren’t those basically the same thing? Not exactly…


Let’s define these metrics first, then look at how to measure them.


PHY Data Rate is the maximum rate at which the physical layer can transmit. It represents the rate that data is transferred over the channel, which includes protocol headers, control and management frames. Usually, this is the number advertised by the access point or controller.


Bandwidth is often used synonymously with the data rate. It usually represents the maximum rate at which data can be transmitted but does not include any physical or data link layer overhead.


Throughput is the actual rate achieved by the data in flight. Only the data frames are taken into consideration when measuring throughput, this measurement does not include control frames, management frames, retries, or any other protocol overhead.


To troubleshoot a performance problem, or to test the rate of transfer in each area, it is best to focus on actual throughput measurements. This is commonly done using tools like iPerf, which can create data streams to measure the throughput between two devices. It is not uncommon for the throughput to be less than 50% of the PHY data rate. If it falls to a much lower value, we can start to focus on other metrics such as noise, channel interference, retries and utilization to determine why it is low.




Monday, October 26, 2020

Example: Application Comparison/Baselining

 There are many different terms I use to cover ‘Application Baselines”, such as ‘Profile’, ‘snapshot’ and many other synonyms.


As I mention in the video, you can do this exercise anytime with any application. Some applications are straightforward to figure out since you see a Write or Read command decoded by your protocol analyzer of choice. While others might not be as easy and you need to pay attention to other tip off’s like the sender using the TCP PSH bit at the end of the Writes/Reads as well as the direction of traffic flow.


The network/server gets included in this analysis as well since you are looking at the packet level. Things to look out for are things like retransmissions, lost packets or large latency values.


This simple exercise really opens your eyes as to how the network, server, application and protocol come together to make things work.



Friday, October 23, 2020

The Most Common Reasons for Wi-Fi Roaming Problems

 You are standing in one area of the building, wirelessly connected, and application performance works great. But as you walk to another area, you notice your radio icon loses a signal strength line or two and things start to lag, or even drop completely.


When this happens, it’s common to initially blame signal coverage and assume that we have stumbled into a dead spot. However, it is entirely possible that another access point is within range of our radio, we are just having problems moving to it. This process is called roaming and there are several reasons why devices can have problems transitioning from one access point to the next one.


Let’s look at the top five causes of Wi-Fi roaming issues:


• Excessive Coverage – Excessive coverage happens when you have too many access points in an area, or the transmit power is higher than necessary for appropriate coverage. Many client devices will not actively look for a new access point until the current signal degrades significantly. Therefore, when there are too many APs, a client device can stay connected to one of them when there is a closer one with a better signal. Client devices can rely on signal strength, signal-to-noise ratio, data rates, or packet retry rates to determine connection quality.


• Poor Signal Coverage – If there are areas of the building with poor coverage (low or no signal overlap), client devices could be completely disconnected from the network while trying to find a new access point to roam to. It is important to have a clear picture of where these problem areas are in the facility.


• Re-Authentication – Wi-Fi devices on a secure network could take longer to roam from one access point to another because of the time it takes to process EAP re-authentication. This can cause performance problems when using latency sensitive applications like voice calls or live video streaming. Total roaming time should be kept to 100ms or less.


• Mismatching Configuration – Mismatching configuration errors can occur in environments without centralized access point controllers or when multiple controllers are used. If access points in an area are not configured with common settings, roaming can fail. To roam, access points need to share the SSID, authentication type, security credentials, and SSID transmission config.


• Hidden SSID’s – When it comes to environments on which Wi-Fi roaming is required, it is best not to use hidden SSID’s. Although there are some use cases where hiding the SSID is necessary, when it comes to roaming, some client devices can have problems when jumping to a new AP with this configuration.


It is critical that network engineers have the tools to test for and troubleshoot these Wi-Fi issues, especially as the demand for quality performance increases. From the ground up, the goal is to design a Wi-Fi network that has thorough coverage, solid channel planning, and takes into consideration user capacity and performance. If all these points are met, client devices should have a clear path to jump from one access point to the next when roaming.


What kinds of tools can help? One test that was designed with these needs in mind is the Roaming Test on the AirCheck G2 or EtherScope nXG. As a network engineer walks the environment, the Roaming Test can pinpoint problems with signal coverage, high roaming times, access point misconfigurations, and hidden SSID’s. These can be quickly identified and resolved to help client devices to roam quickly and efficiently.



Friday, October 16, 2020

Everything Else

 

Multi-Tasking is a fantasy. There are plenty of high-energy people out there who say they can Multi-Task, and many a job-seeker has laid claim to this skill in an interview. The truth is, they are lying. Humans can only do one thing at a time.

Our modern tools have propagated the Multi-Tasking myth. Our Smart-TV’s display a picture-in-picture view of two simultaneous channels and our computers show numerous windows. Today’s office workers typically sit facing multiple monitors. As you read this, your email inbox is filling up while text messages pop up on your digital watch and your phone vibrates with new voicemails. You are convinced you are Multi-Tasking, but you are delusional.

Multi-tasking is commonly believed to be a straight-forward case of doing two or more tasks simultaneously. Many of us spend our days trying to do this, not realizing just how little we are actually accomplishing and how stressed out it is making us. You might be able to walk down the stairs while sipping a latte and checking your smartphone, but true Multi-Tasking reaches beyond mere muscle memory. University of Michigan Psychology Professor David Meyer says that our brains simply aren’t wired for complex concurrent tasks.

My wife and I often turn to meal kits as a complement to grocery shopping, meal planning and eating out. For a non-chef like me, preparing food according to detailed instructions, with all the ingredients pre-measured and arranged on the counter in front of me, is a nice way to relax at the end of a busy WFH day. It is not unusual to have the oven, frying pan and saucepan all simultaneously cooking along on different timers. I am constantly shifting from one task to another to make sure nothing is burning. Although our kitchen is Multi-Tasking, I am not. Swapping attention among several tasks is what the experts call Context Switching – it is not true Multi-Tasking.

Cars are so commonplace that we can easily overlook just how truly amazing their technology really is. For a mere $37K (the average price of a new car in the US) you can own an absolute marvel of modern engineering. Upgrade to multi-zoned climate control, onboard navigation, Internet-connected infotainment, voice control/response, multiple cameras and radar proximity alert systems and it’s easy to forget that the goal is to drive the thing from one place to another. We’ve advanced from the kids chanting “Are we there yet?” to the car intoning “Fasten your seat belt”, “Return to the highlighted route” or “Are you attempting to back up”? For most of us, the real-time demands of navigation and traffic will take priority, requiring us to hastily shift our attention to the world beyond the dashboard.

The coolest tangible evidence that cars have gone high-tech is the prominent center-mounted touchscreen which consolidates the controls and readouts into a single familiar device. Not only has the number of options for driver/car interaction increased dramatically but gone is the tactile reassurance that your hand is on the correct control. The rapid shifts of attention from screen to roadway may seem like a Multi-Tasking workout, but in fact they represent another impostor classified by experts like Business School Professor Sophie Leroy as Attention Residue. 

Regardless of whether we are Multi-Tasking (we’re not), Context Switching or victimized by Attention Residue, there is a cost. Harvard Psychologist Daniel Gilbert estimates that we spend nearly 47% of our waking hours doing one thing and thinking about something else. Several studies agree that these task-juggling activities effectively reduce IQ by up to 10 points (and who can afford that?). There is degradation in the brain’s “clipboard” which manages key information and maintains focus. The state of “flow” which lowers our anxiety and enables creative thinking never happens. 

The experts, having focused on this one subject for awhile now, agree that Single Tasking is the best way to get things done in less time and at a higher quality. Their advice is simple - concentrate on what we should do and not what we could do. The reduction in stress and the increase in creativity will be seen as proof that our brains are pleased.

The true benefit of modern technology is not that it enables us to Multi-Task. Until we figure out how to rewire our brains, that won’t happen.  What it can be very good at, if we only accept it, is to free us to focus on the one thing that matters most and doing it well, while it takes care of everything else.

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life.  In addition to being a regular contributor to NetworkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics.  Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders.  His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara

Popular post