Friday, July 15, 2022

Measuring Up

 

For better or worse, we humans have been measuring things for a long time. In Genesis Chapter 6, God provides Noah with detailed plans for building a very large wooden boat – 300 x 50 x 30 cubits to be exact. Noah presumably knew how to measure a cubit, the distance from the elbow to the tip of the middle finger. 6,000 years ago, the Egyptians built their pyramids using cubits as their standard measure. Long before there was an accurate way to measure time, Galileo (1564-1642) used musicians to supply a steady beat and help determine the acceleration due to gravity. Technology has come a long way, and we now have ridiculously accurate atomic clocks, along with lasers for precise length measurement.


By the Middle Ages, trade had expanded, and a need for recognized standards arose. In the late 18th Century, the French Academy of Sciences decided that the standard for length should be the shortest distance from the North Pole to the Equator (passing, of course, through Paris). One ten-millionth of this distance, which would be measured by a pair of French mathematician/astronomers, was christened the “meter.” While less dependent on human anatomy than the cubit, it did pose some difficulty in accuracy and replication. Thanks to modern science, we now know that there were errors in those original calculations, and the true meter is about 0.2 mm short. In testimony to the somewhat arbitrary nature of “standards”, that error has never been corrected. The half meridian through Paris hasn’t changed.


Today the maintenance of standards is the responsibility of the National Institute of Standards and Technology. You would expect them to have insanely accurate standards for length, weight and time and you would not be disappointed. As an example, the NIST Strontium atomic clock is accurate to within 1/15,000,000,000 of a second and would not have gained or lost even a second if it had been started at the dawn of the Universe.


In a somewhat more esoteric vein, NIST maintains the SRM library, where over 1,300 standard reference materials, including such items as whale blubber (SRM-1945) at $803/30g and domestic sludge (SRM- 781) at $726/40g are stored. There is even peanut butter (SRM-2387) at $1069 for three 170g jars. Precise scientific measurements can now confirm what some people have always questioned – domestic sludge and peanut butter are distinctly different.

For at least 3,000 years, beginning with the ancient Greek Olympic Games, we have also been measuring ourselves in one way or another. We periodically assemble athletes from all over the world to measure who runs the fastest, jumps the highest or throws things the furthest. Once the athletes started wearing clothes, corporate sponsors showed up and brought even more focus on the numbers. Today even arm-chair athletes can measure their key metrics with fitness tracking devices and apps.


Our obsession with measuring ourselves grew during the 19th Century. Until then, consumer products were made by skilled artisans who hand-crafted everything from start to finish. Once machines were invented that could stamp, cut, and otherwise fabricate components, manufacturing became more and more of a rote process. The skill level of the workforce dropped, and companies sought ways to measure and standardize worker performance.

Frederick Taylor, founder of Taylorism, believed that management knew too little about the capabilities and motivations of the workers to be effective. Fred deserves most of the blame for starting to quantify individual performance, as well as for the much-despised time-card. His time and motion studies gave rise to the assembly line, where workers were seen as mere mechanisms in the larger machine.


Most of us first felt the harsh impact of all this quantification in school, where we were sorted according to age, grade and rank and soon learned that our future success was at stake. GPAs, SAT scores, performance reviews, salaries, and net worth all signal our growing obsession with self-measurement. My annual physical includes a number of blood tests, producing dozens of cryptic readings which I can plot out over the course of many years. In an attempt to add clarity, each graph includes horizontal lines for “High”, “Low”, and “Average”. As long as my doctor smiles and tells me to come back in a year, I brush it all off and move on. Yet still I wonder, what is all this measuring doing to us?


Having worked as a professional in STEM for forty-plus years, I have personally witnessed and benefited from the continual advances in measurement instrumentation and standards, awaiting the next development with hopeful anticipation. While measurement is a powerful tool to help us understand and control our environment, it can also be somewhat fluid and random. It remains critical to “know your gauge.” As measurement technology improves, previously indeterminate entities will be revealed. If current trends continue, our future might even include a direct, standardized reading of our own intelligence and emotional stability.


I hope not.


Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetworkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.



Thursday, July 14, 2022

New EMA Research Report Amplifies Need for a Strong Network Visibility Architecture

 

The ongoing COVID-19 pandemic and continued threat of cyber-attacks has put the IT and security industry on high alert, resulting in many organizations having to push their network infrastructure into the cloud. This adoption to support remote work has created challenges in seeing security threats and maintaining performance. As a way to further explore the industry’s response to current visibility initiatives, EMA released a brand new report today sponsored by Keysight Technologies. The report provides an important look at how IT and security organizations use network visibility architectures to deliver network packets to critical performance and security analysis tools as well as how organizations need to evolve their network visibility architectures as they adopt hybrid, multi-cloud architectures.


Visibility architectures are essential for IT and network security personnel to be able to take a holistic look at an enterprise’s entire network followed by a better understanding of what tools you have, where the network is accessed, and what data feeds into those tools. In line with this focus, a website (www.getneworkvisibility.com) helps network and security engineers explore the various findings from the report. This includes: visibility architecture benefits and challenges, requirements from companies for using network packet brokers (NPBs), and how to support hybrid, multi-cloud networks.


Visibility Architecture Benefits and Challenges


I’ve seen firsthand the benefits of having a visibility architecture in place from top global brands. By implementing visibility solutions, IT teams can expose hidden problems and therefore eliminate blind spots, improve efficiency, reduce costs and optimize troubleshooting efforts. The EMA report findings further support these benefits, stating that organizations that use a network visibility architecture will improve IT and security productivity and reduce overall security risk. 25.2% of survey respondents said that using visibility solutions improved capacity management, 22.5% reported optimized cloud migration and 21.9% said it resulted in network and application performance and resiliency. Other opportunities from using a visibility architecture include better cross-team collaboration, reduced compliance risk, and extended life of analysis tools.


As with any IT solution, there are going to be challenges to adoption and implementation. In the case of visibility architectures, organizations have stated that scalability and complexity (as well as insufficient budget, architectural complexity, and limited cloud visibility) are some of their top concerns. In most cases, the benefits of visibility architecture end up outweighing any potential disadvantages. The first task is to choose a solution that fits your enterprise needs.


Budget concerns are not to be taken lightly. The reality is there are ways to cut costs even when it comes to visibility architectures. Many application and network performance and security tools are charged by their use. Filtering out unnecessary traffic and deduplicate traffic, along with other smart filtering functions, can bring that cost down and more than pay for the use of taps and packet brokers, while allowing for a more scalable, future proof approach.


The Architectural Core: Network Packet Brokers


The gold standard in network visibility is the use of packet brokers. The EMA report found that advanced features (such as packet filtering, manipulation, and metadata generation) are the top characteristics of a network packet broker that are most important to earning a return on investment in the technology. Secondary characteristics were listed as resilience/reliability with the tertiary priority noted as manageability and automation.


Adding credibility to its findings, the report included a quote from an enterprise systems monitoring engineer with a Fortune 500 healthcare company: “Performance is number one for me. Then it’s ease of upgrades. We think about the longevity and stability of the company, too. I also want to know if their customer support is any good. Packet broker management tools are also very important if you have an enterprise-scale deployment.”


Network packet brokers from Keysight have several advantages over its competitors such as no dropped packets, use of FPGA-based hardware acceleration, ease of use and multiple features that can run concurrently on the same packet broker. More specific benefits include: processing at line rate, parallel Zero-loss packet processing (something Gigamon’s CPU based solutions can't provide), patented ZERO-errors fully automated filter compiler, and a GUI drag and drop interface that makes configuring your network fast and simple. In addition, NPBs can load balance and optimize the flow of network data to the right tool at the right time.


The report also confirmed the importance of packet brokers when it comes to cybersecurity, citing that the most valued packet manipulation or data generation feature in a packet broker is threat intelligence. In fact, EMA said “organizations that are the most successful with visibility architecture were the most likely to value threat intelligence, suggesting that it’s a best practice to seek this capability from packet broker vendors.”


Supporting Hybrid, Multi-Cloud Visibility


The EMA report confirmed what we already know - packet data is essential to cloud operations, especially for security monitoring and analysis. In fact, nearly 65% of respondents say this data is important for security monitoring and analysis in the cloud.

Newer virtual packet brokers like Keysight’s CloudLens can cost effectively provide private, public, and/or hybrid cloud packet data to tools for analysis. This includes east-west traffic visibility. These advances will allow operational personnel to think of the cloud as just another extension of their physical network and move towards alignment with their security colleagues.


Another standout statistic from the report is that 99% of companies are making at least some attempt to collect packet data in the cloud and supply it to performance and security analysis tools. An infrastructure analyst with a Fortune 500 energy company was quoted as saying within the report, “[Network visibility architecture solutions] can definitely offer value in the cloud, because you need that network traffic when you’re doing end-to-end transactions. Without a packet broker in the cloud, I could deploy a fleet of Linux servers in the cloud running TCPDUMP, but that would be too costly.”


The research from EMA has made it abundantly clear that network visibility architectures are essential to a company’s success. By implementing solutions such as Keysight Technologies’ network packet brokers, your enterprise will be strategically positioned for exceptional network visibility with no traffic loss or dropped packets. Take a look at the entire report from EMA to get a comprehensive download on how enterprises can benefit from a network visibility architecture.


For more information on understanding network visibility and resources on how to implement the tools, check out everything the www.getnetworkvisibility.com community has to offer. You can download a copy of the EMA report here.

Popular post