Click here to close now.

Welcome!

Perl Authors: AppDynamics Blog, Liz McMillan, Mike Hicks, RealWire News Distribution, Bob Gourley

Related Topics: Wireless, Java, Microservices Journal, AJAX & REA, Web 2.0, Perl

Wireless: Article

Delivering Breakthrough Performance with 802.11ac

Breaking the wireless Ethernet gigabit barrier

Consumers are continuing to adopt multiple connected devices and video content is expected to reach more than 70 percent of global traffic. This growth and the increased reliance on wireless networks is putting stress on existing 802.11a/b/g/n networks. As a result of this high usage, users are likely to experience deteriorated performance, choppy videos and slower load times. At a time when IT managers report that network users are now averaging more than one Wi-Fi connected device per person, solutions to handle the rapid growth of devices are at a premium.

The next generation of the 802.11 standard, or IEEE 802.11ac, promises to finally break the wireless Ethernet gigabit barrier. This technology will deliver higher bandwidth while retaining better quality of experience (QoE) for end users, and is expected to be adopted rapidly into all markets: residential, enterprise, carrier and large venue.

Some of the first applications for 802.11ac's faster speeds will better residential video streaming, data syncing between mobile devices, and data backup. Streaming digital media between devices faster and simultaneously connecting more wireless devices will be some of the starting benefits for consumers and enterprises. In terms of service providers, they will be able to deploy the new technology to offload traffic from congested 3G and 4G-LTE cellular networks, and in dense operator hotspots 802.11ac will supply better performance to more users.

To date, all 802.11 revisions have focused on increasing transport speeds, which lead to higher traffic delivery rates and ultimately to faster response times as experienced by the end user. The introduction of 802.11n brought advances of MIMO (multiple-in, multiple-out) to deliver traffic over multiple spatial streams, and packet aggregation. MIMO delivered marked improvements in physical transport rates, enabling more bits per second to be transmitted than ever before over Wi-Fi. Packet aggregation delivered equally impressive improvements in transport experience, allowing devices to send more data once they had gained access to the wireless media. The new 802.11ac protocol is continuing down this path by preserving aggregation techniques, advancing the physical transport rates yet again, and introducing the concept of parallel transport into Wi-Fi through a technique known as Multi-User MIMO (MU-MIMO), where multiple client devices are receiving packets concurrently.

This is the first time Wi-Fi history that directed traffic can be delivered to multiple client devices at the same time. This ability has significant impact on delivery of content to any location with multiple users, especially where content is revenue-generating or critical.

Achieving Increased Gigabit+ Performance with 802.11ac
In order to reach the best performance, 802.11ac uses a variety of advancements and addresses the need for performance improvement through three primary initiatives:

  1. Increasing Raw Bandwidth allows for the higher speeds associated with 802.11ac. It makes use of a higher rate encoding scheme known as 256-QAM, which transmits 33 percent more data than the 64-QAM used in the 802.11n standard. Signal-to-noise ratios that worked for 802.11n are no longer sufficient because the difference in detectable signal level is now significantly smaller.
  2. Multi-user Support makes 802.11ac a real information superhighway, unlike its predecessors that only allowed one device to transmit at a time. MU-MIMO allows an access point to transmit data to multiple client devices on the same channel at the same time. It works by directing some of the spatial streams to one client and other spatial streams to a second client. MU-MIMO is critical to performance improvements in environments with high client counts.
  3. Individual Client Channel Optimization is also a major performance booster. The concept behind channel optimization is transmit beamforming (TxBF). The reflections and attenuations, common during the transmission of 802.11 signals, have a significant performance impact on overall network performance. With TxBF, the access point communicates with the client devices to determine the types of impairment that are present in the environment. Then the access point "precodes" the transmitted frame with the inverse of the impairment such that when the next frame is transmitted and transformed by the medium, it is received as a clean frame by the client. Since no two clients are in the same location, TxBF needs to be applied on a client-by-client basis and constantly updated to reflect the changing environment.

Overcoming Technical Challenges
One of the biggest frustration for developers and users of 802.11 is that it needs to work with previous versions. It can also be extremely difficult to identify the root cause of development problems. For example, when an application performs poorly, it is often hard to determine if it is due to an environmental, client, or network issue. The various devices in an 802.11 network are highly correlated so an issue in one area quickly ripples through to many other areas. Developers have lacked an effective means to assess the total picture from the RF to the application layer.

IEEE 802.11ac makes this problem significantly more challenging. In addition to being deployed into an existing environment with ten years' worth of previous releases, 802.11ac makes use of advanced technologies that are substantially more complex and demanding than previous versions. This latest generation of 802.11 requires a rethinking of how the technology is developed and tested to include a much more holistic view through the product development life cycle.

Traditionally, the RF section is verified using one set of equipment, and then the upper layer functions are tested using a second set of tools. The overall technical complexity and the introduction of new technologies such as TxBF demand coordination and control between the different layers of the protocol stack. Without this coordination, it would be difficult to utilize these functions and to quickly pinpoint performance issues.

802.11ac brings the promise of moving Wi-Fi into the limelight as a trusted and capable communication protocol, and will require equipment and rigor to match. The new generation of testing should be able to decode every frame in real-time and determine each frame's RF characteristics, as well as their frame-level performance, and generate every frame without limitation in real-time to adequately test receiver performance. Previous approaches use a digitized data record approach for both generation and analysis, creating or capturing what are known as I/Q files, and equipment typically adapted from the general-purpose RF domain. This result in equipment being capable of a single spatial stream, and able to generate or capture a small fraction of the frames required to perform testing. To meet the need, the approach needs to be able to generate and analyze all frames in real-time to the limit of the specification, tightly integrate RF and MAC functionality in 802.11ac, and include integral, real-time channel emulation to address TxBF performance.

Increasing Performance for All Markets
Gigabit+ performance for residential, enterprise, carrier and large venue markets is possible with the 802.11ac standard. But to realize the performance and density promise, chip and hardware developers must navigate some significant technical challenges, as detailed in this article. They must ensure graceful migrations from existing deployed solutions by providing backward compatibility and delivering high performance RF transmission and receive performance with a wide variety of signals. They must maintain high performance to multiple clients under the channel conditions that will exist in real deployments, while at the same time provide the high reliability and feature robustness to enable enterprise and carrier grade 802.11 adoption. Ultimately, the developers need to ensure that the key application traffic - most notably video - can be delivered with quality.

More Stories By Joe Zeto

Joe Zeto serves as a technical marketing evangelist within Ixia’s marketing organization. He has over 17 years of experience in wireless and IP networking, both from the engineering and marketing sides. He has extensive knowledge and a global prospective of the networking market and the test and measurement industry.

Prior to joining Ixia, Joe was Director of Product Marketing at Spirent Communications running Enterprise Switching, Storage Networking, and Wireless Infrastructure product lines. He has a Juris Docorate from Loyola Law School, Los Angeles, CA.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers is very hard. You have to learn five new and different technologies and best practices (libswarm, sy...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
Converging digital disruptions is creating a major sea change - Cisco calls this the Internet of Everything (IoE). IoE is the network connection of People, Process, Data and Things, fueled by Cloud, Mobile, Social, Analytics and Security, and it represents a $19Trillion value-at-stake over the next 10 years. In her keynote at @ThingsExpo, Manjula Talreja, VP of Cisco Consulting Services, will discuss IoE and the enormous opportunities it provides to public and private firms alike. She will share what businesses must do to thrive in the IoE economy, citing examples from several industry sector...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.