Criteria for testing 802.11n Draft-N access points
Connecting state and local government leaders
The VeriWave tests were invaluable in pointing out an access point's flaws and quirks, and the real-world file-transfer tests indicated how it would perform in an office environment.
Until the 802.11n amendment gets final ratification, testing intermediary Draft-N access points can be a challenge. So this year we decided to use test equipment from VeriWave to give us a more detailed picture of what was going on with each access point.
VeriWave’s WaveTest 20 suite has more than 20 kinds of tests, but for the sake of simplicity, we concentrated on some of the more basic ones.
The first was a simple throughput test using various data packet sizes at the 2.4 GHz frequency with a simulated client on the local-area network port sending data to a single client on the wireless end. That is the most common use of an access point, and it happens when a client browses the Internet or downloads a file from a network server.
Most Draft-N access points claim throughput speeds of 300 megabits/sec, the theoretical limit of such devices. In practice, a device might approach 300 megabits/sec in any one instant, but the average speed will be less because data is almost always transmitted in packets. A packet contains control data at the beginning and end that help with error detection and processing on the receiving end. Because the access point has to perform checksums and other functions beyond transmitting the data, the average throughput rate is less than the maximum.
Also, the smaller the packet size, the more often it has to perform those tasks, which slows the average throughput. We used a range of common packet sizes: 64; 88; 128; 256; 512; 1,024; 1,280; and 1,518 bytes.
In the second test, we sent the data from the wireless client to the wired one. Those rates tended to be faster because an access point can typically process the control data from the wireless device more quickly.
To determine performance under real-world conditions, we ran the tests again with five simulated clients on each end. How much the average throughputs dropped determined how large a workgroup the access point could comfortably support.
To determine the impact of security activities, we ran a single-user throughput test with the access point’s security set to Wi-Fi Protected Access 2 – Pre-Shared Key.
And because implementation in an office environment has challenges that can’t be duplicated in a shielded enclosure, we ran file transfers over several distances.
We connected each access point to a desktop computer with a 10/100 Ethernet adapter and used a laptop PC with the manufacturer’s preferred USB client adapter when possible. With the adapters on the same subnet as the access point, we measured the time it took to transfer a folder with 100M of typical office files. We took several readings at each distance of 10, 20, 40, 80 and 160 feet to get a good average of the throughput in both directions at each distance.
The VeriWave tests were invaluable in highlighting an access point’s flaws and quirks. However, the real-world file-transfer tests indicated how a device would perform in an office and were the greatest factor in determining its performance grade.
VeriWave, 800-457-5915, www.veriwave.com.
NEXT STORY: Navy purchases decision support software