Need some feedback regarding a huge discrepancy of test results between Windows 2008R2 and 2016 hosts.
Performing a fairly exhaustive test between 20 media agents in separate geographical locations. This is a feasibility study to implement WAN based backups across our available media agents (so the intent is to discover the best network paths between the media agents)
Media agents are configured to listen all the time as a server, over a port specific to their server, then task scheduler launches client testing sequentially to every end point in question periodically.
This captured 5000+ data points regarding the network conditions between media agents at various times of the day.
After parsing this data we discovered:
The 17 2008 servers; outbound rate average was .9 MB/sec
The 2 2016 servers; their average outbound rate was 20 MB/sec
Furthermore we upgraded a media agent from 2008R2 to Server 2016- after the upgrade the test traffic went from 1.2 MB sec to 26 MB/sec.
We have matched the netsh TCP/IP settings between servers, matched vNIC drivers, disabled McAfee.
Using a separate utility iperf- we see that the calculated bandwidth is more consistent across the baselines.
Is there something specific to CVNetworkTestTool.exe within windows 2008R2 that needs to be accounted for. Launch parameters/comparability settings?
After deploying the entire testing mechanism around this utility i would rather not have to re-script around iperf or another alternative.