Me shakes me head when I see AV-Comparatives mid-read...or mis-understood.
Now..looking at their most important test..the "Dynamics" test...and seeing you say "right in between Norton and McAfee"...well...if you've read the Dynamics 2011 test..you'd see that Symantec came in first..detection rate of 99.5%. And McAfee came in 12th...detection rate of 96.7%. So that leaves between 2nd and 11th place open. But they didn't bundle Microsoft in on this recent batch.
The most recent test which they did that included MSE was the "Removal" test...and Microsoft came with two stars.." * * " ..doing better than AVG, Avast, G-Data, Eset, McAfee, Panda, Sophos..and a few others. It was only bested by Symantec, Kaspersky, PC Tools, and BitDefender. It was tied with AntiVir, Webroot, and Trend.
I'm actually struggling looking around the AV-Comparatives site to find where MSE comes in with bad testing results..between Symantec and McAfee.
I took my data from the February 2011 "On-Demand Comparative", page 9.
http://www.av-comparatives.org/images/stories/test/ondret/avc_od_feb2011.pdf
Yes, it is a bit outdated, but that is what I had laying on my desk.
Looking at the newest August "On-Demand Comparative", page 9...
MSE number 13th @ 92.3% detection rate, almost at the bottom, and worse than McAfee and Norton.
As for the "Dynamics Test" being "The Most Important", I would have to disagree. The Dynamics test does not test "Detection Rates" per say, it tests the effectiveness of the AV product essentially during web browsing (URL blocking and such). Testing is also performed as such... by visiting malicious domains. This is but ONE attack vector, and is in no way a comprehensive test. The test also does not provide "Detection Rates" but rather, "Protection Rates". The reason MSE isn't included in these tests is because MSE doesn't provide ANY additional protection beyond the heuristics and known-nasty lists. Therefor, it can't be tested!
http://www.av-comparatives.org/en/comparativesreviews/dynamic-tests
In this test all features of the product contribute protection, not only one part (like signatures/ heuristic file scanning). So the ability of detection/protection should be higher than in testing only parts of the product. We would recommend that all parts of a product would provide high protection, not only single components (e.g. URL blocking protects only while browsing the web, but not against malware introduced by other means or already present on the system).
So, basically what you can take away from looking at the "On Demand Tests" and the "Whole Product Dynamic Tests" is that yes, Norton does a good job of blocking you from visiting malicious sites, but if you do slip into a malicious site that isn't on their list, or open an infected email, etc, you are probably going to get infected because of Norton's #12 place, or 95.1% detection rate.
Further, it is said in the "On Demand Detection of Malicious Software" test, page 8..
http://www.av-comparatives.org/images/stories/test/ondret/avc_od_aug2011.pdf
A good detection rate is still one of the most important, deterministic and reliable features of an Anti-Virus product.
I didn't see any such quote or claim that the "Dynamic Test" was "one of the most important", only that it was their "NEWEST" released test.
Now, as far as the "Removal" test you quoted as MSE doing good, and others doing worse... well, ok... what does that mean? It means that MSE can take out and fix malware better than some others, but MSE doesn't work very well at preventing infection in the first place. To me, it is more important to NOT get the infection in the first place, rather than "fixing" what got broken because your AV sucks. MSE 92.3% detection rate... period.
http://www.av-comparatives.org/images/stories/test/ondret/avc_od_aug2011.pdf page 9.