All of my clients seeking FIPS 140-2 validations are concerned about schedule. The elapsed time to the final validation award is usually more important than cost. The biggest element of that timeline is the long hiatus between the test report submission by the test lab to the CMVP, and the time when it is picked out of the inbox for CMVP review.
That time interval can vary dramatically and capriciously as demonstrated by two recent validations. The test report for #1051 was submitted on 2008-04-28 and validation award was 2008-11-17, approximately 7 months. The test report for #1111 was submitted on 2008-02-29 but the validation award was not until 2009-04-03, approximately 13 months. Quite a difference, roughly half a year, sufficiently long in the latter case to spoil any commercial value of that validation.
How did the two validated products differ? Here’s the interesting part — both were based on the same source code! Even stranger, the “quick” validation was for source code based delivery and static linking, both well off the beaten path for most validations. The tardy validation was a bog standard binary shared library validation, the whole purpose of which was to quickly obtain a few validated binaries for DoD (the sponsor) while waiting for the source code based validation.
The FIPS validation process is so shrouded in secrecy that I will never know for sure why the one validation took nearly twice as long. The validations were performed by different test labs, but there was no evidence that I could see of negligence or incompetence on the part of the one test lab. The most likely cause was different reviewers at the CMVP. The CMVP review is (in my opinion) a very subjective process and different reviewers show very distinct preferences in their commentary and requirements for document changes. Interestingly enough the test lab informed me that the NIST reviewer in this case insisted on remaining anonymous; in the past I’ve always been told who was involved.
So there you have it — a very non-transparent process, anonymous bureaucrats, nearly a 2x difference in validation times for the same software. You pays yer money and you takes yer chances.
[Update 2015-12-11] An even better example of CMVP capriciousness:
The “RE” validation, an “Alternative Scenario 1A” clone of the #1747 validation, was approved November 13 2015 (http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140val-all.htm#2473).
It was submitted along with its identical twin “RE” validation on April 17 2015. The two sets of paperwork differed in only one trivial aspect, “RE” in the module name for one versus “SE” for the other. Same module, same test lab, same paperwork, submitted together at the same time. A more perfect controlled study could not have been devised on purpose.
The “SE” validation was approved on June 25 (#2398), after a little more than two months (69 calendar days, 48 working days).
The “RE” validation was not approved for almost seven months (210 calendar days, 145 working days). That’s three times as long for the exact same submission. This is the most striking example yet of CMVP capriciousness.
Why the wild disparity? Well, probably because the two identical submissions were farmed out to two different reviewers. The review process is notoriously subjective, and in fact we received “comments” (requirements for changes) for the “RE” validation whereas the “SE” one was approved as-is. As a result the two Security Policy documents are no longer identical. That doesn’t explain the time discrepancy, though, as those “comments” weren’t received until long after “SE” had been approved.
The moral here is that FIPS 140-2 validations are a crapshoot; it’s impossible to make any reliable predictions on how long any validation action will take or how it will be received. If you have really deep pockets you can submit the same validation multiple times to hedge your bets (as done for the #1051 and #1111 validations discussed above), but for most of us it’s an open ended gamble: submit, hope, wait, …