Valuable, not-commonly-known information
In August 2010, CDRH's 510(k) Working Group published a preliminary report consisting of more than 60 recommendations grouped under seven findings aimed at improving the Center's effectiveness in implementing its missions. In our cycle of whitepapers, Kathleen and I have been reviewing the findings and recommendations, offering our observations of how they will affect the industry, and making alternative recommendations when we think we have a better idea.
Last week FDA issued its plans for Next Steps, but before we look at them I'd like to make some comments on Findings 6 and 7 because we uncovered some valuable, yet not-commonly-known information.
To recap, the first five findings were:
 There is insufficient clarity with respect to the definition of "substantial equivalence".
 CDRH's current practice allows for the use of some types of predicates that may not be appropriate.
 The de novo pathway is important and has not been optimally used across the Center.
 It is challenging for reviewers to obtain the information they need to make well-supported clearance decisions.
 CDRH's knowledge management infrastructure is limited.
In this final whitepaper we will look at Findings 6 and 7.
Finding 6: Variations in the expertise, experience, and training of reviewers and managers, including third-party reviewers, may contribute to inconsistency or uncertainty in 510(k) decision making.
More than 30% of reviewers, 31.5% to be exact, have less than two years experience; which means that one-third of the 510(k)s submitted to FDA will be reviewed by people with little agency history in their heads. One former high-level manager from CDRH advised me to study the preambles to every regulation. Preambles have legal standing, but most reviewers have never read them. I once asked a mid-level manager from CDRH BiMo how they could give a warning letter to a sponsor over a consent form that had been approved by an IDE reviewer; she scoffed and said: "Reviewers don't have time to read all that stuff."
 Reviewer Memoranda
Reviewers prepare an internal "review memo" for every 510(k) reviewed. The review memo documents the rationale of their decision. The Working Group Report states that less experienced reviewers write lengthier review memos, which are presumably more efficient. Something unexplained seems to happen with reviewers with more than ten years' experience; their highest percentage falls into an "unknown page count" column. (See Table 5.9 of the Report.)
Working Group Recommendations
The Working Group did not make recommendatations specific to review memos.
CDG recommends the very existence of review memos be more widely publicized. A careful search of the FDA website, Device Advice, and Google turned up very litle information about review memos. In fact, the only information I found concerned the 510(k) Quality Review Program, which states that a random selection of 510(k)s will be selected for review each quarter. A phone call to CDRH's Office of Device Evaluation gleaned the information that an internal review memo is written by the reviewer for every 510(k) reviewed, the memos are not generally offered to the submitter, and are available under the Freedom of Information Act. A call to FOI Services Inc (301-975-9400) confirmed this information, and copies of review memos for specific 510(k)s can be obtained through their office.
The review memo is of vital importance in gaining insight into the reviewer's thinking. Because the memos are available under Freedom of Information, why not make them available in the 510(k) database so that everyone can see them without making an independent FOI request? It would contribute to better submissions and more consistent reviews.
In addition, if your 510(k) submission was not cleared or received an NSE decision, the review memo should automatically be included in the FDA letter to the submitter.
 Third-Party Review
Class I and Class II 510(k) that do not require clinical data may go through a pre-review by an accredited third-party. While data show that submissions that have gone through pre-review tend to go through the final FDA review faster, there are no data to show if the total review process (pre-review plus FDA review) is longer or shorter. Concerns have been raised about the quality and consistency of third-party reviews.
Working Group Recommendations
The Working Group recommends regular evaluation of what devices are eligible for third-party review and that the agency enhance its third-party training programs.
CDG agrees. The third-party review program should be promptly expanded as planned in the Next Steps. Kathleen's personal experience with third-party reviewers has been extremely positive. She recommends shopping for companies who have reviewer's experienced in your type of product, checking their schedule for reviewing, and assessing the cost of the review. Typically the reviewer willl require less than a month to give tentative approval (or discuss deficiencies) for the submission. In Kathleen's experience FDA has always concurred.
 Enhance reviewer training
The Working Group recommends that reviewer training, professional development, and knowledge-sharing be enhanced.
CDG agrees. A lack of reviewer experience has serious consequences for manufacturers. Submissions are not reviewed consistently, reviewers may not have the scientific training to adequately review the submission, decisions may be delayed, review decisions may be inadequately documented, unnecessary or even outdated or inappropriate testing may be requested. In one situation, the reviewer of an IVD submission utilizing de-identified data asked the submitter to obtain additional information from the clinical subjects—an illegality because it would violate privacy rules, as well as impossible because the subject's identity was not known. In the case of the IDE approved informed consent form, BiMo nevertheless issued a warning letter for alleged deficiences.
Finding 7: CDRH does not currently have an adequate mechanism to regularly assess the quality, consistency, and effectiveness of the 510(k) program.
CDRH's primary tool for assessing the quality of 510(k) reviews is its 510(k) Quality Review Program. Through this program, managers in ODE and OIVD assess a sample of review memoranda on a quarterly basis. A standardized checklist is used to evaluate the completeness of each review memorandum, but not the adequacy or appropriateness of the reviewer’s decision making rationale and explanation.
Further, although CDRH collects information on device performance in the postmarket setting, important limitations, including the inability to consistently link postmarket events to specific 510(k)s, make this information, in isolation, an unreliable measure of program effectiveness.
Working Group Recommendatons
The Working Group recommended that CDRH develop metrics to continuously assess the quality, consistency, and effectiveness of the 510(k) program, to periodically audit 510(k) review decisions, and to measure the effect of any actions taken to improve the program.
CDG agrees and is hopeful that such information will be made publicly available in ODE's annual reports. Submitters can only increase the quality of their submissions if they have examples to work from.
CDG Can Write Your 510(k)
CDG's extensive Network Staff has expertise in writing 510(k)s in a wide variety of therapeutic, management, or diagnostic areas. My co-author, Kathleen Johnson is a regulatory consultant with more than 10 years experience in medical devices. CDG has capabilities in regulatory submissions, clinical research, toxicology, biostatistics, information research, medical writing, design control—we focus exclusively on medical device pre-approval issues.
Our style is to work collaboratively with a point-person on your side so that you are involved in the process every step of the way. Phone or email us at 773-489-5721 or firstname.lastname@example.org.
Nancy J Stark, PhD
President, Clinical Device Group Inc