We hope you'll join us for our 4/23 webinar on using data tables to apply reference ranges and AE codes in OC4. For more information and to register, visit https://register.gotowebinar.com/register/2882170018956684555


GerbenRienkGerbenRienk Posts: 838 ✭✭✭
Hi All,

I'm interested to know experiences with EMA-compliance: stricter than FDA? doubts about OC? tips and caveats?
You can response via the forum or if you prefer that via direct message.
Kind regards,

Gerben Rienk


  • knelausknelaus Posts: 12
    Hi Gerben, My colleague and I joint a conference in Sweden: “4th conference on Clinical Trials in the Nordic Countries, April 8-9, Stockholm”. Held and organized by the Nordic Health and Medicines Authorizes. Here we did a presentation with the following title: “Implementation of an Open Source FDA and GCP compliant Electronic Case Report Form (eCRF) system in an Oncology Department - an option?” We were a bit surprised when several GCP inspectors (similar to FDA inspectors) approached us and told us that the system we are using (OpenClinica Enterprise) is not GCP compliant in respect of Investigator holding an independent copy of data (Page 10 in European Medicines Agency “Reflection paper on expectations for electronic source data and data transcribed to electronic data collection tools in clinical trials” and Page 7 in FDA “Guidance for Industry – Electronic Source Data in Clinical Investigations”. In short we will get remarks when a GCP inspector comes for an inspection. The situation is a bit more severe if the investigator trials are going to be commercialized then it would be rejected by Health and Medicines Authorizes at least in the Nordic Countries and possible in the rest of Europe I suppose. Recently we have been asked by small bio companies to handle all their data management, but for the time being we have had to say no because of the EMA paper. I know that Akaza is aware of this important issue. Best regards Knud
  • GerbenRienkGerbenRienk Posts: 838 ✭✭✭
    Apparently there has been a meeting with the inspectors workgroup and the pharma and software vendors, amongst whom Oracle, in which the latter expressed their concern about the technical feasibility of this requirement. The result was:
    "It is essential that the principal investigator (PI) maintains an independent (i.e. out of 
    the sponsor’s control), contemporaneous copy of the CRF for which he has exclusive 
    control. This could be at the PI site, although, it could also be maintained at an 
    independent site/vendor when assuring the investigator’s continuous access."

    The minutes of the meeting can be found at

  • lindsay.stevenslindsay.stevens Posts: 404 ✭✭✭
    I don't think this policy should be problem for studies where the OpenClinica system is populated with a copy of source data that the investigator has control of (e.g. transcribed EMR / medical notes etc).

    If the OpenClinica has the original data (e.g. OpenClinica is serving as source data), then this policy would be an issue. However the investigator can print subject casebooks for their records at any time, and having been given a means it is the investigator's responsibility to maintain a copy of the data.

    In practice that would never happen so in both of the above cases I'd send the investigator a copy of the eCRF data for subjects their institution at the end of the study, after data cleaning is complete.

    The underlying EMA requirement seems a bit elaborate though. The reflection paper says 'The sponsor should not have exclusive control of a source document. (Requirement 10, ICH GCP 8.3.13)' and goes on to say that the investigator should have a 'contemporaneous certified copy of the data'. The referenced section of GCP doesn't use that language; it is a list of essential documents required during study conduct 'as it becomes available', and indicates that source documents should be in the investigator files. In which case (from the sponsor perspective) the OpenClinica feature allowing investigators to print subject casebooks at any time meets the GCP requirement, and the sponsor making sure the investigator has a copy at the end of the study would exceed the requirement.
  • ccollinsccollins Posts: 383 admin
    This is a challenging issue for any web-based data capture application. As mentioned in the reflection paper, one way to address it is by having the EDC system hosted by a trusted party who is not the sponsor, and ensuring the software and hosting environment adhere to rigourous GCP and 21CRF11 controls. 

    The cleaner solution is a technology architecture that gets an independent, contemporaneous of the casebook copy into the investigator's hands or some location under their control. OpenClinica's printable casebooks provide the needed record: the ability to, with one click, generate all the completed CRFs for a subject ordered and grouped by visit shows that the system has a logical and coherent way of organizing and displaying data. The completed data isaugmented with a complete, integrated history of changes to the data, accountability to specific individuals, signatures, and any notes & discrepancies that were discussed along the way. A subject-by-subject record of this nature immediately provides the visibility, traceability, accountability, and chain of custody that inspectors desire.

    So the question is how to get it to the investigator without the threat of sponsor tampering of the data. Ideally this should happen simultaneous with form submission, and must be pushed to the investigator since as mentioned it's not realistic to rely on site personnel to print/PDF it.

    We've done some thinking about it, see https://jira.openclinica.com/browse/OC-2930 for discussion of potential ways to handle this issue. In the case of e-source, the "earliest practically retainable record" should be considered as the location of the source data, and it's the site's responsibility to control and archive this. So how about a rule designer-style service I (described in the ticket and subsequent notes). It would be hosted separately from the user's OC application, and could be subscribed to as a 3rd party app with contractual accountability directly to sites rather than just to the sponsor. The validation burden for such a service might be high but it would be a great differentiator for OC vs competing EDC solutions. Have the REST API provide a table of study/site subjects with links to the subject casebooks, and the service would issue a curl-style request for the study, follow the links to each subject, use the PDF'er, zip up the PDf files, and offer them for download to the user. It could even archive them long term as well. 

    What do you think?

  • lindsay.stevenslindsay.stevens Posts: 404 ✭✭✭
    If you're convinced of the need to continually push data to the investigators, my suggestion is to use (or enhance) the existing extract features, rather than setting up a separate system.

    Using existing features:

    For each site, set up an extract definition that includes all data items and set it to run as a daily job (or whatever frequency), with the notification email going to the investigator. The ODM XML 1.3 full extensions format has all the values, audit trail and discrepancy notes, for all subjects at the site. If a nicer layout is required then the already onboard saxon/fop libraries could be used to create an extract format that presents the data in a pdf (an example of this for a discrepancy notes report is at [1]), with bookmarks and everything.

    Enhancing existing features:

    Create a study-level configuration parameter to turn on casebook notifications/download emails (default off). When turned on, an extract format can be selected, and the frequency and job start time can be selected. When turned on, email address(es) must be provided for the investigator in each site configuration (at the moment only the name is required).

    When this feature is active in the study, a non-configurable extract definition that includes all data items, audit trail and discrepancy notes for subjects at the site is created and run at the times set in the study configuration. On top of that, the created zip file for each site could be password protected (e.g. using Zip4j) using a one-time password generated and sent to the investigator with the download link. Each new site extract overwrites the old one so only the latest data and one-time password are any good.

    This enhancement would allow OpenClinica to say that the data made available using this feature is contemporaneous (to whatever degree the sponsor thinks the server can handle), complete, and not amenable to tampering (the sponsor never sees the file password and it OpenClinica doesn't store it).

    [1] https://github.com/lindsay-stevens-kirby/openclinica_scripts/tree/master/dnotes
  • mikewormikewor Posts: 35

    I agree with Lindsay - contemporaneous is not the same thing as simultaneous, contemporaneous can span even many years.

     I would add to Lindsay's requirements the need to optionally specify the sponsor (name and email address) as the sponsor has a similar need for a contemporaneous feed.  

    Also the encryption could be done with public/private key, as one-time passwords can get lost if the investigator does not perform the download immediately. 

    My main concern with this policy is the assumption that the investigator will be the authoritative owner of the data for all time - as we know investigators move on, and/or lose the data for their old studies.   But that problem is not for this forum to solve


  • kristiakkristiak Posts: 1,339 ✭✭✭

    Gerben and others,

    I am surprised that this become an issue at a meeting with Swedish GCP inspectors. All clinics in Sweden are required by law to keep any data collected from a patient in their local EHR or paper system. Therefore a complete source data verification is essential. OpenClinica has an excellent was of creating records of such SDV. Thus this should fulfil the GCP requirements as the OpenClinica data is then an exact copy of the investigator data. All EHR systems do NOT have an audit trail so we can not be sure that the two databases stay in synch. In a few cases we have installed OC on a server in the hospital which is a small task with OpenClinica. This approach should satisfy any GCP inspector!

    Swedish EHR systems leave a lot to be desired. I have described some of the issues in our paper:

    Kristianson KJ, Ljunggren H, Gustafsson LL: Data extraction from a semi-structured electronic medical record system for outpatients: a model to facilitate the access and use of data for quality control and research.

    Health Informatics J; 2009 Dec;15(4):305-19 . PMID: 20007655.






  • kristiakkristiak Posts: 1,339 ✭✭✭

    Further to above I would ad that it would be very simple to set up an OC database with an identical copy of the locked final database on an investigator PC. Using the community edition of OC this would not be a cost and the time to set it up is no more than 15-30 minutes. Would this satisfy the GCP inspectors? If in doubt this database could always be compared with the locked production database to satisfy the inspectors. The ease of doing this is definitely a manor advantage of OC. Even if you had to give them a small laptop for this purpose the cost would be nothing compared to the study cost, if this would solve the issue. Today you can find a laptop for less than 500 USD including WIN 8.1.



  • kristiakkristiak Posts: 1,339 ✭✭✭
    I forgot to attach the paper for anyone interested.
This discussion has been closed.