We are currently working on the forum. For the short-term, all forum content will be in read-only format. We apologize for the interruption and look forward to collaborating with you shortly. All the best in your research!

OC 3.1 issues

Hi
A clutch of issues that I’d appreciate some help with, or at least to compare notes on. Partly in return, I’ve also attached a data extraction script that some might find of use.
In V3.1 –
1. In the Discrepancies screen I can filter the table on type, event, CRF etc., but try as I might I cannot filter on a date, no matter what format I try to put the date in. The default date setup used here is dd/Mmm/yyyy, e.g. 29/Sep/2011. Anyone else had this problem, or know of the magic formula required to filter on a date?
2. When I’ve uploaded rules I have missed out the Rule element, after I discovered that the system seems to default to TRUE for all rule ‘opportunities’ – i.e. if I download the XML the run statement is obligingly there with all data entry types marked as true, and the rule_action_run table also shows all the values are TRUE.
The rules fire OK on normal data entry, but when I import new data it goes in without a murmur. There does not appear to be any validation operating, in fact the feedback on the screen says that no validation checks fired even when the data includes crazy values. Anyone else had this issue?
3. Probably a silly question but clarification would be appreciated: When I run batch validation, either on a rule or a CRF nothing seems to happen. Is that because the rules have already fired on initial data entry? In other words is batch firing only of use if the data entry firings have been turned off? Or should they fire anyway? (Yes I could spend time running experiments to work it out either way, but I really don’t have the time).
4. In some situations some validation checks keep firing every time the section is re-saved. Normally if a discrepancy note is raised then if I go back to the section and re-save (e.g. after changing another value) the original discrepancy is not re-fired, which is what I’d expect. But for at least one data item the discrepancy event fires again, even though the target item has not been changed. It will just keep raising new discrepancy notes – identical except for their IDs. The 2 rules on the (date) question are against the current date (not after) and a fixed date (not before). Again, anyone had a similar issue or know of an explanation?
5. After importing data the reason for change mechanism appears to lose the plot. Given the form has been labelled by the system as complete it posts a request for a ‘reason for change’ if a value is changed. But it also posts a RFC request for data items on the same section that have neither been changed nor imported. Furthermore, attempts to furnish an RFC for these questions seems to have no effect and the RFC is not saved. But the system still requests one, and so the form becomes unsaveable. Anyone else had this issue or can provide an explanation?
And as for the script...
As much fun as XSL transforms might be (if I had the time) I have also been looking at getting data directly out of the DB. That was going OK until I also wanted to 'decode' any category values in the item_data table. The response_set table stores these as comma delimited lists so that it didn't appear very easy, until I came across a neat PostgreSQL function called regexp_split_to_table, which can take such lists and turn them into rows on the fly, and more importantly seems to transform neighbouring lists of the same size into matching rows. (OK, so I'm new to Postgres).
The attached document details the script to get a full dataset for a particular study, and the steps in constructing it. It seems to work but I can't claim much testing. As the document says it almost certainly needs refinement to take into account the statuses of the various entities, but that will take some further work. This is shared as-is, on the chance that others might want to do / already have done a similar thing. If you find any errors I'd be very happy to receive the details.
Best wishes, Mit freundlichen Grüßen, Meilleures salutations
Steve

Comments

  • Hi Steve,

    A list of answers is below:

    You can only filter by a full date in that column, not a partial date. For example, in my instance the date format is DD-Mmm-YYYY. I have to filter by 21-Dec-2010 and I get 2 notes created on that day. If I said Dec-2010, I would get nothing. Please let me know if you are entering a full date and seeing an error.
    Yes, the rule evaluation will default to True if left blank. This has no impact on data being imported. OpenClinica does not support rules running at the time of data import either through scheduled jobs, the UI, or through web services. You would have to run these rules in bulk after the data has been imported. A future release will support rules executing at the time of data import.
    Once a rule has been executed and an action taken, it will never run again until the target value is changed.
    Could you please provide your XML and CRF to recreate this scenario? It sounds like you have multiple rules on a particular item and their evaluation is causing a loop. Based on your explanation of how the rule should work, i.e. a value not being greater than the current date or before a fixed date, this should be handled in one Rule Expression.
    There are known issues with importing data and reason for change. When a subset of items is included in the imported XML file, the rest of the item records are not created in the item_data table. Once you start entering data in the CRF during Administrative Editing, these records will attempt to be created and therefore trigger the Forced Reason for Change. You can do one of two things with the current version of OpenClinica.
    a. Include all of the other items in the XML file for that section, just state their Value=””. This will create the records in the table and you will not see the reason for change fire.
    b. Turn off the Forced Reason for Change study parameter for your study.
    In the future we will allow people to specify whether the CRF should be marked complete or not after the data has been imported, and also create a way for someone to log one Reason for Change that will apply to all items in a section that are being updated.
    Thank you for contributing the script. I will let others in the community comment on what you have done if they are interested.

    Paul
    -----------------------------------------------------------------------
    Paul J. Galvin
    Project Manager
    OpenClinica
    781-547-8425
    www.openclinica.com
    Open Source Platform for Clinical Research
    Sent: Friday, September 30, 2011 9:03 AM
    To: [email protected]
    Subject: [Developers] OC 3.1 issues

    Hi

    A clutch of issues that I’d appreciate some help with, or at least to compare notes on. Partly in return, I’ve also attached a data extraction script that some might find of use.

    In V3.1 –
    1. In the Discrepancies screen I can filter the table on type, event, CRF etc., but try as I might I cannot filter on a date, no matter what format I try to put the date in. The default date setup used here is dd/Mmm/yyyy, e.g. 29/Sep/2011. Anyone else had this problem, or know of the magic formula required to filter on a date?
    2. When I’ve uploaded rules I have missed out the Rule element, after I discovered that the system seems to default to TRUE for all rule ‘opportunities’ – i.e. if I download the XML the run statement is obligingly there with all data entry types marked as true, and the rule_action_run table also shows all the values are TRUE.
    The rules fire OK on normal data entry, but when I import new data it goes in without a murmur. There does not appear to be any validation operating, in fact the feedback on the screen says that no validation checks fired even when the data includes crazy values. Anyone else had this issue?
    3. Probably a silly question but clarification would be appreciated: When I run batch validation, either on a rule or a CRF nothing seems to happen. Is that because the rules have already fired on initial data entry? In other words is batch firing only of use if the data entry firings have been turned off? Or should they fire anyway? (Yes I could spend time running experiments to work it out either way, but I really don’t have the time).
    4. In some situations some validation checks keep firing every time the section is re-saved. Normally if a discrepancy note is raised then if I go back to the section and re-save (e.g. after changing another value) the original discrepancy is not re-fired, which is what I’d expect. But for at least one data item the discrepancy event fires again, even though the target item has not been changed. It will just keep raising new discrepancy notes – identical except for their IDs. The 2 rules on the (date) question are against the current date (not after) and a fixed date (not before). Again, anyone had a similar issue or know of an explanation?
    5. After importing data the reason for change mechanism appears to lose the plot. Given the form has been labelled by the system as complete it posts a request for a ‘reason for change’ if a value is changed. But it also posts a RFC request for data items on the same section that have neither been changed nor imported. Furthermore, attempts to furnish an RFC for these questions seems to have no effect and the RFC is not saved. But the system still requests one, and so the form becomes unsaveable. Anyone else had this issue or can provide an explanation?
    And as for the script...
    As much fun as XSL transforms might be (if I had the time) I have also been looking at getting data directly out of the DB. That was going OK until I also wanted to 'decode' any category values in the item_data table. The response_set table stores these as comma delimited lists so that it didn't appear very easy, until I came across a neat PostgreSQL function called regexp_split_to_table, which can take such lists and turn them into rows on the fly, and more importantly seems to transform neighbouring lists of the same size into matching rows. (OK, so I'm new to Postgres).
    The attached document details the script to get a full dataset for a particular study, and the steps in constructing it. It seems to work but I can't claim much testing. As the document says it almost certainly needs refinement to take into account the statuses of the various entities, but that will take some further work. This is shared as-is, on the chance that others might want to do / already have done a similar thing. If you find any errors I'd be very happy to receive the details.
    Best wishes, Mit freundlichen Grüßen, Meilleures salutations
    Steve
  • scanhammanscanhamman Posts: 39
    Hi Paul
    and many thanks for the clarifications
    With reference to point 1 it is definitely a full date I am using, in as many different formats as I can think of but none seem to work (though we have modified the format_en_GB file to use dd/Mmm/yyyy; please see a post of mine of 1st August, which includes the modified file).
    I will see if combining the logic on point 4 works. If not I'll post the logic.
    The info on point 5, on directly imported data, is very useful and explains a lot. At the risk of appearing a bit grumpy, should it not be included in the basic documentation on this topic? (please)
    I think both the proposals for further development would be really useful additions.
    best wishes
    Steve
    On 7 October 2011 23:07, Paul Galvin wrote:
    Hi Steve,

    A list of answers is below:

    You can only filter by a full date in that column, not a partial date. For example, in my instance the date format is DD-Mmm-YYYY. I have to filter by 21-Dec-2010 and I get 2 notes created on that day. If I said Dec-2010, I would get nothing. Please let me know if you are entering a full date and seeing an error.
    Yes, the rule evaluation will default to True if left blank. This has no impact on data being imported. OpenClinica does not support rules running at the time of data import either through scheduled jobs, the UI, or through web services. You would have to run these rules in bulk after the data has been imported. A future release will support rules executing at the time of data import.
    Once a rule has been executed and an action taken, it will never run again until the target value is changed.
    Could you please provide your XML and CRF to recreate this scenario? It sounds like you have multiple rules on a particular item and their evaluation is causing a loop. Based on your explanation of how the rule should work, i.e. a value not being greater than the current date or before a fixed date, this should be handled in one Rule Expression.
    There are known issues with importing data and reason for change. When a subset of items is included in the imported XML file, the rest of the item records are not created in the item_data table. Once you start entering data in the CRF during Administrative Editing, these records will attempt to be created and therefore trigger the Forced Reason for Change. You can do one of two things with the current version of OpenClinica.
    a. Include all of the other items in the XML file for that section, just state their Value=””. This will create the records in the table and you will not see the reason for change fire.
    b. Turn off the Forced Reason for Change study parameter for your study.
    In the future we will allow people to specify whether the CRF should be marked complete or not after the data has been imported, and also create a way for someone to log one Reason for Change that will apply to all items in a section that are being updated.
    Thank you for contributing the script. I will let others in the community comment on what you have done if they are interested.

    Paul
    -----------------------------------------------------------------------
    Paul J. Galvin
    Project Manager
    OpenClinica
    781-547-8425
    www.openclinica.com
    Open Source Platform for Clinical Research
    Sent: Friday, September 30, 2011 9:03 AM
    To: [email protected]
    Subject: [Developers] OC 3.1 issues

    Hi

    A clutch of issues that I’d appreciate some help with, or at least to compare notes on. Partly in return, I’ve also attached a data extraction script that some might find of use.

    In V3.1 –
    1. In the Discrepancies screen I can filter the table on type, event, CRF etc., but try as I might I cannot filter on a date, no matter what format I try to put the date in. The default date setup used here is dd/Mmm/yyyy, e.g. 29/Sep/2011. Anyone else had this problem, or know of the magic formula required to filter on a date?
    2. When I’ve uploaded rules I have missed out the Rule element, after I discovered that the system seems to default to TRUE for all rule ‘opportunities’ – i.e. if I download the XML the run statement is obligingly there with all data entry types marked as true, and the rule_action_run table also shows all the values are TRUE.
    The rules fire OK on normal data entry, but when I import new data it goes in without a murmur. There does not appear to be any validation operating, in fact the feedback on the screen says that no validation checks fired even when the data includes crazy values. Anyone else had this issue?
    3. Probably a silly question but clarification would be appreciated: When I run batch validation, either on a rule or a CRF nothing seems to happen. Is that because the rules have already fired on initial data entry? In other words is batch firing only of use if the data entry firings have been turned off? Or should they fire anyway? (Yes I could spend time running experiments to work it out either way, but I really don’t have the time).
    4. In some situations some validation checks keep firing every time the section is re-saved. Normally if a discrepancy note is raised then if I go back to the section and re-save (e.g. after changing another value) the original discrepancy is not re-fired, which is what I’d expect. But for at least one data item the discrepancy event fires again, even though the target item has not been changed. It will just keep raising new discrepancy notes – identical except for their IDs. The 2 rules on the (date) question are against the current date (not after) and a fixed date (not before). Again, anyone had a similar issue or know of an explanation?
    5. After importing data the reason for change mechanism appears to lose the plot. Given the form has been labelled by the system as complete it posts a request for a ‘reason for change’ if a value is changed. But it also posts a RFC request for data items on the same section that have neither been changed nor imported. Furthermore, attempts to furnish an RFC for these questions seems to have no effect and the RFC is not saved. But the system still requests one, and so the form becomes unsaveable. Anyone else had this issue or can provide an explanation?
    And as for the script...
    As much fun as XSL transforms might be (if I had the time) I have also been looking at getting data directly out of the DB. That was going OK until I also wanted to 'decode' any category values in the item_data table. The response_set table stores these as comma delimited lists so that it didn't appear very easy, until I came across a neat PostgreSQL function called regexp_split_to_table, which can take such lists and turn them into rows on the fly, and more importantly seems to transform neighbouring lists of the same size into matching rows. (OK, so I'm new to Postgres).
    The attached document details the script to get a full dataset for a particular study, and the steps in constructing it. It seems to work but I can't claim much testing. As the document says it almost certainly needs refinement to take into account the statuses of the various entities, but that will take some further work. This is shared as-is, on the chance that others might want to do / already have done a similar thing. If you find any errors I'd be very happy to receive the details.
    Best wishes, Mit freundlichen Grüßen, Meilleures salutations
    Steve
This discussion has been closed.