Race to Market: Building an Efficient Discovery Engine

Size: px
Start display at page:

Download "Race to Market: Building an Efficient Discovery Engine"

Transcription

1 Race to Market: Building an Efficient Discovery Engine Anja Gilis Bioresearch Quality & Compliance, Janssen R&D Research Integrity: Position yourself for success! - Ghent, December 6 th 2017 Disclaimer: The views expressed in this presentation are solely those of the individual authors, and do not necessarily reflect the views of their employers

2 Agenda Disclaimer: The views expressed in this presentation are solely those of the individual authors, and do not necessarily reflect the views of their employers. 1. Roadmap to bringing a drug to market 10 min 2. How can it go wrong? a workshop 30 min 3. Reproducibility in research 30 min Break 4. Defining quality a workshop 1 h

3 Roadmap to bringing a drug to market

4 The Drug Discovery Process Non Regulated Health Authority Regulated Target Selection Hit Identification Hit to Lead & Lead Optimization Development Product Launch Target ID Assay Dev Screening Medicinal Chemistry / Protein Engineering In vitro assays In vivo & Translational Models CMC & Tox Clinical Trials Phase 1, 2, 3 NME Declaration Health Authority Filing

5 Target Selection How do we decide which biological targets to pursue? Internal Based Discovery External Collaborations Literature

6 Increasing Awareness on Reproducibility scientists lift the lid on reproducibility

7 Increasing Awareness & Challenges A team at Bayer HealthCare in Germany last year reported that only about 25% of published preclinical studies could be validated to the point at which projects could continue. (Oncology, Women s Health, Cardiovascular) Prinz, F., Schlange, T. and Asadullah, K Nature Rev. Drug Discov. 10, 712. Of 53 Oncology landmark studies, in only 6 cases the scientific finding were confirmed (11 %) (In these 6 studies, the authors had paid close attention to controls, reagents, investigator bias and description of the complete data set) Begley, C.G. and Ellis, L.M Nature 483,

8 Increasing Awareness & Challenges of Issues with data integrity in Research environment Jan 2014: RIKEN (Japan s biggest research institute) reports (Nature) easy way creating stem cells Bloggers immedately indicated possibly manipulated images Replication was not possibe & fraud was found + management lack of oversight and push for breacktrough results RIKEN s Center for Developmental Biology was stripped of half its staff, renamed and placed under new management.

9 Increasing Awareness & Challenges of Issues with data integrity in the External research environment Noorden, R Nature 478, 26 28

10 Where are the risks?

11 How can we be successful in this environment? It is all about the data! Basic Research DATA DATA DATA Only Preclinical healthy Clinical data Registration lead to healthy patients Synthesis Screening Lead optimization Safety Toxicology PK Phase 1 Phase 2 Phase 3 Phase 4 > 1,000,000 Compounds 5,000 Compounds 5 2 Compounds 1 Drug years / $ 2.6 billion 11

12 Key Learning We need to ensure the highest data integrity. Internally and externally, and from the very beginning But how? 12

13 How can it go wrong? a workshop

14 Workshops: Work in teams Different fictitious issues / questions will be presented Try to come up with an answer or solution Group discussion

15 How can it go wrong? Workshop 1

16 Workshop 1 In-vivo efficacy test Effect of a compound on the size of subcutaneous tumors in mice. What are the issues in the experiment? Think about how these may impact the conclusion of the experiment.

17 Workshop 1 Reported data Median relative tumor volume (mm 3 ) vehicle (N=6) 30 mpk (N=5) 100 mpk (N=5) Conclusion: Dose-dependent tumor regression BUT 4O mpk dosing group omitted data from D30 D36 omitted exclusion criteria not followed Time (day) omitted animals w/o explanation

18 Workshop 1 Graph with all data included Median relative tumor volume (mm 3 ) vehicle (N=6) 30 mpk (N=5) 40 mpk (N=6) 100 mpk (N=6) NO dose-dependent tumor regression. Tumor regrowth during treatment. Conclusion: Efficacy is questionable Time (day)

19 Workshop 1 Key Learnings: Full disclosure is critical to enable reproducibility and sound decision making Predefined criteria should be closely followed

20 How can it go wrong? Workshop 2

21 Workshop 2 Workshop 96-well plate plate reader Data from the plate reader processed data reported data

22 Workshop 2 Workshop For Compound 2, What are the issues in the experiment? Think about how these may impact the conclusion.

23 Workshop 2 Workshop A B C D E F G H Concentration 10 µm 3 µm 1 µm 0.3 µm 0.1 µm 0.03 µm Calculation of mean values data used for the calulation of mean control and SEM d data excluded from mean calculation

24 Workshop 2 Workshop Test compound Conc. Mean SEM control 0 µm test compound 2 10 µm test compound 2 3 µm test compound 2 1 µm test compound µm test compound µm test compound µm Processed data Reported data

25 Workshop 2 Workshop µm data point not shown 10 Parameter x Concentration (µm)

26 Workshop 2 Key Learnings: Assay robustness is key Criteria for test acceptance Importance of internal reviews Importance of transparency/full disclosure Reminder: You should check with your organization s project leader, management/administration, and legal representatives to determine whether it has a written policy or guidelines pertaining to the recording of experimental data in a laboratory notebook system.

27 Agenda Disclaimer: The views expressed in this presentation are solely those of the individual authors, and do not necessarily reflect the views of their employers. 1. Roadmap to bringing a drug to market 10 min 2. How can it go wrong? a workshop 30 min 3. Reproducibility in research 30 min Break 4. Defining quality a workshop 1 h We are here

28 Reproducibility in research Example 1

29 Reproducibility example 1 For your compound XYZ , an optimized cell based assay followed by ELISA was run multiple times. You, as a project leader, are asked to summarize and present the results. Therefore, you contact the lab manager to provide you with all the individual test results.

30 Reproducibility example 1 It is clear that a lot of variation is seen in the results. You have asked the lab members to provide more information to understand if the data is of comparable quality. Consider which of the following information would help determine which results can or should be used in the upcoming presentation.

31 Reproducibility example 1 Staff who produced the data

32 Experimental Comments Reproducibility example 1

33 Reproducibility example 1 Validation criteria (+/ Controls) established in the protocol Link to protocol

34 Reproducibility example 1 Validation criteria (S/N ratio of >50) established in the protocol Link to protocol

35 Reproducibility example 1 Validation criteria (pic50 of ref cpd at 7.4 ± 0.4 µm) established in the protocol Link to protocol

36 Reproducibility example 1 Results that meet the acceptance criteria: Positive and negative controls signal:noise ratio Reference compound performance

37 Key Learnings: Reproducibility example 1 Criteria can be both quantitative and qualitative Criteria determine the acceptance or rejection of assay results Define criteria prior to the start of the experiments Examples: Range for controls Calibration curves Acceptable experimental error e.g. % standard deviation, %CV, etc. Conditions or rules for determination of outliers Establish a process to monitor ALL performed experiments

38 Reproducibility in research Example 2

39 A pivotal experiment is conducted: Reproducibility example 2 A scientist in your lab is asked to perform a pivotal experiment. The resulting data may be pivotal for decision making in a key project, and will contribute to the intellectual property of your companies asset. What key pieces of information would you expect the scientist to capture in his or her lab notebook entry? Answer: Title Objective Materials and equipment Procedures Analysis and/or statistical methods References to raw data Results or conclusions Signatures and dates of the primary author and a witness

40 Reproducibility example 2 Experiment record: The scientist completes the experiment and is thrilled to show you the results! He shows you a copy of the lab notebook that includes a thorough description of the objective, materials & procedures. He also pasted in the following, a copy of the raw data as well as a graphical representation of the results from an Excel file he saved on his laptop. Do you have any comments or concerns related to the contents of the record? ** Capture: Calculations used for analysis! Reason for exclusion! Statistical methods! Location of raw and processed data!

41 Reproducibility example 2 A closer look: You work with the scientist to further look into the underlying files and some issues are observed. While it was justified to remove an animal, the incorrect animal was excluded. A reanalysis of the calculations and graphing reveals Reanalysis no longer significant

42 Key Learnings: Reproducibility example 2 Experiment records should contain (or cross reference) the following information: Title Names of individuals involved Date of each activity performed Objective Materials and equipment Procedures Analysis and/or statistical methods Raw, processed and final data Results or conclusions Signatures and dates of the primary author and a witness Consider External data as well Data storage Reminder: You should check with your organization s project leader, management/administration, and legal representatives to determine whether it has a written policy or guidelines pertaining to the recording of experimental data in a laboratory notebook system.

43 Reproducibility in research Example 3

44 Example 3 : selective Reproducibility reporting of replicate example tests3 One compound was tested twice and following results were generated. Are the reported results a complete representation of the experiments done? Results reported 5th World Conference on Research Integrity, Amsterdam, The Netherlands

45 Key Learnings: Reproducibility example 3 Full disclosure Report and/or list all findings (both positive and negative) Clearly label replicates when presenting data in slide decks or reports

46 Reproducibility in research Example 4

47 Reproducibility example 4 Why included? No criteria for exclusion of data points defined 47 5th World Conference on Research Integrity, Amsterdam, The Netherlands

48 Reproducibility example 4 Without outlier exclusion With outlier exclusion AUC (570) AUC (570) * * 0 Vehicle Vehicle Treatment P.O. Treatment P.O. One way Anova: P= One way Anova: P= th World Conference on Research Integrity, Amsterdam, The Netherlands

49 Key Learnings: Reproducibility example 4 Build criteria in your process to allow unbiased processing of data Upfront outlier exclusion criteria Consider automation of critical repetitive process steps Consider peer review on analyzed and reported data If an unexpected event happens during your experiment, document it in your data

50 Defining quality in research

51 Defining quality in discovery 1. What factors could boost research integrity? 2. Should there be common guidelines for non regulated research? if yes what should they look like? 3. Should there be an accreditation system for the quality of research? if yes under which conditions could it work?

52 Discovery data integrity A quality approach for Janssen R&D

53 Janssen s Therapeutic Area Research Focus NEUROSCIENCE Schizophrenia Mood disorders Alzheimer s disease Chronic pain IMMUNOLOGY Inflammatory bowel disease Rheumatoid arthritis Psoriasis Pulmonary disease End-to-end strategy focused on five therapeutic areas INFECTIOUS DISEASES & VACCINES HIV Hepatitis C Respiratory Vaccines CARDIOVASCULAR & METABOLISM Cardiovascular disease Diabetes ONCOLOGY Prostate Lung B-cell malignancies Immuno-Oncology

54 everything we do must be of high quality. We must experiment with new ideas Research must be carried on, innovative programs developed Star Ledger file photo

55 Finding the Balance Innovation & Quality Innovation Research Exploration Integrity Accuracy Reconstructability Man Wallpaper Albert Einstein Imagination is More Important than Knowledge 1280X1024 free wallpaper download/

56 Implementation of a Quality System As is situation Monitor Compliance Improvements and Agreements Quality Culture

57 Janssen s DDI quality system Culture & communication Poster campaigns Communication by senior management Interactive GRP training sessions Quality program & metrics Spot checks on certain key data Data traceablility and integrity Continuous improvement of data handling practices by joint Quality Discovery efforts Gap analysis Joint effort by QA and discovery scientists Solve local and group-specific issues Electronic notebook Implement best practices Senior leaders made accountable Internal and external data flows Form multidisciplinary teams to leverage best pratices and tackle gaps IT, communications, biostat, QA, scientists Global sustainable and scaleable solutions photo image

58 Janssen s DDI quality system Basic training program and platform 1 central DDI portal Participative poster campaign Where did I put it? Where to store data What data to store Safe data storage Culture & Communication Mandatory global DDI training Rewarding system How to deal with fraud allegations/ suspicions Mandatory DDI section in electronic protocol templates Internal and external non reg QA program Quality maintenance program Reporting of individual experiments Clinical candidate reporting Reporting process Bias prevention Increased Biostat support Publication process External contracts External data Planned Implementation phase Fully implemented

59 Key success factors Role Models: Senior leaders sponsorship & support Talking the talk, walking the walk Mandatory education for all staff Community participative communication Partnerships: Quality, IT, Biostatisticians, Communications, Simple, sustainable solutions and fit for purpose guidance By scientists, for scientists Transparency: central data sharing Spot check program (= measure of success) Speak up culture (hotline) test

60 Resources Available quality non regulated scientific research/ scientific_recordkeeping.pdf

61 Towards a common quality system for non regulated research? 5th World Conference on Research Integrity, Amsterdam, The Netherlands

62 What EQIPD is the Innovative - IMI Project Medicines European Initiative Quality in Preclinical (IMI)? Data First IMI consortium completely dedicated to improving preclinical Data Quality Joint undertaking by big pharma, CROs, academia and scientific associations Proof of concept in Neuroscience and Safety, facilitated by a Quality Management System Expand R&D wide if successful Participants: 11 EFPIA partners 8 applicants (10 universities, 7 CROs, 1 scientific society 7 associate collaborators 5 advisors

63 What EQIPD is the Innovative - IMI Project Medicines Initiative (IMI)?

64 QUESTIONS 5th World Conference on Research Integrity, Amsterdam, The Netherlands

65 Thank you

66 Increasing Awareness on Reproducibility scientists lift the lid on reproducibility