Reporting Checklist for Nature Neuroscience

Similar documents
Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Reporting Checklist for Nature Neuroscience

Life Sciences Reporting Summary

Reporting Checklist for Nature Neuroscience

Life Sciences Reporting Summary

Our web collection on statistics for biologists may be useful.

Jeffrey W. Elias, PhD UCD SOM Office of Research Grants Facilitation.

Efforts to Improve Data Transparency at the JBC

SUPPLEMENTARY INFORMATION

PROMOTING TRANSPARENT RESEARCH METHODS, PROTOCOLS AND DATA TO REDUCE IRREPRODUCIBILITY

Reporting Summary. Statistical parameters. Software and code. nature research reporting summary April Corresponding author(s):

Our web collection on statistics for biologists may be useful.

Reporting Checklist for Nature Neuroscience

Life Sciences Reporting Summary

Important Information for MCP Authors

Cerebrospinal fluid tau, neurogranin and neurofilament light in Alzheimerís disease

Using Excel s Analysis ToolPak Add-In

Lab 1: A review of linear models

Differential expression analysis for sequencing count data. Simon Anders

Revision Checklist for Science Signaling Research Manuscripts: Data Requirements and Style Guidelines

AP Statistics Scope & Sequence

Edinburgh Research Explorer

STAT 2300: Unit 1 Learning Objectives Spring 2019

Integrative Genomics 1a. Introduction

Aims and challenges of clinical trial data sharing

Practical What & Where of Rigor and Reproducibility in the NIH Application. SIU School of Medicine Office of Grants & Contracts with Sophia Ran, PhD

Microarray Gene Expression Analysis at CNIO

BIOEQUIVALENCE TRIAL INFORMATION

ENCODE RBP Antibody Characterization Guidelines

BIOEQUIVALENCE TRIAL INFORMATION FORM (Medicines and Allied Substances Act [No. 3] of 2013 Part V Section 39)

QPCR ASSAYS FOR MIRNA EXPRESSION PROFILING

Animal Research Ethics Committee. Guidelines for Submitting Protocols for ethical approval of research or teaching involving live animals

iphemap: An atlas of Phenotype to genotype relationships of human ipsc models of neurological diseases

Methods in Clinical Cancer Research Workshop Format for Protocol Concept Synopsis Sheet (blank) SYNOPSIS

Crowe Critical Appraisal Tool (CCAT) User Guide

Medicines Control Authority Of Zimbabwe

Life Sciences Reporting Summary

The first thing you will see is the opening page. SeqMonk scans your copy and make sure everything is in order, indicated by the green check marks.

Exploration and Analysis of DNA Microarray Data

Discriminant models for high-throughput proteomics mass spectrometer data

Archives of Scientific Psychology Reporting Questionnaire for Manuscripts Describing Primary Data Collections

RIGOR AND REPRODUCIBILITY: BACK TO BASICS

Alternative TSSs are co-regulated in single cells in the mouse brain

Design of Experiments (DOE) Instructor: Thomas Oesterle

Electronic Supplementary Information

Laboratory Developed Tests. William Castellani, MD Inter-regional Commissioner Clinical Pathologist, Penn State Hershey Medical Center

FFGWAS. Fast Functional Genome Wide Association AnalysiS of Surface-based Imaging Genetic Data

Weka Evaluation: Assessing the performance

Tutorial Segmentation and Classification

Precision in Quantitative Imaging: Trial Development and Quality Assurance

Are You Aware of New Publication Guidelines?

Design of Experiments Example 009 Tabandeh et al (2008) Chemical Engineering Science (63)

High-density SNP Genotyping Analysis of Broiler Breeding Lines

PROPENSITY SCORE MATCHING A PRACTICAL TUTORIAL

FACTORS CONTRIBUTING TO VARIABILITY IN DNA MICROARRAY RESULTS: THE ABRF MICROARRAY RESEARCH GROUP 2002 STUDY

Executive Summary. clinical supply services

Expectations for Biodistribution (BD) Assessments for Gene Therapy (GT) Products

VALIDATION OF ANALYTICAL PROCEDURES: METHODOLOGY *)

Common Issues in Qualification and Validation of Analytical Procedures

Estoril Education Day

New NIH 2-page Rigor and Transparency Document

Guidelines for Developing Robust and Reliable PCR Assays

Cancer Informatics. Comprehensive Evaluation of Composite Gene Features in Cancer Outcome Prediction

Real-time RT PCR. for identification of differentially expressed genes. (with Schizophrenia application) Rolf Sundberg, Stockholm Univ.

FREQUENTLY ASKED QUESTIONS

ANALYSING QUANTITATIVE DATA

ChIP-seq and RNA-seq. Farhat Habib

BIOSTATISTICS AND MEDICAL INFORMATICS (B M I)

Field trial with veterinary vaccine

JGK TRAINING PROGRAMME MODULE 2: METHOD VALIDATION AND MEASUREMENT OF UNCERTAINTY IN VETERINARY LABORATORIES PRACTICAL COURSE

RayBio Human Caspase-3 ELISA Kit

Codex Committee on Methods of Analysis and Sampling (31 st Session) Budapest, Hungary (8-12 March 2010)

Calculation of Spot Reliability Evaluation Scores (SRED) for DNA Microarray Data

Bioinformatics for Biologists

Validating methods and organizing and analyzing results of interlaboratory comparative tests (CT)

Transcription:

Corresponding Author: Manuscript Number: Manuscript Type: Hillel Adesnik NNA58169T Article Reporting Checklist for Nature Neuroscience # Main ures: 5 # lementary ures: 6 # lementary Tables: 1 # lementary Videos: 0 This checklist is us to ensure good ing standards and to improve the reproducibility of publish results. For more information, please read Reporting Life Sciences Research. Please e that in the event of publication, it is mandatory that authors include all relevant methodological and statistical information in the manuscript. Statistics ing, by figure example example Please specify the following information for each panel ing quantitative data, and where each item is (section, e.g. Results, & graph number). Each figure should ideally contain an sample size (n) for each experimental group/condition, where n is an number and a range, a clear definition of how n is defin (for example x cells from x slices from x animals from x litters, collect over x days), a description of the statistical test us, the results of the tests, any descriptive statistics and clearly defin error bars if applicable. For any experiments using custom statistics, please indicate the test us and stats obtain for each experiment. Each figure should include a statement of how many times the experiment shown was replicat in the lab; the details of sample collection should be sufficiently clear so that the replicability of the experiment is obvious to the reader. For experiments in the text but in the figures, please use the graph number instead of the figure number. Note: Mean and standard deviation are appropriate on small samples, and plotting independent data points is usually more informative. When technical replicates are, error and significance measures reflect the experimental variability and the variability of the biological process; it is misleading to state this clearly. FIGURE NUMBER 1a results, 6 TEST USED WHICH TEST? oneway unpair t test. Results 6 EXACT VALUE 9, 9, 10, 15 n DEFINED? mice from at least 3 litters/group 15 slices from 10 mice Methods 8 Results 6 DESCRIPTIVE STATS (AVERAGE, VARIANCE) REPORTED? error bars are mean / error bars are mean /. Results 6 P VALUE EXACT VALUE p = 0.044 p = 0.0006. Results 6 DEGREES OF FREEDOM & F/t/z/R/ETC VALUE VALUE F(3, 36) = 2.97 t(28) = 2.808. Results 6 nature neuroscience ing checklist March 2016 1

FIGURE NUMBER 1c 1e 1f 3 5 5 2b & 6 2 7 6 6 7 4a & 4b & 5c & 5e & TEST USED WHICH TEST? rank sum test rank sum test & 3 EXACT VALUE 32 16 3 11 5 n DEFINED? mice (both PV and SOM) one session each mice (both PV and SOM) one session each 17 cells 11 and 11 5 74,8 & 6 & 7 mice (both PV and SOM) one session each identifi SOM and PV cells FS/PV and put. SOM cells 11 SOM mice 18 PV mice & 3 & 4 3 & 6 & 7 6 21 lock L2/3 RS cells 6 6 17 lock L2/3 FS cells 6 7 47 lock L2/3 RS cells 7 & & & & 7 SOM mice 4 PV mice 10 SOM mice 10 SOM mice & & & & DESCRIPTIVE STATS (AVERAGE, VARIANCE) REPORTED? gamma power with size: error bars are mean / percent ruction w. ori offs. surr. mean / effect of size on IPSP power: error bars are percent ruction w. phase offs. surr. mean / 3 P VALUE EXACT VALUE 3.09e21 4.38e4 0.0003 and results 1 & 3 & 4 DEGREES OF FREEDOM & F/t/z/R/ETC VALUE VALUE chi2 = 105.85 z = 3.52 chi2 = 23.07 3 9.77e4 3 PPC mean / 5 0.47 5 z = 0.72 correlation @ 30Hz / gamma power ctrl vs SOM : mean / gamma power ctrl vs PV : mean / PPC ctrl. vs.som PPC ctrl. vs.som PPC ctrl. vs.pv stim frequency on band power (SOM) stim frequency on band power (PV) coherence iso vs. cross: mean =/ coherence ctrl. vs. SOM : mean / 5 0.002 5 z = 3.1 6 0.024 7 0.17 6 6 3.21e4 5.03e4 & 6 & 7 z = 1.37 z = 3.60 z = 3.48 7 0.01 7 z = 2.59 legen legen legen legen 3.17e5.0 0.002 0.002 4a & 4a & 5c & 5e & chi2 = 38.45 chi2 = 22.44 nature neuroscience ing checklist March 2016 2

l. 1a l. 1b l. 1c l. 1g l 1i l. 2b l. 2b l. 2b l. 2b l. 2b l. 2e l. 3b l. 3b l. 3b l. 3b l. 3b l. 5a l. 5a correlation coefficient 25 mice (SOM and PV) 8 mice 4 mice 22 mice 32 mice mice mice mice 7 mice 11 mice mice 18 mice 18 mice 18 mice 8 mice mice 11 mice mice high gamma with size: error bars are mean / high gamma with contrast: error bars are mean / high gamma with luminance: error bars are mean / high gamma running vs. quiescent peak frequency with size: error bars are mean / relative gama power ctrl vs. SOM peak/trough gamma power ctrl. vs. SOM absolute gamma power ctrl vs. SOM absolute gamma power (light before vis) ctrl vs. SOM high gamma power ctrl. vs. SOM gamma center frequency with center ruc frequency with SOM relative gama power ctrl vs. PV peak/trough gamma power ctrl. vs. PV absolute gamma power ctrl vs. PV absolute gamma power (light before vis) ctrl vs. PV high gamma power ctrl. vs. PV gamma power (nonrunning) ctrl. vs. SOM gamma power (nonrunning) ctrl. vs. PV 3.04e18 0.002 0.003 5.3e5 0.0 1.22e4 0.003 2.44e4 0.02 0.08 9.01e4 0.06 0.95 1.96e4 0.008 0.008 9.77e4 0.001 lgen chi2 = 91.64 chi2 = 18.71 chi2 = 15.83 z = 4.04 chi2 = 10.68 r = 0.78 z = 1.85 z = 0.07 z = 3.72 nature neuroscience ing checklist March 2016 4f & & pairs Legen cross spectral power: Control vs. Feback vs. Pseudofeback Legen 8.95e08 & chi2 = 48.11 3

4f & 11 l. 6a l. 6a l. 6a Friman test 2way 2way 2way 4f & pairs Legen 7 cells Representative figures 6 and 4 mice 6 and 4 mice 3 and 4 mice 1. Are any representative images shown (including Western blots and immunohistochemistry/staining) in the paper? If so, what figure(s)? 2. For each representative image, is there a clear statement of how many times this experiment was successfully repeat and a discussion of any limitations in repeatability? If so, where is this (section, graph #)? Statistics and general methods 1. Is there a justification of the sample size? If so, how was it justifi? Even if no sample size calculation was perform, authors should why the sample size is adequate to measure their effect size. 2. Are statistical tests justifi as appropriate for every figure? a. If there is a section summarizing the statistical methods in the methods, is the statistical test for each experiment clearly defin? b. Do the data meet the assumptions of the specific statistical test you chose (e.g. normality for a metric test)? Where is this describ (section, graph #)? zero peak of cross correlation: Control vs. Feback vs. Pseudofeback IPSP power control vs. light effect of anesthesia on foldincrease anesthesiafrequency interaction effect of brain area on fold increase Legen 4.3e6 Legend & chi2 = 32.42 0.0217 chi2 = 5.27 0.0021 0 0.44 yes, supplemental figures 3a and 4a F = 10.01 F = 7.54 F = 0.6 Virally induc opsin expression in SOM and PV mice was repeat in all mice on here. There is no challenge in repeatability using the describ protocol. Sample size was explicitly chosen and we collect data from as many cells as possible. In each case, a specific number of mice were available, they were record from, and the data was analyz. We us only nonmetric tests to determine significance except for l. 6a where we us a 2way to test for interaction terms. Each figure states the test, number of samples and the p value for each test. There is a statistics section in the methods. tests, rank sum test, Friman test and the do require any assumptions to be met. We did specifically test for normality of the residuals for the in l 6a. nature neuroscience ing checklist March 2016 4

c. Is there any estimate of variance within each group of data? Is the variance similar between groups that are being statistically compar? Where is this describ (section, graph #)? Differences in variance within and between groups was perform by the KW d. Are tests specifi as one or twosid? all tests are twosid and pair, rank sum are twosid e. Are there adjustments for multiple comparisons? no multiple comparisons were perform 3. To promote transparency, Nature Neuroscience has stopp allowing bar graphs to statistics in the papers it publishes. If you have bar graphs in your paper, please make sure to switch them to dotplots (with central and dispersion statistics display) or to boxandwhisker plots to show data distributions. 4. Are criteria for excluding data points? Was this criterion establish prior to data collection? Where is this describ (section, graph #)? 5. Define the method of randomization us to assign subjects (or samples) to the experimental groups and to collect and process data. If no randomization was us, state so. Where does this appear (section, graph #)? 6. Is a statement of the extent to which investigator knew the group allocation during the experiment and in assessing outcome includ? If no blinding was done, state so. 7. For experiments in live vertebrates, is a statement of compliance with ethical guidelines/regulations includ? 8. Is the species of the animals us? 9. Is the strain of the animals (including background strains of KO/ transgenic animals us)? there are no bar graphs in the figures yes yes methods section no randomization was necessary no blinding was done nature neuroscience ing checklist March 2016 10. Is the sex of the animals/subjects us? both male and female mice were us, methods section 5

11. Is the age of the animals/subjects?. For animals hous in a vivarium, is the light/dark cycle?. For animals hous in a vivarium, is the housing group (i.e. number of animals per cage)?. For behavioral experiments, is the time of day (e.g. light or dark cycle)? 15. Is the previous history of the animals/subjects (e.g. prior drug administration, surgery, behavioral testing)? a. If multiple behavioral tests were conduct in the same group of animals, is this? 16. If any animals/subjects were exclud from analysis, is this? Reagents a. How were the criteria for exclusion defin? Where is this describ (section, graph #)? b. Specify reasons for any discrepancy between the number of animals at the beginning and end of the study. Where is this describ (section, graph #)? 1. Have antibodies been validat for use in the system under study (assay and species)? a. Is antibody catalog number given? Where does this appear (section, graph #)? no behavioral experiments were perform N/A no exclusions N/A nature neuroscience ing checklist March 2016 6

b. Where were the validation data (citation, supplementary information, Antibodypia)? Where does this appear (section, graph #)? 2. Cell line identity a. Are any cell lines us in this paper list in the database of commonly misidentifi cell lines maintain by ICLAC and NCBI Biosample? b. If yes, include in the Methods section a scientific justification of their useindicate here in which section and graph the justification can be found. c. For each cell line, include in the Methods section a statement that specifies: the source of the cell lines have the cell lines been authenticat? If so, by which method? have the cell lines been test for mycoplasma contamination? N/A nature neuroscience ing checklist March 2016 7

Data availability Provide a Data availability statement in the Methods section under "Data availability", which should include, where applicable: Accession codes for deposit data Other unique identifiers (such as DOIs and hyperlinks for any other datasets) At a minimum, a statement confirming that all relevant data are available from the authors Formal citations of datasets that are assign DOIs A statement regarding data available in the manuscript as source data A statement regarding data available with restrictions See our data availability and data citations policy page for more information. Data deposition in a public repository is mandatory for: a. Protein, DNA and RNA sequences b. Macromolecular structures c. Crystallographic data for small molecules d. Microarray data Deposition is strongly recommend for many other datasets for which structur public repositories exist; more details on our data policy are available here. We encourage the provision of other source data in supplementary information or in unstructur repositories such as share and Dryad. We encourage publication of Data Descriptors (see Scientific Data) to maximize data reuse. Where is the Data Availability statement provid (section, graph #)? Computer code/software all data can be made available upon reasonable request Any custom algorithm/software that is central to the methods must be suppli by the authors in a usable and readable form for readers at the time of publication. However, referees may ask for this information at any time during the review process. 1. Identify all custom software or scripts that were requir to conduct the study and where in the procures each was us. 2. If computer code was us to generate results that are central to the paper's conclusions, include a statement in the Methods section under "Code availability" to indicate whether and how the code can be access. Include version information as necessary and any restrictions on availability. we us MClust spike sorting software (available from the author) and field trip, as well as chronux matlab toolboxes for spectral analysis and coherence estimates (both available online) all code can be made available upon reasonable request nature neuroscience ing checklist March 2016 Human subjects 8

1. Which IRB approv the protocol? Where is this stat (section, graph #)? 2. Is demographic information on all subjects provid? 3. Is the number of human subjects, their age and sex clearly defin? 4. Are the inclusion and exclusion criteria (if any) clearly specifi? 5. How well were the groups match? Where is this information describ (section, graph #)? 6. Is a statement includ confirming that inform consent was obtain from all subjects? 7. For publication of patient photos, is a statement includ confirming that consent to publish was obtain? fmri studies For papers ing functional imaging (fmri) results please ensure that these minimal ing guidelines are met and that all this information is clearly provid in the methods: 1. Were any subjects scann but then reject for the analysis after the data was collect? a. If yes, is the number reject and reasons for rejection describ? 2. Is the number of blocks, trials or experimental units per session and/ or subjects specifi? 3. Is the length of each trial and interval between trials specifi? N/A nature neuroscience ing checklist March 2016 4. Is a block, eventrelat, or mix design being us? If applicable, please specify the block length or how the eventrelat or mix design was optimiz. 9

5. Is the task design clearly describ? 6. How was behavioral performance measur? 7. Is an or factorial design being us? 8. For data acquisition, is a whole brain scan us? If, state area of acquisition. a. How was this region determin? 9. Is the field strength (in Tesla) of the MRI system stat? a. Is the pulse sequence type (gradient/spin echo, EPI/spiral) stat? b. Are the fieldofview, matrix size, slice thickness, and TE/TR/ flip angle clearly stat? 10. Are the software and specific meters (model/functions, smoothing kernel size if applicable, etc.) us for data processing and preprocessing clearly stat? 11. Is the coordinate space for the anatomical/functional imaging data clearly defin as subject/native space or standardiz stereotaxic space, e.g., original Talairach, MNI305, ICBM152, etc? Where (section, graph #)?. If there was data normalization/standardization to a specific space template, are the type of transformation (linear vs. nonlinear) us and image types being transform clearly describ? Where (section, graph #)?. How were anatomical locations determin, e.g., via an automat labeling algorithm (AAL), standardiz coordinate database (Talairach daemon), probabilistic atlases, etc.?. Were any additional regressors (behavioral covariates, motion etc) us? 15. Is the contrast construction clearly defin? 16. Is a mix/random effects or fix inference us? nature neuroscience ing checklist March 2016 a. If fix effects inference us, is this justifi? 17. Were repeat measures us (multiple measurements per subject)? 10

a. If so, are the method to account for within subject correlation and the assumptions made about variance clearly stat? 18. If the threshold us for inference and visualization in figures varies, is this clearly stat? 19. Are statistical inferences correct for multiple comparisons? a. If, is this label as uncorrect? 20. Are the results bas on an ROI (region of interest) analysis? a. If so, is the rationale clearly describ? b. How were the ROI s defin (functional vs anatomical localization)? 21. Is there correction for multiple comparisons within each voxel? 22. For clusterwise significance, is the clusterdefining threshold and the correct significance level defin? Additional comments Additional Comments nature neuroscience ing checklist March 2016 11