2019 National Child Welfare Evaluation Summit Call for Abstracts

Size: px
Start display at page:

Download "2019 National Child Welfare Evaluation Summit Call for Abstracts"

Transcription

1 August 20 21, 2019 Washingtn, DC 2019 Natinal Child Welfare Evaluatin Summit Call fr Abstracts The Children s Bureau (CB), within the Administratin fr Children and Families at the U.S. Department f Health and Human Services, is pleased t hst the 2019 Natinal Child Welfare Evaluatin Summit n August 20 21, 2019, at the Marritt Wardman Park in Washingtn, DC. Backgrund CB is cmmitted t advancing effrts that prevent maltreatment and strengthen families t nurture and prvide fr their children s well-being. CB s visin is that child welfare will becme a mre preventin-riented, cmmunity-driven system that ensures families can thrive, reduces unnecessary family disruptin, and keeps children in their cmmunities whenever pssible. Child welfare systems acrss the cuntry are in the prcess f develping 5-year strategic Child and Family Services Plans. CB has encuraged child welfare agencies and curts t use these plans as an pprtunity t engage partners t set shared gals that reflect their cmmunities aspiratins fr children and families. In additin, jurisdictins are implementing Child and Family Services Review Prgram Imprvement Plans (PIPs) and launching evidence-based, preventin-fcused initiatives in respnse t the Family First Preventin Services Act (FFPSA). Many are als explring state and lcal strategies t better understand risk f child abuse and neglect, testing innvative practices thrugh federally funded grant prjects, and wrapping up large-scale title IV-E waiver demnstratin prjects. Data analysis, research, prgram evaluatin, perfrmance measurement, and nging cntinuus quality imprvement (CQI) are crucial t achieving CB s visin fr a mre preventive system and delivering mre effective services that will ensure safety, permanency, and wellbeing fr children, yuth, and families. The Evaluatin Summit ffers a timely pprtunity fr partners frm child welfare systems and the research cmmunity t share relevant methds, findings, successes, and challenges that can help imprve child welfare services and infrm gals fr the future f child welfare at the lcal and natinal levels. Evaluatin Summit Purpse and Gals The theme f the Evaluatin Summit is Leveraging Data and Evaluatin t Strengthen Families and Prmte Well-Being. The Evaluatin Summit aims t strengthen the link between research and practice by ffering a frum fr explring fundatinal questins abut hw stakehlders define and generate evidence in child welfare, as well as fr sharing emerging research and evaluatin findings. Participants will als discuss practical, ethical, and methdlgical dilemmas that accmpany effrts t cnduct analyses, research, evaluatin, and CQI and apply findings t imprve utcmes. The Evaluatin Summit will cnvene a wide variety f stakehlders in cnversatins that will help all participants imprve the value and utility f research and evaluatin and prmte a 1

2 mre infrmed use f results. This event will engage up t 800 participants and target participatin frm the fllwing grups: leaders frm state, lcal, and tribal child welfare agencies; CQI and prgram managers; leaders frm the legal and judicial cmmunity; child welfare researchers and prgram evaluatrs; directrs f child welfare demnstratin prjects and studies; technical assistance prviders; funders and plicymakers; natinal and cmmunitybased partners and prviders; and ther stakehlders, including advcates, caregivers, and yuth. The Evaluatin Summit supprts three verarching gals: building evidence, strengthening practice, and infrming plicy. Building evidence describes effrts t examine the existing evidence base in child welfare, emply strategies that will generate credible and useful evidence, and share recent research and evaluatin findings. Strengthening practice refers t effrts t infrm and imprve case-level child welfare practice and prgram- and system-level decisin-making (including thrugh CQI and quality assurance prcesses), as well as effrts t strengthen the practice and prcess f evaluatin itself. Infrming plicy addresses the use f evaluatin, data, and research findings t drive well-infrmed decisin-making and plicymaking, including selecting, designing, and implementing practices, prgrams, and initiatives that will imprve utcmes fr children, yuth, and families. Call fr Abstracts The Call fr Abstracts fr the 2019 Evaluatin Summit is brad and inclusive. Any prpsal that substantively cntributes t ne r mre f the three primary gals building evidence, strengthening practice, and infrming plicy will be cnsidered. We expect the submissin selectin prcess t be highly cmpetitive. Prpsals that advance CB s preventin-fcused visin fr the future f child welfare, infrm effrts t implement the FFPSA, r prmte the rutine and practical use f data and evaluatin t cntinue t imprve child welfare practice are strngly encuraged. We als have an interest in prpsals that demnstrate prductive and meaningful cllabratin, such as interjurisdictinal and crss-disciplinary appraches t data cllectin, sharing, and analysis and/r partnerships between evaluatrs and child welfare agencies r cmmunities f interest. In general, we seek sessins that fcus n lngstanding and emerging evaluatin issues and practice challenges, with special attentin t pints f debate, gaps in knwledge, innvative methds, and new findings that cntribute t stakehlders understanding f hw t prevent and respnd t child maltreatment mre effectively. Prpsals that engage multiple cnference audiences, invite cnstructive dialgue, and present multiple pints f view may be given preference. We hpe t receive prpsals that will address a wide array f tpics, including submissins in the fllwing areas: 1. Explring Evaluatin Design, Methds, and Measurement Prpsals in this area shuld explre the purpses f evaluatin; the chices stakehlders make abut evaluatin apprach, design, and measurement; and stakehlder analyses based n these chices. Presenters are encuraged t discuss strengths, limitatins, and implicatins and hw these affect what stakehlders can learn and hw evaluatin findings shuld be used. 2

3 In additin t presentatins that describe innvative and practical slutins t evaluatin challenges, prpsals in this area culd explre a variety f tpics and issues. Fr example: Examining key factrs that guide the selectin f research apprach, design, and methds Defining and measuring nebulus cnstructs r utcmes (e.g., quality hearings, meaningful family engagement, healthy culture and climate) Addressing measurement and sampling challenges at the child, family, rganizatinal, cmmunity, r ppulatin levels Weighing the relative advantages f quantitative, qualitative, and mixed methds Using evaluatin appraches driven by stakehlders and/r participants 2. Using Data t Understand Characteristics, Trends, Predictrs, and Perfrmance Prpsals in this area shuld examine hw effective data cllectin, analysis, and interpretatin facilitates understanding f the ppulatins served by child welfare wrkers, the services these ppulatins receive, and hw wrker interventins impact the lives f children, yuth, and families. Presenters are encuraged t discuss the use f child welfare administrative data, ther quantitative datasets, and qualitative data t identify and explre challenges, perfrmance, and utcmes thrugh evaluatin and CQI. In additin t presentatins that describe the nimble use f data t understand ppulatins and perfrmance, prpsals in this area culd explre a variety f tpics and issues. Fr example: Using child welfare administrative data t identify and assess agency needs and strengths Analyzing data at the child, family, rganizatinal, cmmunity, r ppulatin levels t understand and imprve perfrmance Explring pprtunities, risks, and technical chices when using predictive analytics in child welfare Develping agreements and mdels fr sharing data acrss systems and agencies Using quantitative and qualitative data t identify and cnfirm pssible rt causes Explring variatin in the data, including verrepresentatin and disparities in service delivery and utcmes 3. Cmmunicating and Using Findings t Imprve Practice Prpsals in this area shuld explre effective means f translating, cmmunicating, and using research and evaluatin findings t imprve practice. Presenters are encuraged t describe hw they have made findings mre cnsumable and useful and hw these effrts have successfully facilitated data-driven decisin-making and imprvements. In additin t presentatins that explre effrts t bridge the research-t-practice gap by imprving the accessibility and applicability f findings, prpsals in this area culd explre a variety f tpics and issues. Fr example: Attempting t make findings mre relevant, meaningful, cnsumable, and useful (e.g., thrugh data visualizatin, use f Bayesian methds, etc.) Tailring cmmunicatin and disseminatin appraches fr intended audiences and cmmunities (e.g., curts, tribes, etc.) 3

4 Cmmunicating and using negative and null findings Teaming t use data t drive decisin-making and prgram imprvement, including between agencies and curts Integrating research and evaluatin with nging CQI prcesses Evaluating disseminatin, reach, cnsumptin, and use f findings in child welfare 4. Demnstrating Efficacy and Effectiveness in Child Welfare Prpsals in this area shuld fcus n effrts t define evidence and hw tests f efficacy and effectiveness are cnducted in a variety f cmplex cntexts with diverse ppulatins. Presenters are encuraged t discuss dilemmas, challenges, and implicatins assciated with balancing research rigr with feasibility cnsidering resurce realities; the time-sensitive needs f children, yuth, and families; and the FFPSA requirements. In additin t presentatins that describe effrts t test specific preventin and child welfare interventins and share frmative and summative evaluatin findings, prpsals in this area culd explre a variety f tpics and issues. Fr example: Understanding evidence cntinuums and rating criteria in child welfare and ther fields Operatinalizing prgrams and practices in child welfare and perfrming rigrus frmative evaluatin (e.g., casewrk practice mdels, wrkfrce strategies, etc.) Making randmized cntrl trials mre accessible, practical, and feasible Examining factrs that make quasi-experimental designs necessary and mst apprpriate Answering research questins abut effectiveness and cst Setting standards and requirements fr evaluatin that will prmte evidence building and prgressin n evidence cntinuums 5. Evaluating Implementatin and Sustainability Prpsals in this area shuld highlight effective implementatin strategies and help stakehlders better understand and measure implementatin and sustainability. Presenters are encuraged t discuss practical methds fr mnitring and measuring implementatin, share specific findings assciated with these appraches, and address implementatin dilemmas, challenges, and issues that affect rganizatinal ability t imprve utcmes. In additin t presentatins that share results and describe factrs that bth enable and hinder successful implementatin, including implicatins fr rganizatinal change and perfrmance imprvement effrts, prpsals in this area culd explre a variety f tpics and issues. Fr example: Assessing and measuring rganizatinal readiness and capacity Perfrming rigrus implementatin studies Defining and measuring fidelity Accelerating evaluatin timeframes t prduce timely and actinable findings fr imprvement Using implementatin data t make decisins abut scaling up, scaling dwn, adapting, r sustaining practice Evaluating rganizatinal change (e.g., rganizatinal culture, transfer f learning, wrklad studies) 4

5 6. Leveraging Technlgy and Innvatin in Evaluatin Prpsals in this area shuld explain innvative evaluatin appraches, including the use f technlgy t enhance and facilitate evaluatin, as well present results frm the evaluatin f new technlgies and practice innvatins. Presenters are encuraged t identify and discuss security and privacy issues assciated with the use f emerging technlgies in research, evaluatin, r CQI effrts. In additin t presentatins that share nvel evaluatin and perfrmance imprvement strategies and technlgies, prpsals in this area culd explre a variety f tpics and issues. Fr example: Explring new frntiers f innvatin in child welfare research and evaluatin (neurscience, epigenetics, pharmaclgy, artificial intelligence, etc.) Evaluating innvatins and technlgical advancements in wrkfrce develpment (e.g., simulatin training, virtual reality, and distance learning) Using gespatial mapping t better understand cntext, services, and perfrmance Weighing pprtunities and risks assciated with data cllectin frm widespread and emerging infrmatin technlgies (e.g., search engines, scial media, mbile applicatins) Ensuring human subject prtectin, privacy, and security when cllecting and using data in a rapidly changing technlgical and research envirnment Building innvatins int data infrastructure, including thrugh the Cmprehensive Child Welfare Infrmatin System 7. Cnducting Ppulatin-Specific Research and Evaluatin Prpsals in this area shuld describe effrts t understand the needs and experiences f specific demgraphic grups and cmmunities and t evaluate the effectiveness f prgrams and services designed t supprt them. Presenters are encuraged t discuss pprtunities, challenges, and cnsideratins when selecting methds, cllecting and analyzing data, and interpreting findings, especially when fcused n ptentially vulnerable subppulatins. In additin t presentatins that highlight prmising appraches and imprtant cnsideratins fr ppulatin-specific research and evaluatin, prpsals in this area culd explre a variety f tpics and issues. Fr example: Empwering and prtecting vulnerable grups thrughut the evaluatin prcess Designing r adapting methds and measures fr specific cmmunities (e.g., children wh are victims f sex trafficking, yuth in transitin, prspective resurce families, immigrant caregivers) Examining differences in wrldviews and their implicatins fr building evidence Defining and prmting rigrus evaluatin with tribal cmmunities Cllabrating with the legal and judicial cmmunity t evaluate practice Using evaluatin t supprt the develpment r cultural adaptatin f interventins Cnducting research and evaluatin t understand and address disparity and disprprtinality 5

6 Sessin Frmats Applicants are encuraged t submit abstracts fr presentatins f findings and methds, interactive wrkshps, expert panel discussins, issue frums, and pster presentatins. We hpe t create dynamic spaces fr participants f all experience levels t share successes and challenges, brainstrm new ideas, exchange tls and methds, and identify shifts t research and evaluatin practice that will enhance the quality and accessibility f findings t imprve perfrmance and utcmes fr children, yuth, and families. We encurage prpsals that reflect innvative methds fr presenting infrmatin and sparking engagement, critical thinking, and rbust dialgue. Presentatin f Findings/Methds (60 r 90 minutes): These presentatins increase participant knwledge f research issues r evaluatin findings, methds, cncepts, r implicatins. Presentatins typically have dedicated time fr authrs f ne r mre studies n a related tpic t disseminate key findings r lessns learned. T address differing backgrunds and levels f evaluatin expertise amng participants, presentatins may vary in cmplexity f cntent and area f evaluatin practice. In additin t delivering cntent, presenters shuld plan t use apprximately ne quarter f the sessin time t answer audience questins and facilitate discussin. Skill-Building Wrkshp (90 r 180 minutes): These interactive wrkshps increase knwledge and develp skills by ffering participants the chance t practice and apply specific techniques. Skill-building wrkshps typically include explanatin r demnstratin f appraches r methds fllwed by facilitated time fr practice and explratin f the strategies. Examples might include skill building in a type f measurement r demnstratin f a new technlgy. Presenters shuld plan t devte at least half f the sessin time t actively engaging the audience in applicatin exercises and discussin. Expert Panel Discussin/Issue Frum (90 minutes): These sessins increase cllective understanding and critical thinking by presenting diverse perspectives n a specific issue t prmte discurse abut slutins and ptential curses f actin. Typically, a facilitatr will lead panelists and audience participants in discussin. Sessins may use an expert panel frmat with dedicated time fr panelists t share their views r respnd t facilitated questins n a particular issue fllwed by audience questins, r use a rundtable frmat that facilitates audience interactin and dialgue thrughut the sessin. In sme cases, facilitatrs may engage the audience in generating the cntent f the sessin, including infrmatin and strategies that will be recrded t infrm current r future research activities. Pster Presentatin (2-hur frmal display): These presentatins prvide an pprtunity fr individuals r grups t display their prgram and research findings in a pster frmat. Presentatins n recent and emerging findings and research frm emerging schlars are encuraged. A 4x6 pster bard area will be available fr these presentatins. While the fficial pster sessin will ccur fr 2 hurs, the psters will remain n display thrughut the Evaluatin Summit. Selectin Criteria The fllwing criteria will be cnsidered when evaluating abstracts: The written prpsal is clear. The prpsed tpic and materials are likely t achieve the presenters learning bjectives fr the intended target audience(s). 6

7 The prpsed sessin will make a meaningful cntributin t the cnference and will clearly supprt ne r mre f the cnference gals: building evidence, strengthening practice, and infrming plicy. The prpsed sessin is apprpriate fr the selected sessin frmat and adheres t cnference guidelines, including time alltted fr audience participatin. The prpsed sessin is relevant t the selected tpic area (e.g., Explring Evaluatin Design, Methds and Measurement). The prpsed sessin is likely t engage cnference participants in dynamic and cnstructive dialgue abut current evaluatin-related issues affecting child welfare. The prpsed cntent is relevant t child welfare agencies and their practice (e.g., priritizing preventin, FFPSA implementatin, PIP implementatin, nging CQI effrts). When applicable, additinal criteria may include: The prpsed sessin features multiple viewpints and/r diverse perspectives (e.g., frm the evaluatr, agency leader, casewrker, judge, parent, yuth, r funder) when presenting and interpreting findings, explaining appraches and methds, r discussing challenges and issues. The prpsed sessin demnstrates crss-system r interdisciplinary cllabratin that will be highlighted during the sessin. The prpsed sessin describes an apprach that incrprates adult learning principles, including attentin t persnal benefit and experience, different learning styles, and active participatin and applicatin. Prpsal Submissin The deadline t submit applicatins thrugh the Abstract Submissin Prtal is February 1, Applicants can submit up t three prpsals, althugh n ne can be designated mre than nce as the lead presenter. Cntent presented previusly at ther cnferences r meetings is acceptable. Required Prpsal Elements Primary presenter infrmatin (Name, address, rganizatin, state/territry/cuntry, and bigraphical sketch; 100-wrd maximum) Presentatin title (20-wrd maximum) Brief summary f abstract (100-wrd maximum) Sessin frmat Other presenters (Optinal; n mre than fur; name, cntact infrmatin, and bigraphical sketch; 100-wrd maximum) Target audience (Frm drp-dwn menu) Primary fcus area and tpical area (Frm drp-dwn menu) Full descriptin (Summary f the substantive cntent and frmat f the presentatin; 500-wrd maximum) Learning bjectives (Tw r three; n mre than 25 wrds each) Participatin Guidelines While the Evaluatin Summit des nt have a registratin fee, presenters must register t participate. 7

8 Supplemental material, t be made available thrugh the Evaluatin Summit s mbile applicatin (app), is expected fr every selected presentatin. Presenters cannt use sessin time t sell r market prducts. By submitting an abstract fr cnsideratin, applicants acknwledge that they: Have read and agree t abide by the terms f this plicy statement Understand that the purpse f the sessin is primarily t make infrmatin and resurces available t thse wrking in child abuse and neglect practice, research, and plicy, and secndarily, t the general public Understand that any presentatins and handuts prvided in cnjunctin with the sessin may be psted n the Wrld Wide Web and made available t the general public (Mre infrmatin n handuts will be sent t thse presenters whse abstracts have been accepted.) Understand that their travel, ldging, and incidental expenses will nt be underwritten r reimbursed, if selected Submitting Yur Abstract Abstracts shuld be submitted fr cnsideratin thrugh the nline Abstract Submissin Prtal. Abstracts must be submitted nline NO LATER THAN 11:59 p.m. n February 1, We suggest yu prepare yur draft submissin using the abve list f required elements befre beginning the nline submissin prcess. Register a new Accunt Once yu ve selected Create Abstract, yu will be presented with a frm that yu must fill ut. One f thse fields will be fr adding presenters. Cmplete Fields: First Name, Last Name, Address, Create and Cnfirm Passwrd. Chse Authr/Submitter. Select the Sign Up buttn. Select I am Submitting an Abstract then Create Abstract n the side menu. When adding a presenter, chse the type yu wish t add by clicking the apprpriate buttn. A small windw will appear where yu can fill ut the cntact infrmatin fr that presenter. Be sure t cmplete all the mandatry fields, then click Add. After clicking Add, the small windw will clse and the newly created presenter will be added t the list f presenters fr this submissin. At this pint yu can either add anther presenter r change the presenter type fr presenters that have already been added. When all the mandatry abstract fields have been filled ut, click the SUBMIT buttn at the bttm f the page If there is a prblem with yur submissin, yu will be ntified either by an alert bx indicating the prblem r ne f the fields will be highlighted indicating the prblem. If there were n prblems with yur submissin, yu will be brught t the View My Abstracts page where yu will find all f yur submissins, including the new ne. Yu can get t the page via the menu item I am a Presenter then View All My Submitted Abstracts. Check yur submissin fr accuracy. Please check yur cnfirmatin and ensure that all f yur cntact infrmatin, the title f yur presentatin, and yur summary are cmplete and crrect. 8

9 The infrmatin yu have prvided at this time will be used, as yu have submitted it, t create ur prgram, agenda, and website. In particular, please ensure that yur presentatin title and summary statement are nt truncated due t character and wrd limits. It is yur respnsibility t address any questins, r make any changes requested by the Evaluatin Summit Planning Cmmittee nline. Send any questins abut yur submissins t the Evaluatin Summit Planning Cmmittee well in advance f the deadline f February 1, s that we can help yu make the changes r cmplete the submissin n time. Changes may nt be made during r fllwing the review perid. The Abstract Review Prcess All abstracts submitted that meet minimal acceptance requirements (i.e., submitted by the deadline, cntain sufficient infrmatin in each f the required elements fr reviewers t make rating decisins) will be reviewed by a minimum f three qualified individuals. Prpsals will be scred in each f the required prpsal elements. The Children s Bureau reserves the right t make all final decisins regarding which prpsals t accept. We anticipate that thse submitting abstracts will be ntified as t whether r nt their prpsal has been accepted n later than March