Section 6: Observations and Recommendations

Size: px
Start display at page:

Download "Section 6: Observations and Recommendations"

Transcription

1 Section 6: Observations and Recommendations 42

2 Using the Survey Results To Improve Your Organization Here are some ideas to consider before reviewing the specific observations and recommendations that follow. The survey results detailed in this report provide significant information about the following: The aspects of your organization that are perceived to be most important by your staff and contribute most to their satisfaction. The areas in which you may already be meeting or exceeding staff expectations. The highest and lowest performance gaps. Strengths and weaknesses of your organization's quality improvement process. Employee ratings about the quality of programs, services, and activities available to students and staff. Changes needed to create an organization that provides higher levels of personal satisfaction and professional fulfillment to your staff. After you examine and reflect on the contents of this report, there are a variety of options for follow-up action. Some of these are listed below, and others will be identified through discussions within your organization. Market your strengths. Capitalize on the strengths indicated by the Survey. Publicize your successes in areas where expectations are being met. Promote these strengths to your prospective staff, and use this information to recruit more effectively. Implement possible quick-fix remedies. While many areas of deficiency require strategic action, some changes can be implemented immediately. Quick action on items with high performance gaps can bring rapid and significant returns to the organization in increased staff satisfaction and better service to customers. Make minor budget adjustments. Since the COS can identify areas that are not perceived important by staff, money, personnel, and energy may be Performance Horizons CONSULTING GROUP x 43

3 reassigned to areas where there is greater indication of need for improvement. Include results in long-range planning. The information provided by the CQS can be useful in long-range planning. The survey points to payoff areas where expending efforts, focusing budgets, and developing measurable goals can bring important results. Change the climate. The lack of a sense of community in an organization should be of great concern to administrators. Staff feelings about the organizational climate relate directly to their level of satisfaction. When staff feel like they belong, they are likely to deliver higher quality service to internal and external customers. They are also more likely to build their careers and continue employment rather than look elsewhere. Provide competitive advantage. Regularly assessing staff expectations and levels of satisfaction should give you a definite edge over other organizations. The results of this measurement activity will be most valuable if monitored over time. The focus should be on continuous improvement, measured and compared each time the survey is administered. Administrators should avoid comparing results from department to department. Rather, the goal should be to determine if meaningful change is occurring throughout the organization. This report is designed to establish a self portrait of the organization as viewed by the people who make it work. With this information, precise benchmarking and meaningful, long-term improvement is possible. Careful documentation of progress overtime will track your organization's success in the quest for continuous quality improvement. Performance Horizon/ CONSULTING GROUP x 44

4 ston State College Overall Composite Observations and Recommendations Introduction This is the seventh time the Campus Quality Survey has been used at. There are both positive and negative factors revealed by the findings. These provide much opportunity for analysis, reflection, and action. With the 2002, 2004, 2006, 2008, 2010, and 2013 data as a baseline, examination of current results allows for identification of advances and declines. These will highlight opportunities for continued improvement efforts. The Campus Quality Survey results provide a snapshot of employee perceptions at a given time. The information is useful as a basis for analysis of the past; however, the data is even more valuable in charting a course of action for the future of the campus. Survey respondents often express their desire to see survey results. Some educational organizations distribute survey results to all employees. This is an excellent idea, as it affirms that their input has been valued and they can see how the organization as a whole has responded. Distribution of results can be accomplished by referencing and highlighting the various charts, tables, observations, and recommendations in staff communications. Some institutions place copies of the entire report in their resource center for checkout by employees or post the report on their internal Website. This helps instill confidence by avoiding the perception that results are selectively revealed. The Basis for These Observations and Recommendations The observations and recommendations in this section are based on a comprehensive analysis including the following: 1. Data obtained from the survey responses is reviewed and includes an analysis of the: smallest and largest performance gap items results of the composite averages of survey items related to the eight Presidential Award for Quality categories results of survey data for items showing ratings of institutional programs, services, and activities staff ratings for item 81 (employee satisfaction) and 94 (impression of quality) employee comments and suggestions Performance Horizon/ CONSULTING'GROUP x 45

5 2. overall composite data is compared to averages from all other institutions in the Performance Horizons data bank. Note that there are further comparisons of data with National Norms. 3. Data from 2015 is analyzed and compared with data from the previous six surveys (2002, 2004, 2006, 2008, 2010, and 2013). While all three approaches yield valuable information, caution should be exercised when comparing data and results with other institutions and national norms. The size and types of institutions influence national norm data. Other factors include the number of overall responses, number of survey participants in each employee group, and variables unique to individual institutions. The current number of survey respondents (83) reflects an 8% decrease from 2013 (90) yet represents the second highest number of respondents in seven surveys. The number of survey respondents has ranged from a low of 34 (2008) to a high of 90 (2013) with an average of 63 respondents. Refer also to the table on page 33. Observations and Recommendations 1. Effective quality improvement is the result of a thousand things done a little bit better, not a few things done a lot better. The four key features of an improvement system are: Measurable indicators of quality for all key processes Focused improvement effort for each quality indicator Continuous measurement, strong customer service, and process fine-tuning Strong leadership that provides for and encourages total staff participation We recommend that improvement initiatives currently employed be analyzed for conformance to the above listed features of an effective improvement process. Adjusting local improvement strategies accordingly should yield positive results. We emphasize that these recommendations should be viewed as highlights only. They, by no means, represent a complete list of areas in which action should be taken. There is no absolute performance gap value that indicates a firm cause for action. Indeed, every single Performance Horizon/ CONSULTING GROUP % 46

6 element of the report no matter how large or small the performance gap should be viewed as an opportunity for analysis, review, and improvement. Specific timetables and courses for action are best formulated based on local perspectives, goals, and priorities. 2. Individuals responsible for the five functional areas that received the highest overall ratings should be commended for their commitment to providing exemplary services to students and others who use their services. These services are listed on page 20 as well as in Chart 2-30 titled Employee Perceptions of Institutional Programs, Services, and Activities. The five highest rated programs, services, and activities are listed below in-descending order of rating: Bookstore Services *** Payroll services ****** Communicating with legislators and other politicians ** Continuing education and community programs and services ** Affirmative Action* Note that an asterisk (*) indicates the number of surveys this functional area was among the five highest rated services. It is noted that Bookstore services was also the highest rated service area in 2010 and Payroll services has been among the five highest rated service areas in five previous surveys as well and represented the highest rated service area in 2008, and the second highest rated service area in 2010 and Communicating with legislators and other politicians was also the third highest rated service area in 2013, while Continuing education and community programs and services represented the fourth highest rated service area in This is the first time Affirmative Action has been among the five highest rated service areas. 3. A review of the ten smallest performance gap items, as listed on Chart 2 1 and summarized on page 13, shows each of these items to have performance gap values below ranging from to Item 21: Administrators cultivate positive relationships with students, represents s' smallest performance gap for 2015 (0.610). Previous surveys show item 21 represented the eighth smallest performance gap in 2010 (0.776), the seventh smallest performance gap in 2002 (0.476), and the fourth smallest performance gap in 2004 (0.506). Performance gaps for item 21 have ranged from a low of Performance Horizon/ CONSULTING GROUP % 47

7 (2002) to a high of (2008). Nationally, item 21 is not among the ten smallest performance gap items at (0.952). s' second smallest performance gap (0.645) also addresses employee-student relationships and states: This institution promotes excellent employee-student relationships (item 11). Item 11 was among the ten smallest performance gaps in five of the previous surveys representing the eighth smallest performance gap in 2002 (0.492) and 2008 (0.818), the fifth smallest performance gap in 2006 (0.490) and 2010 (0.771), and the smallest performance gap in 2004 (0.350). Nationally, item 11 is not among the ten smallest performance gap items at (0.858). As stated earlier, items with performance gap values of or higher generally represent areas needing attention. A review of Charts 2 11 to 2 13 shows that 26 of the first 50 survey items (52%) with performance gap values at or above ranging from to This is below the national norm for which shows 28 of the first 50 survey items (56%) with performance gaps at or above ranging from to Refer also to national norm Charts 3 13 to The highest rated quality category based on how it is now is Customer Focus (3.446). Previous surveys show Customer Focus was also the highest rated quality category in 2002, 2004, 2006, 2008, and 2010, and the third highest rated quality category in Nationally, Customer Focus is the highest rated quality category at (3.414) and All Institutions (3.499) in the national norm data bank. These results are presented in the following table with the 2015 ratings printed in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. s' current overall rating decreased slightly from 2013 yet is above the National Norm and slightly below the national norm for All Institutions in the data bank. Performance Horizon/ CONSULTING GROUP % 48

8 Quality Category Customer Focus WSC Overall Campus All Institutions For the seventh survey in a row, the quality improvement category needing the greatest attention is Employee Training and Recognition. This is indicated by the largest performance gap overall (1.431) and in each employee group: Support/Classified (1.222), Faculty/Instructor (1.524), Department Chair (1.750), and Administrative/Professional (1.438). Nationally, Employee Training and Recognition is the largest performance gap category at (1.286), Four-Year Institutions (1.348), and All Institutions (1.228) in the data bank. Performance gap values for the Employee Training and Recognition category are shown in the table below with 2015 results printed in red. s' current performance gap increased from 2013 and is above the national norm for and All Institutions in the data bank. Employee Training and Recognition Year WSC Performance Gap Performance Gap All Institutions Performance Gap The specific survey items that make up the Employee Training and Recognition category are shown on Chart Of the eight items Performance Horizon/ CONSULTING GROUP 49

9 Cam pus Quality Survey Interpretive Guide and Results listed in this category, three are among s' ten largest performance gap items for 2015 (items 8, 9, and 26). Item 8 states: Processes for selecting, orienting, training, empowering and recognizing employees are carefully planned. Item 8 has been among s' four largest performance gap items since 2002 and currently represents the second largest performance gap for 2015 (1.962). Nationally, item 8 represents the third largest performance gap at (1.496). overall response history and National Norms for item 8 are shown in the following table with the 2015 results printed in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Item 8: Processes for selecting, orienting, training, empowering and recognizing employees are carefully planned WSC Now WSC Gap Rank in Campus' 1 0 Largest Gap Items Now Gap & Rank Overall Campus (3) (3) 1.529(3) (3) 1.576(3) 1.619(3) 1.66 (3) Item 8 Summary: s' how it is it is now rating decreased while the performance gap increased from 2013 (indicating negative change). s' how it is now rating is below, while the performance gap is above the National Norm. Item 9 relates specifically to employee training and states: Employees receive special training in improving customer service. Item 9 currently represents s' third largest performance gap (1.833) and was among the ten largest performance gap items in 2002, 2004, 2006, 2010, and 2013 as well. Nationally, item 9 represents the seventh largest performance gap at (1.370). The table below presents item 9 results for and National Norms. The 2015 results are printed in red Performance Horizons CONSULTING GROUP 50

10 above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Note that a hyphen (-) indicates item 9 was not among the ten largest performance gaps items for that respective year. Item 9: Employees receive special training in improving customer service. WSC Now WSC Gap Rank in Campus' 1 0 Largest Gap Items Now Gap & Rank Overall Campus (7) 1.382(7) 1.387(7) 1.386(7) 1.402(7) (7) 1.48 (7) Item 9 Summary: 's how it is now rating decreased while the performance gap increased from 2013 (indicating negative change). The current how it is now rating represents the lowest result, while the performance gap reflects the highest value in seven surveys. The WSC how it is now rating is below while the performance gap is above the National Norm. Item 26 directly address employee rewards and states: Employees are rewarded for outstanding job performance. Item 26 represents Williston State ' seventh largest performance gap for 2015 (1.618), and was among the ten largest performance gap items in each of the previous surveys and represented the largest performance gap in 2002, 2004, and Nationally, item 26 represents the second largest performance gap at (1.757). The overall response history and National Norms for item 26 are shown in the table below with the 2015 results printed in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Performance Horizon/ CONSULTING GROUP 51

11 Item 26: Employees are rewarded for outstanding job performance. WSC Now WSC Gap Rank in Campus' 1 0 Largest Gap Items Now Gap & Rank Overall Campus (2) 1.764(2) 1.783(2) 1.787(2) 1.800(2) 1.830(2) 1.86 (2) Item 26 Summary: s' how it is it is now rating decreased while the performance gap increased from 2013 (indicating negative change). s' how it is now rating and performance gap are both below the National Norm. It is recommended that professional development opportunities and employee recognition practices continue to undergo periodic review to determine where improvement is needed. They represent areas of high importance to employees. Some suggested strategies include: Appoint a study team on each campus to review professional development and recognition prog rams for faculty, staff, and administration. Study the current professional development and recognition processes, and determine where they can be improved. Hold focus group sessions with personnel at all levels to get feedback about current professional development and recognition programs. Survey personnel at all levels to determine their perceived professional development needs. Analyze the data obtained, and design an action plan that details the goals and strategies for improvement, together with measurement criteria and a responsibility chart. See sample Action Plan format in the Appendix. 6. Communication is a focus area that warrants continuing study. The largest performance gap of the survey was calculated for item 27, which states: There are effective lines of communication between departments Performance Horizon/ CONSULTING GROUP 52

12 (2.000). Item 27 has been among s' ten largest performance gap items in each of the previous surveys as well. Nationally, item 27 represents the largest performance gap at (1.817). Item 27: There are effective lines of communication between departments. WSC Now WSC Gap Rank in Campus' 10 Largest Gap Items Now Gap & Rank Overall Campus (1) 1.819(1) 1.833(1) 1.835(1) 1.840(1) 1.869(1) 1.91 (1) Item 27 Summary: s' how it is now rating decreased while the performance gap increased from 2013 (indicating negative change). s' how it is now rating is below and the performance gap is above the National Norm. Also addressing communication is item 45: Written procedures clearly define who is responsible for each operation and service. Item 45 represents the fourth largest performance gap for Williston State College (1.831) and has been among the ten largest performance gap items in 2002, 2004, 2010, and 2013 as well. Nationally, item 45 represents the eighth largest performance gap at (1.348). The overall campus ratings along with National Norms for item 45 are presented in the following table. The 2015 results are shown in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Note that a hyphen (-) indicates item 45 was not one of the ten largest performance gaps items for that respective year. Perfomionee Horizon/ CONSULTING GROUP 53

13 Item 45: Written procedures clearly define who is responsible for each operation and service. WSC Now WSC Gap Rank in Campus' 1 0 Largest Gap Items Now Gap & Rank Overall Campus (8) 1.352(8) 1.360(8) 1.360(8) 1.373(9) 1.390(9) 1.41 (-) Item 45 Summary: s' how it is it is now rating decreased while the performance gap increased from 2013 (indicating negative change). s' how it is now rating is below and the performance gap is above the National Norm. In the Assessment of Our Programs, Services, and Activities section of the survey (Chart 2-30), employees were asked to rate communication with other departments using a five-point scale. Communication with other departments represents s' third lowest rated service area for 2015 (2.827), and has been among the five lowest rated service areas in 2002, 2004, 2008, and 2010 as well. The overall campus ratings for item 59 are shown in the table below with the 2015 result shown in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Note that the current overall rating decreased from 2013 and is below in the Fair, Much Improvement Needed range. Item 59: Communication with other departments Overall Campus 1 = Poor and Inadequate 2 = Fair, Much Improvement Needed 3 = Good, Still Needs Improvement 4= Very Good and is Continually Improving 5 = Excellent as it is Now Performance Horizon/ CONSULTING GROUP 54

14 Communication impacts all operating systems of an organization. It also plays an important role in the overall atmosphere and staff morale. Thus, communication must be considered among the highest priority areas of a campus' quality improvement process and continued improvement efforts are encouraged. It is recommended that a cross-functional team be appointed to study the processes related to communication between departments and campuses. The following techniques may be effective: Review all the charts, tables, and data in this report, with special emphasis on items related to communication. Conduct interviews and focus group sessions among personnel levels. Survey employees at all levels on specific aspects of communication. Analyze the data obtained, and design an Action Plan that details goals and strategies for improvement, measurement criteria, responsibilities, and timelines. See sample Action Plan format in the Appendix. The action plan should include the development of written procedures for all cross-functional operating processes. The written processes should help support more systematic interdepartmental communication. 7. It is recommended that Section 4: Comparison with Previous Survey Results be carefully reviewed. This section allows for easy identification of changes from previous surveys. Refer also to pages and Charts 4 1 to 4 8. The following changes from 2013 to 2015 have been noted: How It Should Be ratings increased in six quality improvement categories and decreased in Top Management Leadership and Support and Customer Focus. How It Is Now ratings decreased in all eight quality improvement categories. Performance Gaps increased in all eight quality improvement categories. 8. The Level of Satisfaction with Employment ratings show an increase in the combined percentage of satisfied and very satisfied responses from 68% to 72% representing the highest Performance Horizon/ CONSULTING GROUP % 55

15 ^ combined percentage since 2006 (77%). Neutral responses decreased slightly from 18% to 17%, while the combined percentage of somewhat dissatisfied and not satisfied at all responses decreased from 15% to11% representing the lowest combined percentage since 2004 (5%). The table below presents these results with the 2015 data printed in red. Item 81: Level of Employee Satisfaction Combined Satisfied and Year Very Satisfied % 68% 70% 63% 77% 83% 88% Neutral 17% 18% 18% 19% 10% 12% 6% Combined Somewhat Dissatisfied and Not Satisfied At All 11% 15% 12% 18% 12% 5% 6% 9. The Overall Impression of Quality results show a slight decrease in the combined percentage of excellent and good responses from 70% to 69%. Average responses decreased from 23% to 19% representing the lowest percentage since 2006 (18%), while the combined percentage of below average and inadequate responses increased from 7% to 11%. These results are shown in the following table with the 2015 data printed in red. Item 94: Overall Impression of Quality Combined Excellent and Year Good % 70% 62% 50% 80% 84% 90% Average 19% 23% 27% 32% 18% 13% 10% Combined Below Average and Inadequate 11% 7% 12% 18% 2% 3% 0% Performance Horizon/ CONSULTING GROUP it 56

16 10. As acknowledged earlier, a number of factors should be considered when comparing campus results with national norms. Nonetheless, the comparison is a logical starting point. Charts 3 11 and 3 12 show how it is now ratings are lower in seven quality categories and higher in Strategic Quality Planning than the average ratings of All Institutions in the data bank. When comparing Williston State College how it is now ratings with, WSC ratings are higher in four and lower in four quality categories. These results are presented in the table below with the 2015 ratings printed in red above the 2013, 2010, 2008, 2006, 2004, and 2002 data. Results are sorted in descending order of 2015 rating. Quality Category Customer Focus Top Management Leadership and Support Employee Empowerment and Teamwork Strategic Quality Planning Williston State College All Institutions National Norm National Norm Performance Horizon/ CONSULTING GROUP ss 57

17 Measurement and Analysis Quality/Productivity Improvement Results Quality Assurance Employee Training and Recognition It is recommended that the lowest rated service areas, as shown on Chart 2-30 and summarized on page 21, be examined and improvement efforts concentrated as needed. The five service areas rated lowest by the overall campus and two or more employee groups are listed below in ascending order of rating: Health and nursing services ******* Budget planning and coordination ** Communication with other departments ***** Career information and planning services ******* Recruitment and orientation of new employees ***** Note that an asterisk (*) represents the number of surveys this functional area was among the five lowest rated services. Health and nursing services and Career information and planning services have been among the five lowest rated service areas in each of the seven surveys. Communication with other departments and Recruitment and orientation of new employees have been among the Performance Horizons CONSULTING GROUP : 58

18 lowest rated services in four previous surveys as well, while Budget planning and coordination was also the second lowest rated service area in It is noted that each of these service areas have a rating below in the Fair, Much Improvement Needed range. It is recommended that administration study the service areas that have repeatedly received low ratings and develop an action plan for improvement. 12. It is recommended that administration carefully review the Employee Comments and Suggestions on page It is recommended that this survey be repeated in This will identify progress in closing the performance gaps as a result of continuous quality improvement initiatives implemented. Performance Horizon/ CONSULTING GROUP «59

19 Some Thoughts on the Validity Of the Campus Quality Survey Groups and individuals completing the Campus Quality Survey sometimes inquire about the validity of Survey results. These results are communicated through the Campus Quality Survey Report, based on the data gathered from individual survey participant responses. Concern about validity is appropriate and should be discussed. Generally, validity may be defined as the accuracy and reliability with which a survey confirms known facts or predicts outcomes that can be confirmed through other measurements or observation. The Campus Quality Survey seeks to quantify subjective perceptions of various aspects of an organization's functions and circumstances to the individuals working within it. Survey validity does not address objective facts. With this in mind, the validity of the Survey can be addressed in two ways. HOW VALID ARE SURVEY QUESTIONS? Individual survey questions may be considered valid because they address issues that staff members in educational institutions have indicated to be significant in determining levels of satisfaction within their job environments. This satisfaction is important, as it is generally related to functional effectiveness and personal and professional fulfillment. The questions have been tested in hundreds of surveys involving many thousands of individual respondents. Survey customers indicate a remarkable correlation between positive survey scores and a happy work environment. Negative survey results generally indicate a stressed work environment. Individuals answer Survey questions in a variety of ways depending on personal perspective. For example, some Survey respondents feel satisfied about a particular aspect of an organization's operation, while others may feel entirely differently about the same observable phenomena or their experiences with the phenomena. Thus, the survey documents subjective information that becomes significant by virtue of the reporting of individual responses as aggregate arithmetic and statistical values. These values are meant to be analyzed and considered "good" indicators, or indicators that suggest remedial action, largely based on the established standards of the institution. HOW VALID IS THE SURVEY REPORT? The Survey Report is constructed thoughtfully, and reports summary data based on individual survey participants' responses. The data is derived using valid statistical analysis tools and processes. Performance Horizon/ CONSULTING GROUP % 60

20 The Report clearly presents data that has been deemed useful by Survey customers over a period of more than ten years. Though each of the measures involved in the Campus Quality Survey deals with subjective judgment, each addresses an issue that can significantly impact the overall success of an organization, where the hearts as well as the minds of individuals must be engaged toward fulfilling the mission of the enterprise. Survey data provides a means to identify areas of strengths as well as areas that call for remedial action. Thus, the Survey offers an important and economical way to engage the individuals working in the organization in building a more effective, functional, and satisfactory place to work. Performance Horizon/ CONSULTING GROUP % 61

21 Wiiliston State College Some Thoughts on the Response Rates of Campus Quality Survey Participant Groups How willingly and seriously participants in a survey actually participate has been the source of angst for many a survey designer and administrator. One way to gain insight regarding the engagement of survey participants is to analyze the survey response rate. To clarify what we mean, the Campus Quality Survey Response Rate is the difference between the number of surveys a campus distributes and the number of surveys it receives back, expressed as a percentage. It may then be asked, what is the minimum response rate that will render a survey reliable? If it were only that easy. There are various ways of looking at this issue. On one extreme, we could simply say that if less than X% of the potential respondents participate, then the survey cannot be considered a legitimate indicator of anything. On the other extreme, we could say that any response can be the basis of useful even valuable information. The answer most would find acceptable is somewhere in-between, and is the product of factors that often cannot be expressed by simple numbers. Let's reflect on this. Range of Response Rates and Possible Reasons for Variation In reviewing records of actual Campus Quality Surveys completed by various campuses and multi-campus schools, we find that participant returns range from under 10% to 100%. In general, participation tends to be highest at smaller campuses. The real reasons are surely more complex than campus size. The wide range of returns among survey participants begs the question of why there is such a high discrepancy. Reasons can range from objective variables in the process used to engage staff in the survey's purpose and mechanics, to nonobjective circumstances such as perceived issues of staff morale and general willingness to cooperate with administrative leadership. We mention this as an important qualifier, as poor participation should give rise to immediate efforts to understand and mitigate the impact of these underlying factors. Performance Horizon/ CONSULTING GROUP «62

22 Valid Sample Size and Acceptable Return Ratio We know that an extremely small survey sample size can generate legitimate and reliable indicators if the sample can be ensured to be consistent with the target group as a whole, i.e. be a 'typical' sample. Much larger samples can be less-thanadequate if the sample cannot be assured to represent a typical cross section of the entire survey populace. Of course, sample size is not the same as participation ratio, and low ratios can be the result of many factors as touched on above. We encourage schools to be thoughtful in planning and thorough in implementing their survey process so that a 50% or greater response level is achieved. While a much lower participant level can reveal meaningful opportunities for the improvement of campus climate and services, higher participation generates greater confidence. This confidence extends both to the perception of survey validity and the ability to engage staff in improvement initiatives. The latter, after all, is the essence of the Survey's purpose. Striving for Higher Response Rates Some survey managers consider surveys to be less than acceptably reliable if the participant rate is below 65%. In any circumstance, is meaningful to contemplate how we can boost the level of participation. There are several ways to do this. Here are some worth mentioning. 1. Explain the benefits to employees. Outlining clearly how employees and the organization as a whole will benefit from the survey will help motivate participation. Communicate with staff prior to the survey administration. Senior management should explain the importance and provide opportunity for participation. 2. Secure the endorsement of senior management. Participants need to know that the process is supported at the highest level, and there is a strong commitment to make use of the survey's results. 3. Select the forum for survey administration thoughtfully, and ensure that the implementation process is thorough. Best return rates are achieved when survey instruments are distributed and collected during a planned assembly, such as a staff in-service event. During the in-person meeting, specifics regarding survey completion should be explained and questions answered before participants begin the survey. Performance Horizon/ CONSULTING GROUP x 63

23 Wiiliston State College Specifically, that: the purpose of the survey is to gather information and receive input from all employees about the organizational climate and the quality processes of this organization. the survey is being done to encourage and facilitate continuous improvements at the institution. the intent is not to compare different departments or work units or to point fingers at any campus, department, or person. the information will be used by the administration and committees involved in the continuous improvement process to make decisions about future goals and priorities. no individual or department will be identified in any way. The only distinction made when reporting the results will be by personnel categories (i.e. support/classified staff, faculty/instructor, department chair, or administrative/ professional staff) and by employment status (full-time regular, full-time temporary, part-time regular, or part-time temporary). If there are multiple campuses, each campus will receive a report of the results from ratings compiled from that campus. the instrument serial numbers are used only for ease in order fulfillment and tracking of survey instruments by campus. Individuals are not identified by instrument number. respondents need to record (fill in) one answer in both the left and the right columns. It is important that for every one of the fifty (50) items (or 60 items if the customized section is utilized), each person completes two answers (one in the left column and one in the right column). The responses should be based on the individual's perception of the organization as it is now and how they think it should be. 4. 'Communicate "next steps" Let employees know what will be done with the information gathered through the survey. Participants need to know that their time is not being wasted. 5. Keep promises It should be clear that a low return from which meaningful improvement ideas are gleaned is more useful than a 100% return that is shelved with no follow-though. Performance Horizon/ CONSULTIN'G'GROUP % 64

24 Wiiliston State College Earn-the respect and participation of staff members by saying what you mean and doing what you say. As in all areas of human endeavor, the quality of planning and implementation will be reflected by the quality of the participation. Ensure a high quality outcome by following the above guidelines as you implement the Campus Quality Survey at your institution. Perform once Horizon/ CONSULTING GROUP «65