Performance Indicators in Innovation Business: A Case Study

Size: px
Start display at page:

Download "Performance Indicators in Innovation Business: A Case Study"

Transcription

1 Performance Indicators in Innovation Business: A Case Study Dr Fang Zhao School of Management, Business Faculty, RMIT University, GPO Box 2476V, Melbourne 3001, Vic. Australia fang.zhao@rmit.edu.au Professor John Dalrymple The Centre for Management Quality Research, Business Faculty, RMIT University, P. O. Box 71, Bundoora 3083, Victoria, Australia john.dalrymple@rmit.edu.au Keywords: Performance Indicators, Innovation, Co-operative Research Centres 1.0 Introduction The Co-operative Research Centre (CRC) Program is a key component of the Australia Government s innovation strategy, and plays an important role in the Australian innovation system incorporating the value of science and technology. CRCs serve as a bridging mechanism linking providers of innovative research and the users of new knowledge. In this regard, the CRC Program addresses some of the main problems perceived in the Australian innovation system, including disincentives to collaboration among research providers and Australian businesses, the weak links between research organizations and users, the lack of high intensities of research and development (R & D) and research application, the lack of mobility of personnel between government research agencies, academia and industry, and The lack of strong links to leading international R & D organizations (Mercer & Stockers, 1998). The CRC structure is widely recognised as the flagship of the Australian research environment in which business enterprises, academic institutions, research technology organizations and other partners collaborate on the type of research projects and programs which are crucial in the development of a competitive Australian economy. However, as with any joint venture in which organizations with different missions collaborate, there is considerable complexity involved in the management and judgement of the performance of such joint ventures. This study indicates that the current performance indicators widely used by CRCs do not sufficiently address the considerable complexity of CRCs. The paper aims to explore and determine the effectiveness of performance measures for CRC innovation business. Research questions addressed in this paper are: What s different about CRCs? To what extent do the currently used Performance Indicators (PIs) measure the dimension and complexity of collaboration between CRC participants? 1

2 This study follows a case study design, namely, taking CRCs as a unit of study and analysis, and draws on data collected from the latest Annual Reports of CRCs and a recent on-line survey of CRC managers. Both statistical analysis and content analysis were used to achieve the aim of the paper and answer the above research questions. The study is important because it contributes to a deeper understanding of the complexity of performance measures for innovation business and would inform policy development to enhance competitiveness of innovation organizations in the marketplaces. 2.0 CRCs and Innovation Business In Backing Australia s Ability - An Innovation Action Plan For The Future launched by Australian Government in earlier 2001, innovation was regarded as the key to Australia s international competitiveness, economic prosperity and social wellbeing. In the Action Plan, the Government announced that it would provide an additional 2.9 billion Australian dollars for the innovation programs including the CRCs Program (ISR. 2001). Thus, the CRCs Program is a key component of the Australia Government s innovation strategy. Key Features of CRCs The CRC Program was established by the Australia Commonwealth Government in 1990 to encourage the delivery of innovation in order to provide greater returns to Australia as a result of greater uptake of more focused R & D efforts. The first CRC was established in Australia in With the inclusion of the new Centres announced in the latest round, there are 64 CRCs currently operating across Australia. All Centres fit within the six targeted industry sectors - manufacturing technology, information and communications technology, mining and energy, agricultural and rural-based manufacturing, environment, and medical science and technology (ISR. 2001). CRCs are established through a Centre Agreement, a contract among core participants from various organizations, and a Commonwealth Agreement, a contract between the participants and the Australia Commonwealth Government. Inter-organization collaboration among participants including research providers and research users, is fundamental to the operation and success of CRCs. Of the 64 CRCs operating, the majority are unincorporated joint ventures and only a few are incorporated companies. Those that have been incorporated are companies limited by a guarantee and without share capital. The management structure of the CRC is similar to that of companies, governed by a Board with an independent chair and led by a Director reporting to the Board. Most CRCs have advisory committees with oversight of different aspects of the CRC activities research, education and training, interaction with users, commercialisation or administration. Some CRCs have their own incorporated companies that provide the Centre with administrative, commercial, financial and legal operations. It is noted that there has been a remarkable change in the preferred organisational structure for CRCs recently. New CRCs are strongly encouraged to adopt an incorporated structure for their centre s 2

3 operation. Most CRCs operate at more than one site, covering about 40 locations across Australia. CRCs are like virtual organizations or networks rather than traditional organizations with vertical reporting relations (ISR. 2001). The Government financial support of around $2.45 million per CRC per annum on average is granted to each CRC for a tenure of seven years. The additional Government funding of $227 million in the next five years announced earlier in 2001 will enable larger grants for each Centre and an increase in the number of CRCs (ISR, 2001). The Commonwealth Government has invested $1.5 billion in the CRC Program in the past 10 years. Another $4 billion has been invested from industry, other government agencies, and research and educational organizations (CRCA, 2001). The CRC Program is claimed to have reduced impediments to interaction between public sector research organizations (mostly universities) and industry and other research users (Mercer and Stocker, 1998). The CRC Program has established the most important formal structure and mechanism for collaborative R & D and research training in Australia. The program provides innovation business with a unique opportunity to work in partnership with research institutions and to commercialise innovation through entrepreneurship. Innovation Business As shown above, CRCs are a unique entity in Australia, different from other government research agencies or publicly funded educational institutions. Moreover, the principal feature of CRCs is that they are built upon and develop innovation business/projects. Innovation is an important criterion upon which selection of CRC applicants, evaluation and awards of individual CRCs are based. In the Award for Outstanding Achievement in Collaborative R & D chaired by the Business/Higher Education Round Table in Australia, innovation is one of the key criteria for the Award. Innovation, in the case of the Award, means new products or services, innovative concept or idea, design, delivery and context, new barriers surmounted and new challenges identified. In the Awards for Commercialisation and Utilisation of Research presented by the CRC Association in 2001, a clearly demonstrated innovative outcome is one of the main eligibility criteria for the Award (ISR, 2001). Some examples of achievements of CRCs in innovation business are given below: 1. The Australian Photonics CRC is taking a lead in providing much needed bandwidth for communications, which has attracted in excess of AU$ 50 million international and Australian investment in the past year and is expected to contribute more than 18,000 jobs with a turnover of about $2.4 billion in the next decade alone (CRCA, 2001; ISR, 2001). 2. The CRC for Sensor Signal and Information Processing has developed advanced high-speed subscriber communications equipment, which is successfully competing with major multi-national companies in the expanding field of access networks. 3. The CRC for Cochlear Implant and Hearing Aid Innovation has recently established a spin-off company to commercialise innovative microphone 3

4 technology with applications in hearing aids, stage microphones and other communications fields (CRCA, 2001). It is quite clear that the CRC Program is designed to encourage innovation and high quality research and research training and promote application and commercialisation of research outcomes through collaboration between research providers and users. However, to succeed in innovation and achieve high quality of collaborative R & D, effective management and entrepreneurial skills and capacities are required of CRC managers to take and develop innovative R & D and enhance the quality through collaboration amongst research organizations and between research providers and users. 3.0 CRC Evaluation Processes and Performance Indicators CRC Program is said to have extensive monitoring and evaluation processes (Industry Commission, 1995). The processes start from the selection of CRC applicants and proceed throughout the entire life of CRC operation. There have been so far seven selection rounds for CRCs since 1991 when the CRC Program was initiated. Existing CRCs completing their fifth year can submit their proposals and seek funding for renewal in the next selection round. Standard Selection and Performance Criteria were established, against which Expert Panels and external referees are required to evaluate the proposals and prepare detailed written assessments. The Selection and Performance Criteria against which evaluation of different aspects of CRCs are conducted concentrates on five areas: 1. Cooperative arrangements 2. Research and researchers 3. Education and training 4. Application of research 5.Management and budget (DIST, 1998). In addition, AusIndustry managing the CRC Program carries out an annual Management Data Questionnaire survey (AusIndustry, 2001) The formal ongoing evaluation and review processes for CRCs comprise three components: A comprehensive annual report, including an audited financial statement; The appointment of a Visitor to each CRC; and Formal reviews in the latter half of the first, second and fifth years of a CRC s operations. The Commonwealth Government has proposed a number of guidelines to ensure that CRCs are accountable to the Commonwealth for meeting CRC Program objectives and for the use of public funds contributed to the Centre. These guidelines include: Guidelines for (CRC) Applicants, CRC Visitor Guidelines, Annual Report Guidelines, Second Year Review Guidelines, and Fifth Year Review Guidelines. Accompanied with each of these guidelines is a prescribed reporting format called the report pro forma. The report pro forma sets out general headings and the issues to be 4

5 addressed under each heading. On the positive side, these guidelines and report pro forma help to standardise and streamline evaluation and review procedures and the overall management of CRCs, and facilitate the CRC Committee collecting relevant performance data from each Centre. Most importantly, the guidelines were proposed to help Centres to achieve CRC Program objectives in the five targeted areas: research activities, education & training activities, linkages, cooperation and management. On the negative side, however, the guidelines and the reporting pro forma may have limited the vision of CRC managers and the scope of CRC evaluation. A study of CRC recent annual reports found that most performance indicators used by CRCs to report against the evaluation criteria are, to a large extent, similar in content but different in wording and in scope. The following list summarises most commonly used performance indicators of CRCs in their Annual Reports (CRCs, 2000). A List of Performance Indicators Used by Most CRCs Cooperative Arrangements Proportion of projects/programs involving more than one participant Number of personnel contributing from each participant Industry contributions as a proportion of total funding Number of joint project/program sharing major facilities Number of project/program involving other CRCs and international collaboration Research and Researchers Number of publications (papers, presentations, [provisional] patents, etc.) Number and amount of external funding and awards Number of projects in progress Education and Training Number of higher degree students enrolled and/or completed (theses submitted) Number of participant (non-university) staff contributing to research training and/or teaching Number of courses developed and introduced, and conferences/symposia/seminars held Application of Research Number of patent, licenses and royalties applied and/or granted/received Number of processes and/or projects commercialised Number of consultancies or earnings of consultation Number of new participants and/or associate members Number of promotional articles/research publications on research results and products for users Management and Budget Proportion of research projects completed or milestones reached (in the planned time and within specified budgets) Total staff (full-time equivalents) including new appointments (CRC-funded) Efficiency and effectiveness of reporting systems including financial reporting system Number/frequency of internal reviews of activities/projects and strategies The Performance Indicators demonstrate that the measurement of success of a CRC is mostly based on technology transfer, as this is regarded as a central theme of CRC s agreed outcomes (DIST, 1998). In many CRC annual reports, achievements in the technology transfer are reported in terms of number and/or dollar value of projects 5

6 and processes that have been commercialised, and number of patents/licenses and royalties that have been obtained by CRCs as shown in the Performance Indicator list (Zhao, 2000). 4.0 The On-line CRC Survey Findings and Discussion The authors of this paper conducted a nation-wide survey of 64 currently operating CRCs in The questionnaire for the survey was developed on the basis of a review of recent annual reports of all the 64 CRC in Australia and the existing literature on CRCs. An electronic questionnaire was employed to maximise the response rate. By using an electronic survey, respondents were able to either reply online or reply by . Given the scattered geographical locations of CRCs, and the expected high level of Internet and literacy and access of the target population, electronic survey was selected. This survey sought to explore the views of participants in these joint ventures about current and potential approaches to performance in the CRC system. The survey collected quantitative and qualitative data on - the extent of use, importance and applicability of the current Performance Indicators for CRCs, and - crucial factors affecting collaboration/partnering between CRC participating members. The survey consisted of three parts: Part 1: Performance Indicators for Collaboration Part 2: Coordinating Inter-organisational Collaboration Part 3: Development of Performance Indicators The survey aimed to help innovation organizations taking the forms of alliances, joint ventures and the like to achieve best practice in collaboration in innovation projects. The focus of the survey was performance indicators in evaluating collaborative work between CRC participants. The results of the survey may be used as a basis for developing a performance index to measure inter-organisational collaboration in innovative R & D. The performance index may provide a means of benchmarking for collaborative project managers and personnel to identify best practice in the industry and set goals to emulate them (These are the objectives of the next stage of this research on CRC performance measures.). A total of 31 responses were collected from the survey. The following tables summarise the key findings relevant to this study. Note: Compared with the most used PIs list in Section 3 of this paper, the PIs in our survey are more comprehensive in scope, particularly addressing the key issues of CRC inter-organisational collaboration, such as, strategic alliances, open communication, mutual trust, sharing and interchange. Moreover, the PIs also address management process and output of innovation projects as seen in PIs 15 to 21, and output (see PI 25) of industry and university co-supervision of postgraduate students which is innovative in nature. 6

7 Respondents were asked to rank the importance of each of the PIs shown above, with a score of 5 = Very important; 4 = Quite important; 3 = Important; 2 = Less important; and 1 = Not important. The responses are presented in Table 1. Table 1: Perceptions on Importance of the Performance Indicators (N = 31) Performance Indicators Minimum Maximum Mean Std. Deviation 1. commitment to strategic alliances effectiveness of communication network openness/honesty in communication actual trust between participants sharing/exchanging relevant info/data sharing physical infrastructure collaborative research/training program joint publications personnel interchanged cash contribution in-kind contribution human resource contribution personal commitment of participants adherence to pre-agreed objectives projects that met all defined milestones projects with a signed agreement projects with a full business case analysis projects completed within budget projects that realize market forecasts projects which realize ROI projects which fall within environmental impact forecasts 22. success in transferring research products postgraduates co-supervised by ind/edu theses completed within time graduates employed within 3 months Overall, the responding CRC managers gave a high ranking of importance to 22 out of 25 PIs surveyed (means between 3.0 and 4.7). However, in 14 out of 25 PIs, negative responses ( less important and not important ) were given (see the data under minimum in the Table 1). That means a considerable proportion of the PIs was not important or less important in the view of some CRCs managers surveyed. Responses showed variation in most cases (see Standard Deviation). PIs 17, 19, and 21 received a mean below 3. The three indicators are measures of an organization s project management, which involves comprehensive and strategic planning to achieve all-round corporate success and sustainability. CRCs are claimed to be undertaking innovative R & D projects. Project management is vital to CRC operation. Many CRC managers who responded to the survey did not regard the indicators as important. The results may indicate a lack of entrepreneurial vision of some CRC managers in managing innovation projects. Note: In the survey, respondents were also asked to tell whether the PIs were applicable to their CRCs and whether the PIs were currently in use in their CRCs. The responses are presented in Table 2. 7

8 Table 2: Comparison: Importance, Applicability and Use of the Performance Indicators (N = 31) Performance Indicators Importance ranking (mean) Applicable to my CRC (frequency %) In Use in my CRC (frequency %) 1. commitment to strategic alliances effectiveness of communication network 3. openness/honesty in communication actual trust between participants sharing/exchanging relevant info/data 6. sharing physical infrastructure collaborative research/training program 8. joint publications personnel interchanged cash contribution in-kind contribution human resource contribution personal commitment of participants 14. adherence to pre-agreed objectives projects that met all defined milestones 16. projects with a signed agreement projects with a full business case analysis 18. projects completed within budget projects that realize market forecasts 20. projects which realize ROI projects which fall within environmental impact forecasts 22. success in transferring research products 23. postgraduates co-supervised by ind/edu 24. theses completed within time graduates employed within 3 months Table 2 provides readers with three sets of comparable data which clearly show the gaps between CRC managers perception of the importance and applicability of the PIs and the frequency of them being used in CRCs (See responses to PIs 3, 4, 9, 13, 16, 17, 19, 20, and 25). Like the responses to the importance, respondents regarded most of the PIs as being applicable to their CRCs. However, when it comes to the use of the PIs in their CRCs, the overall frequencies were significantly lower than those of applicability. Over 90 percent respondents thought that the indicators - openness and honesty in communication between participants and actual trust between participants - were applicable to their CRC. But less than 30 percent of respondents said that their CRCs 8

9 used the PIs (See Table 2). The primary reason expressed in the comments of the respondents of the survey is that it is hard to measure them although they are vital to CRC inter-organisational collaboration. Further analysis of correlation between perceptions of importance and applicability and between applicability and frequency of use of the PIs in CRCs was made using Spearman s rho. The results are shown in Table 3 (only those cases that demonstrate significant discrepancy in responses are presented in the Table in order to save space and highlight the discrepancy). Table 3: A Correlation Analysis Importance & Performance Indicators Applicability (correlation coefficient) Applicability & In Use (correlation coefficient) 3. openness/honesty in communication actual trust between participants personal commitment of participants projects with a signed agreement projects with a full business case analysis projects that realize market forecasts projects which fall within environmental impact forecasts 25. graduates employed within 3 months It is obvious that there is little correlation between applicability and frequency of use of the PIs surveyed (with most of correlation coefficients below 0.3), but a strong correlation exists between mangers perception of importance and applicability of the PIs (with most of rhos above 0.55) as shown in Table 3. Table 4: Perceived Barriers to CRC Operation Leadership: a. Board members often focus on their own organizations interests, rather than the CRC as a whole. b. Failure of project managers to gain best from staff c. Incompatible system People Management: a. Lack of ability to reward high fliers b. Low motivation to contribute c. Staff turnover and mobility d. Staff lack of commitment Communication: a. Communication over large geographic distances b. Communication with dispersed end user community Culture: a. Entrenched culture b. Differing cultures between researchers and users c. Different cultures in the research organizations d. Adaptation of university researchers to a project/milestone driven environment e. Learning to work in the new CRC environment Misconceptions: a. Core partners viewing CRCs simply as an external funding body and needing to achieve a fixed cash return on investment b. Individual researchers unable to see the bigger picture beyond their own immediate interests 9

10 Table 4 summaries the findings through content analysis of responses of CRC managers in regard to their perceived barriers to development and operation of CRCs. The table lists the most frequently identified barriers in the view of the respondents surveyed. The cooperative organisational structure in the form of inter-organisational joint ventures appears to have encountered serious problems in management, culture, communication and etc. The problems are likely to undermine the development of CRCs as innovation organizations in Australia. However, given the dimensions and scope of the current performance indicators used in CRCs, they are not able to address the key issues identified in the above table. As a result, the existing performance measures for CRCs would not appear to sufficiently accommodate the complexity of CRC inter-organisational operation. 5.0 In Conclusion The survey conducted for this research demonstrated the significant gaps between CRC managers perception on the importance and applicability of the PIs developed for effective inter-organisational collaborations and the actual frequency of use of them in CRCs. The gaps can be narrowed if CRC decision-makers take an innovative approach to their performance measurement for their innovative organizations CRCs. For inter-organisational collaboration to be successful, total quality partnership should be introduced and evaluated in CRCs. This means the extension of TQM concepts into an inter-organisational perspective. Given the degree of complexity to which several organizations engage in collaborative action, performance measurement should address not only shor-term outcomes and results but also collaborative quality process and long-term social and environmental benefits. 6.0 Looking Forward A Goal for Future Research There has recently been a positive gesture from CRC Program management body seeking improvement in the performance measures. Earlier this year, the Department of Industry, Science and Resources (ISR) commissioned a research project called Quantitative and Qualitative Outcome Study of the CRC Program. (Unfortunately, the authors have not been able to get access to the detail of the findings, as they are not in the public domain. It is perceived that there would be more contribution that this paper could make if a comparative study was undertaken on the findings of this research and those of the ISR commissioned research.) It is reported that some changes were made to the Management Data Questionnaire 2001 (MDQ) as a result of the project. The last section of the MDQ was designed particularly to elicit CRC managers advice on improvement of the existing performance measures for CRCs (AusIndustry, 2001). Our study shows that more research on performance measures is needed to enhance the collaboration in innovation business and address the considerable complexity in the management and judgement of the performance of the collaboration. The future goal of our research will be formulating and improving performance indicators that serve the purpose of measuring performance in complex inter-organisational collaboration. 10

11 References AusIndustry, 2001, Management Data Questionnaire 2001 (MDQ), Canberra: Author Cooperative Research Centres Association (CRCA), 2001, website: Cooperative Research Centres (CRCs), 2000, Annual Report , (64 CRCs) Australia: Authors Department of Industry, Science and Tourism (DIST), 1998, CRC Program: Guidelines for Applicants and General Principles for Centre Operations. Canberra: DIST Industry Commission, 1995, Research and Development, Report No. 44. Canberra: AGPS Industry Science Resources (ISR), 2001, website: Mercer, D. and Stocker, J., 1998, Review of Greater Commercialisation and Self- Funding in the CRC Program. Canberra: DIST Zhao, F., 2001, Managing Innovation and Quality of Collaborative R & D. Conference Proceedings of 5 th International & 8 th National Research Conference, Melbourne, February