LCTA: A Test Framework for Communica6ons- Cri6cal Large Scale Systems. IEEE webinar presenta6on by Mohammad Nabulsi 22 January 2015 Copyright 2015

Size: px
Start display at page:

Download "LCTA: A Test Framework for Communica6ons- Cri6cal Large Scale Systems. IEEE webinar presenta6on by Mohammad Nabulsi 22 January 2015 Copyright 2015"

Transcription

1 LCTA: A Test Framework for Communica6ons- Cri6cal Large Scale Systems IEEE webinar presenta6on by Mohammad Nabulsi 22 January 2015 Copyright

2 Context A look at tes6ng prac6ces during large commercial IT projects Agreements between IT vendors and their clients Late and over budget project deliveries Quality issues becoming apparent during UAT Test standards lead to structured formal ac6vi6es but not test efficiency Requirements- based 2

3 Terminology LCTA: Layered CCLSS Test Architecture CCLSS: Communica6ons- Cri6cal Large Scale System CCLSS example: Emergency services mobilisa6on systems / IT change as well as organisa6onal change / Large IT vendors 3

4 Objec6ve of the Presenta6on To share ideas about an alterna6ve approach to organising and designing tests for CCLSS, including a case study and the method used to evaluate the benefits 4

5 Poll 1 Do exis6ng commercial test prac6ces, methodologies and standards effec6vely meet the needs of communica6ons- cri6cal large scale systems? 5

6 Introduc6on Thesis then IEEE So\ware ar6cle How the ideas emerged Visual concepts and their importance Engineering precision in tes6ng of CCLSS What is different about IT compared to other industries when it comes to tes6ng? 6

7 Test Methodologies and Standards To test a communica6ons- cri6cal large- scale system, many commercial test methodologies and standards can be adopted. Examples include: the V- Model agile tes6ng IEEE 829 so\ware test documenta6on standard BS component- tes6ng standard ISO/IEC so\ware life cycle standard IEEE 1012 verifica6on and valida6on standard ISO so\ware assessment and improvement standard the more recent ISO/IEC/IEEE so\ware tes6ng standard and many company- specific methodologies 7

8 V- Model 8

9 LCTA What mo6vated its development How it was developed: based on a concept of the architecture rather than the development process What it looks like How it compared to other test methodologies/ standards, e.g. the V- Model 9

10 LCTA 10

11 LCTA Non- IT commercial features are outside the tes6ng s scope Infrastructure includes communica6ons hardware, IT hardware and so\ware packages, and the configura6on and setup needed for the infrastructure Communica4ons links and protocols Data Detailed func4onal features facilitate other higher- level func6onal features but aren t in themselves what the system is intended for High- level func4onal features describe how the system should achieve the intended business and opera6onal processes Business and opera4onal processes represent what the system should achieve 11

12 LCTA Inspired by Concepts from telecommunica6ons and protocol specifica6ons and tes6ng: OSI Network Management / OSS: etom/fab/ TMN Value Chain Enterprise Architecture frameworks 12

13 Case Study The project The requirements The communica6ons layer 5 communica6ons interfaces The 19 subcategories 13

14 14

15 Example Requirements Req A In order to make efficient use of the CCI ports provided at each centre, the ICCS shall permit the number of logged- in ICCS terminal users to exceed the number of CCI ports. Req B The Communications Gateway functionality shall support full- duplex data transmission. Req C The protocol for communications between the MDT and the centre must provide an acknowledgement i.e. for every transmission sent by the <System> to a MDT the MDT terminal must acknowledge with that transmission's identifier. Acknowledgements must be logged for audit purposes. Req D The MDT GIS must provide the ability to receive and display new mapping overlays (for example plume- related information sent via the First or the Secondary Bearer in relation to a specific incident). 15

16 Example Outline Test Case Test to ensure that all types of data messages sent from an MDT to can be transmitted over any of the bearers available to <System>: For each of the available bearers, operating under realistic load conditions: - Send to an <Centre> a sample of each data message type (as defined in technical specification) from each resource type. - Both mobile and stationery resources should be used in the test. - In case of SRB, both GPRS and 3G modes should be tested with each message type. Check that messages arrive at destination <Centre> within the limits defined in <Schedule>. 16

17 Case Study Outcome: Simplified test analysis and design effort for 5 communica6ons interfaces All requirements for the 5 interfaces mapped to the 19 subcategories consistently Unplanned benefits: iden6fica6ons of gaps, inconsistencies and test risks, list of technical design assurance ac6vi6es Independence from the technical design, adherence to the original requirements 17

18 Evalua6on The case study Randomized simula6on, including comparison against a rival theory User- based evalua6on 18

19 Randomized Simula6ons One possible es6mate of test effec6veness is the retes6ng needed a\er a fault has been iden6fied, e.g. related to a par6cular requirement An effec6ve test framework should lead to well priori6zed test cases, which in turn should lead to early detec6on of faults and reduced retes6ng 19

20 Dependency Count A dependency is a situa6on in which requirement X can t be fulfilled correctly un6l requirement Y is fulfilled correctly that is, X is dependent on Y. Finding a fault in Y and fixing it will likely necessitate retes6ng the func6onality of X Suppose there is a sequence of requirements, X1,..., Xn. For Xm, the simula6on program counts Cm, the number of requirements that are before Xm and depend on it. The overall dependency count is the sum of Cm over m = 1,..., n This simplified test efficiency calcula6on method produces a quan6ta6ve es6mate of how op6mal a par6cular ordering of a set of requirements is from a retest effort viewpoint This method was adopted in a number of randomized simula6ons for the communica6ons requirements. A single simula6on run randomly ordered the requirements and determined the overall dependency count 20

21 Dependency Count Example A set of requirement IDs: A, B, C, D A is standalone, does not depend on any other B depends on A, C depends on A and B, D depends on A, B and C The op6mal order for tes6ng the four requirements is therefore {A, B, C, D}. This order will have a dependency count of 4 The order {D, C, B, A} would be the least efficient with the highest dependency count of 10 {B, A, C, D} dependency count of 5 {C, D, A, B} dependency count: 1 for C + 1 for D, 3 for A and 3 for B = 8 21

22 Results of the Simula6ons For each interface, the following simula6ons were carried out a randomized simula6on of 1,000 orderings of the requirements, stra6fied according to LCTA s 19 communica6ons subcategories a randomized simula6on of 1,000 orderings, stra6fied according to the V- Model test phases of review, unit tes6ng, integra6on tes6ng, system tes6ng, and user acceptance tes6ng a simula6on of 1,000 randomly generated orderings A final all- requirements- combined set of simula6ons: all interfaces combined, compared to the V- Model and fully randomized sets Each simula6on involved 1,000 test orderings. A three- way ANOVA comparison showed significant differences, with LCTA being the most efficient 22

23 Results of the final (all requirements combined) set of simula6ons 23

24 User- Based Evalua6on Other users evalua6ng the framework via ques6onnaire and interviews Repeat the simula6ons for one of the interfaces Second (smaller) CCLSS case study 24

25 User- Based Evalua6on Criteria / ques6ons asked: Suitability for tes6ng communica6ons- cri6cal large scale systems Enabling tes6ng to be linked to the system s requirements, and being the basis for evidence on whether the requirements have been fulfilled Enabling early detec6on of faults in a new system Applicable to the full lifecycle of a new system Suppor6ng close coopera6on between the test team and the rest of an IT project team Useful as basis for defining a test strategy for a new system Provides a simplified conceptual view of a communica6ons- cri6cal large scale system s structure which can be used as an aid for the test analysis and design efforts Can be used as basis for review and verifica6on ac6vi6es of the system requirements, technical design and technical specifica6on If used as basis for tes6ng a communica6ons- cri6cal large scale system, allows the testers to start their verifica6on and valida6on work from an early stage of the project Can facilitate test traceability and coverage analysis Have poten6al uses in an IT project outside purely tes6ng, e.g. providing a shared view of a system for business analysts, developers, testers and suppliers/vendors 25

26 Summary LCTA: a new conceptual test management and design framework Communica6ons layer detailed Case study Evalua6on: randomised simula6ons and user- based 26

27 Benefits of LCTA 1- Effec6ve categoriza6on and priori6za6on of the test cases 2- Iden6fica6on of gaps and inconsistencies in the requirements and the technical design through the use of the test subcategories. This can help iden6fy areas of poten6al contradic6on or ambiguity in the requirements, e.g. by using the 19 subcategories diagram as basis for structured analysis and review of the requirements. 3- Improved synergy between tes6ng and the overall project ac6vi6es and phases because the framework helps maintain a con6nuous link between test ac6vi6es and requirements and the layers of the framework can be used to define the phases of an IT project. 4- Improved confidence in the results of tests during each phase of the project. This is due to the efficient priori6za6on of the tests, meaning fewer tests will be run too early or too late. 27

28 Poten6al further work Poten6al work to develop and trial LCTA might include one or more of the following: defining the specific test approaches (both func6onal and nonfunc6onal) for the remaining five layers developing further the use of simula6on to evaluate LCTA s priori6za6on benefits devising nota6ons and formats to support more precise defini6on of the test cases and possibly their automa6c genera6on incorpora6ng the ideas into, or as an extension of, an established test or so\ware engineering standard applying the ideas to other real- life CCLSS projects 28

29 Poll 2 Are the expected benefits (outlined earlier) of LCTA, on the whole, realizable if LCTA is adopted as the test framework for other real- life CCLSSs? 29

30 LCTA Ques6ons / Discussion Thank You mnabulsi@b6nternet.com 30

31 Further reading The thesis: hup://bura.brunel.ac.uk/handle/2438/8239 Edited ar6cle to be published in IEEE So\ware in March- April Currently available online as a pre- print version. Co- authored by Prof. Rob Hierons: hup://ieeexplore.ieee.org/xpl/ar6cledetails.jsp? arnumber= &contenttype=early+access +Ar6cles 31