Massachusetts Institute of Technology

Similar documents
Development, Validation and Implementation Considerations of a Decision Support System for Unmanned & Autonomous System of Systems Test & Evaluation

Framework for Characterizing Complex Systems under Test

A Holistic Look at Testing Autonomous Systems. 31 st Annual National Test and Evaluation Conference 3/3/2016

Strategy for Unmanned Maritime Systems

Prepared for Unmanned Surface Vessel Regulation Conference. Southampton October 2016

OVERLORD PROGRAM Phase I Demonstration Plan Unmanned Surface Vehicle (USV)

MBA BADM559 Enterprise IT Governance 12/15/2008. Enterprise Architecture is a holistic view of an enterprise s processes, information and

Architecture Centric Evolution

Extending Systems Engineering Leading Indicators for Human Systems Integration Effectiveness

Product Requirements. Requirements. Get it Right ASAP. Why Requirements are Difficult. Types of S/W Requirements. Levels of S/W Requirements

Measures and Risk Indicators for Early Insight Into Software Safety

Independent Verification and Validation (IV&V)

How Can I Better Manage My Software Assets And Mitigate The Risk Of Compliance Audits?

Software Processes. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 4 Slide 1

DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle

CONCEPTUAL DESIGN OF AN AUTOMATED REAL-TIME DATA COLLECTION SYSTEM FOR LABOR-INTENSIVE CONSTRUCTION ACTIVITIES

Enterprise Architecture and COBIT

Product Requirements. Requirements. Get it Right ASAP. Why Requirements are Difficult. Levels of S/W Requirements. Types of S/W Requirements

Software Processes. Objectives. Topics covered. The software process. Waterfall model. Generic software process models

Objectives. The software process. Topics covered. Waterfall model. Generic software process models. Software Processes

System Engineering. Instructor: Dr. Jerry Gao

The Basic Waterfall Model. Software Process Models. Concurrent Development. (Concurrent Development) The Agile Critique of the Waterfall

Topics covered. Software process models Process iteration Process activities The Rational Unified Process Computer-aided software engineering

Robotic Process Automation

CHAPTER 2 LITERATURE SURVEY

Concept Exploration. Dr. Norbert Doerry, December 8-9, Approved for Public Release Distribution is Unlimited 11/24/2015

Accelerating System of Systems Engineering Understanding and Optimization through Lean Enterprise Principles

Systems and Technologies for Enhanced Coastal Maritime Security

Integrated Systems Engineering and Test & Evaluation. Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA. 16 August 2011

Naval Set-Based Design

A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 Skillport

Broad Agency Announcement (BAA) N S-BA09 Amendment 0001

CENTRE FOR MARITIME RESEARCH AND EXPERIMENTATION

BMC - Business Service Management Platform

ACHIEVE BUSINESS SUCCESS WITH ACCURATE SOFTWARE PLANNING

ericsson White paper GFMC-17: Uen October 2017 TELECOM IT FOR THE DIGITAL ECONOMY

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

Autonomous and Intelligent Systems Partnership. Presented by Dr. Jeffrey A. Kuo

33: Can a foreign company collaborate as a partner for Phase 1 or for Phase 2? A: Please see BAA section

UNCLASSIFIED. FY 2016 Base FY 2016 OCO. Work in this project is performed by the U.S. Army Research Laboratory (ARL), Aberdeen Proving Ground, MD.

Available online at ScienceDirect. Procedia Computer Science 61 (2015 )

PORTFOLIO MANAGEMENT Thomas Zimmermann, Solutions Director, Software AG, May 03, 2017

Complex Systems (Systems of Systems) Technology and What s Next!

Building a Safety Case for Automated Mobility: Smart Cities and Autonomous Mobility Getting There Safely

Robotics & Autonomous Systems Ground Interoperability Profile (RAS-G IOP) NDIA GRCCE 2016

ABB Ability Ellipse APM

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial

03. Perspective Process Models

Innovation Programme 3:

Business Decision Maturity Model BDMM

THE TARDEC ADVANCED SYSTEMS ENGINEERING CAPABILITY (ASEC)

MES ERP. Critical Manufacturing, 2015

The software process

COSYSMO: A Systems Engineering Cost Model

Research on software systems dependability at the OECD Halden Reactor Project

Business Risk Intelligence

COGNITIVE QA. Journey To. The New Essential Ingredient

Business Process Management for Innovation and Optimisation. David Bate SOA Software Sales Executive IBM Asia Pacific

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Implementing Open Architecture. Dr. Tom Huynh Naval Postgraduate School

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Service Oriented Architecture

Safe and Secure by Design: Systems Engineering Best Practices for Connected Vehicles

Using historical data for asset management decision making Craig Knox, ISE Solution Engineering. October 10, 2017

An Overview of the AWS Cloud Adoption Framework

Improving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes

Matthew Clark V&V of Autonomous Systems Autonomous Control Branch (AFRL/RQQA) Integrity Service Excellence

From Dictionary.com. Risk: Exposure to the chance of injury or loss; a hazard or dangerous chance

API Gateway Digital access to meaningful banking content

OCTOBER Digital Supply Networks

Ground Vehicle Reliability

Towards Unmanned Ships

Standards Harmonization Process for Health IT

Owning An Agile Project: PO Training Day 2

Analytics: The Widening Divide

Lectures 2 & 3. Software Processes. Software Engineering, COMP201 Slide 1

Predictive Analytics Cheat Sheet

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009

Application of Robotics and AI Technologies to the Future ATM

2014 Integrated Internal Control Plan. FRCC Spring Compliance Workshop April 8-10, 2014

Driving Compliance with Functional Safety Standards for Software-Based Automotive Components

Telecom Infra Project: Project Group Charter

The 3 M s of ATM. Professor John-Paul Clarke Air Transportation Laboratory Georgia Institute of Technology

Industrial Value Chain Reference Architecture

Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook

Process for Evaluating Logistics. Readiness Levels (LRLs(

CGEIT Certification Job Practice

Mine Production Management

Operational Requirements Document (ORD)

Speed to Value in Portfolio Management

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

"SKILLS AND COMPETENCES NEEDED BY THE TRANSPORT WORK FORCE OF THE FUTURE :

An Industry Code of Conduct Maritime Autonomous Systems (Surface) MAS(S)

Hany Moustapha Professor and Director, AÉROÉTS École de technologie supérieure Senior Research Fellow, P&WC July 2016

An Automated Decision Support System to Assist with Project Planning, Program Management and Work Flow Analysis of an Enterprise

White Paper. Demand Shaping. Achieving and Maintaining Optimal Supply-and-Demand Alignment

Usine Logicielle. Position paper

System Cost Modeling and SysML Integration

Evolving Lockheed Martin s Engineering Practices Through the Creation of a Model-centric Digital Tapestry

Capgemini s PoV on Industry 4.0 and its business implications for Siemens

Transcription:

Massachusetts Institute of Technology A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems: A Collaboration Between MIT, USC, UT Arlington and Softstar Systems Dr. Ricardo Valerdi Massachusetts Institute of Technology August 11, 2010

Outline PATFrame team The Challenge Test & evaluation decisions PATFrame features Use case Anything that gives us new knowledge gives us an opportunity to be more rational - Herb Simon 2010 Massachusetts Institute of Technology Valerdi- 2

Sponsors Transition Partners

PATFrame team http://mit.edu/patframe Valerdi Kenley Medvidovic Ferreira Ligett Deonandan Edwards Hess Tejeda Cowart 2010 Massachusetts Institute of Technology Valerdi- 4

The Challenge: Science Fiction to Reality You will be trying to apply international law written for the Second World War to Star Trek technology. Singer, P. W., Wired For War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009) 2010 Massachusetts Institute of Technology Valerdi- 5

Science & Technology Background Current state of UAS T&E UAS T&E is focused on single systems One-shot planning for T&E and manual test strategy adaptation Value-neutral approach to test prioritization Autonomy not a key consideration Function-based testing Traditional acquisition process Physics-based hardware-focused test prioritization and execution Future State of UAS T&E Systems of systems introduce complex challenges Accelerated & automated test planning based on rigorous methods Value-based approach to test prioritization Autonomy as a central challenge Mission-based testing Rapid acquisition process Multi-attribute decision making to balance cost, risk and schedule of autonomous software-intensive systems of systems 2010 Massachusetts Institute of Technology Valerdi- 6

I think this article would be of interest to LAI consortium members. -Ricardo PATFrame Objective To provide a decision support tool encompassing a prescriptive and adaptive framework for UAS SoS Testing PATFrame will be implemented using a software dashboard that will enable improved decision making for the UAS T&E community Focused on addressing BAA topics TTE-6 Prescribed System of Systems Environments and MA-6 Adaptive Architectural Frameworks Three University team (MIT-USC-UTA) draws from experts in test & evaluation, decision theory, systems engineering, software architectures, robotics and modeling Based on Valerdi, R., Ross, A. and Rhodes, D., A Framework for Evolving System of Systems Engineering, CrossTalk - The Journal of Defense Software Engineering, 20(10), 28-30, 2007. 2010 Massachusetts Institute of Technology Valerdi- 7

Three Stages in Decision Making Simon, H. (1976), Administrative Behavior (3rd ed.), New York: The Free Press 2010 Massachusetts Institute of Technology Valerdi- 8

Prescriptive Adaptive Test Framework Prescriptive Test Strategy/ Test Infrastructure Adaptive System under test 2010 Massachusetts Institute of Technology Valerdi- 9

Time Scale for Testing Decisions PATFrame Scope Test Execution (real time) Data Collection, & reprioritization Test Planning (in SoS environment) Test Development (i.e., design for testability) Long Term Planning Decisions/ investments Minutes Hours Days Months Years 2010 Massachusetts Institute of Technology Valerdi- 10

Net-centricity of the environment net-centric focus Testing a system in a SoS environment SoS Testing a SoS in SoS environment DARPA Urban Grand Challenge Use case (UAS in SoS) UAST focus Ultimate Goal PATFrame Initial Focus: Testing Autonomous System in SoS environment AI none Human S SoS Complexity of the system under test Difficulty Testing SoS 2010 Massachusetts Institute of Technology Valerdi- 11

Prescribed System of Systems Environment Goal: Construct Theoretical Best SoS Test Goal: Synthetic framework for SoS testing at single and multi-program level Normative Metric set, best levels Prescriptive success Metrics, state of the practice levels MetricA MetricB MetricC MetricN Successful SoS Test = f(metrica, metricb, etc.) Goal: Capture Actual SoS Test Descriptive Actual SoS tests include metrica, metricc, etc. (MetricA) Normative (best) Descriptive (SoP) Descriptive (actual) Potential (new test A) test A contribution to state of the practice limit 2010 Massachusetts Institute of Technology Valerdi- 12

Integrated Test Management Primary Inputs to PATFrame SUT requirements, capabilities, & architecture Mission needs and goals Primary Outputs for UASoS T&E Planners UASoS test strategy, recommended tests & their sequencing Feasible T&E test planning options Test resources & associated architecture (infrastructure) Test requirements Schedule, resources and cost constraints UASoS test strategy estimated cost UASoS test strategy risks Undesirable emergent behavior Recommended and automatic adaptations to UASoS T&E issues 2010 Massachusetts Institute of Technology Valerdi- 13

Current Efforts: Ontology and Architecture Frameworks Ontology Meta Modeling Discrete Event Simulation 2010 Massachusetts Institute of Technology Valerdi- 14

Current Efforts: Normative Models Cost Modeling Real Options Valuebased Testing 2010 Massachusetts Institute of Technology Valerdi- 15

PATFrame Workflow Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Data Processing Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Outputs Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 16

Cost Model Workflow Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Data Processing Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Outputs Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 17

Design of Experiments Analysis Workflow Inputs Data Processing Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 18

Dependency Structure Matrix (DSM) Analysis Workflow Inputs Data Processing Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 19

Value-based Analysis Workflow Inputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Data Processing Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Outputs Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 20

Real Options Analysis Workflow Inputs Data Processing Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 21

Discrete Event Simulation Workflow Inputs Data Processing Outputs UASoS & test settings UASoS reqts & capabilities UASoS architecture Test objectives (including test types / list of tests) Test resources & settings Interdependencies of major inputs Cost model Design of experiments Real options DSM Analysis Discrete Event simulation Value-based (algorithm) Cost Estimate List of prioritized tests List of predicted undesirable emergent behaviors / conditions List of suggested adaptations to undesirable emergent behaviors / conditions 2010 Massachusetts Institute of Technology Valerdi- 22

Use Case 1: Define and Prioritize Tests Operational thread NAVSEA unmanned surface vehicles (USVs) must comply with International Regulations for Preventing Collisions at Sea (COLREGS)* T&E thread Action to avoid collision shall be: positive, obvious, made in good time** Validate USVs ability to selfadapt to: learn, sense & avoid, perform automated coupling, optimize adaptive software*** *Hansen, E. C., USV Performance Testing, April 14, 2010. **Part B, Sect. I, Sec. 8, Convention on the International Regulations for Preventing Collisions at Sea, IMO (The International Maritime Organisation), 1972. ***Engineering Autonomous System Architectures. 2010 Massachusetts Institute of Technology Valerdi- 23

Use Case 1: Define and Prioritize Tests Use Case Name: Test selection and prioritization for NAVSEA UASoS Goal: Define and prioritize a set of tests for an unmanned & autonomous SoS comprised of NAVSEA s fleet of unmanned surface vehicles (USVs) Summary: SoS cannot be exhaustively tested, therefore tests must be chosen that provide the most value within the allocated time and budget Actors: Test planner personnel, program manager, scheduler, range safety officer, owners of USVs, regulatory organization(s) Components: Dependency Structure Matrix (DSM) modeling interface, DSM clustering algorithm, XTEAM meta-modeling environment, value-based testing algorithm, LVC environment Normal Flow: 1. Input information about each USV, such as architecture, sensors, communication attributes, etc. 2. Integrate necessary LVC assets 3. Input the types of possible tests to assess COLREGS compliance 4. Input desired confidence intervals and compliance criteria for COLREGS 5. PATFrame outputs A prioritized set of tests to perform Expected level of confidence of COLREGS compliance after each test 2010 Massachusetts Institute of Technology Valerdi- 24

Use Case 1: Define and Prioritize Tests 2010 Massachusetts Institute of Technology Valerdi- 25

Testing to Reduce SoS Risks vs. the Risks of Performing Tests on an SoS SoS Risks What are unique risks for UAS s? For UAS s operating in an SoS environment? How do you use testing to mitigate these risks? What are metrics that you are using to measure the level of risk? Risks of Testing an SoS What are unique programmatic risks that impact your ability to do testing on UAS s? To do testing on UAS s operating in an SoS environment? What methods do you use to mitigate these risks? What are the metrics that you are using to measure the level of programmatic risk in testing? 2010 Massachusetts Institute of Technology Valerdi- 26

Example PATFrame Tool Concept Question: When am I Done testing? Technology: Defect estimation model Trade Quality for Delivery Schedule Inputs: Defects discovered Outputs: Defects remaining, cost to quit, cost to continue 27 2010 Massachusetts Institute of Technology Valerdi- 27

When am I done testing? 28 2010 Massachusetts Institute of Technology Valerdi- 28

Technical Specifications Parameter Automation level in predicting undesirable emergent behavior Probability of automatically predicting actual undesirable emergent behavior before testing (emergent behavior will occur) Rate of falsely predicting emergent behavior automatically and before testing (emergent behavior won t occur false pos.) Current Performance Level Manual Specifications Current Target* Ultimate Goal* Achieved Semi-automated (moderate based on identified best practices and rules from SMEs) No capability 0.90 0.995 No capability 1 per 100 scenarios Semi automated (extensive current target plus collected patterns in SUTs, test infrastructure, associated test settings) 1 per 1000 scenarios * to be validated with stakeholders at workshop #3 2010 Massachusetts Institute of Technology Valerdi- 29

PATFrame Technology Maturation Plan 1. Applied Research Phase (TRL 3-4) Successful SoS Test = f(metrica, metricb, etc.) Goal: Construct Theoretical Best SoS Test Normative Metric set, best levels Goal: Capture Actual SoS Test Descriptive Metrics, state of the practice levels Actual SoS tests include metrica, metricc, etc. Goal: Synthetic framework for SoS testing at single and multi-program level Prescriptive (MetricA) Normative (best) Descriptive (SoP) Descriptive (actual) Potential (new test A) success MetricA MetricB MetricC MetricN test A contribution to state of the practice limit 1A. Define Prescribed SoS Environments 1B. Implement Adaptive Architectural Frameworks Critical UAS SoS T&E concepts are validated through a normative, descriptive and prescriptive framework. Basic ideas are implemented in a prototype and integrated through a UAS SoS T&E ontology model, system dynamics model and SoS architecture notions for dynamic adaptation of SoS T&E. 2. Development Phase (TRL 4-5) 2A. Develop Decision Support System Prescriptive framework (PATFrame) is developed as a decision support system to enable better outcomes of UAS T&E. 3. Deployment Phase (TRL 5-6) 2B. Refine Use Cases Use cases identified by transition partners are used to validate PATFrame in relevant environments. Simulations are performed in specific operational domains (ground, air, space, water, underwater). 3A. Conduct Focused Experiments Readiness is demonstrated through focused experiments in simulated operational environments (i.e., Future Combat Systems). 3B. Deploy Decision Support System PATFrame documented test performance demonstrating agreement with analytical predictions deployed across facilities in the Army, Navy and Air Force engaged in UAS T&E. 2010 Massachusetts Institute of Technology Valerdi- 30