Automated Security Reviews in a CI Environment. Richard Fry Information Security Manager

Size: px
Start display at page:

Download "Automated Security Reviews in a CI Environment. Richard Fry Information Security Manager"

Transcription

1 Automated Security Reviews in a CI Environment Richard Fry Information Security Manager

2 Bio Richard Fry IT Security Manager Swinton Insurance UK s largest high street insurance broker 30 Years IT Experience Manufacturing, Software Development, Financial Services, Government 15 years information security experience CISM, CIRMP, CEH

3 Swinton Insurance Group

4 Swinton Background Insurance & Financial Services Software Development!

5 Context Strategic 3 Year Business Transformation Programme New Retail Platform to be Designed and Launched An increased reliance on contract development staff and an Agile Development meant Increased Risk Manual Code Review too time consuming to review all new code.

6 Requirements Fit with technologies and processes Agile Development Continuous Integration Automated Testing Bug Tracking Regular updates to track emerging vulnerabilities (and OWASP top 10) Output Diagnostics and remediation advice High level of accuracy with low false positives Good Management Information

7 Selection Process Define the requirements See the previous slide Review available approaches and technologies for capability and fit Can the technology and approach lend itself easily to a CI environment? Do we get the MI and feedback to the developers? Proof of concept Make sure it worked as the vendors said it would Implementation Develop a plan and documented approach for onboarding each new application Validate Return on Security Investment (ROSI)

8 Analysis Approaches SAST/DAST/IAST Static Analysis Security Testing (SAST) White Box Testing / Analysis of code at rest. Pro s: - Root cause analysis - Early code only testing Con s: - Noisy (False positives) - Specialised skills required - Doesn t fit with Agile Vendors: Checkmarx, Veracode, Coverity, HP Fortify, IBM AppScan Source Dynamic Analysis Security Testing (DAST) Black Box Testing / Analysis of external behaviour Pro s: - Vuln. Evidence (inc. FP s) - Can be done remotely Con s: - Noise (False positives) - No root cause analysis - Crawling required - Inaccuracies = not Agile Vendors: IBM AppScan, HP Webinspect, Qualys, Acunetix, BURP Suite Interactive Application Security Testing (IAST) Run time code and data analysis / x-ray view of running application Pro s: - Real exploits & evidence - Root cause analysis - Highly accurate - Fits with agile Con s: - Integration required - Crawling required Vendors (true IAST not hybrid or simple agents): Quotium, Contrast

9 Why we chose IAST Fit with Technologies and Processes Fundamentally better data than can be achieved with SAST or DAST alone or as Hybrid. Technically fits with DevOps technology: CI, Bug Tracking, existing test automation Ease of Use Quality of output can be immediately used by scrum in next sprint Evidence of vulnerability, root cause of vulnerability in source code / configuration with remediation recommendations Accuracy! Focus on real business issues Don t boil the ocean in a sprint! Address real vulnerabilities that are in existence At that time / build In that release In that application

10 Architecture Testing is executed against a functional test environment Seeker then generates HTTP/S requests with application layer attacks Passive agents monitor data flow through application stack and code behaviour under attack conditions Agents deliver proof of attack success and give root cause analysis IAST Active Client and Agents Integrated into target environment

11 How it works for Swinton Deploy Repository Build Same automatic functional tests that run smoke & integration tests Development Pass / Fail

12 Screenshots Dashboard Root Cause (source code) Video Evidence (live exploit)

13 Return on Security Investment Manual code review 1-2 hours per application per release cycle (weekly) 80 applications approximately developer hours a week! vs Automated code review every build cycle Developers get instant feedback on issues Very little overhead once application is initially set up (4-5 hours effort)

14 In Summary ROI We save over 80 hours a week of review time Risk Down Checks every build rather than weekly Cost avoided issues fixed early in the lifecycle Agility retained

15 Automated Security Reviews in a CI Environment