Effective Methods of Software Testing Workshop

Register Now!

Course Description

Proactive Testing™ enables you to deliver better software in less time by doing more effective testing, while also providing the value that overcomes traditional user, manager, and developer resistance. Applying special strategies and techniques that spot many of the highest yet ordinarily-overlooked risks, Proactive Testing™ makes sure the most important unit/component, integration/assembly, system, and UAT testing is done in limited available time. Moreover, by managing within an overall Quality perspective that catches more defects earlier when they are easier to fix, and actually prevents many showstoppers and other errors, Proactive Testing™ also can cut developers’ time, effort, and aggravation. After establishing core concepts, this interactive workshop shows Proactive ways to apply powerful proven structured test planning and design techniques that produce value, not busywork. To enhance learning, participants practice each key technique in a series of exercises with various aspects of a real case fact situation.

Participants will learn:

  • A structured Proactive Testing™ model of testing that should be performed throughout the life cycle.
  • Ways testing actually can cut time, effort, and aggravation for users, developers, and managers.
  • Writing industry-accepted test plans, designs, and cases that make testing easier and more reliable.
  • Multiple techniques/checklists to design more thorough tests and discover overlooked conditions.
  • Managing test execution, including estimating/allocating resources and reporting defects and status.
  • Applying risk analysis and reusable testware to perform more of the important testing in less time.

Who Should Attend?

This course has been designed for testing professionals and others who manage and perform testing of software products, and also for analysts, designers, and system/project managers who need to know how Proactive Testing™ can cut software development time and effort.

Course Length

3 days

Course Outline

Testing for correctness vs. testing for errors
Developer views of testing
Exercise: Your defined testing process
What is a process, why it matters
REAL vs. presumed processes
Why most IT process improvement efforts fail
Exercise: Your REAL testing process
Meaningful process measures, results, causes
Defect injection, detection, ejection metrics
Economics of quality problems in life cycle
Keys to effective testing
CAT-Scan Approach™ to find more errors
Dynamic, passive and active static testing
Developer vs. independent test group testing
V-model and objectives of each test level
Reactive testing—out of time, but not tests
Proactive Testing™ Life Cycle model
Proactive user acceptance criteria
Strategy—create fewer errors, catch more
Test activities that save the developer’s time
Applying improvements

Why test planning often is resisted
Buzzword boilerplate platitudes paperwork
Test plans as the set of test cases
Six reasons to plan testing
Risk elements, relation to testing
Traditional reactive risk analysis, issues
IEEE Standard for Test Documentation
Overcoming controversial interpretations
Testing structure’s advantages
Enabling manageability, reuse, selectivity
Test plans, designs, cases, procedures

Exercise: Anticipating showstoppers
Spotting overlooked large risks
Involving key stakeholders, reviewing plans
Formal and informal risk prioritization
Dynamic identification of design defects
Risk-based way to define test units
Letting testing drive development
Preventing major cause of overruns
Stomach ache metric
Testing highest risks more and earlier, builds
Master Test Plan counterpart to project plan
Strategy approach, use of automated tools
Sequence of tests, sources of data
Entry/exit criteria, anticipating change
Test environment, supporting materials
Estimating testing, avoiding traps
Roles, responsibilities, staffing, training
Schedule, risks and contingencies, sign-offs
Management document, agreements
Maintaining the living document

IEEE Standard on Unit Testing
Requirements-based functional testing
Non-functional requirements challenges
Black Box testing strategy
3-level top-down test planning and design
Detailed Test Plans for large risks
Exercise: Functionality matrix
Test designs for medium-sized risks
Use cases, revealing overlooked conditions
Detailed Test Plan technical document

Structural (white box) degrees of coverage
Flowgraphing logic paths
Applying structural paths to business logic
Exercise: Defining use case test coverage
Flaws of conventional use-case testing
Exercise: Additional use case conditions

Risks, issues integration testing addresses
Graphical technique to simplify integrations
Integration test plans prevent schedule slips
Smoke tests, increasing their value
Special tests
Load, performance, stress testing
Ongoing remote monitoring, reliability
Security, configurations, compatibility
Distribution and installation, localization
Maintainability, support, documentation
Usability, laboratories raising the bar

Why tests need to be designed
Appropriate use of exploratory testing
Exercise: Disciplined brainstorming
Checklists, ad hoc exploratory pros and cons
Data formats, data and process models
Exercise: Applying checklists
Business rules, decision tables and trees
Exercise: Create a decision table
Equivalence classes and boundary values
Exercise: Identify logical equivalence classes
Formal, informal Test Design Specifications
Exercise: Defining reusable test designs
Complex conditions, defect isolation
Test Cases for small risks
Test Case Specifications vs. test data values
Exercise: Writing test cases, script/matrix

Maintenance vs. development, why so harder
Improve attention and knowledge
Regression testing, minefield effect
Exercise: Testing maintenance changes

Key test automation issues
Tools for a managed environment
Coverage analysis, execution aids
Test planning, design, administering
Automated test execution tools, issues
Scripting approaches, action words

What is a test case survey
Relevance for estimating test-based tasks
Traceability concepts and issues
Estimating non-test-based test project tasks
Defect reports that prompt suitable action
Determining defect age
Status reporting people pay attention to
Projecting when software is good enough
Defect density, reductions
Defect detection/removal percentages
Exercise: Measuring testing effectiveness

Course Director

Patrick von Schlag
Mr. von Schlag has more than 25 years of real-world experience managing IT and business organizations. He has served as a consultant, facilitator, and instructor in support of more than 200 ITSM program deployments, with a focus on practical benefits. He holds all 11 ITIL 2011 certifications and runs an accredited learning consultancy focused on Making ITIL Work ™ in real organizations. His customer list includes The Walt Disney Company, Microsoft, Nike, Sears, US Marine Corps, US Army, US Air Force, 2nd and 5th Fleet US Navy, DISA, IRS, Federal Reserve, The Hartford, Citigroup, Amgen, Los Angeles County, Port of Long Beach, GDIT, Accenture, Serco, Deloitte, and hundreds of other market-leading companies.

Register Now!

Print Friendly, PDF & Email
Social Media Auto Publish Powered By : XYZScripts.com