⚙️ Process & Workflows — Testing Toolkit
Test Lifecycle
ITIL 4 Service Validation & Testing Lifecycle
Click any step to expand · 7 steps
1
📋Test Planning
2
✏️Test Design
3
🏗️Test Environment Setup
4
▶️Test Execution
5
🐛Defect ManagementDECISION
6
🔄Regression Testing
7
✅Test Completion & Sign-Off
UAT (User Acceptance Testing) Process
UAT Entry Criteria
- All functional and integration tests passed
- No open Critical or High defects
- Test environment is stable and production-equivalent
- Business test scenarios have been reviewed and agreed
- Business testers have been trained
UAT Execution Flow
- Kickoff meeting: Scope, timelines, roles, test scenarios reviewed
- Environment access: Business testers given access to UAT environment
- Scenario execution: Business testers execute real-world business scenarios
- Defect reporting: Business users log issues (with screenshots) in ITSM/Jira
- Fix and retest: Developer fixes; business retests within UAT cycle
- Sign-off: Business sponsor confirms acceptance; signs UAT sign-off document
UAT Exit Criteria
- All critical business scenarios tested and passed
- No Critical defects outstanding
- Agreed High defects have approved workarounds or resolution plans
- Business stakeholder sign-off obtained in writing
Defect Severity Classification
| Severity | Definition | Example |
|---|---|---|
| Critical | System crash or data loss; no workaround | Login fails for all users |
| High | Major functionality broken; workaround exists but complex | Report export fails |
| Medium | Minor functionality affected; simple workaround | Incorrect date format in UI |
| Low | Cosmetic issue; no functional impact | Typo in label, misaligned button |
Defect Report Fields
| Field | Content |
|---|---|
| ID | BUG-001 |
| Summary | Login fails with valid credentials on Chrome 124 |
| Severity | Critical |
| Priority | P1 |
| Environment | UAT |
| Build version | v3.2.1 |
| Steps to reproduce | 1. Navigate to /login 2. Enter valid credentials 3. Click login |
| Expected result | User redirected to dashboard |
| Actual result | Error 500 displayed |
| Screenshot | [attached] |
| Reporter | QA Engineer |
| Assignee | Backend Team |
CI/CD Test Integration
Modern ITIL 4 aligned testing is embedded in CI/CD pipelines:
# Example GitHub Actions pipeline with test gates
pipeline:
stages:
- lint_and_unit_tests: # Block merge if fails
coverage_threshold: 80%
- integration_tests: # Block merge if fails
environment: integration
- security_scan_SAST: # Block if Critical/High findings
tool: SonarQube / Snyk
- deploy_to_staging:
approval: auto (if tests pass)
- performance_smoke_test: # Block if p95 > baseline × 1.2
tool: k6
- DAST_security_scan: # Block if OWASP Critical
tool: OWASP ZAP
- UAT_gate: # Manual approval required
approvers: [product_owner, qa_manager]
- deploy_to_production:
change_ticket: auto-createdPerformance Testing Approach
Test Types
| Type | Objective | Method |
|---|---|---|
| Load Test | Validate behaviour at expected peak load | Ramp to expected concurrent users |
| Stress Test | Find the breaking point | Increase load until system fails |
| Soak Test | Detect memory leaks over time | Sustained load for 4–8 hours |
| Spike Test | Test sudden traffic increases | Instantaneous 10× load spike |
Performance Acceptance Criteria (sample)
| Metric | Threshold |
|---|---|
| Response time p50 | < 500ms |
| Response time p95 | < 2000ms |
| Error rate under load | < 1% |
| System availability | > 99.9% during test |
| No memory leak detected | Heap growth < 10% over 4h soak |
KPIs
| Metric | Target |
|---|---|
| Test coverage (requirements) | > 90% |
| UAT defect escape rate | < 5% |
| Critical defects in production (post-release) | 0 |
| Automated regression coverage | > 70% |
| Test cycle time (change to UAT sign-off) | < 5 business days |
| Defect fix turnaround (Critical) | < 24 hours |
Downloadable Resources
| Resource | Format | Download |
|---|---|---|
| Bug Tracking Register | Excel | ⬇ Download |
| Weekly Test Report | Excel | ⬇ Download |
← Back to Testing Toolkit