English for QA Engineers: Bug Reports, Test Plans, and Reviews
The specific English vocabulary, phrases, and document structures QA engineers use every day — from writing clear bug reports to communicating acceptance criteria and test results.
QA engineers produce a surprisingly large amount of written English every day: bug reports, test plans, test cases, acceptance criteria, review comments, and release sign-off communications. Each document type has its own structure, vocabulary, and expectations.
This guide covers the English language skills that are most important for QA engineers working in English-speaking or international teams.
Bug Reports: The QA Engineer’s Core Document
A bug report is a formal written record of a defect. Its purpose is to give a developer everything they need to reproduce, understand, and fix the problem — without any back-and-forth.
The essential structure
Every bug report should have these sections. The exact names vary by team, but the content is standard:
| Section | What goes here |
|---|---|
| Title / Summary | One-sentence description of the bug |
| Severity / Priority | How bad is it? How urgently must it be fixed? |
| Environment | OS, browser, device, app version, backend environment |
| Steps to Reproduce | Numbered steps from scratch to the bug |
| Expected Result | What should happen |
| Actual Result | What does happen |
| Attachments | Screenshot, screen recording, log file |
Title vocabulary
The title is the first thing developers read. Make it specific and structured.
❌ “Login doesn’t work”
✅ “[Auth] Login form freezes for 3–5 seconds on Chrome 122 when email field contains special characters”
Start the title with the component or feature in brackets, followed by a precise description of the symptom.
Steps to Reproduce: numbered, imperative
Write steps in numbered list format using imperative verbs (commands):
- Open the application at
https://app.example.com/login- Enter any valid email address
- Enter the password:
Test@1234- Click Sign In
- Observe the loading spinner
Note: each step describes one action. Do not combine multiple actions in one step. Do not assume the developer knows what “the login page” looks like — start from a URL.
Expected vs. Actual Result
This is the clearest way to describe a bug in English.
Expected: The user is redirected to the
/dashboardpage within 1 second.
Actual: The loading spinner continues indefinitely. No redirect occurs. No error message is displayed.
Use neutral, factual language. Avoid emotional words like “broken”, “terrible”, or “obviously wrong”.
Severity and Priority Vocabulary
QA engineers regularly need to communicate the severity (how bad is the impact?) and priority (how soon must it be fixed?) of bugs. These are not the same.
| Term | Definition |
|---|---|
| Critical / Blocker | The application crashes or a core workflow is completely broken. Blocks testing or release. |
| Major / High | Significant functionality is impaired, but a workaround exists. |
| Minor / Medium | The feature partially works. The bug has limited impact. |
| Trivial / Low | Cosmetic issue: typo, wrong color, minor layout misalignment. |
Use these terms consistently. Teams often have their own priority scale (P1–P4 or S1–S4), but the vocabulary above maps universally.
Example sentences:
“This is a blocker — users cannot complete the checkout flow."
"I’m logging this as P2. It’s significant but there’s a workaround via the API."
"Setting this to Trivial — it’s a display issue on an older iOS version that affects fewer than 2% of users.”
Test Plans and Test Cases
A test plan is a document that describes the scope, approach, and schedule of testing for a feature or release. A test case is a single documented test scenario.
Test plan vocabulary
“This test plan covers the authentication module, including login, registration, password reset, and session management.”
“Out of scope: third-party OAuth providers (covered separately in the integration test suite).”
“Testing environment: staging server (
staging.example.com) with a dedicated test database seeded with fixture data.”
“Pass/Fail criteria: all P1 and P2 test cases must pass. P3 failures will be logged and triaged in the next sprint.”
Key terms:
- in scope / out of scope — what is and is not covered
- pass criteria / exit criteria — conditions for completing testing
- smoke test — a minimal test to check that the build is not completely broken
- regression test — testing that previously working features have not broken
- edge case — an unusual or boundary input condition
- happy path — the main, expected flow with valid inputs
Writing test cases
A test case has three required parts:
Precondition: The state before the test starts.
Steps: What the tester does.
Expected result: What should happen.
Example:
Test case: Password reset email is sent for registered email
Precondition: A user with emailtest@example.comexists in the database.
Steps:
- Navigate to
/forgot-password- Enter
test@example.comin the Email field- Click Send Reset Link
Expected result: A success message is shown. The user receives a password reset email within 2 minutes.
Acceptance Criteria Language
Acceptance criteria (AC) define the conditions that a user story or feature must meet to be considered done. They are written as Given/When/Then statements (BDD format) or as a numbered list.
Given/When/Then format
This is the most common format in Agile teams:
Given the user is not authenticated
When they attempt to access/dashboard
Then they are redirected to/loginwith a 302 status
Given the user enters an invalid email format
When they submit the registration form
Then an inline error message reads: “Please enter a valid email address” and the form is not submitted
Plain-English checklist format
Some teams prefer a simple list:
- ✅ Unauthenticated users cannot access protected routes
- ✅ Redirects preserve the originally requested URL in a
?returnTo=query parameter- ✅ The session cookie uses
HttpOnlyandSecureflags
Useful phrases for writing AC
“The system must / shall…” — formal, often used in specifications
”The user should be able to…” — describes user capability
”In the event that…” — error/edge case handling
”If [condition], then [result]” — clear conditional statement
”The feature is considered done when…” — defines completion
Test Result Communication
At the end of a testing cycle, QA engineers communicate results to the team and stakeholders.
Release sign-off language
“Testing complete. All P1 and P2 test cases pass. Three P3 issues logged (#312, #314, #315) — none blocking release. Sign-off granted for the
v2.4.0release.”
“I am unable to sign off on this release. The checkout flow has a blocker (see bug #318). This must be resolved before shipping.”
“Partial sign-off: core functionality tested and passing. The mobile payment flow is still in progress — I’ll complete by EOD.”
Reporting test coverage
“Coverage: 45 of 48 planned test cases executed. 2 tests skipped due to environment issue (DB seeding failed). 1 test blocked pending a fix from dev.”
“42 passed, 3 failed, 2 skipped. Test report available in Confluence.”
Daily QA Communication Phrases
Asking for clarification on requirements:
“The acceptance criteria don’t specify the error state for an expired token. Can you clarify the expected behaviour?”
“Is this tested behaviour or undefined? If it’s undefined, I’d recommend we define it before testing starts.”
Communicating progress:
“I’ve completed smoke testing on the build. No critical issues. Moving on to regression testing now.”
“I’m about 60% through the test plan. I expect to finish by end of tomorrow.”
Flagging risk:
“There’s a risk here — we don’t have test data for edge cases in the payment processor. I’d like to delay sign-off until that’s addressed.”
“This area of the codebase has changed significantly. I’d recommend a full regression pass, not just spot checks.”
Key QA Vocabulary Reference
| Term | Meaning |
|---|---|
| regression | Testing that old functionality still works after changes |
| blocker | A bug so severe it stops testing or release |
| flaky test | A test that sometimes passes and sometimes fails for no obvious reason |
| false positive / false negative | Test passes when it should fail / fails when it should pass |
| test harness | The infrastructure that runs automated tests |
| fixture / seed data | Pre-populated test data used to set up test scenarios |
| smoke test | Quick check to verify the build is minimally functional |
| sanity check | Informal term for a quick verification |
| exploratory testing | Unscripted testing — using the application to discover unexpected issues |
| repro steps | Short for “reproduction steps” — how to recreate the bug |
| intermittent / non-deterministic | A bug that only happens sometimes, not consistently |
| root cause | The underlying reason a bug occurred |
| workaround | A way to avoid the bug without fixing it |
QA engineers are often the last line of defence before users encounter a bug. The quality of your written communication — how clearly you describe a bug, how precisely you write AC, how professionally you communicate a sign-off — directly impacts how fast your team can move. Language precision in QA is not a nice-to-have; it is the job.