- Home
- TestNG Tutorial
Selenium
TestNG Tutorial
Master TestNG - the most popular Java testing framework used with Selenium. Learn every annotation, assertion, DataProvider, parallel execution, and listener concept with real-world examples.
Why was TestNG created? — JUnit's limitations:
Key advantages of TestNG:
- ✓Flexible annotations — Rich set of @Before/@After annotations at Suite, Test, Class, and Method levels giving precise control over setup and teardown logic.
- ✓Built-in reporting — TestNG generates HTML + XML reports automatically. No extra setup needed. Shows passed, failed, and skipped test counts with execution time.
- ✓Data-driven testing — @DataProvider lets you run the same test method with multiple input sets (e.g., test login with 10 different username/password combinations).
- ✓Parallel execution — Run tests simultaneously across multiple browsers, devices, or threads. Massively reduces execution time for large test suites.
- ✓Re-run failed tests — After a test run, TestNG generates a testng-failed.xml file automatically. You can re-run only the failed tests without changing any code.
- ✓Seamless tool integration — Works with Maven, Gradle, Eclipse, IntelliJ IDEA, Jenkins, and all major CI/CD tools out of the box.
Method 1 — Maven (Recommended for professional projects):
<dependencies> <!-- TestNG dependency — always use latest stable version --> <dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>7.9.0</version> <!-- latest stable as of 2024 --> <scope>test</scope> <!-- scope=test: only needed for testing --> </dependency> </dependencies> <!-- Configure Maven Surefire Plugin to use testng.xml --> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>3.2.5</version> <configuration> <suiteXmlFiles> <suiteXmlFile>testng.xml</suiteXmlFile> </suiteXmlFiles> </configuration> </plugin> </plugins> </build>
Method 2 — Install TestNG in Eclipse IDE:
- 1Help → Eclipse Marketplace Search for "TestNG" → Click "Install TestNG for Eclipse" → Restart Eclipse. This adds TestNG plugin so you can right-click and run TestNG tests directly from IDE.
- 2For IntelliJ IDEA TestNG is bundled by default in IntelliJ IDEA. Just add the Maven dependency and you're ready to go. No additional plugin needed.
Your first TestNG test class — structure:
import org.testng.Assert; import org.testng.annotations.Test; public class FirstTestNGTest { @Test public void verifyGoogleTitle() { // Real-world example: verify a string equals expected String actualTitle = "Google"; Assert.assertEquals(actualTitle, "Google"); System.out.println("Test passed! Title is: " + actualTitle); } @Test public void verifyMathOperation() { int result = 10 + 5; Assert.assertEquals(result, 15, "Math result should be 15"); } }
All TestNG annotations with real-world purpose:
Complete working example with all key annotations:
import org.openqa.selenium.WebDriver; import org.openqa.selenium.chrome.ChromeDriver; import org.testng.Assert; import org.testng.annotations.*; public class LoginTest { private WebDriver driver; @BeforeClass public void setUp() { // Runs ONCE before all tests in this class // Real-world: Initialize WebDriver driver = new ChromeDriver(); driver.manage().window().maximize(); System.out.println("Browser launched"); } @BeforeMethod public void navigateToLoginPage() { // Runs BEFORE every @Test — fresh start for each test driver.get("https://example.com/login"); System.out.println("Navigated to login page"); } @Test public void validLogin() { // Test Case 1: Valid login System.out.println("Running: validLogin"); // ... selenium steps here Assert.assertTrue(true, "Login should succeed"); } @Test public void invalidPasswordLogin() { // Test Case 2: Wrong password System.out.println("Running: invalidPasswordLogin"); Assert.assertTrue(true, "Should show error message"); } @AfterMethod public void clearCookies() { // Runs AFTER every @Test — clean state for next test driver.manage().deleteAllCookies(); System.out.println("Cookies cleared"); } @AfterClass public void tearDown() { // Runs ONCE after all tests finish — close browser if (driver != null) { driver.quit(); System.out.println("Browser closed"); } } }
Complete execution order (top = first, bottom = last):
Real-world execution trace — what you'd see in console output:
// Class has: setUp(@BeforeClass), navigateTo(@BeforeMethod), // testLogin + testSearch (@Test x2), clearCookies(@AfterMethod), tearDown(@AfterClass) setUp() // @BeforeClass — runs ONCE navigateTo() // @BeforeMethod — runs before testLogin testLogin() // @Test #1 clearCookies() // @AfterMethod — runs after testLogin navigateTo() // @BeforeMethod — runs before testSearch testSearch() // @Test #2 clearCookies() // @AfterMethod — runs after testSearch tearDown() // @AfterClass — runs ONCE after both tests
All important @Test attributes with examples:
public class TestAttributes { // ── priority: controls execution order (lower = earlier) ────────── @Test(priority = 1) public void testLogin() { // Runs FIRST — priority 1 < priority 2 System.out.println("Step 1: Login"); } @Test(priority = 2) public void testAddToCart() { // Runs SECOND — depends logically on login System.out.println("Step 2: Add to Cart"); } @Test(priority = 3) public void testCheckout() { System.out.println("Step 3: Checkout"); } // ── enabled: skip a test without deleting it ────────────────────── @Test(enabled = false) public void testFeatureUnderDevelopment() { // This test is SKIPPED — use when feature is not yet ready // Real-world: Feature X is in dev, test is written but not run yet } // ── timeOut: fail test if it takes too long (ms) ────────────────── @Test(timeOut = 5000) public void testPageLoadSpeed() { // If this test takes more than 5000ms (5 seconds), it FAILS // Real-world: Performance assertion — page must load under 5 seconds driver.get("https://example.com"); } // ── invocationCount: run same test multiple times ───────────────── @Test(invocationCount = 3) public void testFlickerCheck() { // Runs this test 3 times — useful for detecting flaky tests // Real-world: If a test passes sometimes and fails other times System.out.println("Flicker test run: " + Thread.currentThread().getId()); } // ── groups: tag test for selective execution ────────────────────── @Test(groups = {"smoke", "regression"}) public void testHomePage() { // Belongs to BOTH groups — runs when either group is executed // Real-world: Quick smoke tests run on every deploy } // ── description: document your test ────────────────────────────── @Test(description = "Verifies user can login with valid email and password") public void testValidLogin() { // description appears in TestNG HTML report — very useful for teams } // ── expectedExceptions: test that an exception IS thrown ────────── @Test(expectedExceptions = ArithmeticException.class) public void testDivisionByZero() { // PASSES if ArithmeticException is thrown, FAILS if not thrown // Real-world: Validate that invalid input throws proper exception int result = 10 / 0; // This throws ArithmeticException — test passes! } }
Hard Assert — Stops test immediately on failure:
import org.testng.Assert; import org.testng.annotations.Test; public class HardAssertDemo { @Test public void testHardAssert() { // HARD ASSERT — fails immediately, remaining assertions are NOT checked Assert.assertEquals("Google", "Google"); // ✅ PASSES Assert.assertEquals(10, 20); // ❌ FAILS HERE — test stops Assert.assertTrue(true); // NEVER REACHED // Result: Test FAILED at line 2. Line 3 not checked. } @Test public void testCommonAssertions() { // assertEquals — checks if actual equals expected Assert.assertEquals(actual, expected, "Failure message"); // assertTrue — checks if condition is true Assert.assertTrue(driver.getTitle().contains("Google")); // assertFalse — checks if condition is false Assert.assertFalse(errorMsg.isDisplayed(), "Error should not be visible"); // assertNotNull — checks that object is not null Assert.assertNotNull(driver.findElement(By.id("logo"))); // assertNull — checks that object IS null Assert.assertNull(errorElement); // fail — explicitly fail a test with a message Assert.fail("Feature not implemented yet"); } }
Soft Assert — Continues test even after failure, reports all failures at end:
import org.testng.asserts.SoftAssert; import org.testng.annotations.Test; public class SoftAssertDemo { @Test public void testProfilePage() { // SOFT ASSERT — collects all failures, reports them ALL at end // IMPORTANT: Create new SoftAssert object for each test method SoftAssert softAssert = new SoftAssert(); // Real-world: Verify all fields on a user profile page softAssert.assertEquals(getUsername(), "priya"); // ✅ Passes — continues softAssert.assertEquals(getEmail(), "wrong@email.com"); // ❌ Fails — continues! softAssert.assertTrue(isProfilePicVisible()); // ✅ Passes — continues softAssert.assertEquals(getPhone(), "wrong-phone"); // ❌ Fails — continues! // CRITICAL: Must call assertAll() — this triggers the actual failure report // Without assertAll(), test PASSES even with failed assertions! softAssert.assertAll(); // Result: Test FAILED — shows BOTH failures (email + phone) in one report } }
Complete testng.xml example — all major elements explained:
<!-- DOCTYPE tells TestNG where to find the schema --> <!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd"> <suite name="E-Commerce Test Suite" parallel="methods" <!-- parallel: none | methods | classes | tests --> thread-count="3"> <!-- 3 threads run simultaneously --> <!-- Suite-level parameters — available to ALL tests --> <parameter name="browser" value="chrome"/> <parameter name="baseUrl" value="https://shop.example.com"/> <!-- Test Group 1: Smoke Tests — run on every deployment --> <test name="Smoke Tests"> <groups> <run> <include name="smoke"/> <!-- Only run @Test(groups = "smoke") --> </run> </groups> <classes> <class name="com.tests.LoginTest"/> <class name="com.tests.HomePageTest"/> </classes> </test> <!-- Test Group 2: Regression Tests — run nightly --> <test name="Regression Tests"> <classes> <class name="com.tests.LoginTest"> <!-- Run only specific methods from this class --> <methods> <include name="testValidLogin"/> <include name="testInvalidLogin"/> <exclude name="testRememberMe"/> <!-- skip this one --> </methods> </class> <class name="com.tests.CartTest"/> <class name="com.tests.CheckoutTest"/> <class name="com.tests.PaymentTest"/> </classes> </test> </suite>
Using @Parameters to receive XML values in test methods:
import org.testng.annotations.Parameters; import org.testng.annotations.Test; public class CrossBrowserTest { // @Parameters reads value from testng.xml <parameter> tag @Parameters("browser") @BeforeClass public void setUp(String browser) { // browser = "chrome" — value comes from testng.xml if (browser.equals("chrome")) { driver = new ChromeDriver(); } else if (browser.equals("firefox")) { driver = new FirefoxDriver(); } } // @Optional provides a default if parameter not in XML @Parameters("environment") @Test public void testLoginPage(@Optional("staging") String env) { // If "environment" not in XML, defaults to "staging" System.out.println("Running on: " + env); } }
Groups — tag tests for selective execution:
public class ECommerceTests { // ── SMOKE tests: run on every code deploy (fast, critical only) ─── @Test(groups = "smoke") public void testHomePageLoads() { /* ... */ } @Test(groups = "smoke") public void testLoginPageLoads() { /* ... */ } // ── REGRESSION: full suite — run nightly ────────────────────────── @Test(groups = {"smoke", "regression"}) // belongs to BOTH groups public void testUserLogin() { /* ... */ } @Test(groups = "regression") public void testForgotPassword() { /* ... */ } // ── PAYMENT: run only before payment module changes ─────────────── @Test(groups = {"regression", "payment"}) public void testCreditCardPayment() { /* ... */ } @Test(groups = "payment") public void testUPIPayment() { /* ... */ } }
Dependencies — define test execution order based on logic:
public class OrderFlowTest { // STEP 1: Login — no dependency @Test(groups = "flow") public void testLogin() { System.out.println("Logging in..."); // If this FAILS → testAddToCart is SKIPPED (hard dependency) } // STEP 2: Runs only AFTER testLogin SUCCEEDS @Test(dependsOnMethods = {"testLogin"}, groups = "flow") public void testAddToCart() { System.out.println("Adding product to cart..."); // If testLogin failed → this is SKIPPED automatically } // STEP 3: Depends on multiple methods @Test(dependsOnMethods = {"testLogin", "testAddToCart"}) public void testCheckout() { System.out.println("Checking out..."); // Runs only if BOTH testLogin AND testAddToCart passed } // ── Soft dependency — always runs, even if dependency failed ────── @Test(dependsOnMethods = {"testLogin"}, alwaysRun = true) public void testLogout() { // alwaysRun=true = soft dependency // Runs after testLogin regardless of pass or fail // Real-world: Cleanup/logout must always happen System.out.println("Logging out..."); } // ── dependsOnGroups — depend on an entire group ─────────────────── @Test(dependsOnGroups = {"flow"}) public void testOrderHistory() { // Runs only after ALL tests in "flow" group pass System.out.println("Checking order history..."); } }
Basic DataProvider — login test with multiple users:
import org.testng.annotations.DataProvider; import org.testng.annotations.Test; public class LoginDataDrivenTest { // ── Define the DataProvider ─────────────────────────────────────── @DataProvider(name = "loginCredentials") public Object[][] getLoginData() { // Object[][] = 2D array // Each inner array = one set of parameters for one test run // Row 1: valid user | Row 2: wrong pwd | Row 3: empty | etc. return new Object[][] { { "priya@gmail.com", "Pass@123", "Dashboard" }, // valid { "priya@gmail.com", "wrongpass", "Invalid credentials" }, // wrong pwd { "nouser@gmail.com", "Pass@123", "User not found" }, // unknown user { "", "", "Email is required" }, // empty { "admin@company.com", "Admin@456", "Admin Panel" } // admin user }; } // ── Use the DataProvider in @Test ───────────────────────────────── @Test(dataProvider = "loginCredentials") public void testLogin(String email, String password, String expectedText) { // This method runs 5 times — once per row in the DataProvider System.out.println("Testing with: " + email + " / " + password); driver.findElement(By.id("email")).sendKeys(email); driver.findElement(By.id("password")).sendKeys(password); driver.findElement(By.id("loginBtn")).click(); // Verify expected text appears on page Assert.assertTrue( driver.getPageSource().contains(expectedText), "Expected: " + expectedText + " for user: " + email ); } }
DataProvider in a separate class — cleaner project structure:
// ── File 1: TestData.java (separate data class) ─────────────────── public class TestData { @DataProvider(name = "registrationData") public static Object[][] getRegistrationData() { return new Object[][] { { "Priya Sharma", "priya@gmail.com", "9876543210", "valid" }, { "Rahul Mehta", "rahul@yahoo.com", "9988776655", "valid" }, { "", "invalid-email", "1234", "invalid" } }; } } // ── File 2: RegistrationTest.java — use DataProvider from other class public class RegistrationTest { // dataProviderClass points to the class containing the DataProvider @Test(dataProvider = "registrationData", dataProviderClass = TestData.class) public void testRegistration(String name, String email, String phone, String type) { System.out.println("Registering: " + name + " (" + type + ")"); // Real-world: test registration form with multiple user profiles } }
4 parallel modes in testng.xml:
Cross-browser parallel testing — the most common real-world use:
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd"> <!-- parallel="tests" + thread-count="3" = 3 browsers run simultaneously --> <suite name="Cross Browser Suite" parallel="tests" thread-count="3"> <!-- Chrome tests --> <test name="Chrome Tests"> <parameter name="browser" value="chrome"/> <classes><class name="com.tests.LoginTest"/></classes> </test> <!-- Firefox tests — runs in parallel with Chrome --> <test name="Firefox Tests"> <parameter name="browser" value="firefox"/> <classes><class name="com.tests.LoginTest"/></classes> </test> <!-- Edge tests — also runs in parallel --> <test name="Edge Tests"> <parameter name="browser" value="edge"/> <classes><class name="com.tests.LoginTest"/></classes> </test> </suite>
Thread-safe WebDriver — critical for parallel execution:
public class BaseTest { // ThreadLocal ensures each thread has its OWN WebDriver instance // Without ThreadLocal, parallel threads would share one browser — BUGS! private static ThreadLocal<WebDriver> driverThread = new ThreadLocal<>(); @Parameters("browser") @BeforeMethod public void setUp(String browser) { WebDriver driver; if (browser.equals("chrome")) { driver = new ChromeDriver(); } else if (browser.equals("firefox")) { driver = new FirefoxDriver(); } else { driver = new EdgeDriver(); } driverThread.set(driver); // store in current thread's slot } // All test classes call getDriver() — each thread gets its OWN driver public WebDriver getDriver() { return driverThread.get(); } @AfterMethod public void tearDown() { if (getDriver() != null) { getDriver().quit(); driverThread.remove(); // prevent memory leaks } } }
ITestListener — screenshot on failure (most common real-world use):
import org.testng.ITestListener; import org.testng.ITestResult; import org.openqa.selenium.TakesScreenshot; import org.openqa.selenium.OutputType; public class TestListener implements ITestListener { // Called when a test STARTS @Override public void onTestStart(ITestResult result) { System.out.println("▶ STARTED: " + result.getName()); } // Called when a test PASSES @Override public void onTestSuccess(ITestResult result) { System.out.println("✅ PASSED: " + result.getName()); } // Called when a test FAILS — take screenshot automatically! @Override public void onTestFailure(ITestResult result) { System.out.println("❌ FAILED: " + result.getName()); // Real-world: Get driver and take screenshot automatically Object testInstance = result.getInstance(); if (testInstance instanceof BaseTest) { WebDriver driver = ((BaseTest) testInstance).getDriver(); File screenshot = ((TakesScreenshot) driver) .getScreenshotAs(OutputType.FILE); // Copy screenshot to test-output/screenshots/ folder FileUtils.copyFile(screenshot, new File("screenshots/" + result.getName() + ".png")); } } // Called when a test is SKIPPED @Override public void onTestSkipped(ITestResult result) { System.out.println("⏭ SKIPPED: " + result.getName()); } } // ── Attach listener to test class ──────────────────────────────── // Method 1: @Listeners annotation on class @Listeners(TestListener.class) public class LoginTest { /* ... */ } // Method 2: In testng.xml (applies to ALL classes in suite) // <listeners> // <listener class-name="com.listeners.TestListener"/> // </listeners>
IRetryAnalyzer — automatically retry failed tests:
import org.testng.IRetryAnalyzer; import org.testng.ITestResult; public class RetryAnalyzer implements IRetryAnalyzer { private int retryCount = 0; private static final int MAX_RETRY = 3; // retry max 3 times @Override public boolean retry(ITestResult result) { // TestNG calls this after each failure // return true = retry the test // return false = don't retry, mark as FAILED if (retryCount < MAX_RETRY) { retryCount++; System.out.println("Retrying: " + result.getName() + " | Attempt: " + retryCount + "/" + MAX_RETRY); return true; // retry } return false; // stop retrying } } // ── Attach RetryAnalyzer to specific tests ──────────────────────── public class FlakyTest { // This test will retry up to 3 times if it fails @Test(retryAnalyzer = RetryAnalyzer.class) public void testNetworkDependentFeature() { // Real-world: Network calls sometimes fail due to latency // RetryAnalyzer retries automatically — no manual re-run needed Assert.assertTrue(callExternalAPI()); } }
Follow these professional best practices to write maintainable, reliable, and scalable TestNG test suites used in real-world Selenium automation projects.
- 1Always use @BeforeClass / @AfterClass for browser setup/teardown — never @BeforeSuite Unless you truly need one browser for the entire suite, use @BeforeClass. @BeforeSuite creates one browser for ALL tests — if one test leaves the browser in a bad state, all subsequent tests are affected. @BeforeClass gives each class its own fresh browser instance.
- 2Use @BeforeMethod to navigate to a known starting point Before each @Test, navigate the browser to the page you're testing. This ensures tests are truly independent — a failing test cannot affect the next test by leaving the browser on a wrong page.
- 3Always call driver.quit() — not driver.close() — in @AfterClass driver.close() closes only the current window. driver.quit() closes ALL windows and terminates the WebDriver process. Not calling quit() causes memory leaks with dozens of orphaned chromedriver.exe processes on the CI server.
- 4Use ThreadLocal<WebDriver> for parallel execution Never use a static WebDriver when running parallel tests. Each thread must have its own WebDriver stored in a ThreadLocal. A BaseTest class that all test classes extend — containing the ThreadLocal WebDriver setup — is the standard architecture in every professional Selenium framework.
- 5Prefer @Test(groups) over copying tests If the same test needs to run in smoke AND regression, add groups = {"smoke", "regression"} — don't copy the test method. Control which groups run via testng.xml. This keeps your code DRY (Don't Repeat Yourself).
- 6Use @DataProvider for all data-driven scenarios Never hardcode test data inside @Test methods. Use @DataProvider so test data can be changed without modifying test logic. For large data sets, read data from an Excel file using Apache POI into @DataProvider — this is the industry standard approach.
- 7Use Soft Assertions for UI page verification When verifying multiple elements on a single page (profile page, order summary, etc.), use SoftAssert so ALL failures are reported in one test run. And always call softAssert.assertAll() at the end — without it, the test silently passes even with failed assertions.
- 8Attach ITestListener via testng.xml — not @Listeners If you add @Listeners on every test class, you'll forget it on some classes. Adding the listener in testng.xml ensures it applies to ALL test classes in the suite automatically — no class-level annotation needed.
- 9Name test methods descriptively — they appear in reports testLogin() is vague. testValidLoginRedirectsToDashboard() or testInvalidPasswordShowsErrorMessage() are clear. When a test fails in Jenkins, the method name is the first thing seen — make it self-explanatory.
- 10Always re-run testng-failed.xml after failures After a test run, TestNG automatically generates test-output/testng-failed.xml. This file contains ONLY the failed tests. In CI/CD, run this file as a second pass to confirm whether failures are real bugs or transient issues. This is standard in every professional QA team.
"TestNG has several advantages over JUnit. First, TestNG supports multiple levels of setup/teardown annotations — @BeforeSuite, @BeforeTest, @BeforeClass, @BeforeMethod and their After counterparts — giving fine-grained control over test lifecycle. JUnit has only @Before and @After (equivalent to @BeforeMethod/@AfterMethod). Second, TestNG has built-in parallel execution support via testng.xml, critical for cross-browser testing with Selenium Grid. JUnit 4 had no native parallel support. Third, TestNG's @DataProvider makes data-driven testing elegant — running the same test with multiple inputs without code duplication. Fourth, TestNG's grouping feature lets teams tag tests as 'smoke' or 'regression' and run specific groups in CI/CD. Fifth, TestNG generates detailed HTML reports automatically with pass/fail/skip counts and execution times. And finally, TestNG's dependsOnMethods attribute allows defining test execution order based on business logic — essential for E2E flow tests like login → addToCart → checkout."
Ready to Master TestNG in Real Projects?
STAD Solution's QA Automation course covers TestNG with Selenium — complete framework design, real projects, CI/CD integration, and 100% placement support.
Explore Courses at STAD Solution →