Or press ESC to close.

Understanding Stealth Automation

Sep 14th 2025 9 min read
medium
javascriptES6
playwright1.55.0
web

Automation testing often needs to go beyond simply running scripts in a browser. Many websites try to identify when they are being accessed by automated tools and may respond with different behavior than they would for a real visitor. Stealth automation is the practice of making automated browsers appear more like genuine users so that testers can verify the true user experience without interference from bot detection systems.

How Websites Detect Automation

Websites often rely on subtle browser checks to decide whether they are dealing with a real user or an automated tool. These checks are usually fast, client-side scripts that look for inconsistencies in the browsing environment. Some of the most common techniques include:

Websites use these techniques to block malicious bots that scrape content, abuse services, or perform fraud. The challenge is that these same checks often affect legitimate uses such as research, monitoring, and quality assurance testing. As a result, automation engineers need ways to bypass these signals when the goal is to validate the real experience of human users.

What Stealth Automation Does

Stealth automation works by overriding or masking the signals that websites use to identify automated tools. Instead of letting the browser advertise that it is running in automation mode, stealth techniques adjust certain properties so the environment looks more like a normal user session.

For example, the navigator.webdriver flag can be forced to return false, hiding the fact that the browser is being driven by automation. Missing values such as navigator.plugins or navigator.languages can be replaced with realistic defaults. Even canvas or WebGL fingerprints can be adjusted so they produce outputs that match typical browsers instead of obvious headless signatures.

Modern frameworks make this easier. In Playwright, you can use context.addInitScript to inject scripts before any page loads, ensuring properties are patched consistently across tests. In the Puppeteer ecosystem, the puppeteer-extra-plugin-stealth library bundles many of these evasions together, handling common detection points without requiring custom code.

The purpose of stealth automation is not to deceive for malicious use but to allow testers and researchers to interact with websites as if they were real users. This ensures that automated tests cover the same code paths and behaviors that genuine visitors experience.

Demo Website

To make the ideas concrete, the demo is a single small HTML page with a tiny detection script. The page runs a few quick client side checks and shows a simple result: either a friendly confirmation that the browser looks like a real user or a warning that automation was detected.

Here is the core detection function used by the demo. It is intentionally minimal so you can see exactly what each check looks for.

                
// 1. webdriver flag
if (navigator.webdriver) reasons.push("navigator.webdriver is TRUE");

// 2. plugins check
if (!navigator.plugins || navigator.plugins.length === 0) reasons.push("No plugins");

// 3. languages check
if (!navigator.languages || navigator.languages.length === 0) reasons.push("No languages");

// 4. userAgent headless
if (/Headless/.test(navigator.userAgent)) reasons.push("Headless userAgent");
                

What the page shows when you visit it:

Because the demo is small and explicit you can modify or extend each check easily. That makes it a useful teaching tool for QA teams who need to see the difference between raw automation behavior and a stealth-enabled session.

Testing With and Without Stealth

To understand the impact of stealth automation, let's look at two simple Playwright tests that target the demo detection page. The first test runs in a plain headless browser without any modifications. The second test introduces stealth-like overrides that disguise automation signals. By comparing the outcomes of these two runs, we can clearly see the difference stealth makes.

The first test is straightforward. It launches a Chromium instance in headless mode, navigates to the demo page, triggers the detection script by clicking the Run Detection button, and then reads the result displayed on the page. Because nothing in the environment is altered, the demo site easily recognizes automation. The test output confirms this with a message like “Automation detected.”

                
test("Automation is detected", async () => {
  const browser = await chromium.launch({ headless: true });
  const page = await browser.newPage();
  await page.goto(URL);
  await page.click("text=Run Detection");
  const result = await page.textContent("#output");
  console.log("Result (detected):", result);
  expect(result).toContain("Automation detected");
  await browser.close();
});
                

The second test takes a different approach. Instead of exposing the default environment, it applies patches before the page loads using addInitScript. Properties like navigator.webdriver, navigator.plugins, and navigator.languages are overridden to values that mimic what a regular browser would report. With these changes in place, the detection script no longer sees obvious signs of automation. When the test clicks the same button and retrieves the result, the message instead reads “No automation detected.”

                
test("Automation is not detected (stealth)", async () => {
  const browser = await chromium.launch({ headless: true });
  const context = await browser.newContext();

  await context.addInitScript(() => {
    Object.defineProperty(navigator, "webdriver", { get: () => false });
    Object.defineProperty(navigator, "plugins", {
      get: () => [{ name: "FakePlugin" }],
    });
    Object.defineProperty(navigator, "languages", {
      get: () => ["en-US", "en"],
    });
  });

  const page = await context.newPage();
  await page.goto(URL);
  await page.click("text=Run Detection");
  const result = await page.textContent("#output");
  console.log("Result (stealth):", result);
  expect(result).toContain("No automation detected");
  await browser.close();
});
                

The contrast between the two runs demonstrates the value of stealth automation. Without stealth, the environment is quickly flagged and the test only sees the detection path. With stealth applied, the test is able to experience the page in the same way a real user would, which is essential when verifying features that depend on undisturbed user behavior.

The Arms Race

The relationship between detection systems and automation tools is best described as an arms race. Websites constantly refine their methods of spotting bots, from monitoring subtle browser fingerprints to tracking interaction patterns that humans naturally produce but scripts struggle to replicate. In response, automation frameworks evolve with increasingly sophisticated stealth features that mask or randomize the very signals used for detection.

For QA engineers, this contest has practical consequences. When a site deploys bot-blocking code paths, a standard automated test may never see the real user experience. Instead, it interacts with a degraded or restricted version of the site, which skews test results and hides issues that affect actual users. Stealth automation fills this gap by allowing tests to pass undetected and observe the same features, flows, and edge cases that real visitors encounter.

The point is not to bypass security for malicious purposes, but to ensure that testing remains accurate in environments where automation is treated as suspicious by default. By understanding the arms race and applying stealth where needed, QA teams can stay aligned with the real-world behavior of their applications.

Conclusion

Stealth automation is not about tricking systems for the sake of it, but about ensuring that quality assurance reflects the true experience of real users. As detection techniques become more advanced, testers need equally advanced tools to verify features accurately and avoid being misled by bot-blocking code paths. By combining detection demos with stealth-enabled tests, teams gain a clear view of how automation interacts with their sites and why disguising it can sometimes be essential.

The complete code examples used in this blog can be found on our GitHub page. Bye 👋.