SmartBear announced AI enhancements for API testing, UI test automation, and test management across its product suite, the SmartBear Application Integrity Core™.
Software teams spent the last decade reinventing how they ship code. Releases that once happened quarterly now happen hourly. CI pipelines enforce quality in minutes. Infrastructure is elastic, tests are automated, and deployments are continuous.
So far so good, and not a small achievement.
But accessibility never made this transition. It stayed slow, manual and disconnected from the development cycle. In the process, DevOps unintentionally broke accessibility … not because anyone wanted to, but because modern engineering velocity outpaced the speed and structure of traditional accessibility workflows. Software raced ahead. And accessibility? Well, it got left behind.
No wonder only 5% of website home pages are accessible.
How It Happened
Before fully-automated DevOps took hold, teams operated on longer release cycles. Those cycles created natural pauses where manual audits and static reviews could function. Accessibility fit awkwardly, but it did fit.
Continuous integration and deployment erased those pauses. Now:
■ engineers push multiple changes per day
■ apps evolve constantly
■ testing and security checks run automatically
■ quality gates operate in minutes not weeks
Some engineering teams now ship code more than 1,000 times each week in mature DevOps environments, which means manual checks never see most of what reaches production.
Nothing in traditional accessibility workflows was built for this environment. Manual audits require stable builds. Legacy scanners expect static pages. Reports often arrive long after code has changed.
As a result, accessibility defects slip into production unnoticed. Developers don't notice them at the moment they are introduced. CI does not surface them, and product managers often discover them through customer complaints or risk reviews. Which is to say, way too late.
A Workflow Problem
Let's say that, a few years ago, your team wanted to improve accessibility. You'd have faced three key workflow problems:
■ Tool mismatch. Developers rely on rapid feedback loops. If your team relied on legacy tools, they would have had significant problems with false negatives — meaning they would have let hundreds of accessibility bugs through to production. On the other hand, if your team relied on manual accessibility checks, those could take days or even months to complete. With that sort of timeline, manual checks simply cannot influence daily development. By the time results appear, the pull request is merged or the code path has changed.
■ Static analysis. Legacy tools required manual triggers for each event or view. But modern web applications span SPAs, mobile screens, nested routes, dynamic rendering, shadow DOMs and iFrames. And they routinely contain dynamic states, asynchronous rendering, encapsulated components, modal layers and flows that cross domains. Many issues appear only after interaction, after login or after a certain state transition. Missing a single step in the flow can hide thousands of critical issues. And the alternative, manually stepping through each DOM mutation or state, is both painful and likely to create duplicate issues in reports.
■ Non-actionable reports. Developers usually received long, noisy lists of findings with little context, vague or nonexistent fix instructions, and duplication across components. Sorting the truth from all that noise takes more time than fixing issues. And remember, these issues were often reported from production three months ago, so someone also has to do the work of reproducing the issues in a current build.
These problems can easily cause accessibility issues to grow slowly and invisibly until they become too expensive to repair. And as the complexity of web and mobile ecosystems grows, the gap between legacy scanning and real user experience grows with it.
Adding Accessibility to Automation
For accessibility to succeed in DevOps, it must integrate directly into existing tools, languages, and workflows. Strong programs are timely, consistent, collaborative, and avoid slowing delivery or flooding teams with noise. Automation and developer enablement are essential pillars.
Modern teams should expect:
■ CI integration. Accessibility checks should run on every pull request and commit, keeping pace with large engineering organizations.
■ Support for real user flows. Automated accessibility tools must handle multi-page paths, mobile navigation, deep linking, authenticated sections, and Multi-Factor Authentication (MFA). If a task can be completed by a user, automation should be able to analyze it.
■ Full support for modern architectures. Shadow DOM, iFrames, component libraries, and mobile rendering must be scanned accurately, reflecting how real UI frameworks behave.
■ Performance. CI steps should add minimal overhead, so checks remain enabled.
■ Actionable reporting. Findings should be grouped, enriched with remediation guidance, and mapped to WCAG standards. Low-noise, high-signal reporting drives adoption.
■ Shareable output: HTML or JSON reports enable cross-functional collaboration among QA, engineering, design, and accessibility specialists.
Good, but Not Enough
The good news is, tools now exist that can address all these requirements. But to our way of thinking, even fixing this DevOps mismatch problem in accessibility isn't enough. Finding issues in CI is good. Finding and fixing them even earlier is better.
Modern developer workflows increasingly rely on context aware tools inside the IDE. Through emerging standards such as the Model Context Protocol (MCP), accessibility engines can run checks directly within the editor, giving developers immediate feedback while writing or reviewing code.
This has two major advantages:
1. Developers discover accessibility issues before the code ever leaves their workstation.
2. When CI flags a problem, the IDE can assist with guided fixes, documentation and relevant component level insights.
This reduces the need to bounce between logs, browser tools and external documentation. By surfacing issues where developers already work, accessibility becomes part of the development rhythm rather than something discovered later.
Don't Forget QA
Even with strong CI and IDE support, dedicated QA testers play a critical role. They interact with the product the way users do. They exercise flows automation may miss. They explore edge cases and uncover conditions that require human judgment.
Modern accessibility guidance stresses that QA teams need a blend of automation and human-led exploration. Teams should avoid the trap of choosing between "test less, test later, or slow development to a crawl," and instead use automation to scale while preserving manual depth.
Semi automated exploration for web and mobile: testers should be able to run accessibility checks as they interact with the product, whether in a browser or on a device. This helps them identify issues that appear only after specific interactions or in complex app states. Many real world examples discussed in accessibility testing workshops reinforce the value of this combined approach.
One-click issue creation: when they do find issues, testers need to be able to immediately surface them to PMs and development teams through the issue-tracking tools they both use. By integrating directly with platforms, accessibility tools can help reduce inter-team friction.
Our Recommendation
Accessibility becomes sustainable when it becomes routine. The path forward requires combining three layers of feedback.
At development time: IDE and MCP powered tools catch issues at creation.
At CI time: Automated checks on pull requests catch regressions quickly.
At QA time: Semi-automated tools validate real world flows and mobile contexts.
This multi-layer approach mirrors how DevOps already handles quality, security and performance. When accessibility is visible at every stage, it stops being a late phase concern and becomes part of how products are built. Instead of a reaction to a complaint, accessibility becomes part of the craft of writing — and shipping — great code.
Industry News
JFrog announced its partnership with iZeno Pte Ltd, a Singapore-headquartered enterprise technology solutions provider.
Red Hat announced an expanded collaboration with Google Cloud to help organizations accelerate application modernization and cloud migrations.
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the contribution of SQLMesh, an open source data transformation framework, to the Foundation by Fivetran.
Check Point® Software Technologies Ltd. released the AI Factory Security Architecture Blueprint — a comprehensive, vendor-tested reference architecture for securing private AI infrastructure from the hardware layer to the application layer.
CMD+CTRL Security won the following awards from Cyber Defense Magazine (CDM), the industry’s leading electronic information security magazine: Most Innovative Cybersecurity Training and Pioneering Secure Coding: Developer Upskilling.
Check Point® Software Technologies Ltd. announced the Check Point AI Defense Plane, a unified AI security control plane designed to help enterprises govern how AI is connected, deployed, and operated across the business.
Oracle announced the latest updates to Oracle AI Agent Studio for Fusion Applications, a complete development platform for building, connecting, and running AI automation and agentic applications.
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced that Istio has launched a host of new features designed to meet the rising needs of modern, AI-driven infrastructure while reducing operational complexity.
Chainguard announced Chainguard Repository, a single Chainguard-managed experience for pulling secure-by-default open source containers, dependencies, OS packages, virtual machine images, CI/CD workflows, and agent skills that have built-in, intelligent policies to enforce enterprise security standards.
Backslash Security announced new cross-product support for agentic AI Skills within its platform, enabling organizations to discover, assess, and apply security guardrails to Skills used across AI-native software development environments.
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the graduation of Kyverno, a Kubernetes-native policy engine that enables organizations to define, manage and enforce policy-as-code across cloud native environments.
Zero Networks announced the Kubernetes Access Matrix, a real time visual map that exposes every allowed and denied rule inside Kubernetes clusters.
Apiiro announced AI Threat Modeling, a new capability within Apiiro Guardian Agent that automatically generates architecture-aware threat models to identify security and compliance risks before code exists.
GitLab released GitLab 18.10, making it easier and more affordable to use agentic AI capabilities across the entire software development lifecycle.




