Shadow AI is creeping its way into software development – more than half of developers admit to using unauthorized AI tools at work, and it’s putting companies at risk
Enterprises need to create smart AI usage policies that balance the benefits and risks
With software developers increasingly flocking to AI tools to support daily activities, new research suggests a concerning portion are using unauthorized solutions.
Findings from Harness’ State of Software Delivery Report show that more than half (52%) of developers don’t use IT-approved tools, raising significant compliance and intellectual property concerns.
“Perhaps the most alarming observation was around the use of company-approved coding tools - of lack thereof,” the report states.
“The unauthorized adoption of AI codegen tools creates significant shadow IT challenges that extend far beyond immediate security concerns.”
Shadow AI is a serious cause for concern for teams, the report added, with developers potentially exposing sensitive code snippets to third-party AI services without proper governance.
“Ultimately, they can’t track the origin of AI-generated code, nor can they ensure consistent security standards across teams,” Harness said.
Software developers aren’t the only ones flocking to shadow AI
The rise of shadow AI has become a recurring talking point over the last two years as enterprises globally flock to the various AI tools available on the market.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
In its Chasing Shadows report, Software AG found 75% of knowledge workers are already using AI, with this figure set to rise to 90% in the near future, and more than 50% of this group use personal or non-company issued tools when doing so.
Another study by customer service platform Zendesk noted there has been as much as a 250% rise year on year in the use of unsanctioned AI tools in certain industries.
The financial services sector was found to be the worst affected by this phenomenon with a 250% spike year on year compared to 2023 levels, but the healthcare (230%) and manufacturing industries (233%) also exhibited very high levels of shadow AI use.
RELATED WHITEPAPER
Developing robust AI usage policies will be integral to ensuring this growing reliance on unvetted AI tools does not expose your enterprise to unnecessary risks.
Harness’ report listed the critical gaps identified by software engineering leaders in their organization’s AI coding tool policies.
Three-fifths of engineering leaders said companies need policies prescribing the processes for assessing code for vulnerabilities or errors, with 58% stating they need to outline specific use cases where AI is safe or unsafe.
Bharat Mistry, field CTO at Trend Micro, told ITPro implementing the policies included in the Harness report were all wise, but highlighted the importance of training when trying to shape employee behaviour and foster responsible use of personal AI systems.
“I agree with the policies given above, however for me it begins with the human aspect. By investing in comprehensive training and awareness programs, businesses can empower their employees to use AI responsibly, identify and mitigate risks and contribute to the development of ethical and effective AI solutions,” he argued.
“This proactive approach not only enhances the organization’s AI capabilities but also builds a culture of trust and accountability around AI technologies.”
Speaking to ITPro, Steve Ponting, director of Software AG echoed Mistry’s comments, noting that training will be essential in mitigating the risks associated with employees using their own AI tools.
“Workers have been clear: they will use AI whether it’s sanctioned or not. This means that businesses could struggle to manage AI applications, leading to cyber-security risks, skills gaps, and inaccurate work.,” he explained.
“Businesses must have a plan in place to reduce risk, build skills and plan for AI’s inclusion in daily work. If people are determined to use their own AI, training is vital in this regard. Better training would make 46% of employees use AI more, but crucially, they would use it effectively and responsibly.”

Solomon Klappholz is a former staff writer for ITPro and ChannelPro. He has experience writing about the technologies that facilitate industrial manufacturing, which led to him developing a particular interest in cybersecurity, IT regulation, industrial infrastructure applications, and machine learning.
-
Google CEO Sundar Pichai says vibe coding has made software development ‘exciting again’News Google CEO Sundar Pichai claims software development has become “exciting again” since the rise of vibe coding, but some devs are still on the fence about using AI to code.
-
15-year-old revealed as key player in Scattered LAPSUS$ HuntersNews 'Rey' says he's trying to leave Scattered LAPSUS$ Hunters and is prepared to cooperate with law enforcement
-
Google CEO Sundar Pichai thinks software development is 'exciting again' thanks to vibe coding — but developers might disagreeNews Google CEO Sundar Pichai claims software development has become “exciting again” since the rise of vibe coding, but some devs are still on the fence about using AI to code.
-
Open source AI models are cheaper than closed source competitors and perform on par, so why aren’t enterprises flocking to them?Analysis Open source AI models often perform on-par with closed source options and could save enterprises billions in cost savings, new research suggests, yet uptake remains limited.
-
‘Slopsquatting’ is a new risk for vibe coding developers – but it can be solved by focusing on the fundamentalsNews Malicious packages in public code repositories can be given a sheen of authenticity via AI tools
-
Microsoft’s Windows chief wants to turn the operating system into an ‘agentic OS' – users just want reliability and better performanceNews While Microsoft touts an AI-powered future for Windows, users want the tech giant to get back to basics
-
Google Brain founder Andrew Ng thinks everyone should learn programming with ‘vibe coding’ tools – industry experts say that’s probably a bad ideaNews Vibe coding might help lower the barrier to entry for non-technical individuals, but users risk skipping vital learning curves, experts warn.
-
European software spending is set to surge in 2026 – here's whyNews Enterprises are approaching the “trough of disillusionment” with AI, but it’s not stopping them from spending
-
AI is transforming Agile development practices as teams battle mounting delivery cycle pressure and ROI concernsNews The influx of AI tools is helping reshape Agile development at a critical juncture for the methodology
-
UK software developers are still cautious about AI, and for good reasonNews Experts say developers are “right to take their time” with AI coding solutions given they still remain a nascent tool