Skip to Content

What Changes AI might Drive in the IT Technical Services Industry





The Beginning of a Shift in Systems Integration


For most of the modern systems integration era, technical implementation skill has been the industry’s primary currency. The engineer who understood the configuration structures, the scripting languages, the APIs, the command syntax, and the operational behaviour of complex platforms was the person who created value.

That value model is beginning to shift.

Large Language Models are rapidly changing how technical work is performed. Tasks that previously required deep platform familiarity are increasingly becoming instruction-driven. Generating configuration files, writing integration logic, analysing logs, querying data structures, troubleshooting service behaviour, and navigating vendor documentation are moving from manual technical exercises into conversational workflows guided through AI systems.

This changes the role of systems engineers and technicians.

The junior engineer entering the industry over the next few years is likely to spend less time memorising commands and more time learning how to direct AI systems effectively, validate outcomes, identify operational risk, and translate business intent into executable technical actions.

A request that once required detailed implementation knowledge may increasingly begin with a prompt.

An engineer connected to modern AI tooling can already interact with systems in ways that dramatically reduce the friction between intent and execution. A database structure can be analysed in seconds. API relationships can be mapped conversationally. Configuration changes can be generated with contextual awareness of surrounding systems. Troubleshooting increasingly becomes an iterative dialogue between operator and machine.

The impact is not limited to senior engineering roles. The operational ladder that traditionally trained junior technicians through repetitive low-risk technical work is also changing. Entry-level technical tasks are becoming highly automatable, particularly where processes are structured, repeatable, and well documented.

As a result, technical recall is beginning to lose some of its historical commercial value.

The differentiator moves upward toward judgement, context, prioritisation, validation, and business understanding. Knowing what outcome should be achieved becomes increasingly important relative to manually constructing every step required to achieve it.

This shift creates opportunities as much as it creates pressure.

Lower technical barriers increase the speed at which systems can be deployed, modified, integrated, and supported. Smaller teams gain capabilities that previously required far larger engineering departments. Businesses gain faster access to specialised implementation knowledge. Experienced engineers gain leverage through automation and augmentation rather than purely through manual execution capacity.

The organisations that adapt fastest are likely to be the ones that recognise the shift early and begin redesigning their operational models around it.


From Configuration Knowledge to Outcome Knowledge


The effects of AI-assisted execution are not limited to software development or cloud infrastructure. Almost every technical industry built around configuration, procedural knowledge, and operational implementation is likely to experience some version of the same shift.

A Cisco engineer who previously spent years building deep familiarity with routing protocols, IOS syntax, ACL structures, redistribution rules, and vendor-specific implementation behaviour may increasingly spend less time constructing configurations manually and more time defining the operational behaviour the network should achieve. The configuration itself becomes something generated, refined, validated, and tested collaboratively with AI systems.

Network simulation tools may still exist, version control may still exist, and operational governance will remain critically important, but the engineer’s daily activity begins moving upward from syntax construction toward systems reasoning.

The same pattern appears in ERP implementation.

An Odoo or Microsoft Dynamics consultant historically created value through detailed platform familiarity. Knowing where settings lived, how workflows interacted, how reporting structures behaved, and the knowledge of how to configure integrations formed much of the implementation effort. An MCP server is a stopgap for now, but that too looks likely to change. Increasingly, AI systems are becoming capable of navigating large portions of that configuration layer conversationally.

At that point, the consultant who creates the greatest value is no longer necessarily the one who knows the platform most mechanically. The valuable consultant becomes the one who understands how the business actually operates. How inventory affects finance. How procurement affects cash flow. How warehouse structure affects operational throughput. How sales incentives influence reporting requirements. The implementation layer becomes progressively easier to generate while the business understanding layer becomes progressively harder to replace.

The same transition appears outside traditional IT environments.

A CNC engineer who once operated almost entirely at the instruction level may increasingly work alongside systems capable of generating or refining machining instructions automatically. The differentiator moves toward understanding tolerances, production quality, throughput, material behaviour, manufacturing constraints, and end-product requirements rather than manually constructing every operational instruction.

In each case, the technical layer doesn’t disappear. It becomes increasingly abstracted.

The engineer still needs enough technical understanding to validate outcomes, identify risk, recognise failure conditions, and maintain operational discipline. But the commercial value of memorising implementation detail begins to diminish as AI systems become increasingly capable of generating that detail themselves.

That shift has implications for both individuals and organisations.

A business that continues measuring technical value primarily by procedural implementation knowledge may eventually find itself competing directly against internally hosted AI tooling and subscription costs. The organisations that adapt most successfully are likely to be the ones that move their people upward toward operational understanding, systems thinking, governance, commercial awareness, and outcome ownership.


Process, Documentation, and Team Structure Will Evolve


Most operational processes inside IT businesses were built around the assumption that technical work would be performed manually by people following structured procedures.

That assumption shaped everything from training manuals and implementation guides through to escalation models and staffing structures, and even how KPI’s or OKR’s are created and measured. A junior engineer learned by repeatedly performing smaller technical tasks under supervision, gradually building operational familiarity and confidence before taking on larger responsibilities. Responsibilities that carry more risk for the business have been reached, taken on, later when an engineer is more experienced.

As AI systems become more capable of handling structured technical execution, many of those processes begin to change shape.

A deployment document that once existed primarily to guide a technician through configuration steps increasingly becomes a document focused on intent, constraints, validation, rollback conditions, and operational policy. The procedural knowledge itself starts moving into the AI layer.

The same applies to troubleshooting. Historically, support escalation often depended on progressively more experienced engineers manually analysing logs, configurations, integrations, and infrastructure states until someone identified the root cause. Increasingly, much of the analysis phase can be performed almost instantly by AI systems with access to the relevant operational context. I don’t want to signal the end of days for the closing of unsolved ticketing system tickets just yet, but the expectations are changing almost as fast as the deliverables are.

That changes what technical teams spend their time doing.

The engineer becomes less occupied with manually executing repetitive technical procedures and more focused on validating outcomes, managing risk, understanding operational impact, and resolving edge cases that require broader contextual understanding.

This also affects how organisations structure technical departments.

Many businesses currently rely on large operational layers because implementation effort scales roughly in proportion to workload. AI-assisted execution changes that relationship. A relatively small technical team equipped with effective tooling may eventually deliver operational output that previously required substantially larger departments.

That doesn’t remove the need for technical staff. It changes where their value sits.

Businesses begin placing greater emphasis on people who understand systems holistically. An engineer who understands how a warehouse operation functions, how a finance department interacts with ERP workflows, how manufacturing bottlenecks affect reporting structures, or how routing policy affects application performance becomes increasingly valuable because those contextual relationships are far harder to reduce into isolated procedural instructions.

The same pattern affects training.

Historically, repetitive technical work formed part of the learning path. Over time, much of that repetitive work becomes automated. Junior engineers may develop differently from previous generations, spending less time memorising procedures and more time learning operational reasoning, systems interaction, governance, validation, and business alignment.

In many organisations, the first major AI-driven transformation is likely to develop less in the technology stack itself and more in the surrounding operational model.


Today’s Frustrations Are Part of the Transition


Anyone working closely with LLMs today quickly discovers that the technology still requires supervision, a lot of it.

An AI system may produce excellent work for several hours and then suddenly ignore half the instructions it was given. It may misunderstand context, introduce unnecessary complexity, lose track of operational constraints, or confidently produce output that requires correction. Anyone using these systems operationally today learns fairly quickly that they still require validation, oversight, and structured control.

At the same time, the operational gains are already significant enough that many engineers continue using them despite the friction, and also despite the cost.

An engineer can analyse infrastructure, generate configurations, troubleshoot integrations, inspect logs, query APIs, review schemas, and automate repetitive technical work at a speed that would have been extremely difficult to imagine only a few years ago. Even when correction is required, the overall productivity gain is often substantial.

The experience today frequently resembles working with an extremely fast junior engineer. The output volume is remarkable. The initiative can be useful. The accuracy varies. Supervision remains necessary. Yet the overall operational leverage is still commercially valuable because the throughput is so high.

The important point is not whether the systems are currently perfect. The important point is that they are already useful while still improving rapidly.

Each new generation improves instruction following, contextual understanding, reasoning capability, execution quality, and integration awareness. The operational gaps that currently require constant human correction are narrowing steadily. Two or three years from now, many of today’s frustrations may simply become normal historical limitations associated with early-generation systems.

At the same time, AI itself is likely to become progressively less visible as a standalone product category.

Today, many businesses interact directly with platforms like Claude or ChatGPT. Over time,  that capability will increasingly become embedded inside existing software platforms, operational systems, infrastructure tooling, ERP suites, development environments, monitoring systems, networking platforms, manufacturing systems, and business applications.

The same way cloud capability eventually became part of ordinary software architecture, AI capability is likely to become part of ordinary software behaviour.

ERP systems will increasingly include conversational process design, reporting generation, workflow optimisation, and operational analysis directly inside the product. Infrastructure platforms will include AI-assisted troubleshooting and configuration generation natively. Network management systems will analyse topology, routing behaviour, and policy structures conversationally. Industrial systems will optimise production behaviour dynamically.

In many cases, businesses may interact with AI without thinking of themselves as “using an AI platform” at all.

That may also gradually change the role of the major LLM providers themselves. We might speculate that companies like Anthropic and OpenAI may at some point operate primarily as foundational AI infrastructure providers whose models power features embedded inside thousands of third-party business products and operational platforms.

Software vendors have strong incentives to move quickly in this direction. Any platform that fails to integrate meaningful AI-assisted capability risks gradually reducing its own value as external AI systems become increasingly capable of operating around it. Those that lean heavily on the need for an external application to incorporate AI features are likely to find the ground they’re standing on a little softer and less certain than it had been. That said the “In App Purchase” model might level the competitive advantage that an integrated AI model brings somewhat. If one must pay either the software producer or an external third party the advantages diminish to some degree.

As a result, much of the next phase of competition in enterprise software may centre less around whether AI exists and more around how effectively it becomes integrated into operational workflows, decision-making, and execution.


Preparing for the Next Operational Model


The systems integration industry is unlikely to change all at once. Most transitions of this kind happen gradually at first, then suddenly become normal.

For a period of time, many organisations will continue operating with a mixture of traditional implementation approaches and AI-assisted workflows. Some engineers will continue working largely as they do today, while others begin shifting toward models where a significant portion of technical execution is generated, analysed, or validated through AI systems.

The important strategic question for businesses is not whether every prediction arrives exactly on schedule. The important question is whether operational structures are being designed with the direction of change in mind.

Many companies still recruit technical staff primarily around procedural implementation knowledge. Training paths still focus heavily on memorisation, platform familiarity, and manual execution capability, as do the certifications those paths result in. Documentation is often structured around detailed human-operated processes. Operational models frequently assume that scaling delivery requires scaling implementation headcount proportionally.

Those assumptions may gradually become less reliable.

As technical implementation becomes increasingly assisted by AI systems, the value layer begins moving upward. The organisations likely to create the greatest long-term value are not necessarily the ones performing the largest volume of manual configuration work, but the ones helping customers understand what they should be building, why they should be building it, how systems should interact, how information should flow through the business, and how operational decisions support commercial outcomes.

That creates room for greater specialisation, not less.

The systems integrator of the future may spend less time manually configuring software and more time operating as a strategic partner across business operations, process design, automation strategy, data strategy, customer engagement, operational visibility, and decision support.

An ERP consultant may increasingly create value by understanding warehouse efficiency, procurement flow, reporting structures, operational bottlenecks, and management visibility rather than primarily by configuring workflow rules manually.

A network engineering business may increasingly focus on resilience modelling, application performance strategy, segmentation philosophy, operational continuity, and security posture, even more than they are today.

A data-focused consultancy may create value through interpretation, modelling, forecasting, and commercial insight while much of the underlying technical preparation becomes increasingly automated.

In many cases, customers themselves may gain access to increasingly powerful implementation capability directly through the software platforms they already use. At that point, the commercial value of a service provider shifts further upward toward strategy, interpretation, operational alignment, governance, and business optimisation.

The entry-level technician of the near future may develop very differently from previous generations. Technical understanding will still matter enormously, but increasingly as part of a broader operational skillset rather than as an isolated implementation capability.

The organisations that adapt successfully are unlikely to be the ones that simply attempt to automate away technical work. They are more likely to be the ones that recognise where human value becomes more important as technical execution becomes easier.

The implementation layer becomes progressively more automated. The strategic layer becomes progressively more valuable.

All of that said, I’m anticipating that E-commerce checkout will become more “AI personal shopper” and less integrated AI checkout operation.

 

Why Keeping Your ERP System Current Is One of the Best Business Decisions You'll Make