Chronicle in the IT channel: AI does not do the work for us – it reveals whether we know our job

Chronicle in the IT channel: AI does not do the work for us – it reveals whether we know our job

Jan 15, 2026

Simon Wallin Co-Founder Crux Comms

Simon Wallin, Co-Founder of Crux Comms, recently participated in IT-kanalen with this column on AI and the future of work.

AI has quickly become an integral part of knowledge work.

The tools are faster, more accessible, and more powerful than many had anticipated.

Production has become frictionless. And for this reason, one thing is becoming increasingly clear: AI does not replace competence. It exposes the lack of it.

When AI is used superficially, it acts as a shortcut. When used consistently, it serves as a stress test. It quickly reveals who understands their subject, context, and responsibilities – and who merely produces output.

In environments where AI is fully integrated, it is no longer possible to hide behind pace, volume, or nice form. As everything speeds up, weak strategies, unclear goals, and poor judgment become immediately visible. AI does not fill in the gaps. It magnifies them.

Human work has not become less important. It has become more concentrated. Delimitation, prioritization, framing, and context are now what determine quality. Knowing what should not be done is often more important than producing more.

AI is very good at answering questions. It is bad at determining whether the question is properly posed.

This is where many organizations go wrong. AI is introduced as a production tool before the governance is in place. The result is not laziness, but overproduction. More documents, more analyses, more formulations – but fewer decisions. The flow increases, but the direction is missing.

If used correctly, AI instead serves as a quality requirement. It forces clear goals, explicit assumptions, and accountability. Unclear reasoning can no longer be hidden behind work effort. When something fails, it quickly becomes clear where the problem lies.

This imposes new demands on leadership. Not technical requirements, but demands for intellectual discipline. What has been decided? What is open? What is just a basis? What should actually be done?

The same applies to competence development. AI can be an effective training tool, but only if the requirements are maintained. Those who use AI must be able to explain why an answer is reasonable, see what is missing, and stand by the conclusion. Without that, the tool becomes a substitute for thinking, not a support for it.

In practice, AI acts as a litmus test. It reveals whether an organization has thought clearly before it produces, and whether it is ready to take responsibility when production becomes cheap.

As the AI hype has now settled, what remains is what has always created value: judgment, prioritization, and responsibility. Technology changes the pace. It does not change the demand for craftsmanship.