Engineering

How We Apply ISACA Frameworks to Product Engineering at Lemorange

Lemorange Team 10 min read

Why an ISACA Vocabulary Matters for a Product Builder

ISACA frameworks are written for auditors, governance committees, and risk officers, but the most valuable users in 2026 are the engineering organisations that build the systems being audited. The vocabulary itself, especially the COBIT 2019 distinction between governance and management, between intent and execution, removes a class of arguments that engineering teams have been having for forty years. Who decides what the system must achieve? Who is accountable for delivering it? Who validates that it was delivered? The framework names each role and bounds it.

At Lemorange we build products in environments that range from regulated financial services to consumer facing fintech to integrated accounting and ecommerce systems. The common thread is that every project sooner or later meets an audit, a regulator, an underwriter, a board, or a procurement committee. When that happens, a clean answer to "how do you make decisions about IT and security?" is worth more than another lap around the same architectural debate. We use ISACA's vocabulary because it is internally consistent, externally recognised by 185,000 plus members worldwide, and explicitly designed to bridge the engineering and assurance worlds.

This article walks through the specific frameworks we lean on: COBIT 2019 for governance and management structure, the 2026 ITAF 5th Edition for audit and assurance practice, the Risk IT Framework 2nd Edition for risk scenarios, the CSX cybersecurity model for operational security, the COBIT for DevOps Audit Program for delivery discipline, and CMMI for measuring how mature any of those processes actually are.

COBIT 2019: The Five Domains, the Forty Objectives

COBIT 2019 is structured around 40 governance and management objectives, increased from 37 in COBIT 5, grouped into one governance domain and four management domains. The governance domain is Evaluate, Direct and Monitor (EDM), where the governing body evaluates strategic options, directs senior management on chosen options, and monitors achievement of strategy. The four management domains are Align, Plan and Organise (APO), Build, Acquire and Implement (BAI), Deliver, Service and Support (DSS), and Monitor, Evaluate and Assess (MEA). These are the explicit names from the framework, not paraphrases.

For an engineering organisation, the BAI and DSS domains are home territory. BAI maps to how a system gets defined, acquired, integrated, and put into production. It covers requirements management (BAI02), solution identification and build (BAI03), availability and capacity management (BAI04), and change management (BAI06). DSS covers operational delivery: continuity (DSS04), security services (DSS05), business process controls (DSS06), and incident management (DSS02). When a client asks us to demonstrate that our build process is rigorous and our operations are accountable, we map our practice to specific COBIT objective numbers and show evidence per objective.

COBIT 2019 also defines six capability levels (level 0 through level 5), based on the underlying CMMI capability model. This is the important governance answer to the natural follow up question: "Yes, you have a process, but how mature is it?" Capability levels let us be honest about where a given practice actually sits. A change management process at level 2 (Managed) is a real thing, but it is not the same as one at level 4 (Quantitatively Managed). Saying so explicitly is the difference between a credible governance posture and a marketing one.

ITAF 5th Edition (2026): What Changed and Why It Matters Beyond Audit

ISACA released the IT Audit Framework, 5th Edition (ITAF 5) in 2026, the first major update since 2020. According to the ISACA press release accompanying the launch, the 5th edition broadens the audit framework's scope to cover cloud computing, AI and machine learning, business automation, data analytics, agile auditing, continuous assurance, and AI governance. It explicitly adds expectations for transparency, ethical technology use, and oversight of automated systems.

The reason this update matters to product builders, not just auditors, is that the audit framework now reflects the actual technology stack we ship. ITAF 4 was already a good framework but it carried 2020 era assumptions about IT controls. The 5th edition has language for what we do in 2026: AI features in production, automated decision systems, continuous deployment pipelines, cloud native architectures, and the agile development cadence that makes traditional point in time audits obsolete. When we build systems for clients in regulated sectors, we now have a defensible reference for "auditable by ITAF 5" that did not exist a year ago.

ITAF 5 is published as a free download, which is meaningfully different from the model used by some competing frameworks. For an engineering organisation that wants to align with audit expectations from day one, the cost of entry is the time to read the document, not a license fee. We treat ITAF as a checklist of evidence types: when we are asked for an audit trail, an access review, a change record, or an AI model lineage artefact, the framework tells us what an auditor will reasonably expect to see.

The Risk IT Framework, 2nd Edition: Scenarios, Not Vibes

ISACA's Risk IT Framework, 2nd Edition (released 2020) is the more useful counterpart to COBIT for the parts of engineering work that turn on uncertainty. Its central technique is the risk scenario: a structured narrative that names the asset at risk, the threat actor (with intent or motivation where relevant), the threat method, the timing or frequency, the vulnerability that enables the scenario, and the impact pathway from technical event to business consequence. The 2nd edition was updated for stronger cybersecurity focus and explicit alignment with COBIT 2019.

We use risk scenarios at three points in a project lifecycle. At project initiation we draft a scenario set covering the most plausible adverse paths: an authentication bypass, a data exfiltration through a third party integration, a regulator initiated audit, a supplier insolvency, an AI model drift event. Each scenario is written as a narrative, not as an uncalibrated probability, because narratives generate clearer decisions. During delivery we revisit the set when the architecture meaningfully changes. After release we map actual incidents back to the scenario set and update the scenarios that did or did not predict the event.

A 2025 ISACA Journal piece on quantum risk assessment using the Risk IT Framework illustrates where this is going. The framework is being applied to scenarios that did not exist when it was first written: post quantum cryptographic transition, AI model poisoning, autonomous agent supply chain attacks. The structural method (asset, actor, intent, vector, vulnerability, impact) generalises gracefully, which is the property a framework needs to remain useful for a decade.

CSX and the NIST Five Functions in Production

ISACA's Cybersecurity Nexus (CSX) framework derives its operational structure from the NIST Cybersecurity Framework's five functions: Identify, Protect, Detect, Respond, Recover. This is the same five function spine that informs the EU's national strategies (Cyprus added Evaluate and Improve, but kept the original five at the core). Using a single shared vocabulary for security operations, regardless of which side of the Atlantic the client sits on, removes a great deal of friction.

In practice we use the five functions as a cross check for security architecture in every system we design. Identify covers asset inventory, data classification, third party dependency mapping, and the regulatory scope analysis. Protect covers access control, secure development practices, encryption, and configuration baselines. Detect covers logging, anomaly detection, behavioural baselining, and the alert triage chain. Respond covers the incident response playbook, the legal notification clock under NIS2, GDPR, and sectoral rules, and the communications plan. Recover covers backup and restore design, business continuity, and the lessons learned loop back into Identify. When a system is missing visible controls under any one function, we treat that as a finding before the auditor does.

The CSX certifications, notably CSX P (the Cybersecurity Practitioner credential), are performance based, not multiple choice, which matters when the question is whether someone can actually configure an intrusion detection system or harden a server, rather than recite definitions. We treat the CSX skill domains as a competency map for security oriented engineering work in the same way we use COBIT for governance work.

COBIT for DevOps Audit Program: Auditable Delivery

ISACA released the COBIT for DevOps Audit Program in 2022, a companion to the COBIT 2019 Focus Area: DevOps publication. It maps DevOps practices to specific COBIT management objectives across the four management domains. The audit program tells you exactly what evidence an auditor expects to see for deployment automation, source control discipline, artefact integrity, secret handling, and rollback capability.

We have integrated the audit program into our delivery pipeline as a self assessment checklist. Each pull request that touches deployment infrastructure references the relevant objective number. Each release runs a pre flight check that confirms the artefacts an auditor would request still exist and are accessible: signed commits, dependency provenance, change approval records, infrastructure as code diffs, secret rotation logs. None of this is novel as a DevSecOps practice, but the COBIT mapping turns it from "we follow good practice" into "we follow practice with named external alignment to a recognised audit program," which is a different conversation with regulators and procurement teams.

The internal auditor's role in DevSecOps, as ISACA blog content from 2023 and 2024 has argued, is increasingly that of a catalyst rather than a gatekeeper. We have found the same in our own work. When the audit framework is accessible to the engineering team and the engineering practice is accessible to the audit framework, the friction collapses and the velocity goes up.

CMMI: Saying Out Loud How Mature a Process Actually Is

ISACA's CMMI (Capability Maturity Model Integration) provides five maturity levels: Initial, Managed, Defined, Quantitatively Managed, and Optimising. In 2023 ISACA added three new model domains to CMMI: Data Management, People Management, and Virtual Work, which together make the model usable for the kinds of distributed product organisations that have become standard since 2020.

We use CMMI maturity levels honestly. For any given practice (incident management, secure code review, AI model governance, third party risk assessment) we will say which level we are at and which level we are working towards. A practice at level 1 (Initial) is ad hoc and personality dependent. A practice at level 2 (Managed) is documented and repeatable. A practice at level 3 (Defined) is standardised across the organisation. Level 4 (Quantitatively Managed) means the practice is measured and the measurements drive decisions. Level 5 (Optimising) means the measurements drive continuous improvement of the practice itself. Most healthy engineering organisations live at level 2 to level 3 across most practices, with level 4 reserved for the practices where measurement actually pays back.

The honest level assessment matters because clients and underwriters are increasingly asking the maturity question directly, and an answer of "we follow best practice" is no longer enough. A credible level 2 with a roadmap to level 3 is a stronger answer than an aspirational level 4 that does not survive contact with evidence. ISACA's framework gives both parties the same scale to converse on.

Where the Frameworks Converge for Product Engineering

COBIT 2019, ITAF 5, Risk IT, CSX, the COBIT for DevOps Audit Program, and CMMI are not separate frameworks competing for attention. They are layers of the same governance structure. COBIT names the objectives, ITAF defines how to audit against them, Risk IT structures the uncertainty, CSX structures the security operations, the DevOps Audit Program operationalises delivery, and CMMI measures how well any of the above is actually implemented. ISACA has approximately 185,000 members globally and a credentialling stack of CISA (held by 151,000 plus professionals), CISM (named the 2025 Best Professional Certification Program by Certification Magazine), CRISC (30,000 plus holders), CGEIT (8,000 plus holders), and a growing AI focused credential set, which means the vocabulary is shared by enough professionals across the world for the framework references to be portable.

For Lemorange, the value of building products inside this framework set is that engineering decisions can be defended at every level: to the engineers themselves (who get a clean separation of concerns), to clients (who get a recognised reference for what they bought), to auditors (who get the evidence types they expect), and to regulators (who get the documented governance that NIS2, GDPR, the Digital Operational Resilience Act, and sectoral rules increasingly demand).

We did not adopt ISACA frameworks because they are fashionable. We adopted them because the alternative is reinventing a vocabulary every time a client asks a hard question, and the cost of that reinvention compounds across projects and across years. The frameworks are imperfect (COBIT can feel heavy, CMMI was famously over claimed in the 1990s, ITAF still carries some of its accountancy origins) but they are the strongest set of shared references that exist for the work we do, and using them honestly is cheaper, faster, and more durable than any alternative we have tried.

  • Map every project's decisions to COBIT objective numbers, especially in the BAI and DSS domains
  • Treat ITAF 5 as a checklist of evidence types, not a compliance burden
  • Write Risk IT scenarios at project start, revise them at architecture changes, validate them after incidents
  • Use CSX (Identify, Protect, Detect, Respond, Recover) as a cross check on every security design review
  • Run the COBIT for DevOps Audit Program self assessment in the deployment pipeline, not as an annual exercise
  • State CMMI maturity levels honestly and present a roadmap rather than an aspiration

Looking for help with .NET development, system architecture, or modernization?

We build production systems using the patterns and technologies discussed in this article. Tell us about your project.

Get in Touch