Skip to main content
“Governance” is one of those words that feels like it belongs to someone else. A compliance team, maybe, or a VP who sends emails with “audit” in the subject line. It’s easy to assume it doesn’t have much to do with your day-to-day work as a developer. That assumption is wrong. If you work in a regulated industry, governance is already yours whether you wanted it or not. And if you’re using AI tools to write code, you’re producing work faster than anyone around you can review it.

Governance is a chain of contracts

Strip away the jargon, and governance is really a chain of contracts. Each link in the chain creates obligations that flow downward. Your company holds a certification (ISO 27001, SOC 2, PCI DSS, pick your flavor; Upsun holds all of them). That certification comes from a certification body that audits your company against a set of controls. Your company agreed to maintain those controls in exchange for the right to display the certification logo on its website and hand it to prospective customers. Your company also has contracts with its customers. Those contracts often reference the certification explicitly: “We are SOC 2 Type II compliant, and here’s the report to prove it.” Customers signed on that basis. They’re trusting that the controls described in the certification are actually in place. Then there’s your employment contract. You agreed to follow company policies, which include the controls required by the certification. You might not have read that part carefully, but the obligation is there. So the chain looks like this: certification body sets the standard, company commits to the standard, you commit to the company. Governance is the practice of keeping this chain coherent. Making sure that what the certification body expects, what the company promises its customers, and what you actually do on a daily basis are all the same thing.

Blame culture vs. root cause

Here’s where governance gets its bad reputation. In a poorly run organization, the contract chain becomes a weapon. Something goes wrong in production, and management traces the chain downward until they find someone to blame. The chain exists, but it’s only invoked after the fact, as a forensic tool for assigning fault. This is not governance. This is dysfunction wearing governance’s clothes. In a well-run organization, governance does the opposite. Because the chain of contracts makes expectations explicit, everyone knows what they’re accountable for before something goes wrong. When an incident happens, you can run a blameless postmortem precisely because accountability was already clear. You’re not trying to figure out who should have done what; you’re figuring out why the system allowed a failure to happen despite the controls. The scary part of governance isn’t the accountability itself. It’s ambiguity. When nobody knows where their obligations start and end, every incident becomes a political exercise. Governance, done properly, removes that ambiguity.

AI didn’t change the chain

Your SOC 2 auditor doesn’t care whether Copilot wrote your code or you typed every character by hand. The control says “code must be reviewed before merging to production.” Whether the code originated from a human brain or a large language model is irrelevant to the control. Your customer’s SLA doesn’t have an “AI did it” clause. If a deployment breaks their service, the contractual obligation is the same regardless of how the breaking change was authored. Your employment contract doesn’t distinguish between code you typed and suggestions you accepted from an AI assistant. You committed the code. You pushed it. Your name is on the pull request. The chain of contracts hasn’t changed at all. Every link is exactly where it was before AI tools entered the picture. The certification body still expects the same controls. The company still promises the same things to its customers. You’re still bound by the same employment terms.

The speed gap

What AI did change is the speed at which you produce work that flows through this chain. Before AI tools, writing code was usually the slowest part of the chain. It generally took longer than reviewing it, testing it, or deploying it. The governance controls (code review, testing, approval workflows) tended to have spare capacity because the input rate was naturally limited by how fast a person could think and type. AI flipped that. You can now generate code faster than anyone can review it. The bottleneck moved from production to verification, and the governance structure wasn’t designed for that. This is a hard constraint. A reviewer can only hold so much context in their head at once. They can only carefully read so many pull requests per day, and every hour spent reviewing is an hour they’re not writing their own code. There’s no “review harder” setting. When AI enables you to open ten pull requests in the time you used to open three, the review quality per pull request drops, or the review queue grows, or both. Either way, the controls that the certification body expects are degraded. The chain doesn’t need to change for this to become a problem. Certification bodies don’t need to say anything about AI; the existing controls already cover the situation. “Code must be reviewed before production” doesn’t stop applying because the code was generated faster. The question isn’t whether the governance chain will adapt. It’s whether you and your team can find the right balance between AI-assisted speed and the review throughput your controls actually require.

Understanding the chain gives you clarity

None of this should make you anxious. It should make you informed. The accountability was already yours before AI tools showed up. Your name was already on the pull request. Your company was already bound by its certifications. Your customers were already relying on those commitments. What’s different now is that you can produce more work, faster, than any human can verify. The governance structure doesn’t need to change, but how you work within it does. Knowing the chain exists, understanding where the contracts connect, and recognizing the limits of human review capacity gives you something useful: the ability to make deliberate choices about how fast you move and how carefully you verify. Governance isn’t a bureaucratic obstacle. It’s a map of who promised what to whom. Reading the map is the first step toward working within it without getting surprised by it.
Last modified on April 14, 2026