The CLOC Conversations Every Legal Ops Team Should Be Having

It was great to be back at CLOC this year. The conversations across sessions made it clear that legal teams are navigating a real moment of change, not just in the tools they're using, but in how they're structured, how they operate, and what they're being asked to own. Here are three themes that kept coming up.

1. When AI isn't working, look at the infrastructure first

In one session, presenters cited that 41% of organizations have no formal AI policy and more than half have no formal AI training in place, and yet the adoption is already happening. When AI doesn't deliver the results people expect, the instinct is to blame the tool. More often, it's a training and governance problem. The tool is only as reliable as the process built around it. That doesn't mean pausing adoption until every policy is in place. It means treating governance and training as active workstreams, not afterthoughts.

2. Legal Ops has evolved into a strategic function

One session on the evolving role of legal ops named something a lot of people in the room were already feeling. The function is no longer primarily about right-sourcing and efficiency metrics. Legal ops is increasingly an advisor to the GC, and GCs are sitting at the C-suite table as strategic advisors to the CEO and board, not just the people who sign off on risk.

That expanded scope is reshaping what legal teams need to look like. What roles do you need that didn't exist three years ago, how do you hire for the intersection of process, legal, and technology, and how do you train for it? Those questions don't have clean answers yet, but they're the ones legal ops leaders are actively working through.

But building those new roles is only part of it. The consistent friction point is change management. Not the technology, not the budget, not the org chart. Getting people to change their behavior is the hardest part of the job, and it's consistently underestimated when teams are scoping new initiatives.

3. Data and process gaps don't disappear when you add technology

Several sessions came back to the same sequencing problem: teams are buying technology before they understand their data gaps, and the technology ends up masking process problems rather than solving them. The questions that belong at the front of any tech evaluation are what the goal of the data is, what you need it for, and what's missing. Technology deployed to extract and surface data is healthy. Technology deployed before you've answered those questions tends to create a different set of problems.

On the process side, a useful diagnostic came up: if the same question gets different answers depending on who you ask, or two people on your team are handling the same task differently, you don't have a documented process. Testing, documentation, and ongoing monitoring aren't optional steps, and neither is governance, even when it slows things down. When it comes to training and onboarding, showing people real examples of good use cases but also bad ones is an underused tool for building the shared understanding that makes process actually stick.

The Big Picture

Across all three conversations, the pattern was the same: the foundation matters more than the tool. Process gaps don't close themselves when you add technology, roles don't evolve without investment in training, and AI doesn't build trust on its own. Getting those things right is the work.