Aimed at heads of legal innovation, IT directors and managing partners scoping AI projects at UK law firms. The single biggest determinant of whether a project goes well is how it is briefed at the start — not the cleverness of the technology.
Start With the Problem, Not the Technology
The most common mistake firms make is leading with the technology rather than the problem. "We want to implement AI in the firm" or "we want to use LLMs in the practice" are not useful briefs. They put the cart before the horse and tend to produce projects that look impressive technically but solve nothing specific.
The most productive conversations start the other way: here is a task we do repeatedly, here is how long it currently takes and who does it, here is what the output needs to look like. That specificity is what allows a delivery partner to tell you quickly whether automation is feasible, what it would cost, and what the realistic time saving would be.
The Information That Actually Matters
When preparing to brief a project, focus on collecting the following:
A description of the current manual process
Walk through what someone does today to complete the task, step by step rather than at a high level. If it involves reading documents — which documents, how many, in what format? If it involves pulling data from multiple sources — which sources, how is the data accessed, what format is it in? If it involves compiling a report — what does the report look like and who receives it (deal team, partner, client)?
The volume and frequency
How often does the task happen — daily, weekly, monthly, per matter? How many units of work per instance — how many documents, how many data points, how many records? Volume and frequency together determine the ROI case, so they are the most important numbers to have ready.
Who does the work and at what cost
What seniority is currently doing the task — trainee, paralegal, associate, senior associate? An approximation of fully-loaded cost (salary plus overhead, not charge-out rate) helps the business case. Even a rough figure is useful: "a mid-level associate costing the firm about £80 per hour fully loaded" is more useful than nothing.
What the output needs to look like
What does the finished product look like today? A DD report, a red-flag schedule, a Word memo, an entry in iManage or a matter management system, a structured database? And who uses it — internal, or does it go to a client? The output format determines what the automation has to produce, which directly affects the build.
Sample materials
If possible, bring examples of the inputs and outputs (suitably redacted or from a closed matter). A sample document the system would need to process. A copy of the report it would need to produce. Even approximate examples are valuable — they let a delivery partner assess quickly whether the automation is straightforward or complex, and whether there are edge cases that need handling.
What a Good Scoping Conversation Looks Like
A competent AI delivery partner should be able to tell you, by the end of a 45-minute call, whether your problem is automatable, what the likely approach is, roughly what it would cost to build, and what the realistic time saving would be. If they cannot give you a directional answer after a single conversation, they either lack experience or your brief is still too vague — go back and tighten it.
The conversation should cover inputs (what exactly goes in), outputs (what exactly comes out), edge cases (what happens when input is ambiguous or non-standard), and constraints (data residency, integration with iManage or your matter system, privilege considerations, budget range). If those questions are not being asked, that is a warning sign about the partner you are talking to.
Evaluating a Proposal
When the proposal arrives, scrutinise these things:
- Is the scope specific? A good proposal describes exactly what the system will and will not do. Vague scope is a risk — it gives the supplier flexibility to deliver less than you expected while technically meeting the brief.
- Is the accuracy claim realistic? Any claim of 100% accuracy should be treated with scepticism. A credible proposal specifies expected accuracy on the specific document types involved and describes the validation mechanism — how errors are caught before they reach the partner.
- What happens when it goes wrong? What is the exception-handling approach for documents the system cannot process confidently? A well-designed system flags uncertainties rather than guessing; the proposal should describe this.
- How are privilege and confidentiality handled? Where is data processed? Which sub-processors are involved? Is there a written DPA? Has the supplier worked with regulated UK firms before?
- What does ongoing maintenance look like? Document formats evolve. Regulation changes. Processes change. What is the arrangement for maintaining and updating the system after delivery?
What Makes a Project Go Well
The projects that succeed quickest share three characteristics: a specific bounded scope; a clear owner on the firm side who can make decisions about output format and validation criteria (usually a knowledge lawyer or innovation lead, not a remote IT contact); and real example documents to build and test against. The projects that run into difficulty have vague or expanding scope, no designated decision-maker, and no real materials to work with until late in the build.
Getting these three things in place before you start is the best investment you can make in a successful project. If you would like to talk through scoping a specific workflow at your firm, get a quote.