Tl;dr:
Building contract AI yourself is costly. More problematically, it’s hard to get the tech right. You risk spending a significant amount of time and effort and still ending up with a solution (that you have to maintain) which isn’t as good as what competitors have through licensing tech.
As the leader of a software vendor, you (and your teams) are constantly making buy/build decisions. I’ve been there. Over a decade, I was CEO of Kira Systems as we grew from two people to over 200, building a dominant position in our main market.
Increasingly, we’re seeing software builders and systems integrators be interested in adding contract analysis AI features into their apps. This makes a lot of sense. Contracts are like the operating system for businesses, and knowing what they say can help businesses make better decisions. If you’re going to implement a contract analysis AI feature, you have a range of choices that essentially boil down to:
- Build it yourself.
- From scratch.
- By buying a company that has already built it and incorporating their tech.
- License a pre-built system from another vendor (like Zuva, my selfish reason for writing this post instead of doing something more fun with this time!).
In this post, we’ll walk through building it yourself vs licensing in more detail. In a future post, we’ll drill more deeply into the case for buying a company and incorporating their technology. While this post is long, building or licensing contracts AI is a decision with lasting implications, and it makes sense to put in a bit of time to make a thoroughly-thought-through decision here.
Let’s start by covering a key problem of building contract analysis AI, then focus on costs and benefits.
Contract Analysis AI Is Hard To Build Well
In January 2011, I found myself at a Starbucks in Toronto, pitching four University of Waterloo computer science PhD grads on the idea that pulling data from contracts could be valuable. They thought this sounded like a not-too-hard problem, where leading open-source machine learning software plus maybe 100 annotations per category—about four months worth of work—would be likely to yield a well-performing system. I clicked well with one of the comp sci PhDs (my co-founder Dr. Alexander Hudek), and we set to work. Six-some-odd months later, we realized that the problem was a lot harder to solve than expected. After about another year of intense work, our system got okay, and with another year it got good. Over time, we built our machine learning research team out and continued to advance the tech. Looking back, I think we were lucky to get it to work well so quickly (even though it took much longer than expected). Also, it helped that getting our AI to work was basically the be-all-and-end-all of our business at the time. If we didn’t solve this problem, we would fail. Period.1
What went wrong? Essentially, the state of the art wasn’t where we thought it was and many contracts turn out to have a lot more variation than expected. Sure, there are pretty standard contracts and easy clauses (governing law is great! Assignment, confidentiality, and term can all be good), but the trick is that many contracts are on third party paper, and our experience was that many critical clauses could be drafted a lot of different ways. Exacerbating the problem is that experienced lawyers tend not to agree exactly on what exactly a given clause should find (as we later found when we did research on this). Building accurate contracts analysis AI is hard, and harder than expected.2
My sense is a number of companies have been down this same path with building their own contracts AI. I’m skeptical that many got to as positive a final result on getting their tech to work. Publicly available machine learning tech may have advanced since we started, but I believe the problem remains hard. The biggest risk with building your own contracts AI is spending a significant amount of time and effort on this and still ending up with a solution that may not work well . Even if you do get it to work well, it is very unlikely to be as good as the best-in-class licensable contracts AI. That means that your competitors can get this feature built quicker and better by licensing it.
It gets worse. Not only should it take you significantly longer to build your own contracts AI, if you go down the path of building your own AI and it underperforms the best-in-class licensable tech, you may eventually end up switching to licensable tech. But you will have wasted the intervening years. Basically, if you think an AI contracts analysis feature is useful now, get it to work now.
With that overarching issue out of the way, let’s get to considering costs and benefits of building versus licensing.
Costs
There are three main types of cost you’ll face with building contract analysis AI from scratch or licensing it:
- Build/development cost. How much does it cost to get the feature working?
- Running cost. What is your ongoing cost of running and maintaining the feature?
- Opportunity cost. What else might you have done in the interim? Time spent doing one thing means less time to spend on another.
Build It Yourself | License | |
---|---|---|
Build / Development Cost | Likely higher cost. If building an AI solution takes only 6 months (unlikely) for a development team of 10 (including data scientists, product management, software devs, data annotation) then you’ll spend $500k – $1M just to build a solution that you still need to integrate (assuming you can even hire and retain the right talent).3 | Potentially some upfront licensing cost (e.g., to commit to a subscription tier, locking in lower per-unit pricing). Cost to integrate the licensed software into your product. Note that the tech talent to integrate licensed contracts AI is probably less specialized (and expensive) than that needed to build (good) contracts AI tech from scratch. |
Running Cost | You will have processing costs. You will need to ensure that your offering’s performance scales with increased usage and that you have resources available to facilitate this (e.g., ensuring you have necessary computing hardware, and staff to monitor the infrastructure). While some maintenance costs can be offset using cloud services, often you’ll still need individuals to monitor these services and ensure uptime of the solution. You will need to maintain your AI feature and fix bugs in it. Maintaining features can be costly, and distract your product development teams from other priorities. | With most licensed AI contract analysis software, you’ll likely pay a vendor primarily based on usage. The vendor may offer discounts in exchange for volume/duration commitments. This licensing cost is likely to be higher than pure processing costs for running a homegrown AI solution. You may face additional operating costs if you self-host the licensed software. With licensed software you may have to do upgrades from time to time (depending on how you have deployed the software; this shouldn’t be an issue for cloud deployments). Your AI vendor should maintain the licensed software and fix bugs in it. |
Opportunity Cost | Significant. Even if you have personnel on-staff who could build an AI contract analysis feature, competitors can get this feature done with less effort through licensing (leaving more time available to solve other problems). | Note that if you build a “good” framework around a licensed solution then you can potentially trial or iterate through different licensed products more easily (e.g., kind of like a SIM card, I can switch phones easily but people can still reach me easily). |
Benefits
Here are potential benefits you might consider. These aren’t here to try to convince you to implement contract analysis AI—maybe I’ll focus on that another time :) —but rather to outline a set of factors where build vs license could play out differently.
- Marketing benefits. Makes your product appear more attractive, helping brand or demand generation efforts.
- Automated contract data detection seems like a popular feature with buyers, and you are likely to promote that you have this feature.
- My sense is nearly all licensees tout that they have incorporated a third-party AI. That said, depending on the terms of your contract with the AI licensor, you may not need to say whether you built or licensed your AI.
- You need to decide for yourself which will give you the most marketing impact: being able to say you built your own AI or licensed best-in-class tech built by someone else.
- If you decide to build yourself, you should be prepared to answer questions from market commentators, prospects, and customers about how well your solution measures up against best-in-class licensable contract analysis AI.
- Sales benefits. Helps you close deals, faster?
- We have heard that AI contract analysis features can matter for sales. One exec recently mentioned that they had double-digit Q4 prospects in their funnel where this feature was important to them.
- If the feature matters, getting it done quickly and with high certainty might cut towards licensing.
- You can probably fake a demo of your feature if it doesn’t work super well, but will have more trouble in trials and if you get a reputation for your feature not working as well as advertised.
- If you build yourself, prospects are increasingly likely to have questions about how your homegrown tech matches up against best-in-class licensable AI.
- Customer success impact. Makes your existing customers happier, more likely to retain, and increase their spend.
- A state-of-the-art contracts AI system today generally still makes mistakes (misses and false positives) but can be impressive at many tasks.
- Customers are really going to care about how well your feature works.
- Whether you build yourself or license, your customers are likely to have lots of questions about how your contracts AI works, so be sure you make/get quality documentation.
- My experience has been that customers can be very intolerant of imperfect features (even when you are clear about limitations). Generally, building a feature with significant limitations means you will have to over-prioritize maintaining and improving it.
- This cuts towards licensing the technology, since you will then be able to get access to best-in-class AI, and have your provider maintain it.
- Competitive advantage. Note that your product needs to be more than different - it needs to be better (or perceived as better) in order to achieve a competitive advantage.
- If your competitors have contract analysis AI embedded into their apps, then having the feature itself does not give competitive advantage.
- The quality of your feature versus your competitors’ could become a competitive advantage (or disadvantage).
- Unless your homegrown feature is better than the best-in-class licensable alternatives, it will likely create negative differentiation. That is, your product will be differentiated from competitors’, but in a way that makes yours worse.
Unless your homegrown contracts AI feature is better than the best-in-class licensable alternatives, it will likely create negative differentiation.
- Company valuation enhancement. Investors might perceive your company as more valuable because you have AI.
- In general, IP is valuable. If you own your own AI (and it’s good) that should enhance your company’s value.
- Investors have gotten more sophisticated over time with AI, and are likely to ask how a homegrown AI stacks up against a best-in-class licensable offering.
- If your homegrown AI doesn’t match up particularly well, it may be perceived as “technical debt” in that you will likely feel pressure to replace it with a licensed alternative in time.
- My sense is that a long term licensing deal might be enough to satisfy investors.
- Licensing shows investors that you are focused on areas where you can positively differentiate.
Ideally this was a helpful run through my thinking on the topic. I’d love to hear any feedback on how you think about the problem.
With thanks to Dr. Adam Roegiest, whose comments and thinking on this (through our collaboration on “A Guide to Evaluating Contract Analysis AI Solutions”) influenced my views on the subject.
1 We later realized that we could have pivoted to find a market where semi-accurate contracts AI would have been appealing, instead of continuing down the exceptionally difficult path we were then on.
2 Note that it is fairly easy to get 60–75% accurate contracts AI to work, but getting it more accurate can be very tough, in our experience.
3 Given current market conditions for tech talent, these numbers seem conservative.