Navigating AI in the UK as a Founder or Entrepreneur
Navigating AI in the UK as a Founder or Entrepreneur
AI is rapidly changing the way we do business. AI presents exciting opportunities for founders and emerging companies but the laws governing AI in the UK are in a state of flux. There are many different laws and guidance to understand and contend with, which can be challenging for emerging businesses.
Join AI and Technology partner Charlotte Walker-Osborn and IP partner Steven James to hear about some of the areas you need to be thinking about as a founder, investor or advisor and when to start having those discussions.
[00:00:00] Welcome to MoFo Perspectives, a podcast by Morrison Foerster, where we share the perspectives of our clients, colleagues, subject matter experts, and lawyers.
[00:00:19] Steven: Hello, welcome to the MoFounders videocast, which is also available to be downloaded as a podcast. My name is Steven James, a partner at Morrison Foerster. MoFounders is a Morrison Foerster group designed to bring together a community of entrepreneurs, investors, and advisors. In this episode, we’re talking about AI and in particular, what you need to think about if you’re a business that is in this space or is planning to be.
[00:00:45] There’s been a lot of discussion about how legal frameworks around the world may need to change to address the challenges and opportunities afforded by AI. To navigate our way through these opportunities and challenges in the UK and elsewhere, I’m delighted to be joined today by Morrison Foerster partner Charlotte Walker-Osborn, who jointly leads MoFo’s European Tech and AI practice.
[00:01:06] Charlotte, welcome to the MoFounders videocast. So, if we could start off by just talking through some of the challenges. So, if you’re, say, a startup and emerging company in this space, you want to be an AI, which I appreciate encompasses all sorts of different things and all the different types of technology. What are the things that you need to be aware of if you’re, say, launching in the UK?
[00:01:294] Charlotte: Good and big question, Steven. Well, I think there’s two aspects really. There’s what I’d call the back-office element of AI that might be used in the founder’s business, but also if, as you say, they’re specifically then using AI in their products or services, there’s that aspect and those are two different things.
[00:01:49] I haven’t counted the regulations, but there are hundreds of regulations around the world. In the UK, I think it’s a question of saying, is this just going to be for the UK? In which case there’s a number of regulations to think about. If you are in the UK, but you’re going to market to other countries, you’re going to have to think about those regulations.
[00:02:16] There’s good and bad news. There’s a lot of incoming principles across all the legislation that companies need to think about and whether they are going to be challenged by customers in terms of their products and services or by governments in terms of how they governed their internal back office or then even on exit.
[00:02:39] What people will look at what we’re really looking at, is there transparency in the AI and how it’s being used? Is it explainable? Is there a human in the loop? If there’s an issue with bias or discrimination, can that be sorted. These are some of the very, very basic principles. Now, we could spend hours on this, so I won’t go into detail, but basically they need to think very carefully about how they are going to answer some of these key principles.
[00:03:11] And then in the EU, as you know, there is a big piece of law coming soon. So if they’re marketing to the EU, that’s one thing, but focusing back on the UK, at the moment, the UK is a bit mixed in its approach on AI. On the one hand, it’s saying it’s pro-innovation. Good news for founders. But on the other, and I’ll ask you a question about intellectual property.
[00:03:36] It does have some very important things for founders to think about. So, the UK is basically saying, we’re not going to have, at the moment, its own AI regulation. And so, founders have to think about all the regulations they’re looking at already: data privacy, if there’s personal data, other laws around product safety, if it’s a AI that’s in a product that’s doing a technical thing, and various things.
[00:04:04] But what the government is saying, and it’s literally just had a consultation and then a response, it’s saying—it is asking all those regulators to put some AI specific rules. So, that means founders are going to have to keep an eye during 2023 and into 2024, what those specific things are, because they will then become statutory, and they will need to be looked at.
[00:04:29] But I do want to ask you about intellectual property, because for me, that’s actually the biggest, most important point here. If you’re in the UK and you’ve got an AI model or you’re developing one and you want to train in the UK with data, the UK’s position—isn’t that pro-innovation? So, I wouldn’t mind you just talking to that point a little bit.
[00:04:53] Steven: Yeah, sure. And it’s, it’s a very real issue. I think part of the challenge is, is that when the UK exited the European Union, it hadn’t transcribed what was going to be a much broader, what they call text and data mining exemption, which the EU has, and that permits effectively mining for commercial purposes, subject to an opt out.
[00:05:18] In the UK, there was consultation, potentially in accordance with this pro-innovation approach to having an even broader exemption, which would be for any purpose, you couldn’t opt out of it. And this was something that was being discussed and mooted around 2022. There was a consultation on it. But when this came out, this prospect of having this super broad exemption came out, there was a huge pushback from the creative industries.
[00:05:45] And the UK obviously has a very strong reputation in the creative sector from content providers. And as a consequence of that, the plans to have this very broad exemption was scrapped early in 2023. So, what we’re left with is what feels like a slightly archaic, outdated, exemption for text and data mining, which allows you to do that. Those activities only for noncommercial purposes, which, of course, is going to be unhelpful for the majority of founders, entrepreneurs, small businesses who want to be used to doing the training for commercial purposes, obviously. And the other problem with the UK position in terms of copyright is that the exemptions are far narrower, say, than the U.S. position, where you have these very broad, fair use principles, which allow you to do alternative things with the copyright protected material that you have.
[00:06:43] So that is a real challenge. And I think that’s something that that the government legislatures are ultimately going to have to work with, because at the moment there are too many risks or many, many risks if you’re going to be training in the UK. So, I imagine most providers would be looking to train elsewhere, which is easier said than done and much more challenging if you’re a startup that’s based here and wants to get into this space.
[00:07:07] So it’s going to be a very difficult thing to navigate. And I would expect moving in this space there’s a UK intellectual property office consultation looking at this, trying to get the balance right between content holders, rights holders, and the AI providers. But it’s going to be a very difficult one to strike because of these competing tensions, wanting to support tech and innovation. But also, being mindful of people’s intellectual property rights and their right to be fairly compensated for that.
[00:07:35] Charlotte: Yeah. And I think it is something to grapple with. You and I both work in the space of licensing in IP and tech. I think how we’ve both been seeing companies deal with it well is actually just to be really cautious about what data they are using for training if they want to train in the UK.
[00:07:55] So there’s many companies, founders, scale ups, and bigger companies, who are just complying with those rules. So, thinking I probably can’t scrape data, but I can go to public areas where I’ve got a free license to use that, or I can strike up deals, data licensing deals, or I can use my own data.
[00:08:20] So I think it depends on what they’re doing and what they’re needing to train. But I think the key point you’re saying is, this is going to be a very contentious area if that’s not right at the start and that can affect the value of the model. So that is absolutely critical, and it can also affect the exit because those are the sorts of things if you want to exit, you might not want to.
[00:08:40] Those are the sorts of things that diligence that we’re helping with will look very, very much at how has the data being captured? So not even just where is it trained, which is what we’re talking about here, but if it’s personal data, data of individuals, has that been captured in a fair way for privacy laws or consumer laws?
[00:09:00] So I think, unfortunately, that money and time doesn’t need to be with lawyers, but time well spent trying to work that position out and get that right early on. And then, of course, lots of people are talking about GenAI, and a lot of people are using GenAI models themselves rather than building their own.
[00:09:20] I think it is just a question of looking at, well, what does the contract say, about those rights to use? And are you comfortable if you really embed that in your products and services that, that is going to be something you can use in the long term. And whether that’s GenAI or someone else’s AI, I think that’s something you and I have been talking to businesses about.
[00:09:41] If you’re embedding AI critically, even into your back-office systems, you need to know whether it’s making sure there won’t be an IP infringement because someone else has done it or in another way. That, that AI isn’t suddenly going to be something you have to stop using, and if you are, what’s the work around?
[00:10:02] So there’s just some important stuff to grapple with. Obviously, I think we’re both in the tech geek space so we don’t see this as an issue. It’s just, you have to get it right early on, which unfortunately, as we know, often we want to just get going. But this is going to be really critical.
[00:10:20] Steven: Yeah, because you’ve touched upon it there as well. There’s a lot of money, obviously, around the AI space at the moment, lots of investment going in, but it’s what are the things you need to make sure that you have in order if you want to get investment or if you want to have an exit? What other things do you really need to be thinking about and getting it right if you’re in that space and looking to achieve one of those outcomes?
[00:10:43] Charlotte: Yeah. And I think rounding it back to these overall principles, therefore, you are saying, whether I am licensing in some AI or building myself, how do we know the data is allowed to be used? As we said, personal data, public data, data that has got a copyright or an IP holder, but then crucially, once you’ve got that data, have you tagged it correctly?
[00:11:13] And have you ensured the AI models trained in a non-biased manner? And forgive me, if you’re a gentleman who is white and of a certain age, and Steven’s heard me say this before, even if you’ve got the right data, you have to think really carefully, if that’s historical data, is that still going to produce the right result?
[00:11:37] Because there was something in America, and this is one of the high-risk, muted areas of AI. If you’re using AI for recruitment, and you look and say, who should be a board member? And you looked at data from maybe not the last 10 years, but even maybe the last 10 years, but certainly before, that would probably say a good board member will be a white male of 60 years old.
[00:12:00] So it’s really applying a lot of common sense. The good news is I think a lot of tech founders, startups, scale ups kind of know this stuff instinctively, but there’s just a lot of legislation and a lot of focus on this. And again, going back to exit, that means that gets crawled over by people who are looking to buy companies and/or investors.
[00:12:25] Steven: Yeah. So, it sounds like what you’re saying in summary is that planning and thinking about these things up front, managing the risk, engaging with different stakeholders really early, making sure you considered all these different aspects. It sounds like that’s what these kind of businesses need to be thinking about.
[00:12:39] Charlotte: Yeah, exactly. And the one thing that I think is critical is also how are you going to have the processes or a person or even some tech in place that if the model is challenged or the result is challenged that you can explain why, and or do something about it. So that technical, how easy is it to unpick the model? So, you’re absolutely right. Just needs a lot of thinking about at the start because it’s not so easy to fix later.
[00:13:11] Steven: Yeah. What I think what we’re saying from this is that this is really in a state of flux. There’s a huge amount of development that there’s going to be in this sector, and we will be giving further updates as the legislation develops, as the principles, code of practice, as the legislative framework in the UK and elsewhere develops.
[00:13:31] And these are things that if you are a founder, if you are in an early‑stage business, you need to be engaging with and grappling with at an early stage. So please do look out for further MoFounders videocasts and podcasts in this area. Thanks very much, Charlotte.
[00:13:46] Charlotte: Thanks, Steven.
[00:13:53] Please make sure to subscribe to the MoFo Perspectives podcast so you don’t miss an episode. If you have any questions about what you heard today or would like more information on this topic. Please visit MoFo.com/podcasts. Again, that’s MoFo, M O F O dot com slash podcasts.