Three people sitting on a stage with large screen behind them that says Intersection of AI and the Law.

AI in Business: Risks, rewards and legal realities

From a corner café to the latest app, there’s no easy path to success in business. Matthew Pelkey ’10 sees that every day as director of the law school’s Entrepreneurship Law Center Clinic.       

And that path has become increasingly complex with the rapid growth of artificial intelligence tools. AI has permeated the realm of business promising unparalleled efficiencies and opportunities and at the same time presenting thorny problems of ethics and technical limitations.

It’s a development with increasing relevance for the entrepreneurs whom the clinic advises, and one that Pelkey and his students are confronting. Pelkey spoke to the issue last month at a major State University of New York symposium on AI, held at the University at Buffalo. For that event, Pelkey was on a panel with two members of SUNY’s Office of General Counsel, discussing the intersection of AI and the law.

UB Law Links asked Pelkey, who practices business law with Colligan Law LLP in Buffalo in addition to his teaching, to reflect on how artificial intelligence is playing out in start-ups and other businesses, as well as in the classroom.

man speaking into a microphone.

Matthew Pelkey ’10

There has been much talk about how lawyers and firms can use artificial intelligence, and some have begun using AI tools. Is this something the clinic can benefit from?

There is certainly a role for AI tools, but it is also critical that students learn the foundational tools of practice first—sort of a “learn to walk before you run” situation. I want students to understand how to do something manually, the right way, and then learn how to do it faster and more efficiently.

Are law students rushing to adopt AI?

I think it is unavoidable. There is certainly an interest in it, but it goes beyond that. AI is everywhere, whether we realize it or not. Our clients are creating AI tools. They are using AI tools in working with us. They are using AI tools to know what questions to ask us. They are checking our answers with AI tools. It introduces some efficiencies for them, but what red flags does that raise?

It really depends on the context. It can certainly assist with something like drafting a contract, but there are areas where it is not well suited. For instance, AI tools do not work particularly well if you’re trying to build a capitalization table or calculate pro-formas and the tax consequences of a proposed transaction. AI tools (at least right now) do not do as well with more quantitative work. It also tends to be very generalized. That can give you an OK place to start from, but it is rarely the complete answer and often misses jurisdictional nuances.

Are start-ups more conversant with AI than older established businesses? Do they find a competitive advantage in these tools?

I think more often it is a question of necessity for early-stage startups. They often do not have the same resources as more established companies, so they may be forced to rely on more cost-effective options like AI tools for something like drafting a simple non-disclosure agreement. I’m not sure that’s really an advantage. At the same time, there are start-ups that embrace AI tools in markets that are ripe for disruption, and that can certainly be a competitive advantage. A good example of this are legal AI tools. I think we will continue to see this proliferate, and ideally the legal profession—and law schools—will take a more active role in their development.

Overall, in a business context, do you see more peril or more promise in the advent of readily usable artificial intelligence?

I think it is both. There is no way to avoid it at this point. You cannot put the AI genie back in the bottle. Some areas and tools show incredible promise, others probably less so. What is clear to me is that our clients will continue using these tools whether we as lawyers decide to or not. It is a very effective solution to the cost pressures that clients are facing. Maybe it is not a perfect solution, but if clients can get an 80 percent answer for a fraction of the cost, many will decide that is good enough. We, as a profession, need to figure how we can add value beyond what AI tools can provide.

How can businesses guard against the pitfalls that might result?

Awareness of the limitations of AI tools is important. Yes, ChatGPT can give you an answer, but that does not mean it is the correct answer or the best answer for you under the circumstances. I am not saying businesses should not use these tools to provide foundational knowledge, but understand that it is still important to consult with competent legal counsel.

It is also important, if you are a start-up developing AI tools, to recognize the legal risks that can result from machine learning models—data privacy, discrimination and bias, and intellectual property rights are very real legal issues that can arise when deploying these tools in the marketplace. These are areas where attorneys will undoubtedly be needed.