A few years back, I attended a zoom meeting with a company that teaches how to incorporate AI as a helpful tool for recruiters. They suggested three services: ChatGPT, Gemini, and Claude. Each excels at different things. I didn’t sign up for the course at the time as I didn’t know how much of an asset AI really would be.
There have been multiple studies done on these large language models, Gemini, Claude, and ChatGPT. I would encourage people to experiment. Our firm decided to try out two of the recommended AI models. We went with ChatGPT and Claude. I’ve since added another, Perplexity, but because I haven’t used that one as extensively, I won’t discuss my experiences with it here. I also didn’t use Gemini in depth, so my comments are limited to my experiences using Claude and ChatGPT.
After a year of experimenting with AI in my legal recruiting practice, here are my thoughts:
What was helpful
- Writing job descriptions
I found AI helpful in creating job descriptions. The more information I could input, the better the description came out. In my recruiting practice, I spend a lot of time with each client discussing their hiring needs and ideal candidate profiles. After that conversation, I’m able to articulate what I think is important, what the client does and does not want, and their priorities. The more I talk to either the client or a candidate, I have a better idea of what they want. The same is true with AI. The more information I input into chat, and the more I refine the requests (over several attempts at prompts, refining each time), the better the AI output. It’s not one question or statement in, and we we’re done. When all is said and done, I think what AI was able to produce was well done. It wasn’t perfect, but it was a great start.
- Explaining complex law
Explaining complex concepts and areas of law that I was not familiar with was another area where I found AI helpful. Most recruiters have been to law school or have some background in law (but not all). However, not everyone remembers everything they learned in the classes they took nor will they be well versed in obscure areas of law. Especially if, like me, law school was more than just a few years ago, memories can become hazy. Plus, law continues to change over time. I found putting questions about specific areas of law and legal concepts into a chat query to be a very good starting point. AI is quite good at pointing out important things you may need to understand when talking about any particular subject.
- Initial research
AI is an excellent entry point for research but you can go down a literal rabbit hole if you start going back and forth with AI. Using it to find information on law firms, for example, is very useful. It’s helpful to research salary information (in general), such as what partners are making (in general). It’s useful for asking basic questions, like who oversees recruiting at a particular law firm, when you don’t have a contact.
Caveats
- Double check
I use both of the chat models I’m experimenting with to compare and double check the results. What a chat gives you may not necessarily be 100% correct, so can’t be relied upon blindly. The one thing I remembered from the introductory lecture I mentioned above about the utilization of AI is that it can lie. If you need to count on something AI tells you, try to double check it. (The Gemini AI model runs through Google so it’s free of charge to some degree and might be a good place to start your double checking.)
- Incomplete information
The information AI provides is correct based on what the developers have inputted and uploaded. Therein lies the problem and why I don’t think I would rely on just one AI system or on AI completely. Also, for example, Claude’s most recent updates end in January 2025. So, information about anything that happened during most of 2025 is lacking. You may have to do more digging to make sure you have the most accurate and updated information possible. But as recruiters, as with lawyers, our reputations are important, so we have to do our own diligence.
- Bot talk
While AI can write 50 letters in mere moments if you give it the correct information, you want your work to sound like you, not like a robot wrote it. If I do have AI help me write a letter or job description, I always go back and add my language and how I speak. I found that AI can come close to sounding like the person it’s been chatting with over a period of time. In fact, Claude can be very familiar in its communications with the person asking the questions, which sometimes is quite amusing—especially the time it responded to me, “No sh*t!”
There are differences between ChatGPT and Claude and how they interact with the user. Claude tends to be more human in conversation; hence, the “oh sh*t.” It speaks in a familiar way. It can remember conversations so it may pop out with something that you were not actually asking but it will drop a pearl of wisdom into the chat. I find it mimics a more human voice. On the other hand, I find ChatGPT to be crisper in reasoning and deep research. It writes well, too. I think the choice may come down to personal preference.
Conclusion
These are just a few examples of where AI helped me in my recruiting practice. There are a ton of things it can be used for and I know I haven’t used it to its fullest capacity. Currently, for me, it’s like bouncing ideas off a colleague. In the end, AI is a tool and will not have instincts that I, the recruiter, does, because recruiting is about people—and AI is only a robot.







