COMPLIANCE
AI Administrative Use Guidelines
By Scott Snipkie
If you read enough about artificial intelligence (AI), it certainly seems capable of doing anything. My personal favorite “use” of AI is tasking DALL•E to create a picture of Winston Churchill and Bart Simpson playing badminton in the style of Goya. You’ve probably got higher-minded or more industrious ideas about its use—I know my clients do.
Over the last few years, I’ve been watching the expansion of AI into the advisory industry with hints of hope and trepidation. Some of its earlier uses seemed to revolve around the specific acts of creating financial plans and managing client assets but more recently I’ve noticed a shift. Advisers are using AI to unburden themselves of administrative and ministerial work like capturing meeting notes.
That shift to using AI for those necessary but time-consuming administrative tasks feels like a great idea, and it gets advisers back to doing what they love and what we need them to do: providing personalized investment advice. However, it seems incumbent upon me in my role as a compliance consultant to add a touch of gray to this silver lining.
Focusing on my area of expertise, it’s important I note that nearly every task an adviser performs comes with compliance considerations and obligations. Interposing a tool between yourself and those tasks doesn’t alleviate those considerations, so AI is no different. In fact, the use of AI for administrative tasks probably complicates things somewhat by implicating multiple compliance considerations.
My intention here isn’t to scare you away from using AI. My goal is to highlight some areas to consider in your use, so when incorporating AI into your administrative processes you’ll be able to see around a few corners and stay buttoned up in key areas of your operations.
A few caveats before I proceed. AI is still new, and while I’m trying to provide some guidance here, what I’m about to note is by no means exhaustive. Also, when adopting any sort of tool into your processes or practice, make sure to talk it over with your compliance team to ensure you’ve got the appropriate policies, procedures, and disclosures in place. Finally, everyone’s business is a little different, so keep that in mind in thinking about the following. With all that in mind, below are some things to think about when considering using AI to help with your administrative load.
- One of the first things to consider is whether the AI software/service you’re going to use records meetings with clients. If it does, that likely constitutes a record of a client communication, which means you may need to retain that recording or have access to that recording so you could produce it for an examiner. Creating that recording may complicate your practice because, if you have to retain it, destroying that recording or failing to retain it might create an issue with your regulator while retaining it might be a substantial burden both in terms of sheer storage space and whether it’s something you’re comfortable sharing with examiners.
- In that same vein, if the software/service makes a recording of your meeting, you’ll probably want to implement a process to get client permission to record. While some states allow people to record conversations without the consent of other involved parties, there are plenty that don’t, so the best practice is likely to be open and honest and always get a client’s permission so you're never in violation of the law.
- Even if the software/service doesn’t record your meetings, you’re likely picking one that either creates a transcript of the conversation or produces notes of it; that’s the point of what you’re doing, right? You’re probably going to rely on those notes at a later date just like you do the notes you’re taking yourself, so you need to be sure they’re accurate. How do you do that? A best practice for transcripts or notes of meetings—apart from, of course, retaining them for books and records purposes—is a contemporaneous review to establish their accuracy. At a bare minimum, this likely looks like reviewing the notes or transcript upon receipt to ensure they match your recollection of events and then making a note that you conducted that review. I can’t make any guarantees, but if I was under regulatory examination and used AI software to capture notes of client meetings, I’d be shocked if an examiner didn’t ask me about the process I use to ensure my notes are accurate.
- Some more areas to consider in your use of AI for administrative purposes are cybersecurity and privacy. Because everyone’s compliance manual is drafted a little differently since it’s tailored to prevent the sorts of violations of the securities laws posed by their business model, these may not apply to everyone. With that said, client meetings often cover sensitive information about their life and financial situation, and exposure of client non-public information (NPI) is a concern for everyone that’s likely addressed in one way or another in everyone’s manuals, so the following should be good food for thought:
- Check to ensure you’ve got written agreement with the vendor licensing the software or providing the service.
- Review the agreement for a clause requiring the vendor to notify you in the event of a breach and that it outlines whose responsibility it is to notify your clients in the event of that breach.
- Determine whether that agreement also contains a confidentiality clause that defines confidential information to include client NPI as well as that vendor’s use, retention, and destruction of any client NPI in its possession.
- Determine whether you need to generate and retain a record that tracks and catalogs that vendor’s access to client NPI.
- Consider whether you should generate and retain a record of the diligence you’ve performed on this vendor in anticipation of the use of their software/service that addresses cybersecurity and privacy concerns.
- Finally, you should consider reviewing your privacy policy notice with an eye toward whether it adequately informs clients of the exposure of their NPI to the vendor associated with this software/service, what sort of NPI the vendor collects, and how and when they collect it. It’s entirely possible you may need to revise your privacy policy to use AI in this way, and that would mean a delivery to all your clients, which could lead them to opt out of your use of AI with them altogether.
As I noted above, I think the use of AI by advisers can have a generally positive impact on firms and clients alike; it’s just the nature of the compliance consultant to be the worrying sort. I hope none of the foregoing has scared you away from its use but rather it’s given you a base of things to consider moving forward, helped you better prepare yourself, and maybe even spurred your consideration of things I haven’t even mentioned here.
Scott Snipkie, a Mizzou (JD, MA) and Penn State (BA) alum, joined Adviser Compliance Services in 2019, and keeps busy away from work by boring his wife, son, and dog with baseball and compliance soliloquies while watching Penn State football. Reach him at scott@advisercompliancesvcs.com or 573-416-8076.
image credit: Adobe Stock Images