The SEC’s proposed rule governing how brokerages and investment management firms use artificial intelligence and predictive analytics will lead many self-directed investors to be cut off from access to markets altogether, an attorney for Robinhood warned during a discussion at last week’s Consumer Federation of America’s conference on financial services.
The commission’s proposed rules are meant to limit “conflicts of interest” that can arise when brokerages and asset managers use AI and predictive analytics to make investment recommendations and trading decisions that “optimize for, predict, guide, forecast, or direct investment-related behaviors or outcomes.” Many investor advocates take issue with investment apps that seek to “gamify” investing, using engagement engines more akin to social media platforms that can lead users to make impulsive and potentially harmful investment decisions.
But Robinhood Deputy General Counsel Lucas Moskowitz said that the proposal goes too far and would eliminate the “innovation and technology” that ease access to the markets by self-directed investors who otherwise would not use a traditional brokerage firm. That would lead to the “real tragedy” of those investors leaving the market altogether.
He argued that while FINRA registered brokerages already operate under a best-interest standard, adding additional rules governing how self-directed investors operate on the platforms is a “slippery slope.”
“I think you just go to the next logical conclusion, which is that customers just shouldn’t do this on their own on self-directed platforms, and I think that’s not a result that I would hope anyone would really want,” he said.
But the argument that firms would have to give up some technology is a “red herring,” countered Stephen Hall, the legal director the consumer watchdog group Better Markets.
“It doesn’t say that, nor would it necessarily have that effect,” he said. “What it says is if you have harmful conflicts of interest in the technology you’re using, then you have to rid your technology of that, period. You can still use it.”
Hall said rules currently on the books were not “up to the task” of overseeing AI-related conflicts, because Reg BI relies more heavily on disclosure.
Even if firms got mandated disclosures to investors in time, and if they were understandable, it doesn’t solve AI conflicts because the disclosures would not give investors the tools to decide how best to use the information, he said.
To Hall, the industry needed to face up to the fact that too often the digital engagement practices were a “scam.”
“Is the client better off if they save 10 bucks because they have commission-free trading, but they are induced into trading options and accounts on margins that cause them to lose thousands of dollars?”
The North American Securities Administrators Association also threw its support behind the SEC’s proposal. During the panel, Kristen Hutchens, NASAA’s director of policy and government affairs, said state regulators worried about the twin risks of heightened “emotional investing” and digital scams when firms use AI tools.
She charted historical markers indicating the younger generations flocking to self-directed apps were primed to do so, from the rise of CNBC and 24/7 coverage of every nuance in the market, and the introduction of things like the Nintendo Game Boy and Facebook.
“In hindsight, it may not be a surprise we ended up with a Robinhood,” she said.
Jasmin Sethi, an associate director of policy research for Morningstar, echoed concerns raised in the organization’s letter to the SEC that the rule tends to treat every technology “pretty much the same,” and advocated instead for the commission to take a risk-based approach.
“If it’s going to be a rule dealing with technology, it should be a rule about technology, not mixing conflicts and technology,” she said. “Those are separate things.”
After being released in late July, the comment period for the proposed rule extended for 60 days. The revised final version will likely be released in 2024.