Rounds leads call for SEC to change AI conflict-of-interest proposal

U.S. Sen. Mike Rounds (R-SD) and a bipartisan group of his colleagues recently requested that the U.S. Securities and Exchange Commission (SEC) withdraw and substantially change its proposed artificial intelligence (AI) conflict-of-interest rule.

The “Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker Dealers and Investment Advisers,” which the SEC proposed in July 2023, would harm American innovation and curtail the use of many technologies, including AI, by financial services firms. Such a move could limit market access to both retail and institutional investors, according to a May 23 letter four lawmakers sent to SEC Chairman Gary Gensler.

“The proposal would upend today’s rules by adopting broad definitions of foundational regulatory concepts and a one-size-fits-all approach to addressing conflicts of interest,” wrote Sen. Rounds and his colleagues, who included U.S. Sen. Martin Heinrich (D-NM).

The proposed rule would apply to firms’ current and future use of “covered technologies” in “investor interactions” using broad definitions of these terms that would extend its reach to technologies and firm activities far beyond its stated purpose, the senators wrote. 

For example, the scope of “covered technologies” would capture virtually every technology used by firms, including AI, formulas in spreadsheets, and numerous other basic tools that have been in common use for decades, according to their letter, while the definition of “investor interaction” would cover almost any communication or engagement with a current or prospective investor, they wrote. 

“Taken together, these definitions establish a very broad scope with no clear limit,” wrote Sen. Rounds and his colleagues. “Firms would need to review almost all technology — regardless of use — and document compliance. This would be extremely challenging and expensive.”

The senators pointed out that federal regulatory agencies that implement rules related to the use of data analytics, AI, and other technological tools need to design them carefully so that they complement existing rules related to human behavior and narrowly target the specific and unique properties of the technology.

“Although we recognize there are risks and challenges that the SEC must address as AI evolves, the agency should strive to keep regulation outcome based and aim for technology neutral regulations, except where emerging technology poses unique risks that cannot be adequately addressed by those technology neutral regulations,” they wrote.

The SEC should only consider re-proposing the rule once the lawmakers’ concerns and “robust stakeholder engagement” have informed “extensive, material changes” to the proposed rule, wrote the senators.