Surveying the risks: RICS proposed updates tackling financial crime
Financial crime is on the rise. In response to the new technologies criminals are using, such as AI and cryptocurrency, RICS launched a consultation calling on members, regulated firms and key stakeholders to respond to its consultation on proposed changes to "The Financial Crime Standard" (The RICS Countering Financial Crime: Bribery, Corruption, Money Laundering, Terrorist Financing, and Sanctions Violation Professional Standard). Originally launched in 2019 as a professional statement and upgraded to a mandatory standard in July 2023, the RICS recognise financial crimes are evolving and its clear the profession needs to adapt ability to address issues such as bribery; corruption; money laundering; terrorist financing; and sanctions violations together for public safety.
We await the results but outline the proposed changes below.
Updates
- Expanded key terms eg AI; Nominated Reporting Officer; SAR; Specially Designated National; Tipping Off; Proliferation Financing; Sanctions; and Trade-Based Money Laundering. The RICS also recognise whilst AI is useful and should be considered eg to improve CDD, it may not be cost effective for all firms and viability will depend on their resources and specific operational demands. Separately, the RICS launched a consultation on the Responsible Use of AI which closed at the end of April 2025.
- Updated to reflect emerging financial crime risks driven by AI, cryptocurrency, and conflict-related trade in resources like gold and timber (often used for money-laundering). Indeed, when asked for funds to be paid in cryptocurrency, members need to remember terrorists favour this method and therefore undertake "enhanced due diligence".
- Strengthened the bribery and corruption section with in-depth guidance on risk and regular reviews.
This includes regular training for employees; developing policies and procedures to manage potential risks; and recording risks identified.
Additions
- Trade-based Money Laundering (hiding illegal money through trade). Watch out for unusual high or suspicious transactions.
- Terrorist financing: looking at supply chains and applying UN rule Resolution 2195.
- Electronic verification: using technology to make checks faster, better and more efficient.
- Due diligence outsourcing: maintaining responsibility.
On the face of it, these new standards may seem straightforward. However, it is important to ensure members understand the enhanced risks, particularly in relation to AI usage. For example, if the member is established in the EU and looking to use or provide an AI system, they are likely to be in scope for the EU's AI Act. Whilst it is unlikely AI used for due diligence will fall within high-risk, it may well fall within limited use. What this can mean is if the member is undertaking due diligence and provide a chatbot for clients to input their own details for the purpose of due diligence, they will need to say it's an AI chatbot and not a person (transparency). Also, if a member is using AI within the EU, they will need to train people within the firm about AI (AI literacy). The training required is proportional to use and the risk presented by the AI system. Whilst there doesn't appear to be a financial penalty for breach of the AI literacy requirement, breach of the transparency requirement can attract fines of up to €15m/3% of turnover.
Stay connected and subscribe to our latest insights and views
Subscribe Here