A Guide for Tech Companies on New AI Rules in Switzerland

New AI Rules Released by FINMA in Guidance 08/2024

Switzerland’s financial regulator, FINMA, has new guidelines for using artificial intelligence (AI). These rules were released on December 18, 2024, in a document called “Guidance 08/2024.”

These guidelines are important for any technology company that sells AI products to Swiss banks or insurance companies. While the rules are for the financial institutions, they directly affect you as a technology provider. Your clients must prove your AI tool is safe and well-managed. If they cannot, they cannot buy your product.

This article explains what these new rules mean for your business and how you can prepare.

What the New FINMA Rules Say

FINMA’s guidance is based on principles, not exact technical instructions. It sets clear goals for managing AI risks.

Someone Must Be in Charge

A human must be responsible for every AI system. Companies cannot blame the AI when something goes wrong. Banks and insurers must assign a person to be accountable for each AI tool.

Keep a List of All AI Tools

Financial companies must create and maintain a full inventory of all their AI applications. This list must include tools they build themselves and tools they buy from other companies, like yours.

Leaders Must Understand AI

The company’s board and executive team have the final responsibility for AI risk. They need to understand AI well enough to set safety limits and oversee its use. They are expected to ask tough questions about how AI systems work.

Know the Dangers of AI

FINMA lists several AI risks that companies must manage. These risks include biased decisions, a lack of clear explanations for AI outputs, and errors from generative AI. Banks must figure out which of these risks apply to each AI system they use.

Banks Are Responsible for Your Tools

This point is critical for tech companies. When a bank uses your AI product, the bank is still 100% responsible for it. The bank cannot pass the risk on to you. This requirement means the bank must closely check and monitor your product to ensure it meets FINMA’s standards.

AI Must Explain Its Decisions

The results from an AI model must be understandable to the staff using it. “Black box” models, where the decisions are a mystery, are considered very high-risk. If your AI cannot explain how it makes decisions, Swiss banks will find it very difficult to buy your product.

Why This Matters for Your Tech Company

FINMA’s rules apply directly to banks, not to you. But in practice, this makes little difference. Your customers are the banks. They must follow these rules, so they will require you to meet the same standards.

Before a Swiss bank buys your AI product, they must perform checks. They will ask you for documents, explanations, and proof that you manage risks like data bias. If you cannot provide this information, you will lose the sale. Being compliant is now a requirement to enter the market.

FINMA’s survey in April 2025 showed that about 50% of Swiss financial firms already use AI. This widespread use means these rules affect a very large market. Your ability to meet these new standards will determine your access to that market.

Switzerland’s Rules vs. The EU AI Act

Switzerland and the European Union are taking different paths to regulate AI. Understanding the difference is key for your business strategy.

  • Switzerland’s Approach: The Swiss method is based on principles. FINMA applies existing financial rules to AI. It tells banks to “manage the risk” but does not give a long list of specific technical rules.
  • The EU’s Approach: The EU AI Act is based on strict rules. It creates fixed categories for AI, such as “High-Risk” or “Prohibited.” Each category comes with specific legal duties.

In Switzerland, whether an AI is “high-risk” depends on how it is used. The same AI tool could be low-risk in one situation and high-risk in another. In the EU, the classification is often fixed by law.

However, many Swiss banks also operate in the EU, so they must follow the EU AI Act anyway. Because of this overlap, they often apply the stricter EU standards to all their systems. For tech companies, this means building your product to meet EU standards is a good strategy to access both markets.

A Practical Checklist for Tech Companies

Here is what you should prepare to meet your clients’ needs.

Documents and Explanations

  • Model Facts: Create a simple sheet that documents your AI’s data sources, training methods, and known limits.
  • Clear Explanations: Write a non-technical summary of how your AI makes decisions. This summary is vital for selling to banks.
  • Data Quality: Be ready to show proof that you have checked your training data for errors and bias.

Oversight and Monitoring

  • Change Log: Keep a record of all updates to your model. Your clients will need to know how you test the AI after it changes.
  • Backup Plan: Define what happens if your AI fails. Does a person take over? Make the process clear.
  • Test Results: Provide reports showing your model works well, even in difficult situations.

How This Affects Your Company’s Funding

Investors look closely at whether a tech product can be legally sold. If your AI product cannot meet these new standards, your potential market in Switzerland is much smaller. This market limitation makes your company a riskier investment.

Getting funding often depends on successful pilot projects with banks. But banks will not test a “black box” AI because of the regulatory risk. Without these pilot projects, it becomes much harder to raise money.

Investors now check for AI governance during their review process. Having your documents, risk plans, and oversight procedures ready will help you move faster through fundraising.

For Swiss investors, navigating this new landscape is key. It highlights the importance of balancing high-risk technology bets with more predictable investments. CapiWell helps investors achieve this balance by providing access to a range of stable, alternative assets, creating a well-rounded multi-asset portfolio.

References

  • FINMA, “FINMA guidance on governance and risk management when using artificial intelligence” (December 18, 2024)
  • FINMA, “Guidance 08/2024: Governance and Risk Management when using Artificial Intelligence (AI)” (December 2024)
  • FINMA, “FINMA survey: artificial intelligence gaining traction at Swiss financial institutions” (April 24, 2025)
  • PwC Switzerland, “What FINMA’s Guidance 08/2024 means for your institution”
  • Lenz & Staehelin, “FINMA Issues Guidance on AI Use in Financial Institutions”
  • Pestalozzi Attorneys at Law, “FINMA guidance on governance and risk management when using AI”

Latest News & Resources