Skip to main content

Data law

We open up possibilities

Whistleblower protection 

Background

The Whistleblower Protection Act (HinSchG) came into force on July 2, 2023, transposing Directive (EU) 2019/1937 into German law. It serves to protect whistleblowers and obliges companies above a certain size or in certain sectors to set up internal reporting channels. The aim is to protect whistleblowers from reprisals and give companies the opportunity to resolve grievances internally before they become public. 

An effective whistleblower protection system is not only a legal obligation, but also offers companies numerous advantages: It increases compliance, reduces the risk of legal violations, and prevents economic damage due to loss of reputation or high administrative fines. Companies that take whistleblower protection seriously also benefit from a better working atmosphere and an open culture of error management. 

Why whistleblower protection is crucial for companies 

Companies benefit in several ways from effective whistleblower protection. 

Early problem detection

Internal reports enable violations to be identified at an early stage and countermeasures to be taken. 

Reputation and trust

Companies that actively protect whistleblowers are perceived as responsible and enjoy greater trust among employees, customers, and business partners. 

Reduction of liability risks

Proactive measures help avoid administrative fines, penalties, and damage to your image. 

Open corporate culture

A whistleblower protection system creates transparency and promotes a culture of open communication.

Who must set up a whistleblower protection system? 

Companies and public authorities with at least 50 employees are generally required to set up an internal reporting channel. In certain sectors, such as the financial sector, this obligation applies regardless of the size of the company. 

Your advantages with our whistleblower protection system HintPilot

Our law firm offers you a comprehensive whistleblower protection system that meets all legal requirements and can be seamlessly integrated into your corporate structure. With our secure reporting  channel HintPilot, we enable simple, confidential, and efficient processing of reports. 

General Data Protection

Regulation (GDPR)

GDPR – Background 

The General Data Protection Regulation (GDPR) has been in force since May 25, 2018, and forms the central legal basis for the protection of personal data in the European Union. It specifies how controllers and processors may carry out the processing of personal data in order to ensure the protection of the privacy of natural persons. The aim of the regulation is to ensure a uniform level of data protection within the EU and to give data subjects more control over their data. 

Basic principles of the GDPR 

The GDPR is based on several basic principles that must be observed by all controllers: 

Lawfulness, fairness, transparency:

The processing of personal data must have a legal basis and be comprehensible to the data subjects.

Purpose limitation

Personal data may only be collected and processed for specified, explicit, and legitimate purposes.

Data minimization

Companies should only collect as much data as is necessary for the purpose for which it is intended.

Accuracy

Personal data must be accurate and kept up to date. 

Storage limitation

Personal data must not be stored for longer than necessary.

Integrity and confidentiality

Companies must take appropriate security measures to protect personal data. 

Obligations of the controller

In addition to these basic principles, controllers must fulfill very specific obligations when processing personal data. These include, for example: 

  • Ensuring the rights of data subjects
  • A reporting obligation in the event of personal data breaches 
  • The appointment of a Data Protection Officer 
  • Ensuring technical and organisational measures for data protection
  • General documentation and accountability obligations 
  • Rules for transfers of personal data to third countries

Penalties for violations

The GDPR provides for severe penalties for violations. These range from warnings to heavy administrative fines of up to €20 million or 4% of a company’s global annual turnover. In addition to financial penalties, violations can also lead to reputational damage and a loss of trust among the data subjects.

Would you like to get an initial overview of the level of data protection in your organisation? Take advantage of our free GDPR quick check. 

Interview with Scheja & Partners

Artificial Intelligence Act (AI Act)

AI Act – Background

The AI Act aims to create a harmonized set of rules within the EU that protects the safety and fundamental rights of citizens while promoting innovation. Clear guidelines are intended to minimize the risks associated with the use of AI.

Objective of the regulation

The AI Act aims to create a harmonized set of rules within the EU that protects the safety and fundamental rights of citizens while promoting innovation. Clear guidelines are intended to minimize the risks associated with the use of AI. 

Classification of AI systems 

A central element of the  AI Actis the risk-based approach, which classifies AI systems into different categories:

Unacceptable risk

AI systems that pose a threat to safety or fundamental rights are prohibited. 

High risk

These systems are permitted but subject to strict requirements.

Specific AI systems

AI systems subject to special transparency requirements.

AI competence:

The AI  Act requires companies, public authorities, associations, and NGOs to ensure that their employees have a “sufficient level of AI competence.” “AI competence” includes knowledge of the opportunities and risks of AI, the rights and obligations under the AI Act, and the ability to use AI systems competently. As this applies to all AI systems, regardless of their risk classification, all entities that use AI systems are affected.

Consequences of using AI:

The core element of the AI  Actis the determination of the specific obligations that apply to you. The first step is to check whether an “AI system” within the meaning of the AI Act exists at all. If this is the case, it must be determined what “role” thecompanies, public authorities, associations, and NGOs play and what risk the respective AI system poses. This gives rise to specific obligations that must be observed when using AI. 

Scheja & Partners provides comprehensive advice on all legal issues relating to the AI Act, including the complex determination of roles and risks, and thus identifies your individual obligations. This allows you to concentrate fully on your core business and ensure that AI systems are used in compliance with the law.

Let’s talk.

Jens-Martin Heidemann, LL.M.