Have management forgotten their GDPR obligations?


Blog post by Dr Christina J. Colclough, Founder of The Why Not Lab. Published 9 December, 2024


Photo by ev on Unsplash

If management is processing employee data, they have a number of obligations towards their employees. These are clearly set out in the GDPR, but much indicates that these provisions are widely violated. It’s time to act!


Over the past years, I have held workshops and keynote speeches for hundreds of European workers, shop stewards and union officials on the digitalisation of work, and worker and union strategies to protect workers’ rights.

In most of these events, an overwhelming percentage (close to 100%) of participants confirm that they (1) do not have knowledge about whether, and if so which digital systems, management are using that process workers’ personal data or personally identifiable information. Nor (2) have they ever been consulted in connection with a data protection impact assessment (DPIA).

This is problematic. Firstly, management can be using digital management systems that, albeit unintentionally, have negative consequences on the freedoms, rights and autonomy of workers, yet if workers and/or their representatives don’t know, they can hardly do anything about it.

Secondly, the vast majority of digital systems used by management are developed by third parties. This adds a dimension to the necessary due diligence. Have the systems been adapted to local laws and regulations? Has management negotiated guardrails that protect the reuse, selling or otherwise transfer of workers’ data? Can the third party deny to amend or recode the system if (unintentional) harms are identified? Do the systems actually do what they claim they will: reduce administrative burdens, improve productivity and efficiency etc? If so, at what cost not only to labour but also to management's own autonomy?

Thirdly, management are obliged by law - the GDPR - to inform employees of all of the above, but given the responses from the shop stewards and workers in my workshops, they apparently are not.

This is leading to an information asymmetry that is weakening the power of organised labour, of labour market laws and agreements and the collective voice. So what GDPR obligations are management likely violating?

Possible GDPR Violations

Articles 13 and 14 GDPR: Information to be provided

The GDPR sets out clear transparency obligations, notably in articles 13 and 14. Here employers (in their role as controllers) must provide the employees with a range of information “at the time when personal data are obtained”, meaning before the data is initially collected or immediately after. This includes information about:

  1. “the purposes of the processing for which the personal data are intended as well as the legal basis for the processing.” (art 13 (c))

  2. the recipients or categories of recipients of the personal data (i.e. whether workers’ personal data or PII are disclosed to other public or private entity other than controller, processor and persons who, under the direct authority of the controller or processor, are authorised to process personal data.” (art 13 (e) and 4 (9)).

  3. “The existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”.

Whilst articles 13 and 14 contain more transparency obligations, not least on the employees’ rights as data subjects, the three mentioned here are important for the following reasons.

  • Knowing the purpose of the data processing allows employees to (a) know what data is being processed, and (b) why. Knowing this, will enable the employees to monitor for purpose-slip, which means whether the employer starts using that personal data for a new purpose. This is important for employees, so they can negotiate guardrails and redlines. For example, if the employer installs electronic keycards that are personal for each employee for security purposes in case of an emergency, but later uses the data from these to evaluate employee performance, this is a purpose slip that workers should address.

  • Knowing which third party receives employees personal data or PII, will provide employees with the possibility to negotiate guardrails around the reuse, repurposing or selling of these data. There are indeed strict limitations on third parties rights to do so in the GDPR, but again if the employees don’t know, they can’t do what they must do to protect their personal data.

  • Profiling is one of the most opaque, yet potentially most impactful aspects of digitalisation. It refers to the aggregation and comparison between lots of data to find (dis)similarities, probabilities and patterns in the data that then is used to profile employees. For example, a profile created could be: “likely to be leaving the workplace soon”, or “less productive and less likely to be appreciated by customers”. These profiles can have a detriment affect on workers who due to them might not get a job, a promotion or a pay rise, or who might be labelled as someone who should be fired. Profiling is a form of automated decision making, but there are many more algorithmic systems that are partially or entirely automated. In Holland, in a case between Uber and drivers, the Amsterdam Court of Appeal found that the human review of the decisions that led to two workers being fired was nothing more than a symbolic act. Management are, in other words, reasonable for ensuring that automated systems are reviewed, not 100 % relied upon and questioned. If the use of digital systems has an impact on workers, it is always managements’ responsibility. They cannot blame the system.

Article 35: Data Protection Impact Assessment

This article states:
Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks.”

What constitutes a ‘high risk’ is defined in recital 75. Here it says:
“Where personal aspects are evaluated, in particular analysing or predicting aspects concerning performance at work.”

This means that the processing of workers’ data by management for the purpose of assessing their performance is a high risk and a data protection impact assessment (DPIA) must therefore be made before a digital management system is taken into use.

Further in article 35. 9. it says that:

“Where appropriate, the controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of processing operations.”

In other words, management should consult with the shop stewards or if there are none, the workers, when conducting a DPIA

The data subjects in our case are the workers, their representatives are the shop stewards and the controller is the employer. In other words, management should consult with the shop stewards or if there are none, the workers, when conducting a DPIA prior to the implementation of a digital management system that processes workers’ data or PII.

Article 35. 9 will provide the opportunity to workers to offer input into how the systems in question might affect different categories of workers, such as male vs female, young vs older, ethnic minorities vs majorities, disabled vs abled or any mix of all of these. Workers can also provide input into how the systems might affect working conditions, stress, burnout, the right to organise, the right to collective bargaining and much more. In other words, this article is an important entry point into having a say over the digital systems and in cooperation with management determine what management must keep a special eye on, for whom and how often.

Having a say

Workers can’t affect what they don’t know exists. Fortunately, management is obliged to tell the employees what digital systems they are using that process workers’ data. In addition, the GDPR includes the oftentimes overlooked article 35. 9 that provides an additional opportunity to query a digital system’s impact on different categories of workers and working conditions. By flagging potential risks, the employees and/or their representatives can hold management accountable for monitoring harms and remedying them. This is the basis for negotiation over the technologies used, that apparently isn’t being used, but should be.

For more guidance on how to use your GDPR rights as a worker, head over to our guide called Workers Data Rights. It offers a step by step way to map your legal rights and managements’ obligations.




Next
Next

Why we shouldn’t talk about the gig economy