Introduction
This guide is designed to help organizations understand and evaluate the data protection aspects of using Indivd’s Quality Assurance (QA) service for People Counting. According to the GDPR, a Data Protection Impact Assessment (DPIA) may be advisable where processing, even if limited, could impact the rights and freedoms of individuals. This guide supports such assessments.
Indivd’s QA service is optional and can be used to verify the statistical accuracy of automated people counting insights. Customers may alternatively choose to perform QA independently using Indivd’s tools and infrastructure. This guide reflects documented practice and privacy-by-design controls.
This guide is based on documented experience from previous DPIAs, it is not legal advice but a practical tool to support your DPIA process. For questions or more comprehensive documentation, please reach out to privacy@indivd.com.
1. Why and when to conduct a DPIA
While the QA process does not involve large-scale or continuous monitoring, it temporarily processes personal data (image data) and may therefore benefit from DPIA documentation under Article 35 of the GDPR. The Swedish Authority for Privacy Protection recommends DPIAs for such borderline cases as a precaution, even where high risk is not clearly present.
This service may be considered to pose a low risk to the rights and freedoms of individuals (previously conducted DPIA). In such cases, a DPIA is not required by law, and no prior consultation with the supervisory authority is necessary. However, documenting a DPIA is good practice to support transparency and accountability.
QA is activated only upon customer request and is fully controlled by the customer in terms of timing, scope, and data volume. It is not conducted in real time, and all data processing is limited and purpose-specific.
- Key insight: The QA process relies on the legitimate interest legal basis (Article 6(1)(f) GDPR). No profiling, biometric analysis, or continuous surveillance is performed. All data is deleted automatically within a strict 48-hour window.
A previously conducted DPIA for this processing activity is attached at the end of this guideline to support transparency, accountability, and facilitate your own assessment.
2. Understanding the processing activity
Indivd’s Quality Assurance (QA) process involves the temporary collection, storage, and review of image data. This data supports the validation and calibration of automated people counting systems. All manual review is performed on full-body masked video fragments, not raw video. Raw recordings are stored only for short-term automatic reprocessing, such as during calibration, and are never accessed by any human.
Image: During the QA process, QA analysts count the number of times a box (i.e., a masked person) crosses the green lines.
Process Overview:
- Customers schedule QA sessions and define scope (location, time, and sample size)
- Snapshots are taken outside opening hours to define measurement lines
- Short video recordings are captured and encrypted
- Raw video is stored for a maximum of 48 hours for automated system reprocessing
- Masked (full-body) video fragments are reviewed once by Indivd QA analysts
- All masked (full-body) fragments are automatically permanently deleted after review
What data do we collect (for up to 48 hours):
- Raw video recordings: Used exclusively for immediate reprocessing to full-body masked video fragments. The raw video recordings are not accessible to any individual
- Masked (full-body) video fragments: Reviewed by QA analysts for accuracy verification. These are shown once and deleted automatically after the QA analyst has manually counted line crossings.
- Statistical aggregates: Anonymous aggregated statistical data is calculated and compiled into a report. The data consists of a figure of the total number of visitors crossing a measuring line according to both automatic and manual counting. These statistics allow us to calculate the accuracy of the automatic counting and verify that the system works properly.
Purpose: To verify that the newly installed people counting system works as intended and to detect installation problems that may require further system calibration.
Safeguards:
- Raw video is encrypted and stored only within EU-based infrastructure (GleSYS AB, Sweden)
- Only masked (full-body) fragments are used for manual QA review
- Neither customers nor analysts have access to raw recordings
- All data is deleted automatically after processing and review
Not collected or retained:
- Facial recognition or biometric identifiers
- Special category data (e.g., age, ethnicity, beliefs)
- Real-time surveillance or live tracking
- Persistent identifiers or any link to individual visitors
These restrictions are technically enforced through system design and privacy-by-default principles. It is not technically possible to perform facial recognition or identify individuals through the QA process.
3. What the system is used for
The QA process validates that the people counting system works as intended. It helps confirm the statistical reliability of automated insights and supports fine-tuning of system settings (e.g., measurement lines). Target group estimates, when applicable, are derived mathematically from validated group statistics, not manually assessed.
Purpose: In order to ensure that the processing purposes for the Main processing activity are achieved, the data quality needs to be ensured. Quality Assurance services fulfil such a purpose.
4. Your role and responsibilities
As a customer, you are the data controller. Indivd acts as the data processor. Deciding whether and when to initiate the quality assurance, setting parameters (scope, location, sample size), and ensuring appropriate signage and privacy information for visitors. Detailed processor obligations and safeguards are set out in Indivd’s Data Processing Agreement (DPA), which governs all third-party processor relationships.
Third-party processors (within the EU):
- GleSYS AB (Sweden): Infrastructure.
- DigitalOcean EU B.V. (Germany): Platform data storage.
5. Transparency and visitor information
Clear signs must be posted at all entrances where cameras are used for people counting and quality assurance. These can be integrated with existing security signage and supplemented with brochures, QR codes, or web links to a privacy notice. In accordance with Article 13 of the GDPR and the EDPB’s video surveillance guidance, layered transparency is essential to ensure that visitors are adequately informed.
Visitors are considered data subjects but are not directly consulted, and the relationship is limited to in-store presence. Given the short retention time of data (typically under 48 hours), data subject rights such as access, objection, or erasure are unlikely to be practically exercisable.
-
Recommendation: Use Indivd’s privacy signage examples and ensure compliance with Article 13 GDPR and the EDPB’s guidance on video processing. These resources are included below.
6. Internal Communication and union involvement
Although QA cannot identify individuals, internal communication ensures transparency. If applicable, union or employee representatives should be consulted before QA implementation. This helps build trust and prevent misunderstandings.
7. Ethical & legal residual risks and mitigations
Beyond technical compliance, ethical considerations are essential to responsible data processing. Even when data is masked (full-body), the perception and context of data use can influence public trust. This section outlines residual ethical and legal risks, and how they are mitigated through Indivd’s privacy-by-design architecture, technical safeguards, and transparent communication.
Residual ethical risks:
- Risk of re-identification: Risk of re-identification: Mitigated through strict full-body masking, encryption, role-based access, and automated deletion after one-time review. Raw video is never viewed by individuals and is retained only for automatic system processing.
- Misunderstanding of QA purpose: Addressed with layered signage and clear privacy information.
- Standing access: Prevented via access logging, role separation, and deletion within 48 hours.
- Function creep: Prevented through strict time-limited use, no reuse of data, and separation between QA and production systems.
Controls in place:
- Data processed only in secure EU environments
- QA is strictly retrospective, never real-time
- All data is deleted automatically after use
- No personal data is exported or linked to any persistent identifier
Identified ethical and legal risks and mitigations:
-
Privacy risk
Risk: Sensitive or intrusive data collection may violate individual privacy through unconsented observation.
Mitigation: Only masked data (full-body) is ever accessed by QA analysts. Raw video is encrypted and processed automatically, never viewed by humans. Clear information allows visitors to understand or avoid the processing.
-
Security risk
Risk: Unauthorized access or data breaches could compromise the confidentiality of individuals' data.
Mitigation: Data is stored within secure EU-based infrastructure. Encryption, access controls, and logging are enforced. Data is automatically deleted within 48 hours. Only masked data (with full-body masks) is viewable by the QA analysts.
-
Fairness risk
Risk: Surveillance may be perceived as intrusive or discriminatory, potentially impacting individual autonomy.
Mitigation: The QA process cannot identify individuals. No persistent identifiers are used, and data is masked (full-body) before review. No demographic traits (age/gender/ethnicity) are accessible to reviewers. The only effect of the system is to generate a total count of people crossing a line for comparison against the total count generated by the automated counting system.
-
Transparency risk
Risk: Insufficient communication about AI processing could limit individual awareness and control.
Mitigation: Clear privacy signage and documentation aligned with GDPR are provided, explaining the purpose-limited QA, full-body masking, and 48-hour deletion.
-
Governance risk – AI literacy compliance
Risk: Failure to communicate AI functionality clearly could affect legal compliance and public trust.
Mitigation: Indivd provides clear privacy signage and documentation aligned with GDPR and EDPB guidance. QA is described as time-limited, anonymous, and non-intrusive.
-
Perceived biometric categorization risk
Risk: Estimation of traits through masked data may raise concerns about inferred profiling.
Mitigation: No facial features or biometric identifiers are used in the QA process. People on the cameras are immediately masked by the automated QA systems before viewing by QA analysts for further privacy protection.
8. AI Act – Assessment of prohibited use cases
Indivd’s Quality Assurance process has been evaluated against Article 5 of the EU Artificial Intelligence Act, which outlines practices that are explicitly prohibited due to their risk to fundamental rights. The QA process does not fall into any of the prohibited categories defined by the Act. Specifically:
-
Biometric categorization: Not applicable. The QA process does not involve classification of individuals based on sensitive attributes such as race, ethnicity, religion, political opinion, or sexual orientation. No biometric characteristics are processed or accessed during QA.
-
Surveillance of workers: Not applicable. The QA process is used solely for validating the people counting system's accuracy. It does not involve monitoring employee behavior or activity. The process is limited in duration, uses masked data (full-body), and is entirely retrospective.
-
Manipulation of human behavior: Not applicable. QA is a passive verification step with no interaction, feedback, or influence on individuals' decisions or behaviors. It serves solely to confirm technical accuracy.
-
Exploitation of vulnerable individuals: Not applicable. The system does not exploit or adjust its behavior based on individual characteristics such as age, disability, or socio-economic status. The quality assurance is statistical and technical, without processing at individual level.
-
Social scoring: No data used in QA could be linked to personal behaviors or reputational traits. QA generates masked (full-body), aggregate-level validation metrics, not scores or profiles.
-
Biometric identification in public spaces: Not applicable. QA does not involve identifying individuals in any space, public or private. No biometric identifiers are collected, stored, or processed.
-
Facial recognition database or scraping: Not applicable. The QA process does not build or access any facial recognition database. Individuals in video data are masked (full-body) before any human review, and raw video is not available for facial analysis.
- Emotion recognition: Not applicable. No emotional states are inferred, analyzed, or interpreted at any stage of the QA process.
Indivd’s Quality Assurance method is designed according to privacy-by-design and data minimization principles. The system does not meet the technical, operational, or legal definitions of any prohibited AI use case under Article 5 of the AI Act. It operates strictly within the scope of statistical system validation, without personal impact or human rights implications.
9. Recommended actions
To complete your DPIA:
- Review your legitimate interest assessment
- Confirm internal communication and external signage
- Use Indivd documentation to support record-keeping and compliance
- Schedule QA only when needed, and document each session
- Maintain summaries of QA results for auditing and insight confidence