Introduction
This guide is designed to help organizations understand and evaluate the data protection aspects of using Indivd’s People Counting service.
All people counting (2D cameras, 3D cameras/sensors) constitutes large-scale processing of personal data (since all these systems use image data), and thus a DPIA is always required before deploying such systems, in accordance with Article 35 of the GDPR.
This guide walks through the relevant steps of a DPIA, supported by documentation from a previously completed assessments (DPIAs) and a favorable prior consultation with the Swedish Authority for Privacy Protection (IMY). It is intended to support your understanding of the processing and help you meet your obligations as a data controller.
The guide is not legal advice but a practical tool to support your DPIA process. For questions or more comprehensive documentation, please reach out to privacy@indivd.com.
1. Why and when to conduct a DPIA
A DPIA is required for processing operations that involve large-scale monitoring of publicly accessible areas or the use of innovative technologies, especially when these involve image data. Indivd’s service processes video from cameras and immediately anonymizes the data using a patented method.
Despite these safeguards, the scale and nature of the monitoring mean that a DPIA is mandatory for all uses of people counting technology.
-
Key Insight: The Swedish Authority for Privacy Protection confirmed during prior consultation that, under the specific implementation and safeguards presented, processing could rely on legitimate interest (Article 6(1)(f) GDPR). However, each data controller remains responsible for conducting their own legitimate interest assessment and ensuring full implementation of required technical and organizational measures. This guide does not substitute for that responsibility.
2. Understanding the processing activity
Indivd’s People Counting service uses video of e.g. store entrances, applies a patented anonymization technique, and deletes the original image data within milliseconds.
The anonymization process transforms image data into group-level anonymized statistics. While this significantly minimizes privacy risks, it must be noted that the classification of data as 'anonymized' under GDPR depends on context. According to Recital 26 of the GDPR, data is only truly anonymized if re-identification is not reasonably likely, including through the use of external data. Therefore, the term 'anonymous' is used to describe data that is technically and procedurally protected against identification, while recognizing that its status must always be assessed in context under applicable law.
Processing workflow:
No personal or special category data is ever stored or retained.
-
Video is recorded from in-store cameras.
-
Data is encrypted and transferred to a local processing instance.
-
Anonymization occurs in real-time (Group ID).
-
Image data is deleted within 1–3 milliseconds.
-
Anonymized statistics are securely transferred and used for insights.
What data we collect:
All data is instantly anonymized. No personal data is stored. Anonymized data is retained up to 3 hours solely for statistical aggregation. When the data is collected is controlled by the customer themselves in the Indivd platform for each individual location.
-
Counting data: Number of people entering/exiting/passing zones.
-
Movement trend data: Patterns of anonymized visitor flows per hour.
What we do not collect:
❌ No personal data is ever stored or retained: Raw image data is deleted before it can be saved to disk, ensuring it never touches an HDD or permanent storage.
❌ No special category data is processed: Our system is not capable of identifying health status, ethnicity, beliefs, or biometric characteristics.
❌ No biometric data is processed: Facial features are not analyzed or retained. Our resolution is too low (60–100 ppm), and the anonymization method prevents any facial recognition.
❌ No facial recognition or identification is possible: Group ID assignment is random and non-deterministic. It is mathematically and technically impossible to track individuals.
❌ No tracking of specific individuals or groups: The system has no identifiers or attributes to distinguish individuals, meaning groups such as children or employees cannot be isolated or followed.
These limitations are not just policy decisions, they are enforced by the system design, technical constraints, and anonymization methods. As such, it is technically impossible for the system to collect or store such data.
3. What the system is used for
The anonymized data is used solely by the customer to understand visitor patterns, improve store layouts, and compare performance between locations. These are the same purposes presented and approved during the prior consultation with the Swedish Authority for Privacy Protection. Any change or expansion of these purposes should require prior approval and be reflected in the DPIA.
Purposes:
- Understand how different objects in the store environment (shelves, aisles, furnishings, entrance, signage, lightings, etc.) attract interest (attention, integracts with, leads to conversion) from visitors to be able to make changes that lead to the store environment becoming more efficient and relevant to all visitors.
- Understand where and when queues are formed (e.g checkouts, dressing rooms, toilets, customer service counters) in order to be able to make changes that lead to a streamlining of store operations and minimization of queue formations.
- Understand how visitor flows take place in order to be able to make changes that lead to a streamlining of customer flows in the store.
- Streamline internal operations by understanding visitor flows and then adapting staffing at different locations / departments in the store.
- Understand how visitors navigate the store's planned visitor journey turns (i.e which paths visitors walk in the store) in order to implement changes that lead to increased efficiency and relevance of the visitor journey.
- Enable clearer comparisons and thereby understand differences between different stores, e.g the impact of our store environments, queuing, visitor flows, the attraction of objects, visitors 'navigation in the stores' planned visitor journeys, etc. to learn about the differences and make changes that lead to a streamlining of our organization's overall operations. We could, for example, make two separate investments in two stores with similar conditions. In one store we invest in education for the staff, and in the other store we invest in a new store interior. If the results of such a study would show that it is more favorable to provide our staff with more education, it could lead us to restructure our investments and increase the efficiency of our organization's overall operations.
- Enable clearer comparisons and thereby understand differences between different stores, e.g the impact of our store environments, queuing, visitor flows, the attraction of objects, visitors 'navigation in the stores' planned customer turns, etc. to streamline / adapt planned establishments, including identifying new cities / areas for establishment.
- Enable clearer comparisons and thereby understand differences between our stores and other stores over time, for example regarding the impact of store environments, the attraction of objects, queuing, visitor flows, visitors 'navigation in the stores', planned visitor journeys, etc. to get and try new ideas and offers, e.g by introducing a yoga studio or running club to increase innovation and strengthen our competitiveness. For example, insights that a large proportion of visitors are attracted to (pay attention to, interact with) sports products could lead us to temporarily try to launch a running club in the store. A recurring activity in the store could lead to visitors gathering to run and discuss running in order to strengthen the store's brand towards that target group. With an increased understanding of differences, the attractiveness of objects, visitor flows, etc. could lead to an increased understand about the investment's profitability if (i) more visitors visit us (ii) new target groups visit us (iii) it leads to increased attractiveness for other products (iii) we have an increased frequency of visits and an increased attractiveness in relation to benchmark data, which in itself leads to increased competitiveness.
4. Your role and responsibilities
As a customer, you are the data controller. Indivd acts as the data processor. You are responsible for ensuring appropriate signage and transparency measures (GDPR Article 13). Detailed processor obligations and safeguards are set out in Indivd’s Data Processing Agreement (DPA), which governs all third-party processor relationships.
Third-party processors (within the EU):
-
GleSYS AB (Sweden): Infrastructure.
-
DigitalOcean EU B.V. (Germany): Platform data storage.
5. Transparency and visitor information
Clear signs must be posted at all entrances. These can be integrated with existing security signage and supplemented with brochures, QR codes, or web links. In accordance with Article 13 GDPR and the EDPB’s video surveillance guidance, layered transparency is essential.
Even with strong anonymization, perceived surveillance can raise ethical concerns and affect public trust. Therefore, public communication should clearly explain what data is used for, how it is anonymized, and how individuals’ rights are respected. This includes easily accessible privacy notices, understandable language for non-technical audiences, and providing contact details for data protection inquiries.
-
Recommendation: Use Indivd’s signage examples and ensure compliance with GDPR Article 13 and EDPB video guidance. These are listed below in this article.
6. Internal communication and union involvement
For systems involving workplace environments, it’s essential to address internal transparency and ensure that employees understand the system’s purpose and limitations. Because the system cannot distinguish between individuals, it is not technically or operationally capable of monitoring employee behavior.
-
Union representation: If your organization has union representation or safety delegates, involve them early in the DPIA process and before rollout. Doing so strengthens trust, helps prevent misunderstandings, and demonstrates accountability.
Identified risks and mitigation measures:
-
Human (Risk: Low R1-R2): Information will be posted at each entrance/exit and all doors leading into the store from BOH, clearly explaining the system and its purpose. Staff will be briefed on the background, purpose, and data use before rollout.
-
Technology/Physical work environment (Risk: Low R1-R2): The strict anonymization method and low-resolution video ensure no data can be linked to individuals or recreated.
-
Organisation (Risk: Low R1-R2): Store teams will not access the system or data, eliminating the need for operational training. Separate information briefings are provided instead.
-
Organisation (Risk: Low R1-R2): Signage and briefings ensure internal transparency and prevent misinterpretation of the system’s role and impact.
7. Ethical & legal residual risks and mitigations
Beyond technical compliance, ethical considerations are essential to responsible data processing. Even with anonymization, the perception and context of data use can influence trust and public acceptance. This section outlines residual ethical and legal risks and how they are addressed through Indivd’s technical, organizational, and communicative measures.
Residual ethical risks:
-
Re-identification: Not possible due to Group ID and data noise.
-
Transparency misunderstanding: Addressed with layered visitor and employee information.
Controls in place:
-
Patented anonymization method.
-
Instant data deletion in transit; personal data is never stored (never touches an HDD).
-
EU-hosted infrastructure.
-
Union consultation and employee communication for workplace deployments.
- Indivd enforces a strict anonymization policy that requires a documented risk analysis for any change, update, or enhancement to the anonymization method (Function Creep). This ensures that no modification can reduce the system’s anonymization capability. Changes that might affect anonymity must be approved by both the Product Owner and the Head of AI, while any change that would reduce anonymity is categorically prohibited.
Identified ethical and legal risks and mitigations:
-
Beneficial use risk
Risk: Sensitive or intrusive data collection may harm organizational reputation.
Mitigation: Indivd ensures that all personal data is anonymized in real time (within 1–3 milliseconds). The patented anonymization method prevents identification and only retains anonymous group-level data, reducing privacy impact and enhancing compliance with ethical standards. -
Security risk
Risk: Unauthorized access or data breaches could compromise confidentiality and trust.
Mitigation: A comprehensive security framework is in place, including encrypted data storage, secure transmission protocols, role-based access control, and regular internal audits. Incident response procedures are clearly defined in the Data Breach Response Policy. -
Fairness risk
Risk: Employee or individual surveillance may be perceived as intrusive or discriminatory.
Mitigation: Indivd’s platform is specifically designed to prevent tracking or profiling of individuals, including employees. All data is anonymized and aggregated, and no personal identifiers are retained, thereby avoiding unfair or biased treatment. -
Governance risk – Children’s data
Risk: Accidental capture of minors’ data may result in GDPR breaches and reputational damage.
Mitigation: The system does not process or identify age-related or biometric data. Anonymization ensures that individuals, including children, cannot be singled out or tracked. Nonetheless, care must be taken to avoid unintended inferences about vulnerable groups (e.g., children, employees) from aggregated patterns. As part of your DPIA, you should assess whether your specific deployment context—such as camera placement or store type—might indirectly affect such groups. This ensures both ethical and legal risks are appropriately accounted for. -
Governance risk – AI literacy compliance
Risk: Failure to communicate AI functionality clearly could affect legal compliance and public trust.
Mitigation: Indivd adheres to AI ethics principles and GDPR transparency requirements by providing layered information (e.g., signage, brochures, digital notices) and public guidance on how AI systems operate and individuals’ rights under the GDPR. -
Perceived biometric categorization risk
Risk: Even though data is anonymized, the estimation of traits such as gender or age range may raise concerns about inferred biometric profiling.
Mitigation: Indivd’s anonymization process uses group-based statistical methods that do not allow for unique identification. No facial features are stored or processed. The technology has been reviewed by the Swedish Authority for Privacy Protection, which accepted that the system does not involve biometric identification as defined by the GDPR.
8. AI Act – Assessment of prohibited use cases
Indivd’s AI-powered system has been thoroughly assessed against Article 5 of the EU Artificial Intelligence Act (AI Act), which outlines explicitly prohibited AI practices. Below is a clear summary explaining why Indivd’s technology is compliant and does not fall within any prohibited categories.
-
Biometric categorization: Indivd does not categorize or classify individuals based on sensitive or protected characteristics such as race, ethnicity, sexual orientation, or political affiliation. When customers use the optional Target Group Insights, data is only aggregated based on non-sensitive categories like age and gender, strictly at a group level, never identifying or profiling individuals.
-
Surveillance of workers: Indivd’s technology exclusively provides insights on customer behavior and journeys during store operational hours. It is not designed or capable of monitoring employees or collecting data outside store hours. Thus, it does not infringe upon worker privacy or constitute worker surveillance.
-
Manipulation of human behavior: Indivd provides statistical insights without influencing or manipulating individual decisions. The system generates data on customer movement and interaction without employing techniques designed to control or distort behavior.
-
Exploitation of vulnerable individuals: Indivd does not exploit vulnerabilities related to age, socio-economic status, or other sensitive attributes. The system’s outputs are purely analytical, supporting ethical decision-making guidelines provided to customers.
-
Social scoring: Indivd’s technology does not evaluate or classify individuals or groups based on social behavior or personal characteristics. It exclusively generates group-level behavioral insights intended to enhance customer experiences in retail settings. There is no mechanism within the system to create or contribute to social scoring practices.
-
Biometric identification in public spaces: Indivd’s technology does not perform biometric identification. All image data captured is immediately anonymized through a patented, irreversible method, rendering the information unusable for individual identification. The process ensures compliance with GDPR and the AI Act by not storing or utilizing biometric identifiers.
-
Facial recognition database or scraping: Indivd does not create or expand facial recognition databases, nor does it use any scraping techniques from online sources or CCTV footage. Data collected is anonymized instantly, aggregated for statistical use, and does not support facial recognition.
- Emotion recognition: The system does not analyze, infer, or use emotional state data. Metrics collected, such as movement patterns and dwell time, do not contain emotional indicators or parameters.
Indivd’s AI system does not meet the criteria of any prohibited AI practices defined by Article 5 of the EU AI Act. The technology has been intentionally designed around privacy-by-design principles, robust anonymization, data minimization, and ethical use, ensuring lawful, compliant, and ethical data processing.
9. Recommended actions
To complete your DPIA:
- Review the processing workflow and data categories.
- Confirm that the purposes align with GDPR principles.
- Use the templates provided to conduct a residual risk assessment.
- Document your findings and decision rationale.
- Prepare internal communications and signage.
- Keep the DPIA under regular review.