Whistleblower reveals Netherlands’ use of secret and potentially illegal algorithm to score visa applicants
In 2022, a report commissioned by the Dutch Ministry of Foreign Affairs concluded that the agency’s internal culture was riddled with structural racism. Some employees recounted being described as “monkeys” while others were continually asked to disavow terrorist attacks. In response, the foreign minister Wopke Hoekstra promised reform. “This is not who we want to be,” he said.
The Ministry of Foreign Affairs is not alone one among Dutch institutions that have been in the spotlight for structural racism. The Netherlands prides itself on automated decision making to reduce bias. But in the last two years of reporting from Lighthouse Reports have revealed wide-scale use of algorithmic risk profiling systems across the Netherlands.
New, leaked documents obtained by Lighthouse Reports and NRC reveal that at the same time Hoekstra was promising change, officials were sounding the alarm over a secretive algorithm that ethnically profiles visa applicants. They show the agency’s own data protection officer — the person tasked with ensuring its use of data is legal — warning of potential ethnic discrimination. Despite these warnings, the ministry has continued to use the system.
Unknown to the public, the Ministry of Foreign Affairs has been using a profiling system to calculate the risk score of short-stay visa applicants applying to enter the Netherlands and Schengen area since 2015.
An investigation by Lighthouse and NRC reveals that the ministry’s algorithm, referred to internally as Informatie Ondersteund Beslissen (IOB), has profiled millions of visa applicants using variables like nationality, gender and age. Applicants scored as ‘high risk’ are automatically moved to an “intensive track” that can involve extensive investigation and delay.
“Family members of Dutch citizens with a migration background are prevented in all kinds of ways by the Ministry of Foreign Affairs from getting a visa for short stays.” said Kati Piri, a Dutch MP. Regardless of the ministry claiming efficiency over the use of the IOB system, “from countries like Morocco and Suriname, it is incredibly difficult to get a visa,” Piri added.
In February 2023 Lighthouse obtained previously unpublished, crucial documents pointing to heated internal discussion around the Ministry of Foreign Affairs’ use of nationality and gender in its algorithmic risk profiling.
According to documents, the internal Data Protection Officer advised the ministry to “immediately stop profiling visa applicants to distinguish them partly on the basis of nationality and then treating them unequally on the basis of that distinction.”
Examples of so-called “risk profiles” used by the algorithm include Surinamese men aged between 26-40 who applied from Paramaribo and unmarried Nepalese men aged around 35-40 who applied for a tourist visa. Officials claim that the risk profiles are also based on data from third parties to see if a group of individuals from the same nationalities attempted to apply for asylum.
Despite the Data Protection Officer’s serious and continuous warnings from at least 2021, the documents suggest that Hoekstra delayed making a decision about the system. With reporting partner NRC, we spoke with internal sources who confirmed it is still active and uses nationality as a variable.
An internal watchdog at the Ministry of Foreign Affairs has pressed them since midway through 2022 to immediately halt algorithmic profiling of visa applicants based on their nationality. Despite pushback from officials, the DPO continued to insist that the algorithmic assessment system is potentially discriminatory.
With the national newspaper NRC, we chronicled how the ministry turned to algorithmic profiling in the midst of a larger drive to cut costs. Ministry officials maintain that the algorithm helps to centralise workloads and remove bias of caseworkers. They also claim that three consulted experts have concluded that the use of nationality in the algorithmic profiling system was proportional and not discriminatory.
Statistics from the ministry suggest that being flagged as a high risk can carry serious consequences. As of March 2023, 33 percent of applications in the intensive, ‘high risk’ track were rejected, whereas only 3.5 percent of applications in the normal, ‘low risk’ track were rejected.
Being flagged for the intensive track can come with months of delays and consequences in an already difficult bureaucratic process. With NRC we spoke to Saadia Ourhris, a Moroccan-Dutch mother who recounted a relative of hers receiving constant automated rejections when attempting to visit her in the Netherlands.
Reacting to our findings, Dutch MP Kati Piri described the use of algorithmic visa profiling as “downright shocking.”