Automating Distrust

Digital profiling of Dutch low income, minority neighbourhoods spirals, series of local investigations reveals

When Lighthouse and partners revealed that dozens of so-called “problem” neighbourhoods had been singled out and profiled by secretive algorithms to detect welfare fraud in 2022, it was clear that further revelations would follow. Cities across the Netherlands were continuing to use low-income neighbourhoods as testing grounds for high-risk tech.

We teamed up with three local media partners, De Limburger, Omreop Brabant and Bureau Spotlight and Follow The Money in order to investigate the deployment of this technology in their own communities. Over the course of a year, we unravelled a web of data exchange, dragnet profiling and dozens of municipalities and government ministries.

The Netherlands has a long history of using algorithms and data exchange to profile and pursue vulnerable communities. Describing these systems in 2019, Phillip Alston, United Nations Special Rapporteur on extreme poverty and human rights, compared them to “the digital equivalent of fraud inspectors knocking on every door in a certain area and looking at every person’s records in an attempt to identify cases of fraud, while no such scrutiny is applied to those living in better off areas.”

Months of local reporting showed how in low-income neighbourhoods nearly a dozen forms of data-driven profiling were stacked atop each other. Residents were surveilled from entering school to starting a business or applying for government benefits. Internal documents reveal how local and national agencies trampled privacy regulations and their own data protection officers to profile vulnerable residents with little or no legal basis.

METHODS

Local governments in the Netherlands are increasingly deploying algorithms with little oversight. Local journalism has a vital role to play in holding these systems accountable and deep connections to the communities most directly affected. In 2022, With support from the Dutch Journalism Fund (SVDJ), we partnered with De Limburger, Bureau Spotlight and Omroep Brabant to produce nearly 20 investigations across 12 months.

Freedom-of-information requests were sent to 21 municipalities seeking documents on the various types of profiling they deployed and internal correspondence about the legal basis. Meanwhile community reporting revealed the harrowing consequences for the families who find themselves constantly pursued in nearly every aspect of their lives.

Over the course of the investigation it became increasingly clear that these tools were being deployed with little to no legal basis, and that insiders had been sounding the alarm for over a decade. ​​“We will find fraud here because we have looked; we will find ourselves further confirmed in the assumption that more fraud takes place in ‘poor deprived areas,’” one Data Protection Officer wrote.

After months of reporting that began with a collaborative investigation between Lighthouse Reports and Argos, the Dutch government announced in November it would halt all ‘neighbourhood-oriented’ projects to detect benefits fraud.

STORYLINES

In Limburg, a province in the southeast of the Netherlands, De Limburger investigated how the region had become a hotspot for algorithmic profiling. In the city of Kerkrade, officials targeted neighbourhoods with ‘high welfare density’ and used risk indicators like ‘single mother on welfare of which the father of the child is unknown’ to flag people for invasive house checks. Meanwhile, a project in the city of Venlo that targeted migrant workers and shared their data with government agencies was criticised by its data protection officer. Internal documents we obtained show how the city consciously continued and said that “ignoring privacy laws” was a “political decision.”

In Brabant, a province in the South of the Netherlands, local broadcaster Omroep Brabant discovered how city, national agencies, and the police were collaborating to profile residents for petty crimes. One project that cost hundreds of thousands of euros attempted to predict businesses involved in subversive crime. Too few Google reviews or a remote street could result in a business being flagged. One business owner recounted how 15 police officers and city officials descended on his office. Yet reports requested by Omroep Brabant show that the only wrongdoing found in any of the province’s checks was a fire extinguisher hanging in the wrong place and a blocked emergency exit.

In Gelderland, the breadbasket of the Netherlands, regional investigative newsroom Bureau Spotlight went deep on the story of Fatima, a young woman with multiple sclerosis. Documents and internal emails obtained by Spotlight show how Fatima and her entire family came into the crosshairs of city and government officials through a project that profiled residents based on their family relations, water consumption, and age, among other things. They were subject to invasive house visits where neighbours recalled seeing investigators hiding behind bushes near their home.

Internal emails suggest racial discrimination. When investigators received her case, one wrote in an email that they saw “last names that make me wonder.” After her investigation, city authorities accused Fatima of lying and claimed that she actually lived with her parents. Officials blocked her welfare support, despite Fatima being entitled to benefits regardless of whose house she was living at.

With national partner Follow The Money, we tied together the strands of our regional reporting in a story on the city of Breda, where the city’s poorest neighbourhood, Hoge Vucht, has been the subject of at least six different profiling projects. The piece shows how Breda is at the forefront of a growing trend wherein the government increasingly deploys algorithms and data-driven profiling to interfere in the private lives of residents. The result, according to residents and experts, is a state of constant distrust on both sides.