Premium

The Great Replacement Chronicles: ‘Correctly Identified’

AP Photo/Alastair Grant

Archiving the “strange death of Europe,” as Douglas Murray put it, and the West more broadly, at the hands of the neoliberal technocracy.

TikTok migrant milks content out of scaring British women on the Underground

The London Underground was once the pride of Britain — until the migrants showed up and started farming TikTok out of acting like feral animals on the train.

Like this guy, whose clever shtick is sitting next to British women and scaring the bejesus out of them by suddenly bursting into extremely loud recitations of urban music.

Do you get the joke?  

You see how she reflexively jumps in fear at the sound of his loud and aggressive outburst and then cowers next to him, too afraid to move or ask him to stop blasting her eardrums with his urban “singing” (loosely defined)?

So hilarious!

If you don’t appreciate the humor, that’s probably because you’re a deplorable racist. Some quality time in Starmer’s gulag to reconsider your bigotry might do you some good.

Related: British PM: We Censor Anti-Migrant Protests ‘For the Children’

The fun on the London Underground never stops; here’s another British gentleman of the urban persuasion harassing, fondling, and threatening to kill a gaggle of young British women at a train station.

The high-quality audio suggests that he’s mic-ed up and, therefore, probably also producing the scene for social media content.

UK removes surveillance cameras because they ‘correctly identify’ too many black criminals

The key modifier in this article here is “correctly”; there is no claim that the cameras were flagging innocent blacks; the theoretical problem is that they “correctly identify” too many black criminals and therefore are racist somehow.

Via GBNews (emphasis added):

A police force has stopped the use of live facial recognition (LFR) cameras after it identified more black people than other ethnicities.

Essex Police's facial recognition cameras, mounted on vans and used to identify people on watchlists, will be paused over "bias" concerns.

The force said it paused the use after "potential bias in the positive identification rate" - but now believes the issue has been corrected with an update to its algorithm.

University of Cambridge researchers tested LFR during one of Essex Police's deployments, with the help of nearly 200 volunteers…

The study found it was significantly more likely to correctly identify black people.

Black people were 27 per cent more likely to be identified than all other ethnicities, and 31 per cent more likely than white people.

The technology was also 14 per cent more likely to spot men than women.

The research found it correctly identified around half of those on the watchlist, and was extremely rare for someone to be flagged who was not on the list.

Related: Migrants Committed Almost 70% of Violent Crimes in France Last Year

The language is sort of difficult to parse — perhaps intentionally so, although GB News is less politically correct than most British mainstream outlets — in that it’s unclear if the cameras were simply not correctly identifying white people on the watchlist at the same rate or if, in fact, as appears to be the case, it was just correctly identifying criminals on the watchlist, which presumably is disproportionately populated by blacks.

Again, the key phrase here, which points to the latter being the case, is “correctly identified.”

Recommended

Trending on PJ Media Videos

Advertisement
Advertisement