This article, authored by John Loftus is republished under the Creative Commons “CC BY-NC-ND” license with permission from The Daily Caller News Foundation.
I’m not sure this is what Steve Jobs envisioned when he set off to change the world by starting Apple. But it might be what George Orwell foresaw earlier in the 20th century.
The tech behemoth announced Jan. 29 that it had purchased Q.ai, a little-known Israeli tech startup, for nearly $2 billion. That’s quite an eye-popping valuation. But Q.ai’s software, at least from a layman’s perspective, seems very powerful. It might even be dystopian.
One Q.ai patent filing says the technology is capable of “using facial movements to generate a conversational record” and “determining an emotional state of an individual based on facial skin micromovements.” A separate patent filing claims the software “synthesises speech in response to words articulated silently by the test subject.” And yet another describes a “sensing device configured to fit an ear of a user, with an optical sensing head which senses light reflected from the face and outputs a signal in response. Processing circuitry processes the signal to generate a speech output.”
So, if our brains send signals to our muscles and our mouths move before we even talk out loud, Q.ai’s sensors can detect these extremely subtle “micromovements.” At least that’s what the company is claiming. On its stripped-down website, Q.ai says, “In a world full of noise we craft a new kind of quiet. Magic. Realized.”
So you can see where Apple may want to take this, and why they paid a handsome sum for the company. Imagine using an iPhone without having to type or dictate. Imagine wearing a pair of glasses that detects whether your newborn infant is on the verge of crying. Imagine a doctor or nurse wearing these glasses, monitoring a patient susceptible to strokes. The idea is almost like telekinesis: being able to control our physical world without speaking or physically interacting with it.
Which is frankly mind-boggling.
At the same time, though, you can also see how this could get incredibly dystopian and lead to all kinds of privacy concerns. With concepts like “silent speech” and “pre-speech” being thrown into the mix, how far are we from the concept of “pre-crime”?
Imagine a police officer wearing sunglasses with the technology to scan people’s faces and determine their moods. A cop could look around a crowd of protesters and find the person who appears most agitated, who’s making faces that denote higher amounts of anger and aggression — someone who could turn violent — and then target that person, whether by harassing them more than other protesters or perhaps shooting a canister of tear gas in their direction, even if they are not committing a crime. This would be facial-recognition technology, already used by the federal government, on steroids.
Or, imagine someone wears the glasses as they walk through a seedier neighborhood of a city, and it allows them to look at strangers to determine which ones appear more aggressive, which ones to avoid, and when to cross the street to avoid potential danger. All of this sensory data picked up from other humans is then collected in a database. Would this not be yet another colossal invasion of everyone’s privacy? In this scenario, it’s not as if the observed people consent to a terms and conditions agreement, as you would when you create an Apple account and set up your new iPhone. You wouldn’t have a choice.
Besides the civilian applications, militaries would certainly put this technology to use. Just like the above scenario, imagine a soldier patrolling a crowded urban market. Being able to scan all the faces in the crowd might help him discover the enemy combatant who’s hiding a gun or a bomb under his jacket. It could give him that split-second advantage, the difference between life and death. But what if the technology flags the wrong person? What if that person is completely innocent?
Whether the claims in Q.ai’s patents are to be believed, and whether their technology ever sees the light of day for consumers, law enforcement, or militaries alike, there is a dystopian element to it that hints at a bleak future.
Your support is crucial in helping us defeat mass censorship. Please consider donating via Locals or check out our unique merch. Follow us on X @ModernityNews.
More news on our radar














