Voice?
Recently the British tabloid, The Sun, ran a report on voice clones created by artificial intelligence (AI).
Say you have audio uploaded on social media sites such as Facebook, Instagram, or LinkedIn. Or YouTube. (Oh, dear, my sermons!) Well, it’s no difficult matter for AI tools to rip off your voice and clone it.
(The one doing this devious deed need have hardly any technical prowess. It is simply and automatically accomplished by advanced tools easily available.)
What can they do with a clone of your voice?
Dr. Justin Lo, Lecturer in Security and Protection Science at the University of Lancaster, U.K.:
AI-generated voices are already a threat to personal identity in at least two contexts. First, the voice can be used to breach speaker verification software in order to gain access to systems such as bank accounts. Second, the AI-generated voice can be used to deceive other human listeners. The technology already exists to produce speech which is indistinguishable to most listeners in best case scenarios.”
And cyber thieves are increasingly scouring for clips of people’s voices on social media to mimic them in potentially devastating scams. Some sites claim they only need as little as ten seconds of audio from a person to pull off the vicious stunt.
But Dr. Luca Arnaboldi, Assistant Professor of Computer Science, at the University Of Birmingham, U.K., wasn’t so sure about that:
I am skeptical of how good they’ll be with ten seconds. But I’ve tested other AI sites needing 50 seconds that work better. The more data the easier it is, so obviously having your voice out there is an inherent risk.”
(And I have … hours of sermons!)
Dr. Arnaboldi continued:
We as users should be conscious of what we share but I don’t think it should be on the user. I think part of the burden has to be on companies, not to let themselves be taken in by such malfeasants.”
In any case, if you get a call that sounds like a loved one, but coming from an unknown number, this should be a massive red flag. Hang up and try ringing them from the number you have for them to see if whatever crisis is being claimed is for real. And if you’re really concerned about AI voice clones, there’s a clever trick you can set-up between family—using a safe word. Like a password.
Dr. Lo added:
Social media accounts have been prime targets for personal information even before the rise of deepfakes, so taking care around how much, and what kind of, information you put on social media applies not only to the voice. Given that the voice can be vulnerable to impersonation, it might be worth thinking carefully about how much we wish to rely on it as the sole or primary means of verifying identity in different parts of our personal lives.”
There is only one voice that is unclonable:
The voice of Yahweh is upon the waters …
The voice of Yahweh in power,
The voice of Yahweh in majesty.
The voice of Yahweh breaks the cedars ….
The voice of Yahweh splits flames of fire.
The voice of Yahweh quakes the wilderness ….
The voice of Yahweh quakes the large trees and strips the forests [bare].
Psalm 29:3–5, 7–9
Listen to that one, for sure, and add your voice to these:
And in His temple everyone says,
“The glory [of] Yahweh sits [enthroned] at the flood;
And Yahweh, King, sits [enthroned] forever. …
Yahweh, He blesses His people with peace.”
Psalm 29:9–11
No worries about voice impersonation!
SOURCE: The Sun