How will AI invade the world's elections this year?
PLUS Voice cloning and protecting your identity...
Remember when terrible speeches and presentations would always start with an Oxford English definition of the topic? This has been replaced by “I asked ChatGPT what it thinks about X… <wait for reaction of laughs or gasps>”. It’s excruciating, but when that novelty diminishes what will the world be like when AI is completely common place?
This week Meta’s oversight board found that it was okay with Facebook keeping a video that suggests US president Joe Biden is a paedophile… because it was created WITHOUT AI. The board is funded by Meta and run independently, however, it is a shrewd move because Meta doesn’t want to get caught policing the accuracy of all the content on its platforms, especially when going into an election year. It is easier for them to simply target content that has been manipulated by robots.
Around 49% of world is set to go to the polls in at least 64 countries (plus the European Union) during 2024. The AP has said that as AI deepfakes become mainstream and social media guardrails are taken down, false narratives are going to be easier to spread. So, even if you catch single pieces of disinformation and debunk them then (because of AI) the larger invented stories will become harder to dispel.
Forbes has reported on how deepfaked news segments that appear to be delivered by well-known journalists could influence this year’s elections and Columbia University’s School of International and Public Affairs put out a panel where top minds laid out the problem and did offer solutions, but they are all going to take huge resources. But it is clear that governments need to stay up to date with using technology to educate voters and not perpetuate problematic information with their own hallucinating chatbots.
How this relates to audio…
I have been worrying about voice cloning for a year now. And nothing that has happened in the past 12 months has dampened those fears. We are now living in a world where a politician’s voice can be cloned (with just a few minutes of audio) and used against you. And what has concerned me more is how readily we are willing to give over our voice samples to paid services like ElevenLabs (me included) to clone our own voices to make our lives more convenient.
The security debate got supercharged last week as ElevenLabs announced that they were launching a way for people to detect if a sample of audio has been created by their service. So, if you get a voice note that you think is fake you need to go to ElevenLabs, upload it and they will tell you if it is a cloned voice. The futile silliness of this being useful when you consider the rapid disinformation cesspool of platforms like WhatsApp is laughable. It involves someone knowing about voice cloning, the company ElevenLabs and the method of screening the audio. Pure lip service.
There is another wrinkle to this: ElevenLabs (who I subscribe to and really like) have opened a “store” for voice actors. Apple has enticed everyone to chase the dream of running a digital store and skimming off the top. An ElevenLabs store involves you putting your own voice up for sale. So, content creators across the globe can browse the voice store and if they like your sultry tones you’ll make some cash. This beautifully circumvents the security question by introducing a financial incentive. If you wilfully document and sell your voice for scraps then surely you are giving up a major part of your IP? Our voices are for sale… and in the process we will make revenue and muddy the question around vocal privacy.
How this relates to everyone…
This is a good time to tell you about a new podcast called Challenging The Truth which is coming out at the end of February. It will explore how journalists and activists are reporting on disinformation in Africa. The show is produced by our sister organisation Develop Audio in partnership with DW Akademie and the German Federal Ministry for Economic Cooperation and Development.
A different journalist from a different African country will host each episode and tell their own story about how they are figuring out this problem.
This week’s AI tool for people to use…
I have been exploring how AI is being used in creating art (it is going to be the topic of next week’s newsletter). In that spirit, Open Art AI boasts that you don’t need to even use “complicated prompts” to generate an image. We are already living in a post-prompt world. But in some ways it is a step towards having more control over the AI and getting the results you want.
What AI was used in creating this newsletter?
I transcribed the video of the Columbia panel using Descript because the GPTs inside ChatGPT that claim they can transcribe are mostly just chatbots that advise you on which services you could use if you paid money. I then used ChatGPT to summarise the video. And I used Bing to create the pixel art image.
In the news…
The latest episode of The Future of Journalism podcast from The Reuters Institute for the Study of Journalism at Oxford University paints a very informed, but pretty bleak picture of where media is going…
While the second part of this Newsroom Robots podcast is energising and exciting when it comes to how AI is going to change the media.
What’s new at Develop Media?
I want to give a huge thanks to The Global Investigative Journalism Network for singling out Develop Audio’s podcast series The Last Afternoon In The Garden as one of the best investigative podcasts in the world last year. It is an incredible honour. It wouldn’t have been possible without funding from The Henry Nxumalo Foundation.
We have had a very positive response from our two newsletter courses. One is on Investigative Podcasting and will teach you how to build a serialised, in-depth piece of audio journalism. The second is on Branded Podcasting and will teach you how to build a series for a client. Thanks has to go to DW Akademie, Barbara Gruber and her team for making these courses a reality. You can sign up now and start immediately. They are completely free.
See you next week. All the best,
Join our WhatsApp Community and visit our website.
You can email me directly on paul@developai.co.za.
Listen to Develop Audio’s podcasts and sign up for their free newsletter courses.
Contact us on X, Threads, LinkedIn, Instagram or TikTok.
Physically we are based in Cape Town, South Africa.
If you aren’t subscribed to this newsletter, click here.