Youtube Changes Security Measures Prior to 2020 Election
Youtube has now become one of the latest social media firms to rethink its security measures and content policies as the 2020 United States Presidential Election begins. The Google-owned company has announced its plans to remove any and all content that aims to mislead votes, as well as any other disinformation from its service.
This new security update from Youtube arrived moments prior to the start of the Iowa Democratic Party caucuses – the first stepping stone in the 2020 election. Accurate reporting on the results was temporarily halted due to a fault in the tallying app. While the fault was ruled as having nothing to do with a cybersecurity failing, it was clear that poor code and even worse training were the culprits.
Social media has come a long way over the last decade and is now changing the face of politics as we know it. And with many stating that Youtube, Facebook, and others were responsible for the result of 2016’s election and the UK’s Brexit referendum, it has become a priority for social media giants to reconcile their ability to potentially sway results.
New Security Policies
The 2020 election promises to be even more contentious than the last. America prospering financially for most everyday people, with total personal income growing to $101.7 billion over the last year and disposable income to $87.7 billion, the highest ever in American history.
But both sides of the political spectrum are taking credit, and general election polls consistently indicate that American voters are torn over who to vote for with neither party having an obvious advantage.
In an update to its website, Youtube has announced that any videos containing clear misinformation, or even inaccurate information about the voting process, would be removed. Any content or ad scams stating inaccurate information about candidates or party members would be deleted, as would anything that gives poor logistical information, such as the wrong voting date.
Youtube will be clamping down on accounts that promote content that smears or over-inflates and candidates record, especially content that relates to a candidate’s “technical eligibility requirements”. The platform even made reference to attempts to smear former President Barack Obama’s place of birth during the 2008 election. Essentially, anything designed to manipulate voters will find itself in their crosshairs.
YouTube will also be deleting any channels that attempt to conceal their government connections, hide their true country of origin, or attempt to impersonate another person or channel with influence over the election. Likewise, YouTube will also be removing any channels that attempt to inflate their success using deceptive tactics such as falsely advertising content – such as a video that promises a certain story or footage but provides nothing of the sort.
YouTube runs on an SaaS model, which means that anyone can create an account and contribute to the platform, but also means that the ability for bad agents to gain access to thousands of viewers is greater than it’s ever been, and the need for Youtube to take responsibility has become clear.
As a result, these steps will go a long way to ensuring that Youtube controls its influence on the US elections. By changing their content policies to restrict the effect bad agents can have on the election, the argument goes that they help to empower voters to make genuinely informed choices, rather than falsely informed ones. In a time where election interference from foreign powers is still making headlines, this is more crucial than ever. But this is also not the final step.
Bringing Real Journalism to the Forefront
To help truly assist voters and promote a genuine election, YouTube is also attempting to cultivate an environment of real journalism. One of the ways they’re looking to do this is by putting the most reliable and high authority content for news and political information higher on the watch next and recommended side panels.
‘Breaking News’ and ‘Top News’ sections will be introduced that highlight the best quality journalism and information panels that make it clear whether video publishers are receiving government money. This makes it easier for viewers to see what is sponsored information, and what is not.
When YouTube’s users search for videos and information on presidential candidates, the service will present them with an information panel that gives the users details about the candidates themselves as well as links to the candidate’s real Youtube channel. This is similar to what they offered to users throughout the 2019 EU parliamentary elections and 2018 US midterms.
This is important because while Facebook may consistently rank as the number one source of misinformation (and thus be at the forefront of the push against it), YouTube is still the second most used social media platform, and especially for news content. With over two billion monthly users, Youtube must take serious efforts to meet its responsibility towards the election.
Dangerous New Technologies
There are numerous new technologies that are putting out false information and directly threatening peoples’ online security today as well. This is one reason why the United States government is planning to spend $15 billion on cybersecurity for protecting infrastructure, businesses, and individual consumers in 2020, an increase of 4.1% from two years ago.
For example, the rise of deep fakes and doctored videos has been widely reported on. There have been videos that show actors in movies artificially recast or incredibly believable impersonations of public intellectuals. As this technology grows more sophisticated, its ability to influence elections becomes more profound.
After all, if a convincing deep fake can be made of a candidate saying something offensive or contrary to their actual beliefs, this could be used to sway voters one way or another.
Of particular note was the infamous video of Nancy Pelosi slurring her words – a video that Facebook refused to take down. The video was doctored and designed to cast Pelosi in a bad light, bringing the company under fire from the Democratic Party and others, notably Pelosi herself.
Given the backlash Facebook faced, it could be supposed that Youtube’s motivations for blocking doctored videos and deep fakes are simply to save face. Regardless, doing so could have a net positive effect.
Conclusion
These new rules should help to stem the flow of bad agents looking to use YouTub manipulate voters and help to start the process of making its content more accurate, trustworthy, and responsible. However, YouTube also has a long way to go before it can curtail the spread of misinformation on its platform.