In response to the question ‘What would you like to see Youtube become” asked by Charlie Rose, back in 2006, Chad Hurley, one of the co-founders replied to it by saying “We see ourselves building the next generation platform to serve Media world-wide’ (Charlie Rose, 2006). And he was right, it was the beginning of a social revolution (Barkham, 2010)
Youtube is a brilliant example of a classic Silicon Valley garage to glory tale. (Delaney,J, 2006)
What started off as a simple idea, to tackle the problem of uploading videos online, blew out to become a platform that would become a destination of its own (Moylan, 2015)
The initial days were masked with some struggles, as the product was so primitive and the website picked out videos for people to watch (Wasserman, 2015).
The “Me at the Zoo” video made a breakthrough and gained people’s attention.
The founders were confused on what YouTube exactly was when it was a Beta and decided that it could be a dating app (inspired from HotOrNot) or a video messaging site.
A barbeque dinner party turned the tables around for YouTube and was an answer to all the money woes, as it was then that Karim met Keith Rabois, and later Roelof Botha, who went on to invest in it (Wasserman, 2015). Youtube made sharing easy, and this prompted the boom of the online video, and they found their big break (Wasserman, 2015).
Google acquired Youtube on Oct 9th, 2006 and it has been a social revelation for YouTube ever since (Summers,n.d). Chen was inspired by how Google, being the biggest search giant, managed to monetize the platform without having to hurt the users, and he said, ‘It translated over to Youtube as well. There are people that create content, view content and pay for content. If there is a situation to make all three parties happy, it’s a win-win’ (Summers,n.d).
Youtube’s central revenue stream is through Advertisement (Wikipedia,n.d). In his book “Internet and Society”, Christian Fuchs says, “Youtube is an example of a business model that is based on combining the gift with the commodity. The first is free, and the second yields profit” (Wikipedia, n.d). The shared and re-mixed free content is just an allurement over which they base their revenue foundation from advertisement and premium services (Wikipedia, n.d)
The youtube CEO Susan Wojcicki gave an insight into their numbers revealing that YouTube currently holds 2 billion monthly active users, which showed a steep 5% increase on the 1.9 billion logged-in users that was previously reported in 2018 (Iqbal, 2019).
It is the world’s second biggest social network platform, as reported in We are Social and Hootsuite’s Digital 2019 report (Iqbal, 2019). The watch time of Youtube has skyrocketed to more than 250 million hours per day, which was 39% higher than the 180 million hours reported in 2018 (Iqbal, 2019). 95% of the world-wide internet population views YouTube, and there are more than 50 million creators. (Omnicore, 2019).
This demonstrates a steady digital ecosystem, with a platform where both users and creators co-exist and mutually benefit from each other. Google is turning to YouTube to be it’s next potential driver of growth, with a net advertising revenue of YouTube as $4.95billion (Omnicore, 2019)
Private Regulation of content on the Internet
In YouTube’s Creator Blog, Susan Wojcicki addresses the implications of having a platform which is built on the premise of openness (Wijcicki, 2019).
She says,”Without an open system, diverse and authentic voices have trouble breaking through” (Wojcicki, 2019). But openness leads to a fair share of problematic content, which has a monstrous impact on the community and potentially harm the ecosystem, giving reason to doubt an open model that has nurtured and built a sharing community.
The practical difficulties of monitoring content on the Internet is huge, and private regulation often leads to huge financial costs, which involuntarily disincentivize the big players from monitoring user based content online (Balteanu, Marcu, 2014)
The Internet as a global commons, renders it difficult to regulate data online due to it’s borderless nature (Nair, 2007) There has been a growing friction between governments and the private companies when it comes to the transparent nature of content that is breeding online from the Internet services provided by these companies (Arthur, 2017). France and Germany have inflicted fines on companies which have permitted Nazi related content to be unattended online (Arthur, 2017).
Twitter has created a venue where users can be harassed at an astonishing scale such as the Gamergate dispute, Alt right (Arthur, 2017).The UK government has implemented The Online Harms White Paper, proposing a mandatory duty of care on platforms, pushing regulation and responding to the mental health impacts of harmful online content (Lomas, 2019).
Facebook has faced the wrath of being the platform of choice where disinformation and fake news is spread by governments and political parties around the world (Hanbury, 2019) YouTube has an average upload rate of about 400 videos per minute, and research shows that extremist content can remain on the platform for days, although the median period is 24 hours. (Arthur, 2019)
This open system poses challenges which put a heavy weight on the back of these companies, arousing dire harmful consequences if not rectified.
YouTube’s Revenue Model – Ads on YouTube
YouTube’s primary means of revenue plays out from sponsored, embedded and landing page advertisements (Dutta, 2019).Google is the biggest data repository, as it captures our interests, and search preferences, and helps Youtube target the advertisements by creating affinity and market audiences (Smith, 2019).
It also has a freemium revenue model which is the YouTube Premium and taking inspiration from Patreon’s business model, Channel memberships is another source of income, where YouTube retains 30% of the subscriber’s membership fee. Similar to the competitor Hulu, YouTube TV is also a brilliant addition to the revenue model, with more than a billion users (Dutta, 2019)
Importance of Digital Brand Safety
According to Forrester, fraudulent online ads will lead to a revenue loss of 10.9 billion by 2021 (Zitto, 2018).Over the years, programmatic advertising has seen the biggest boon. Although it has lead may companies to achieve their big sales numbers through its targeting and relevancy, there are downsides which have a rather dark nature.
Sometimes in an open marketplace, advertisers are more focused on cheaper inventory, rather than transparency and individual Ad placements (Vranica, 2017) Malgorithms are Ads which are misaligned and have been placed alongside irrelevant content, and such misalignments can often damage the reputation of your brand (Bannerflow, 2017)
Companies which have an open platform and act as libraries of content online, are often bogged down by the challenges of controlling and analysing this inflow and outflow of content.
Facebook has taken a considerable dip in terms of brand safety owing to its fake news scandal, YouTube following closely after the string of brand safety breaches that prevailed within the company. Unwanted bot traffic and fraudulent clicks have also added to the difficulties of maintaining 100% brand safety online (Bannerflow, 2017)
It is important to look into contextual implications of the online content where your Ads are placed, and ensure that it fits with your brand care guidelines, personalizing the brand safety strategy at the ground level (Welch, 2019).
There has been considerable AI and machine learning driven advancements to improve transparency and build more tools that could review and classify content at a more granular level. YouTube has had a flurry of events which have affected the brand image of the company, and have had severe implications on customer loyalty on the platform.
According to analysts at Nomura Instinet, the advertiser boycott could cause YouTube’s revenues to take a hit of 7.5%, costing them an approximate of $10.2 billion for 2017 (Rath, 2017).
YouTube uses a theme-guided algorithm, and places Ads right before videos of a monetized channel (Alexandra, 2019). Advertisers do not pay much heed to individual Ad placements, instead they are more concerned on buying audience’s and the programmatic method of bidding on each Ad slot (O’Reilly, 2017).
What marked the beginning of this dreaded Adpocalypse, was when YouTube sensation Felix Kjellberg was involved in portraying offensive and insensitive rhetoric online, and owing to his reaction to a symbol “Death to all Jews” in one his videos.
There was public backlash, and heavy consequences which involved him being dropped by Disney, losing his YouTube Red Series, and being removed from Google Preferred list (Alexandra, 2019). In March 2017, an article in the Times, confirmed instances where the UK government funded advertisements appeared on hate sites promoting anti-semitism and offensive videos by White Nationalist David Duke. (Palladino, 2017)
The implications of such promotions came down on YouTube and soon many top advertising players like Wal-Mart, Starbucks,General Motors, FX Networks, Johnson & Johnson and more than 200 others joined this blackout from YouTube and removed their Ads (Sherr, 2017)
What followed was even more appalling, when sexual imagery was getting portrayed through unlicensed disney characters, and curated to look kid friendly (D, 2017). The videos contained content which involved sexual instances, drugs, toilet humour, fetishes and disturbing events shown through favoured animated characters (Wikipedia,n.d)
They were tailored in such a way that their titles, descriptions and the kind of keywords used skirted their way out of the inbuilt child algorithms, and they also leveraged the automatically placed ads on them (Placido, 2017).
The child exploitation scandal came into light when Ads started appearing on videos of “scantily-clad children” and this made companies like Adidas, Mars, Cadbury, Hewlett-Packard and Deutsche Bank put their global Ads on hold (Handley, 2017).
BuzzFeed bought YouTube’s autocomplete feature into light, which was aligned to display results that autofilled to child sexual exploitation (Gillispie, 2018) Toy Freaks, which had 8.5 million subscribers, was dismissed on the basis of posting such contentious content (YouTubeWiki,n.d).
Another public upheaval followed soon, when a YouTube star, Logan Paul, was embroiled in controversy when he filmed a video of a dead body in Japan’s suicide forest, and reacted to it on his video blogs with no remorse (Leskin, 2019). YouTube was getting the wrath of users misusing the platform, and was not able to moderate content at a rate fast enough to avoid such situations.
A Pew Research Centre experiment implied how the YouTube algorithm recommended videos of longer content and how popular videos were constantly repeated in the recommendations (Iqbal, 2019).
This experiment had 175,000 trials, out of which 134 videos were recommended at least a 100 times, and from the list of top 50, 11 were classified as targeting children (Iqbal, 2019). The most frequented recommended video category is that of children (Iqbal, 2019).
The #Youtubewakeup became very prominent when YouTuber MattsWhatItsIs posted a video online identifying a “wormhole” that allowed pedophiles to contact each other and trade child porn (Katzowitz, 2019) He outlined how YouTube’s recommended algorithm connected pedophiles to trade contact information, images of children in compromising positions, and actual child pornoraphy links in the comment section through time stamps (Katowitz, 2019).
It took him less than 10 mins and sometimes less than 5 clicks, to enter into this vicious wormhole from a never before used YouTube account, and once down this rabbit hole, the algorithm glitched and facilitated only child exploitation videos (Katzowitz, 2019).
Nestle, Epic Games, Disney, Peloton reacted to this Pedophilic filtration debacle by stopping advertisements on YouTube (Francis, 2019) YouTube’s inability to retain brands showed when AT&T, who had just resumed advertising on YouTube after a two-year halt, again stopped their Ads due to the recent fiasco (Francis, 2019)
A YouTube conservative commentator, Steven Crowder, set off another fire in this ongoing war of content moderation and hate speech, when he was accused of making jokes and insulting comments on a Vox journalist, Carlos Maza (Goggin, 2019).
Maza went on to accuse YouTube of not moderating content which prompted homophobic harrasement, and was followed by outrage from many left-wing groups for the lack of action taken (Goggin, 2019). YouTube response suggested that such comments were opinions, and nothing prompted any personal attack against Maza, and that it did not violate their policies (Re,n.d).
Succumbing to criticism, YouTube introduced tightened monetized policies, completely stripping Crowder of monetization, and several right-wing channels such as Black Pigeon Speaks, Louder with Crowder were also deprived (YouTubeWiki, n.d).
According to Buzzfeed, the most basic searches on YouTube would lead to conspiracy theories and Motherboard announced the masse 9/11 newscast that was being recommended to YouTube users (Dozier, 2019)
How has YouTube worked towards better Digital Brand Safety?
YouTube is constantly fighting a losing battle when it comes to content moderation, owing to the sheer exponential volumes of content being uploaded every second. Although human intervention is needed, YouTube has to leverage technological advancements and utilise them to better analyse and monitor content online.
Soon after the company was slapped with the accusation of Ads playing on extremist content, Google General Counsel Kevin Walker, laid emphasis on video analysis models, which would leverage the machine learning and AI technologies to tap into previously removed violent content, and prepare machine “content classifiers” to identify and terminate such content (Kho, 2017).
YouTube uses the ‘anti-abuse machine learning algorithm’ which recognizes violent content and automatically flags it, which is then subjected to human review verification and reports have shown that 75.9% of automatically flagged content was taken down before clocking any user views (Shu, 2017).
Google announced that the new tightened safeguard policies would include Ads being eligible to play only on authorized creators from the YouTube Partner Program and more power would be given to the Advertisers in individual Ad placements (Spangler, n.d)
Machine Learning has enabled the possibility of removing 70 percent extremist content in a time frame of eight hours which would have otherwise taken at least 40 hours a week to handle by a capacity of 180,000 people (YouTube Blog, n.d)
YouTube made use of the “The Redirect Method”, which redirects potential terrorist recruiters searching for hateful content to YouTube curated videos squashing violent rhetoric (Weiss, 2017) This method will be integrated into more dynamic search terms, and queries in different languages (The Redirect Method, n.d).
YouTube implemented the ‘trusted flagger program’, allowing flagging of content which were of borderless nature and made use of ‘image-matching’ techniques to stop re-uploads of already existing terrorist content ( Gutelle et al. 2017)
In 2016, YouTube had developed a ‘hash-sharing database’ to share and collect digital footprints of terrorist content. In July 2017 this was taken to another level when YouTube, Microsoft, Facebook, and Twitter collectively formed the ‘Global Internet Forum to Counter Terrorism’ (GIFCT) to share such distinctive digital hash databases and tools that would make such online dissemination of harmful content, difficult (Google Transparency Report, n.d ).
YouTube has partnered with Faith Matters, JFDA, Licra and Observatorio Web to tackle the grey area of hateful content and get more linguistic subject matter experts (Google Transparency Report, n.d). The YouTube intelligence Desk was set up in January of 2018, where newly uploaded problematic content would be monitored by analysing new media trends, news and google user queries (Kaser, 2018).
In the YouTube Community Enforcement Report released, the company mentioned the removal of 8.3 million videos, out of which 6.7 million were flagged by machines, 1.1 million by the ‘trusted flaggers’, 400,000 by individual YouTube creators, 64,000 by NGOs and 73 flagged by government agencies, over a period of a quarter (Weiss, 2018).
How is YouTube taking care of Child Safety?
YouTube has long-faced the wrath of child safety, and has been trying to tackle this surfacing issue for the past few years. It has leveraged the use of the ‘CSAI Match Technology’, which identifies ‘child sexual abuse imagery’ (CSAI) online and flags it for further human intervention.
They make use of the hashing technology, and maintain a database of these hashes, and YouTube has been reported to share over 100,000 CSAI hashes to the National Centre for Missing and Exploited Children (YouTube Community Guidelines Enforcement Report, n.d)
YouTube stamps out pedophilic behaviour by taking down comments on videos featuring minors, and removing any predatory comments, further reporting these to law enforcement (Shieber, 2019) Following the Elsagate scandal, the company reciprocated by removing Ads from about 3 million videos that used family entertainment as a cover to portray violent, and inappropriate content (YouTube Official Blog, 2017)
Efforts are being made by YouTube to catalyse the automation of content monitoring by building a new ‘classification tool’ that will accelerate the process of identifying and removing user comments twice as faster than before (Shieber, 2019)
YouTube has been slapped with a $170 million fine for illegally using their cookies and tracking mechanisms to collect personal data of children without parental consent. The company plans on stopping personalized Ads on content for children, eliminate video notifications and comments, limit the user engagement and the data collection from such content to a bare minimum (Sarah, 2019).
They will be leveraging machine learning to correctly identify and classify videos that target children and require creators to declare whether their content is directed to this demographic (Solsman & Nieva, 2019)
There has been talk about YouTube measuring out their decision on moving all kid content to the YouTube Kid App, which has very extensive measures on parental supervision (Hale, 2019) A $100 million fund will be initiated to focus on improving the YouTube Kids product and create technology which aids kid friendly programming to manage content on them better (Sloane, 2019)
What is Next for YouTube?
Despite all challenges, Youtube continues to grow in size, scale and revenue every year. This continuous growth makes it tougher for the company to regulate content on the platform. Strict enforcement of policies, a strong content flagging system and a healthy number of human reviews is what Youtube is banking on to keep it’s platform safe.
Cite this essay
A Social Revolution With YouTube. (2019, Dec 06). Retrieved from https://studymoose.com/a-social-revolution-with-youtube-essay