To install StudyMoose App tap and then “Add to Home Screen”
Save to my list
Remove from my list
Humans have always been gifted in adapting to their environment. Difficult decisions such as which location to settle in or move to, or which leader to side with in battle, were made based on both the experiences of the individual or the general opinion of a very small group. However, as time progressed so did the development of means of communication throughout the world such as through a telephone, especially following the invention of the internet 1980s and the introduction of the World Wide Web in 1990. With these means of communication improving as much as they did, so did the ability for people to gain information from others about their interests, personalities and relationships.
This information has largely been given out voluntarily through different services such as Google, Facebook, Twitter and other large websites because it has generally been considered that privacy would still be a concern. However, the development of large algorithms, such as that used on Facebook, has taken that information and used it to create more personal experiences for users, such as using targeted advertisements based on searches or likes, suggesting restaurants or other public services and changing website layouts and accessibility based on how a user actually navigates the web page.
Facebook founder Mark Zuckerberg would find this revolutionary because it makes experiences more personal, which keeps people connected to the platform.
However, this “automated thought” removes decision making from people who are content with tailored advertisements and suggestions. Consequences of this new form of decision making can be drastic; this includes reduced privacy for users such as with their personal information, an exponential increase in power for tech companies who are able to use the information for themselves in order to better meet their bottom line, and gives controllers of these algorithms the ability to possibly control what users can see and search, reducing their ability to make their own social, religious and political decisions.
Privacy for users on services such as Facebook and Google has always been a major concern for those who understand the information they might be giving away. For example, the use of cookies – small text files that a webpage uses to recognize individuals – have always been in the minds of people because they generally get notified about them. However, while cookies are something most people know about, the much more sinister use of algorithms to track users are not. Indeed, according to Franklin Foer in “Facebook’s War on Free Will,” he says “(Facebook’s) algorithm was developed in order to automate thinking, to remove difficult decisions from the hands of humans, to settle contentious debates”. Cookies are unlike algorithms because they are not a step by step instruction on what to show or not show a user, instead they store the information needed to allow the algorithm to work in the first place, such as user preferences and personal information such as passwords and usernames. This lack of privacy is not because companies have sneakily added these programs without telling the user, but because users voluntarily agree to terms of services which take away their privacy, since it is the only way to use services such as Facebook, which has a massive presence on the internet and is one of the only places to keep in contact with friends and family.
At the beginning of Facebook’s terms of service it states very clearly “We use the data we have – for example, about the connections you make, the choices and settings you select, and what you share and do on and off our Products – to personalize your experience” (Facebook). This very blunt piece of information confirms the entire point of Facebook- to have users engage with friends and services in order to collect and use data under the guise of it being in order to make a better experience. While this kind of personalization is liberating in the sense that it unites people with similar interests and personalities, on the other it can be damaging to those who have views that differ to the creators of the algorithms because they are more easily targeted by the algorithm. For example, a certain group may be banned from using online services and those who share opinions similar to those groups may be targeted themselves. As Foer puts it “the point is that Facebook has a strong, paternalistic view on what’s best for you, and it’s trying to transport you there”. The information Facebook collects allows such easy access to information that companies can refuse to hire or even fire individuals because they chose to share information about themselves and their opinions over the internet. However, according to Zuckerberg this may be a good thing because “having two identities for yourself is an example of a lack of integrity”. As these revelations about lack of privacy become more well known to the public, questions will continue to be raised about how governments can control what information companies can take from users, and also what they can do with the information.
Algorithms have drastically increased the power of the companies that use them effectively and have large user bases to the point where they are able to control the public conversation. In a September 7, 2018 Facebook post by Zuckerberg himself, he stated: My personal challenge for 2018 has been to fix the most important issues facing Facebook -- whether that's defending against election interference by nation states, protecting our community from abuse and harm, or making sure people have control of their information and are comfortable with how it's used. While what Zuckerberg is saying seems genuine and without any harmful intent, what may be abuse or harm is subjective, and with the algorithms so large is seems increasingly difficult to control what information Facebook can take from people, and of course take away all the information they already have. In fact, these algorithms have gotten so large that they are able to shift public opinion on a wide variety of issues and even increase voter turnout. As Foer puts it, “(Facebook) has bragged about how it increased voter turnout (and organ donation) by subtly amping up the social pressures that compel virtuous behavior.” While it may seem reasonable to encourage people to go out to vote, by having this amount of influence on such personal issues it seems very likely Facebook would be able to not only increase voter turnout, but perhaps only do it for one side of the political spectrum. This amount of power by a private company is undermining democracy by essentially streamlining the decision making process for people, not only in who or what they vote for, but also in what stores they shop out, the charities they give to and the values they instill in their children. This monopoly of power does not just apply to democratic countries, but also and especially to countries with authoritarian governments such as China, where tech companies are in collusion with the government, such as with Chinese company Baidu, equivalent to Google.
In a 2005 Guardian article “The man behind China's answer to Google: accused by critics of piracy and censorship” by Jonathan Watts, he quotes Baidu cofounder Robin Li as saying “As a locally operated company we need to obey the Chinese law. If the law determines that certain information is illegal, we need to remove it from our index”. This shows that while it may seem a large company may have their own political and social interests, sometimes they are influenced by the government themselves, most likely for personal gain and because of the overwhelming power the government has over companies. Indeed, according to Ambrose Leung’s article in the South China Morning Post from June 4, 2009 following 20th anniversary of the Tiananmen Square Massacre, the Chinese government forced internet companies to have “national internet server maintenance day' covering most major online networking websites.” Other sites in China such as Facebook and Twitter were no longer made accessible while their Chinese counterparts such as WeChat and Weibo were brought under the influence of the Chinese government so they could continue to operate. To summarize, the algorithms of major social media companies both in democratic and authoritarian countries have, in essence, become the predictor and the guide of human decision making and there does not seem to be any sign of it stopping.
As the trend of companies controlling what users see continues to become more and more complex and valuable, the great ability humans have of making their own decisions based on their own experiences and opinions of others can diminish drastically to the point where everyone thinks and acts the same. This would occur because companies would be able to not only make suggestions, but also censor and stop the spread of certain information, not only political ones, but less important suggestions such as where to eat and what to watch. As Foer puts it, “algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. The continued presence of algorithms that are unchecked by privacy laws will have a lasting impact on diversity of thought and individualism, which does not seem to bother leaders of big tech companies, who according to Foer “has a team, poached from academia, to conduct experiments on users”.
As time goes on, the incredibly damaging trend of automated thinking for users will not only continue to negatively impact democracy, but will create a greater reliance on big tech for information, advice and even how to act. On a multinational level, political unions such as the European Union will continue to implement changes to the information that companies can take from users and how they can use that information, which can stall what seems to be inevitable- that is the end of free thinking in favor of a more unified system of thinking in order to prevent conflict and make the need for privacy a thing of the past, especially if people think they are getting a better experience with services that take all their information.
The Danger Of Algorithms to Individualism. (2024, Feb 22). Retrieved from https://studymoose.com/the-danger-of-algorithms-to-individualism-essay
👋 Hi! I’m your smart assistant Amy!
Don’t know where to start? Type your requirements and I’ll connect you to an academic expert within 3 minutes.
get help with your assignment