Analysis of Business Ethics

This should provide you an overview/template; be sure to add your own opinion, I have listed several additional sources for you: 1. What makes an ethical decision or issue ethical? How would you explain the differences between ethical/nonethical and ethical/unethical? What ethical issues or dilemmas have you experienced in the workplace? Decision-making is one of the fundamental keys to the survival of an organization, more so now that economic boundaries between countries crumble, business becomes more complex, and the results of decisions often have global impact.

Decisions are made constantly in business; it is the part and parcel of being effective in one’s job. Innovation and improvement on a regular basis are required to maintain and improve the ability to make rational decisions, and some psychologists even believe that the ability to make effective decisions is at the core of the individual’s success of failure within their organization (Porter, 1998).

Managers, in particular, realize that if their organizations are to survive in this dynamic and uncertain environment, they have to make decisions concerning new business opportunities, products, customers, suppliers, markets and technical developments.

This clearly indicates that the most important managerial attribute is the ability to make the right decision. The outcomes of the decisions will be used as the benchmark to evaluate whether managers are successful (Drucker, 2001).

However, not all decisions can be made on a relaxed or considered timetable. Most managers would prefer a reflective form of decision making, one in which they had a bit of time to review the options, to do any primary or secondary research, and to glean views and opinions for colleagues.

Top Writers
Professor Harris
Verified expert
4.9 (457)
Writer Lyla
Verified expert
5 (876)
Verified expert
5 (298)
hire verified writer

In an ideal situation, this would be possible – but as we know, day-to-day business decisions are not idea. Sometimes expedient decision making must occur, and for that, businesses must rely on the experience, wisdom, and rust that are part of any quality management team (Drucker). Several biases come into play when business decisions are made, often unrealized by the decision-maker. One of the most common is called the “confirmation trap,” in which the decision maker seeks out information and references that will support what they believe to be true, while ignoring information that would refute their decision (Bratvold, 2002).

Hindsight bias is also common in decision making – individuals overestimating the degree to which they would predict an outcome once they have been informed that the outcome has indeed occurred. New research has now shown that hindsight bias, while negative in some aspects, plays an influential role in the human memory system (Hoffrage, 2000). Additionally, “anchoring” or “focalism” is one of the most common biases in decision making. Essentially, anchoring is the common tendency to rely far too heavily on one trait, piece of information, comment, or personal view when making decisions.

Individuals then typically adjust the rest of their view to match that bit, again negating other data (“Anchoring Bias,” 2009). Of course, standard human bias also enters in the decision making process – the way the individual perceives reality, illusion of control, personal feelings or background on the subject, or even as simple as not liking the individual or organization on whom the decision is centered (Simon, 1979). Decision-making can be critical in two very common business scenarios.

First, when an organizational change occurs there are numerous risks that can take many forms, and which require new and unique decision making patterns. There is confusion about what the change means, how to manage the change, loss of confidence and moral among staff and stakeholders, and general panic about the new processes of decision making (Drucker). Krispy Kreme, the decades old pastry maker, recently had a severe and rapid change of direction when two events collided that caused a conundrum both internally and externally.

First, the popularity and media coverage of the no-carb, low-carb Atkin’s diet was in its heyday during 2003-04, causing a loss of market share for the company. At the same time, both franchisees and the SEC began to notice accounting irregularities that overinflated company value while diminishing franchise returns. The combination of these two events was dramatic and the culture needed immediate change. A new CEO was ired, new decisions were made, new strategic plans launched, and now, despite the recession, well on the road to recovery – all based on rational, strategic decision making (Taub, 2005). Another very common business scenario is the merging of two corporate cultures into one. Decisions that were once solid now become amorphous; normative business culture behavior changes, and employees and managers alike are left wondering what criteria will be used to make the new decisions that will affect them personally.

For example, when Starbucks acquired Seattle’s Best Coffee, speculation and panic arose about some of the very basic decisions that affected employees and managers in both companies: what would happen to the SBC stores and employees, what style of roast would consumers prefer, what changes would be made short and long-term, causing numerous speculation and panic throughout the organization and into the media (“Starbucks to Buy? , 2003). Styles and methods of decision making are a critical part of both tactical and strategic management in the business world.

Styles have been categorized, types of decisions defined, and numerous materials written about formulating logical and rational means with which to use information to provide the best decisions possible. However, in the final analysis, it is the maturity and ability for the decision maker to synthesize data, guard against excessive bias, and rationally weigh various options that are the key to positive decision making behavior.

However, when one looks at the history of any philosophical subject, it is important to note that differing concepts of philosophy often arise “out of” that very historical and cultural fabric of the time – and then evolve so that they become more acceptable to future generations rather than contemporaneous ones (MacIntyre, 2006). While the topic of ethics is vast, and a long essay could be written about each individual contribution, we can divide the idea of Western Philosophy into seven major periods, at least to appropriate group basic thought.

First, for western philosophy, the basis for the study of ethics arose in Ancient Greece. Socrates, for example, believed that the pinnacle action of anything that makes one human is the ultimate goal of happiness, and virtue (morality) is the manner to reach that. Since all humans intrinsically seek happiness, it must mean, then, that all humans are moral and just, and any evil is nothing more than ignorance. Socrates’ student, Plato, took this view and expanded on it by indicating that virtue is not just the absence of ignorance, but in such moral virtues like justice, fortitude, temperance, and harmony.

Aristotle, taking both ideas, used his observation of the world to show that it is really only happiness that all humans strive; and everything else is simply a part of that ultimate activity (Roochnick, 2004). The bridge between Greece and Rome seems to be the philosophy of moral Hedonism, namely that it is virtue itself that causes humans to be happy – and that anyone seeking wisdom will move away from human law to a higher calling (also called Cynicism).

Moving from this idea, the Stoic movement in both Greece and Rome (Seneca, Epictetus, and Marcus Aurelius) said focus should be directly on virtue, and passion and/or deviation from that focus is inherently evil. Cicero of Rome was interesting in that he took a few of the views from most previous philosophers, and established the idea that actions are not necessarily good or bad because of human action, but because of their own, individualistic nature.

This idea was a more “reasonable conglomeration” of ethics, and Cicero focused on the rational, and obedience to a higher authority, which, interestingly enough, seemed to lead right into the next phase, Christianity (Saunders, 1997). Even though Cicero focused on obedience, none of the Ancient Philosophies thus far really explained the nature of ethical behavior and morality in the relation of man and God. This may have been because the concept of God was not a single, solid framework, but a multiple view of types of Gods for individual occasions, events, etc.

It was not until the rise of Christianity that the nature of the “temporal” world with the “spiritual” world began to be examined. And, most of this early philosophy centered around the new Christian Gospels, later adapted and enhanced through the Medievalists St. Augustine, St. Thomas Aquinas, and other Catholic based philosophers. Whereas ethics prior sought to explain man’s condition in a more humanistic genre, the very nature of Catholic society showed that it was all-encompassing and needed the guidance of an intermediary (the Pope), Jesus as the Son of God, and then God himself (Tomkins, 2006).

With the rise of Protestantism and the focus on Greece and Rome from the Renaissance thinkers, newer ethical philosophy began to reexamine the role of the individual and the human desire to attain perfection. For example, Thomas Hobbes believed that humans were born into a state of perpetual sin, darkness, and lack of moral continuity; it was then the job of government and society to control the urges of humanity and guide them to perfection.

John Locke, in contrast, believed that humans were born good, wise, and intrinsically want to attain the highest good, cooperate, and move toward a more just society. It is government’s job, then, to assist and aid in that discovery – rather than control, benevolent support (Syse, 2004). Modern philosophers, of which there are many, tend to take the sociological and technological changes in society and combine them with previous philosophical work to help decide just what the focus is that humanity strives.

There have been the nihilists (Nietzsche, who declare God “dead,”) the evolutionists, spurred by the work of Charles Darwin, all contributing to the idea of Marxism (Karl Marx, Friederich Engels), who believed that within each human a certain task was their contribution to society as a whole, “from each according to his ability, to each according to his needs,” spurning some of the major revolutions of the 20th century (V. Lenin and Mao) (Scruton, 2001).

Of course, the study of ethics did not end with Lenin and Mao; in fact modern social philosophy had its roots with Descartes (“I think, therefore I Am), Kant (brining unity to rationalism), to the existentialists (thought begins internally and all else flows outward), to the newer contemporary philosophies surrounding the use of language and culture (Bertrand Russell), deconstructionism (critical analyzation of thought and work through minute analysis of the text), and even into combination theories like Post-Structuralism (there are no absolute truths, and the goal is to expose the fraudulence of the self as the final entity) (Delacampagne, 2001).

Rather than viewing the overview of ethics as a “classical” versus “modern” approach, what seems apparent is that there has been a powerful, but gradual, evolution in theory that began in Ancient Greece, and has simply been reinterpreted based on more contemporary cultural and societal ideologies. References: Delacampagne, C. , (2001), A History of Philosophy in the 20th Century, Johns Hopkins University Press. MacIntyre, A. (2006), A Short History Of Ethics, Routledge. Roochnick, D. (2004), Retrieving the Ancients: An Introduction To Greek Philosophy. Wiley. Saunders, J. (1997), Greek and Roman Philosophy After Aristotle, Free Press. Scruton, R. (2001), A Short History of Modern Philosophy, Routledge. Syse, H. 2004), Natural Law, Religion, and Rights, St, Augustine’s Press. Tompkins, S. (2006). A Short History of Christian Philosophy, Eerdmas. 4. What is the difference in your mind, and in common usage, between a perception, a generalization, and a stereotype? Provide an example of each. Are each or all consistently unethical judgments or are they sometimes or always ethically justified in their use and implementation? There are no absolutes; there are no firm rules across cultures, nothing is always or never. Basic definitions; Perception – a process in which an individual attains awareness or understanding of the world. Perception is the way we see something, not necessarily how it is.

When people view something with a preconceived concept, they use that concept to form their perception of reality, because the human mind can only contemplate what it has been exposed. For example, someone who had never seen an airplane would describe it as their world, probably a bird. How you perceive something is unique and individual, based on experience. Generalization – taking a conception to the less-specific criteria; a foundation of logic or human reasoning; a way for us to make the world work for communication; an animal is a generalization of a bird because every bird is an animal, but there are animals that aren’t birds. Stereotype – an oversimplification or way to define a group based on characteristics that may be true at times, but not universal, and are the basis of prejudice.

For example, all X-Men have claws; therefore, all x-men are evil. For example, culturally: 1) The prescriptive norms, rules, roles, and networks of a specific culture. Culture is defined as; a way of life developed and shared by a group of people and passed down from generation to generation. Culture provides us a framework to organize our activity. Thus also allows us to predict the behavior of others. There are different cultural formations; these formations depend on complex elements. Major ones are language, regional differences, religious beliefs and political systems. Different people from different nations provide a good example, like Turks-from eastern cultures- and British-from western culture-.

Not all cultural groups share all elements of their culture, in larger cultural groups there are smaller cultural groups formed. A very basic example is Alevi, Sunni, Sii muslims. People usually belong to more than one cultural group. Different sub-cultures are created between people. These separate themselves under their main belonged culture but they still share a set of common characteristics. It is possible to build up understandable messages by describing and comparing cultures. Results got from these studies will let us know about the values, beliefs and what is accepted and what is an appropriate behavior in a specific cultural group. Roles applied in the society can be understood form these accepted behaviors as well.

Main cultural differences can be listed as Individualism-Collectivism, High and Low Contexted, Power Distance, Masculinity-Femininity, and Uncertainty Avoidance. Culture is a learned – passed down through parents, peers, and reinforced with positive responses, or discouraged with negative responses. Because of this very important reason it is possible for us to communicate with different cultures. Learning a culture for the use of communicative means can be done by using the methods of the following areas; psychological, sociological, anthropological, linguistical, political, economical and historical. 5. Term papers on practically every subject imaginable are available on the Internet.

Many of those who post the papers defend their practice in two ways: (a) These papers are posted to assist in research in the same way any other resource is posted on the Web and should simply be cited if used, and (b) these papers are posted in order to encourage faculty to modify paper topics or exams and not to simply bring back assignments that have been used countless times in the past. Are you persuaded by either argument? Is there anything unethical about this service in general? If so, who should be held accountable, the poster, the ultimate user, or someone else? Ethical behavior is up to the user; it is not necessarily unethical to produce a gun, for instance, because that gun can be used to hunt, for self-defense, or simply as a deterrence. However, that gun can be used for something unethical, too. So could a butter knife, etc. In terms of purchased term papers, it is not unethical to produce them or sell them, it depends on their use.

For instance, if someone is assigned a semester term paper on the Medical Practices of Saudi Arabia, but completely in the dark about the topic; purchasing a template can be a great idea – IF, it is used as a template, a guide for further research, and a way to get some ideas on organization; much like looking up an encyclopedia article, journal, or even book chapter. If, however, a paper is purchased, simply changed in terms of font or pagination, and a new title page put on it, it is immoral because the student, although they paid money for it, is passing it off as their work, rather than as a research tool. For instance, this is an article from (http://www. internationalstudents. org/academic-ethics-term. php? idlv2=69&idlv3=226): If you analyze the purpose for writing a term paper, you realize that your instructors want you to research and learn about a particular topic and understand it well enough to explain it thoroughly in your own words, in an organized and reader-friendly way.

Depending on the assignment, some research papers summarize and pull together all the pertinent information about the topic in your own words, whereas other papers present not only the information available, but expand on it and sometimes build a new theory or case based on it. This second kind of paper demonstrates your grasp of the information and challenges you to take it further based on your own interests and goals. However, in this day of high technology and access, people tend to get lazy and rely on others to do the hard work for them. Certainly this has its advantages in areas where it doesn’t make sense to “reinvent the wheel,” but in situations that the work aspect actually builds your skills, experience and understanding; it only makes sense to do the work yourself. What will a degree in your field profit you if you don’t actually have the knowledge and expertise to go with it?

So, while you will find resources and other students will pressure you to take advantage of term paper writing services available, ethically you should not consider paying someone else to write your paper for you. Do not hesitate to use the services of the Writing Center or a tutor to help you say what you want to say or learn how to organize your information and write it clearly, but do not pay someone to do the work for you. Term paper writing services will try to convince you that buying their services will actually benefit you, but remember the bottom line-the purpose for writing your term paper is not to see how much someone else knows or understands your topic and what they think about it, but what you know, understand and think.

Many, if not all of the companies that offer term paper writing services plagiarize, or use someone else’s words, recycle their papers, make up information and/or botch the bibliography. The very fact that they sell these services to students who will earn degrees based on information they have not learned or processed, should cause you to question their ethics right away. Knowing the state of the world we live in, teachers have had to create some tricks of their own to discourage this practice. University professors have methods for recognizing when student’s attempt to pass off someone else’s work as their own and consequences often include failing the class, and even getting dropped from the degree program. They take this kind of behavior very seriously.

Cite this page

Analysis of Business Ethics. (2018, Oct 18). Retrieved from

Are You on a Short Deadline? Let a Professional Expert Help You
Let’s chat?  We're online 24/7