In the 21st century software is every where imaginable, when one checks out at the grocery store, using a credit card, driving ones car or listening to music in your new MP3 player just to name a few. Software development has undergone a change from its early days, the awareness of the software crisis has forced engineers to address the problems by various processes and methodologies for “Best Practice” and the industries are realizing that formal software processes lead to better products with higher quality and reduced costs. And yet the principles of the software crisis are still here.
Software projects still run over budget; projects are late, contains large amount of errors and are completed with the wrong requirements. There is a crisis in the development of IT project among software engineers and developers. The term software crisis was used in the early days of the software engineering field. It was used to describe how the rapid increase in computational power and the complexity of problems which now could be tackled. It refers to the difficulty of writing correct, understandable and verifiable computer programs.
The consequences of the software crisis led to IT projects running over-budget, over-time, low quality and that software became unmanageable or did not even meet the requirements. The need for “Best Practice” became important for IT developers. This paper discusses the reason behind the use of “Best Practice” that are needed to properly implement IT projects such as software development. Is the “Best Practices” approach an effective means of controlling the implementation of IT projects? YES! “Best Practice” is an effective means of controlling the implementation of IT projects.
It is important to utilize the “Best Practice” for IT projects such as Software developments. Today, software exists in almost any imaginable device; more and more products are enhanced with software embedded into small scale computers. As software grows more and more important, the need for quality in our software is ever increasing. Large software development project requires a team of programmers rather than a single individual. Software products are not without their problems; some systems do not function as expected or do not even function at all.
Sometimes software is written that contains faults but it is delivered anyway because it works “good enough”. The difficulty of writing correct, understandable and verifiable computer programs have lead many engineers to cry out that we are living in a software crisis. “The roots of the software crisis are complexity, expectations and change and the contradictions of requirements have always troubled development processes, users always demand more features, customers want to minimize the amount they must pay for the software and the developing company wants to minimize time required for its development” .
While successful result does not necessarily depend on good project management, poor project management will definitely lead to failure. IT Projects are running over budget, overtime or suffers from low quality. To make matters worse, as the computers became more powerful the techniques for developing software stayed the same. With the rapid technology change in computer hardware, the constraints between software technology and hardware technology make it more difficult for engineers to take advantage of hardware improvements.
As software become more and more embedded into consumer electronic products and safety critical appliances software has to be more error and risk free. However, getting the software to work properly for the first time is hard for many developers leading both the developers as well as the company to expensive testing of the software. Maintaining software has become very expensive compared to what it used to be in the 1980s. “American and European giants compete with Asian development companies for contracts; in response many are forming subsidiaries overseas” .
Computer science is a very young science, if the 1960’s is to be regarded to be the equivalent of a software stone age; the software industry can arguably be regarded as still being only in its middle age where all the problems of the software crisis are still with us. The term software crisis was coined in the early days of software engineering, in the late 1960’s . Since then the processes of how we conduct our software engineering, design our systems and how we develop our software has undergone a rapid change. A good software development involves a lengthy and continuing process usually referred to as the software’s life cycle.
Although there are many different process models and practices in the software engineering community, the life cycle of any software begins with the sprawling of an idea; in other words we have a real world problem that we wish to solve. The solution in this case is to apply a computer and some software to solve the real world problem. Even if software process models differ in many cases, most involve the general steps. One step that runs throughout the project is documentation; every phase of the project will have to be properly documented.
As of today, a large scale software system is considered to be a system that contains more than 50,000 lines of high-level language code. It is large scale software systems that suffer the most from the software crisis. Large scale software is developed in teams consisting of project managers, requirements analysts, software engineers, documentation experts and programmers. With so much professionalism and organized ways of working, where is the problem? Why is it that the team produces fewer than 10 lines of code per day over the average lifetime of the project? Why are sixty errors found per every thousand lines of code?
Why is one of every three large projects scrapped before ever being completed? And why is only 1 in 8 finished software projects considered successful? “Best Practice” is an effective way to decrease most of the problems faced by companies and its IT projects. The greatest advantage that can be utilized from the practice of Project Management is the knowledge for better improvement. “Knowledge Management is one way to provide others with experience that is known by the organization. It can help to dramatically mitigate risks by allowing the pitfalls of previous projects to be exposed and understood. ” .
Other improvements that needs to be taken into consideration is Continuous Improvement, Team Practices, Front-end Planning, Good Communications and Risk Registration and Documentation. Although it is clear that without processes any large scale project will almost surely fail. It is fascinating how broad, deep and old the crisis is. Even though not all IT projects fail, most do and it can be hard to see why given all the supposedly great tools and techniques that are supposed to work if applied correctly. Maybe this is what is not done, by using the “Best Practice” techniques and tools in the wrong way leading towards sure failure.
The crisis has been studied by many researches during the decades and their work has given us the myriad of models, tools and practices of today, it is encouraged that software engineering researchers to come up with more unified models and practices that can become a standard in both large scale and small scale industry IT projects. Conclusion Software products can sometimes be very vague. As Mike Wooldridge say “It’s hard to claim a Bridge is 90% complete if there is not 90% of the bridge there” , but it is easy to claim that a software project is 90% complete even though there is no outcome of the product.
As shown in this paper the software crisis is a very broad topic that spans over many areas in the IT industry. The problems are many, there seems to be no concessive pattern, process or testing that will help solve quality issues and time to market. Although a lot of software works and have become a large part of our lives, it can be asked if we are really living in a software crisis. Can the software crisis and the software era co-exist? It is the crisis or perhaps the awareness of the crisis in the use of “Best Practice” that drives the IT industry towards success.
Without the problems there would be no research or new technologies, the need is the mother of all inventions. But there is always a problem that the tools and techniques of “Best Practice” invented today will need time to mature, to be introduced into industry, perhaps then it will be too late. To answer above question why software teams produce less than 10 lines of code in the average software lifetime is because maybe the programmers are less motivated when using tools that influence less creativity.
The reason to why sixty errors found in every thousand lines of code is that the testing tools used to test the programs are inadequate or not efficient enough. Every third large project is scrapped before ever being complete due to lack of “Best Practice”, as well as the many problems with the software engineers and developers who are not properly trained or experienced enough to see their own limitations. Reference  Wikipedia: History of software engineering, http://en. wikipedia. org/wiki/History_of_Software_engineering; Accessed on 17th July, 2007
 W. W. Gibbs. Software’s Chronic Crisis, http://www. cis. gsu. edu/~mmoore/CIS3300/handouts/SciAmSept1994. html; Accessed on 17th July, 2007  Yongxue Cai,Sunny Ghali,Michael Giannelia,Aaron Hughes,Adam Johnson and Tony Khoo; http://www. pmforum. org/library/papers/2004/itpractices. pdf; Accessed on 18th July, 2007  S. L. Pfleeger; Software Engineering: Theory and Practice. Prentice Hall, 2001.  LECTURE 5: SOFTWARE PROJECT, http://www. csc. liv. ac. uk/mjw/teaching/softeng/lect05. pdf; Accessed on 18th July, 2007
Courtney from Study Moose
Hi there, would you like to get such a paper? How about receiving a customized one? Check it out https://goo.gl/3TYhaX