Database Development Essay

Custom Student Mr. Teacher ENG 1001-04 23 April 2016

Database Development


This paper defines the Software Development Life Cycle phases specifically the Waterfall method with a review of tasks to improve the quality of datasets throughout the cycle. It includes recommendations of actions to be performed for full optimization for enhancing performance from data quality assessment. Although full optimization may be reached throughout the process of SDLC, continued maintenance must be in sued to properly retain the database error-free and protected. An evaluation of three methods and activities to ensure maintenance planning is implemented is discussed. An in-depth analysis of an efficient method for planning concurrency control methods and lock granularities that are available to use that will minimize potential security risks that may occur. Finally, serializability isolation model is introduced that ensures transactions produce less record-level locking while operating the system and how a verification method allows review of proper inputs and error checks to increase consistency.


There are several Software Development Life Cycle methods that are availabel to utilize although, the Waterfall SDLC is the most desirable due to the simplicity and straight forward methods utilized and will be discussed in regards to topics in this paper. The benefits of this model type include departmentalization and manegerial control. A schedule can be set for each phase similarly to a how a factory system works from one step to the next in a proceeding manner until the product is complete. However, once in the testing phase it is difficult to revert back to make any additional changes. (SDLC Models., n.d.).

Tasks to Improve Dataset Quality Using SDLC Methodology

The Waterfall SDLC incorparates the following stages of planning and executing software, requirements specification, design, implementation, testing and maintenance. The requirements phase of the SDLC is to ensure clearly defined requirements via all parties involved in the processes. Deliverables in this stage include requirements documents that incorporates descriptions of requirements, diagrams and references to necessary documentation as well as Requirements Tracability Matrix (RTM), this displays the manner in which products being developed will interact and correlate to previous components that have already been developed. This phase prepares datasets integrity success throughout the SDLC process when requirements are properly defined. (The Software Development Cycle (SDLC)., n.d.).

The design phase lists software features in detail with psuedocode, entity-relationship model(s) (ERM), hierarchy diagrams, layout hierarchy, tables of business rules, a full data dictionary, and business process diagrams. This phase transforms the requirements into system design specifications. In this phase it is imporatant to review software and hardware specifications and system architecture. This will create the foundation for the implementation phase. Lastly, the implementation phase begins the coding process in which portions of programs are developed and tested. Clearly defined requirements are defined via use-case scenario that enables context based definitions and a visualization of the completed product for clarifications, accuracy, and completeness of requirement request. (SDLC Models., n.d.).

Actions to Optimize Record Selections and Improve Database Performance

Actions to optimize record selection and improve database performance include automated controls that can be applied in the design phase of SDLC. The design phase specifically it is important for developers to set proper automated controls such as input, processing, and output controls to enhance integrity, security, and reliability of the system and datasets. Input controls such as completeness checks and duplication checks ensure blank fields and duplicate information is not entered into the data sets. Automating process controls to ensure systems correctly process and record information. (FFIEC IT Examination Handbook InfoBase – Design Phase., n.d.). Quality management techniques that improve quality assessments include error detection, process control, and process design. These processes detect missing values, improve recurring errors, and help optimize effeciency. (Even, A., & Shankaranarayanan, G., 2009).

Three Maintenance Plans and Three Activities to Improve Data Quality

Three types of maintenance plans include: preventative, corrective, and adaptive maintenance which improve the data quality. Activities to improve data quality include database backups, integrity checks, optimizing the index. Preventative maintenance incorporates creating and continuously maintaining daily and/or weekly backups for data loss prevention, corrective maintenance ensures system errors are corrected. One activity associated to corrective maintenance includes resolving deadlocks, which occurs when two or more tasks permanently block each other. Adaptive maintenance includes enhancing system and database performance via based on utility assessments and optimized queries to improve performance. (Coronel, Morris, & Rob., 2013).

Methods for Planning Proactive Concurrency Control and Lock Granularity

Concurrency issues revolve around conflicts that occur when simultaneous tasks are performed on multiple systems, the conflict may cause inconsistencies. The goal of concurrent controls is to establish consistent throughput and accurate from results in concurrent operations. Granular locking schemes enable locking pages, tables, rows, and cells. After reviewing “Process-centered Review of Object Oriented Software Development Methodologies,” the methodologies mentioned were outside of the scope of concurrency and lock granularity. However, there are two methods, high granularity approach and low granularity approach that will enable a distributed database with consistency. High granularity offers maximum concurrency although requires more overhead versus low granularity which offers minimum overhead although reduces concurrency. Additional overhead in the form of locking granularly at different object-oriented hierarchy levels helps create proactive concurrency control within the system. This provides additional security via the ability to control which users are modifying the database at the same time. (Ellis, R., n.d.).

System Analysis to Ensure Tractions do not Record-Level Lock Database in Operation

In multiuser database transactions that are executing simultaneously must have consistent results, it is vital to have control over concurrency and consistency. To enable processes that provide this control a transaction isolation model named, serializability is available for use. This model gives the illusion that transactions execute one at a time. The multiversion consistency model provides multiple users with a separate view of the data concurrently, which prevents record-level locking from affecting the database. (Data Concurrency and Consistency., n.d.). Once updates are commited to the system verifyoption can be utilized to ensure the integrity of data entered will enhance system effectiveness. (SqlCeEngine.Verify Method (VerifyOption) (System.Data.SqlServerCe)., n.d).


In conclusion material discussed includes an analysis specific tasks that will improve the quality of datasets within a database. A review of the Software Development Life Cycle (SDLC) and more specifically the Waterfall methodology SDLC. Recommended actions in the design phase that will enhance the optimization of record selection are considered along with three maintenance plan options and activities to improve the quality of data within the database. Serializability isolation model ensures transactions that will produce less record-level locking while operating the system and verification methods will allow for review of proper inputs and error checks to increase consistency. Overall, research shows that multiuser distributed databases utility will depend on specific functions created from the origination of the product in the SDLC to the finished product and continued maintenance for consistent and efficient performance.

Data Concurrency and Consistency. (n.d.). Oracle Documentation. Retrieved September 12, 2013, from Even, A., & Shankaranarayanan, G. (2009). Quality in Customer Databases-Centered Review of Object Oriented Software Development Methodologies. ACM Computer Database, 15, 3,4,5. Retrieved September 12, 2013, from the ACM Computer database. Ellis, R. (n.d.). Lock Granularity. Granularity of Locks_and Degrees of Consistency_in a Shared Database. Retrieved September 12, 2013, from FFIEC IT Examination Handbook InfoBase – Design Phase. (n.d.). FFIEC IT Examination Handbook InfoBase – Welcome. Retrieved September 12, 2013, from Rob, P., & Coronel, C. (2002). Database systems: design, implementation, and management (5th ed.). Boston, MA: Course Technology. SDLC Models. (n.d.). One Stop QA. Retrieved September 12, 2013, from SqlCeEngine.Verify Method (VerifyOption) (System.Data.SqlServerCe). (n.d.). MSDN the Microsoft Developer Network. Retrieved September 12, 2013, from The Software Development Cycle (SDLC). (n.d.). Pelican Engineering. Retrieved September 13, 2013, from

Free Database Development Essay Sample


  • Subject:

  • University/College: University of California

  • Type of paper: Thesis/Dissertation Chapter

  • Date: 23 April 2016

  • Words:

  • Pages:

Let us write you a custom essay sample on Database Development

for only $16.38 $13.9/page

your testimonials