Conventional Software Management
Project Management - 1
Table of contents
- Waterfall Model
- Barry Boehm’s "Industrial Software Metrics Top 10 List"
- Software Economics
- Cost estimation
- Improving software economics
- Improving software economics
- Improving automation through Software Environment
- Principles of Conventional Software Management
- Principle Of Modern Software Management
- Transition to an Iterative Process
A project is a series of tasks that must be carried out in order to reach a desired outcome. It involves non-routine tasks and requires planning. Specific objectives are to be met or a specific product is to be created. The project has a predetermined time span. Work is carried out for someone other than yourself. People are formed into a temporary work group to carry out the task. Work is carried out in several phases. The project is large or complex.
Software Project Management
Software project management involves the complete procedure of software development from requirement gathering to testing and maintenance, carried out according to the execution methodologies, in a specified period of time to achieve the desired software product.
Management involves planning, organizing, staffing, directing, monitoring, controlling, innovating, and representing. It is important because money is at risk and projects are not always successful. Projects are late and exceed the budget.
There are two approaches to software management: the Old Way (Conventional Software Management) and the New Way (Evolution of Software Economics).
Old Way is good in theory but does not sound practical. Examples are the waterfall model etc. Software development is highly unpredictable. Management discipline is more of a discriminator in success or failure than our technology advances. The level of software scrap and rework is indicative of an immature process.
The Waterfall Model is a traditional, linear approach to the software development process. It involves breaking down the development process into distinct stages, with each stage having defined objectives, inputs, and outputs. The key feature of the waterfall model is that each stage must be completed before the next phase can begin.
The Waterfall model is the earliest SDLC approach that was used for software development. The waterfall Model illustrates the software development process in a linear sequential flow. This means that any phase in the development process begins only if the previous phase is complete. In this waterfall model, the phases do not overlap.
There are two essential steps common to the development of computer programs: analysis and coding. In order to manage and control all of the intellectual freedom associated with software development, one must introduce several other “overhead” steps, including system requirements definition, software requirements definition, program design, and testing. These steps supplement the analysis and coding steps.
The development process follows a sequence of stages: requirements gathering and analysis; system design; implementation; testing; operation/deployment; and maintenance.
Drawbacks of the Waterfall Model
Protracted integration and late design breakage.
Late risk resolution.
Requirements-driven functional decomposition.
Adversarial (conflict or opposition) stakeholder relationships.
Focus on documents and review meetings.
Improvements to the Waterfall Model
However, five necessary improvements can be made to the Waterfall Model to make it more effective:
Complete program design before analysis and coding begins.
Maintain current and complete documentation.
Do the job twice if possible.
Plan, control, and monitor testing.
Involve the customer.
Barry Boehm’s "Industrial Software Metrics Top 10 List"
Barry Boehm’s "Industrial Software Metrics Top 10 List” is a good, objective characterization of the state of software development. It provides insight into the challenges and opportunities that software developers face in today’s fast-paced environment.
The Top 10 List
Finding and fixing a software problem after delivery costs 100 times more than finding and fixing the problem in the early design phases.
You can compress software development schedules to 25% of nominal, but no more.
For every $1 you spend on development, you will spend $2 on maintenance.
Software development and maintenance costs are primarily a function of the number of source lines of code.
Variations among people account for the biggest differences in software productivity.
The overall ratio of software to hardware costs is still growing. (In 1955 it was 15:85; in 1985, 85:15)
Only about 15% of software development effort is devoted to programming.
Software systems and products typically cost 3 times as much per SLOC as individual software programs. Software-system products (i.e., systems of systems) cost 9 times as much.
Walkthroughs catch 60% of the errors.
80% of the contribution comes from 20% of the contributors.
Software Economics is a mature research area in software engineering that deals with the most difficult and challenging problems and issues of valuing software and determining or estimating costs usually involved in its production. In this article, we will discuss the evolution of software economics and how it can be applied to improve software design, development, and evolution.
Evolution of Software Economics
Most software cost models can be abstracted into a function of five basic parameters: size, process, personnel, environment, and required quality.
Size: The size of the end product (in human-generated components), which is typically quantified in terms of the number of source instructions or the number of function points required to develop the required functionality.
Process: The process used to produce the end product, in particular the ability of the process to avoid non-value-adding activities (rework, bureaucratic delays, communications overhead).
Personnel: The capabilities of software engineering personnel, and particularly their experience with the computer science issues and the applications domain issues of the project.
Environment: The environment, which is made up of the tools and techniques available to support efficient software development and to automate the process.
Required Quality: The required quality of the product, including its features, performance, reliability, and adaptability.
Relationship Between Parameters
Effort = (Personnel) (Environment) (Quality) (Size ^ process)
Generations of Software Development
Conventional: This generation of software development occurred in the 1960s and 1970s and was characterized by craftsmanship. Organizations used custom tools, custom processes, and virtually all custom components built in primitive languages. Project performance was highly predictable in that cost, schedule, and quality objectives were almost always underachieved.
Transition: This generation of software development occurred in the 1980s and 1990s and was characterized by software engineering. Organizations used more-repeatable processes and off-the-shelf tools, and mostly (>70%) custom components built in higher-level languages. Some of the components (<30%) were available as commercial products, including the operating system, database management system, networking, and graphical user interface.
Modern Practices: This generation of software development occurred in 2000 and later and was characterized by software production. This philosophy is rooted in the use of managed and measured processes, integrated automation environments, and mostly (70%) off-the-shelf components. Perhaps as few as 30% of the components need to be custom-built.
Cost estimation is an important process in project management that involves forecasting the cost and other resources needed to complete a project within a defined scope. In this article, we will discuss the important points of cost estimation, cost estimation models, attributes of good cost estimation, and the need for good cost estimation.
Important Points of Cost Estimation
Three important topics are here:
Which cost estimation model to use?
Whether to measure software size in source lines of code or function points.
What constitutes a good estimate?
Cost Estimation Models
There are several cost estimation models available for software development projects. Some of the most popular models include COCOMO, CHECKPOINT, ESTIMACS, KnowledgePlan, Price-S, ProQMS, SEER, SLIM, SOFTCOST, SPQR/20 etc.
Predominant Cost Estimation Process
Attributes Of Good Cost Estimation
It is conceived and supported by the project manager, architecture team, development team, and test team accountable for performing the work.
It is accepted by all stakeholders as ambitious but realizable.
It is based on a well-defined software cost model with a credible basis.
It is based on a database of relevant project experience that includes similar processes, similar technologies, similar environments, similar quality requirements, and similar people.
It is defined in enough detail so that its key risk areas are understood and the probability of success is objectively assessed.
Need For Good Cost Estimation
Enable you to weigh benefits against the cost to see whether the project makes sense.
Allow you to see whether the necessary funds are available to support the project.
Serves as guidelines to help ensure that you have sufficient funds to complete the project.
Benefits Of Good Cost Estimation
A valuable tool for decision-making.
Provides a starting point from which to begin evaluation of a project.
Allows comparison between investments required for the project.
Becomes easy to exclude bad projects from consideration.
Algorithmic or Parametric Models: This technique involves using mathematical algorithms to estimate the cost and other resources needed for a project. These models are based on historical data and can be used to estimate the cost of similar projects.
Expert Judgment: This technique involves using the knowledge and experience of experts to estimate the cost and other resources needed for a project. Experts can provide valuable insights into the project’s requirements and can help identify potential risks.
Top-Down Approach: This technique involves estimating the cost and other resources needed for a project at a high level and then breaking it down into smaller components. This approach is useful when there is limited information available about the project.
Bottom-Up Approach: This technique involves estimating the cost and other resources needed for each component of a project and then aggregating them to determine the total cost. This approach is useful when there is detailed information available about the project.
Estimation By Analog: This technique involves using historical data from similar projects to estimate the cost and other resources needed for a new project.
Pricing to Win Estimation: This technique involves estimating the cost and other resources needed for a project based on what it would take to win the contract.
Improving software economics
Improving software economics is an important part of software development that involves reducing the cost and other resources needed to complete a project within a defined scope. In this article, we will discuss the five basic parameters of the software cost model that can help improve software economics.
Five Basic Parameters of the Software Cost Model
Reducing the size or complexity of what needs to be developed: This parameter involves simplifying the project’s requirements and reducing its complexity. By doing so, developers can reduce the amount of time and resources needed to complete the project.
Improving the development process: This parameter involves improving the development process by using best practices, such as agile development, continuous integration, and automated testing. By doing so, developers can reduce the amount of time and resources needed to complete the project.
Using more-skilled personnel and better teams: This parameter involves using more-skilled personnel and better teams to complete the project. By doing so, developers can reduce the amount of time and resources needed to complete the project.
Using better environments (tools to automate the process): This parameter involves using better environments and tools to automate the development process. By doing so, developers can reduce the amount of time and resources needed to complete the project.
Trading off or backing off on quality thresholds: This parameter involves trading off or backing off on quality thresholds to reduce the amount of time and resources needed to complete the project.
Important trends in improving software economics
Abstraction and component-based development technologies
|COST MODEL PARAMETERS
|Size: Abstraction and component-based development technologies
|Higher-order languages (C++, Ada 95, Java, Visual Basic, etc.)
|Object-oriented (analysis, design, programming)
|Process: Method and techniques
|Process maturity models
|Personnel: People factors
|Training and personnel skill development
|Environment: Automation technologies and tools
|Integrated tools (visual modelling, compiler, editor, debugger, change management etc.)
|Hardware platform performance
|Automation of coding, documents, testing, analyses
|Quality: Performance, reliability, accuracy
|Hardware platform performance
|Statistical quality control
Reducing software product size
Reducing software product size is an important part of software development that involves producing a product that achieves the design goals with the minimum amount of human-generated source material. In this article, we will discuss two methods for reducing software product size: language and object-oriented methods.
Reducing Software Product Size: Language
Universal function points (UFPs) are useful estimators for language-independent, early life-cycle estimates. SLOC metrics are useful estimators for software after a candidate solution is formulated and an implementation language is known. The table below shows the SLOC per UFP for some of today’s popular languages:
|SLOC per UFP
Reducing Software Product Size: Object-Oriented Method
An object-oriented model of the problem and its solution encourages a common vocabulary between the end-users of a system and its developers, thus creating a shared understanding of the problem being solved. The use of continuous integration creates opportunities to recognize risk early and make incremental corrections without destabilizing the entire development effort. An object-oriented architecture provides a clear separation of concerns among disparate elements of a system, creating firewalls that prevent a change in one part of the system from rending the fabric of the entire architecture.
Booch also summarized five characteristics of a successful object-oriented project:
A ruthless focus on the development of a system that provides a well-understood collection of essential minimal characteristics.
The existence of a culture that is centered on results, encourages communication, and yet is not afraid to fail.
The effective use of object-oriented modelling.
The existence of a strong architectural vision.
The application of a well-managed iterative and incremental development life cycle.
Reducing Software Product Size: Reuse
Reuse is another method for reducing software product size. Common architecture, development environment, operating system, DBMS, networking products, and office applications are some examples of reusable components.
Reducing Software Product Size: Commercial Components
A common approach being pursued today in many domains is to maximize the integration of commercial components and off-the-shelf products. The use of commercial components is certainly desirable as a means of reducing custom development. However, there are advantages and disadvantages to using commercial components.
|Predictable license costs
|Broadly used, mature technology
|Up-front licence fees
|Recurring maintenance fees
|Dedicated support organization
|Dependency on vendor
|Run-time efficiency sacrifices
|Rich in functionality
|Integration not always trivial
|No control over upgrades and maintenance
|Unnecessary features that consume extra resources
|Often inadequate reliability and stability
|Complete change freedom
|Expensive, unpredictable development
|Smaller, often simpler implementations
|Unpredictable availability date
|Often better performance
|Undefined maintenance model
|Control of development and enhancement
|Often immature and fragile
|Drain on expert resources
Improving software economics
Improving software economics is an important part of software development that involves reducing the cost and other resources needed to complete a project within a defined scope. In this article, we will discuss two methods for improving software economics: improving software processes and improving team effectiveness.
Improving Software Process
There are three distinct process perspectives:
Metaprocess: an organization’s policies, procedures, and practices for pursuing a software-intensive line of business. The focus of this process is on organizational economics, long-term strategies, and software ROI.
Macroprocess: a project’s policies, procedures, and practices for producing a complete software product within certain cost, schedule, and quality constraints. The focus of the macro process is on creating an adequate instance of the Meta process for a specific set of constraints.
Microprocess: a project team’s policies, procedures, and practices for achieving an artefact of the software process. The focus of the micro process is on achieving an intermediate product baseline with adequate quality and adequate functionality as economically and rapidly as practical.
Although these three levels of process overlap somewhat, they have different objectives, audiences, metrics, concerns, and time scales.
Improving Team Effectiveness
Teamwork is much more important than the sum of the individuals. Some maxims of team management include the following:
A well-managed project can succeed with a nominal engineering team.
A mismanaged project will rarely succeed, even with an expert team of engineers.
A well-architected system can be built by a nominal team of software builders.
A poorly architected system will flounder even with an expert team of builders.
Boehm Five Staffing Principles
Boehm's five staffing principles are:
The principle of top talent: Use better and fewer people.
The principle of job matching: Fit the tasks to the skills and motivation of the people available.
The principle of career progression: An organization does best in the long run by helping its people to self-actualize.
The principle of team balance: Select people who will complement and harmonize with one another.
The principle of phase-out: Keeping a misfit on the team doesn’t benefit anyone.
Software project managers need many leadership qualities in order to enhance team effectiveness. The following are some crucial attributes of successful software project managers that deserve much more attention:
Hiring skills: Placing the right person in the right job seems obvious but is surprisingly hard to achieve.
Customer-interface skill: Avoiding adversarial relationships among stakeholders is a prerequisite for success.
Decision-making skill: The jillion books written about management have failed to provide a clear definition of this attribute. We all know a good leader when we run into one, and decision-making skill seems obvious despite its intangible definition.
Team-building skill: Teamwork requires that a manager establish trust, motivate progress, exploit eccentric prima donnas, transition average people into top performers, eliminate misfits, and consolidate diverse opinions into a team direction.
Selling skill: Successful project managers must sell all stakeholders (including themselves) on decisions and priorities, sell candidates on job positions, sell changes to the status quo in the face of resistance, and sell achievements against objectives. In practice, selling requires continuous negotiation, compromise, and empathy.
Improving automation through Software Environment
Improving automation through Software Environment is an important part of software development that involves creating an environment that can automate the design process and provide payback in quality, the ability to estimate costs and schedules, and overall productivity using a smaller team. In this article, we will discuss forward engineering, reverse engineering, requirements analysis and evolution activities, software design activities, coding and unit testing activities, test activities, configuration control and change management activities, documentation activities, project management, business administration, and progress assessment.
Forward engineering is the automation of one engineering artefact from another, more abstract representation. For example, compilers and linkers have provided automated transition of source code into executable code.
Reverse engineering is the generation or modification of a more abstract representation of an existing artefact. For example, creating a visual design model from a source code representation.
Requirements analysis and evolution activities consume 40% of life-cycle costs.
Software design activities have an impact on more than 50% of the resources.
Coding and unit testing activities consume about 50% of software development effort and schedule.
Test activities can consume as much as 50% of a project’s resources.
Configuration control and change management are critical activities that can consume as much as 25% of resources on a large-scale project.
Documentation activities can consume more than 30% of project engineering resources.
Project management, business administration, and progress assessment can consume as much as 30% of project budgets.
Achieving required quality
Achieving required quality is an important part of software development that involves creating an environment that can support early and continuous configuration control, change management, rigorous design methods, document automation, and regression test automation. In this article, we will discuss key practices that improve overall software quality, using metrics and indicators to measure the progress and quality of architecture as it evolves from a high-level prototype into a fully compliant product, visual modelling and higher-level languages that support architectural control, abstraction, reliable programming, reuse, and self-documentation, and early and continuous insight into performance issues through demonstration-based evaluations.
Key practices that improve overall software quality include the following:
Focusing on driving requirements and critical use cases early in the life cycle
Focusing on requirements completeness and traceability late in the life cycle
Focusing throughout the life cycle on a balance between requirements evolution, design evolution, and plan evolution
Metrics and Indicators
Using metrics and indicators to measure the progress and quality of an architecture as it evolves from a high-level prototype into a fully compliant product is an important part of achieving the required quality. This can help developers better understand how their architecture is evolving over time and identify areas where improvements can be made.
Visual Modeling and Higher-Level Languages
Using visual modelling and higher-level languages that support architectural control, abstraction, reliable programming, reuse, and self-documentation is another important part of achieving the required quality. This can help developers better understand their architecture and identify areas where improvements can be made.
Early and Continuous Insight into Performance Issues
Early and continuous insight into performance issues through demonstration-based evaluations is also important for achieving the required quality. This can help developers identify performance issues early on in the development process and make changes before they become more difficult to address.
The Typical Chronology of Events in Performance Assessment
The typical chronology of events in performance assessment was as follows:
Project inception: The proposed design was asserted to be low risk with adequate performance margin.
Initial design review: Optimistic assessments of adequate design margin were based mostly on paper analysis or rough simulation of the critical threads. In most cases, the actual application algorithms and database sizes were fairly well understood.
Mid-life-cycle design review: The assessments started whittling away at the margin, as early benchmarks and initial tests began exposing the optimism inherent in earlier estimates.
Integration and test: Serious performance problems were uncovered, necessitating fundamental changes in the architecture. The underlying infrastructure was usually the scapegoat, but the real culprit was the immature use of the infrastructure, immature architectural solutions, or poorly understood early design trade-offs.
General quality improvements with a modern process
|MODERN ITERATIVE PROCESS
|Unknown until late
|Understood and Resolved early
|Still a quality driver, but trade-offs must be resolved early in the life cycle
|Late in the life cycle, chaotic and malignant
|Early in the life cycle, straightforward and benign
|Mostly error-prone manual procedures
|Mostly automated, error-free evolution of artifacts
|Tunable to quality, performance, and technology
|Paper-based analysis or separate simulation
|Executing prototypes, early performance feedback, quantitative understanding
|Software Process rigor
|Managed, measured, and tool-supported
Principles of Conventional Software Management
As technology continues to evolve, the importance of software management in creating high-quality software has become increasingly apparent. Conventional software management provides a set of principles to guide the development process and ensure that the final product meets customer needs.
Make quality a priority and establish mechanisms to ensure it is achieved.
Prove that high-quality software is achievable by using techniques such as involving customers, prototyping, simplifying design, conducting inspections, and hiring skilled professionals.
Give products to customers early to obtain real feedback on their needs.
Explore alternatives and determine the problem before writing requirements.
Evaluate design alternatives after the requirements have been agreed upon.
Use an appropriate process model that makes the most sense for the project.
Use different languages for different phases of the project.
Minimize intellectual distance by aligning software structure with the real-world structure.
Prioritize techniques over tools to ensure disciplined engineering practices.
Get it right before making it faster, don't optimize prematurely.
Inspect code to find errors more effectively than testing alone.
Good management is essential to motivate people to do their best.
People are the key to success, hire skilled professionals with appropriate experience and training.
Follow with care, don't adopt something just because others are doing it.
Take responsibility for the quality of the software being developed.
Understand the customer's priorities to meet their needs effectively.
The more they see, the more they need, therefore, to balance functionality and performance with customer expectations.
Plan to throw one away for entirely new applications, architectures, interfaces, or algorithms.
Design for change by using architectures, components, and specifications that accommodate change.
Design without documentation is not design, ensure proper documentation is created.
Use software tools to increase efficiency, but be realistic about their capabilities.
Avoid tricky code that can cause issues later on.
Encapsulate information for easier testing and maintenance.
Use coupling and cohesion to measure software maintainability and adaptability.
Use the McCabe complexity measure to report software complexity.
Developers should not be the primary testers of their own software.
Analyze the causes of errors to prevent them in the future.
Realize that software entropy increases over time, and continuous change will lead to complexity.
People and time are not interchangeable, measure projects by their outcomes, not solely by person-months.
Expect excellence, and employees will perform better when expectations are high.
Principle Of Modern Software Management
Here are the top 10 principles of modern software management:
Base the process on an architecture-first approach: This means achieving a balance among driving requirements, architecturally significant design decisions, and lifecycle plans before committing resources for full-scale development.
Establish an iterative life-cycle process that confronts risk early: With sophisticated software systems, it's not possible to define the entire problem, design the entire solution, build the software, then test the end product in sequence. An iterative process that refines the problem understanding, an effective solution, and an effective plan over several iterations encourages a balanced treatment of all stakeholder objectives. Major risks must be addressed early to increase predictability and avoid expensive downstream scrap and rework.
Transition design methods to emphasize component-based development: Moving from a line-of-code mentality to a component-based mentality is necessary to reduce the amount of human-generated source code and custom development.
Establish a change management environment: The dynamics of iterative development necessitates objectively controlled baselines.
Enhance change freedom through tools that support round-trip engineering: Round-trip engineering is the environment support necessary to automate and synchronize engineering information in different formats (such as requirements specifications, design models, source code, executable code, test cases cycle).
Capture design artifacts in rigorous, model-based notation: A model-based approach (such as UML) supports the evolution of semantically rich graphical and textual design notations.
Instrument the process for objective quality control and progress assessment: Life-cycle assessment of the progress and the quality of all intermediate products must be integrated into the process.
Use a demonstration-based approach to assess intermediate artifacts: It's essential that the software management process drive toward early and continuous demonstrations within the operational context of the system, namely its use cases.
Plan intermediate releases in groups of usage scenarios with evolving levels of detail: Intermediate releases should be planned in groups of usage scenarios with evolving levels of detail. This ensures that the software management process is geared towards early and continuous demonstrations.
Establish a configurable process that is economically scalable: No single process is suitable for all software developments. Therefore, it's important to establish a configurable process that is economically scalable and can be adapted to meet the needs of different projects.
Modern Process Approach Solving Conventional Problems
|Conventional Process: Top 10 Risks
|Modern Process: Inherent Risk Resolution Features
|Late breakage and excessive scrap/rework
|Quality, cost, schedule
|Automated change management
|Attrition of key personnel
|Quality, cost, schedule
|Successful, early iterations
|Trustworthy management and planning
|Inadequate development resources
|Environments as first-class artifacts of the process
|Industrial-strength, integrated environments
|Model-based engineering artifacts
|Necessary technology insertion
|Use case modelling
|Demonstration-based performance assessment
|Early architecture performance feedback
|Overemphasis on artifacts
|Objective quality control
|Early prototypes, incremental releases
Transition to an Iterative Process
Modern software development processes have moved away from the conventional waterfall model towards iterative processes.
Application Precedentedness: Domain experience is crucial in understanding how to plan and execute a software development project. In unprecedented systems, one of the key goals is to confront risks and establish early precedents, even if they are incomplete or experimental. This is one of the primary reasons why the software industry has moved to an iterative life-cycle process. Early iterations establish precedents from which the product, the process, and the plans can be elaborated in evolving levels of detail.
Process Flexibility: Modern software development is characterized by a broad solution space and numerous interrelated concerns, necessitating continuous incorporation of changes. Project artifacts must be supported by efficient change management commensurate with project needs. A configurable process that allows a common framework to be adapted across a range of projects is necessary to achieve a software return on investment.
Architecture Risk Resolution: Architecture-first development is a crucial theme underlying a successful iterative development process. A project team develops and stabilizes architecture before developing all the components that make up the entire suite of application components. An architecture-first and component-based development approach forces the infrastructure, common mechanisms, and control mechanisms to be elaborated early in the life cycle and drives all component make/buy decisions into the architecture process.
Team Cohesion: Successful teams are cohesive, and cohesive teams are successful. Advances in technology (such as programming languages, UML, and visual modelling) have enabled more rigorous and understandable notations for communicating software engineering information, particularly in the requirements and design artifacts that previously were ad hoc and based completely on paper exchange. These model-based formats have also enabled the round-trip engineering support needed to establish change freedom sufficient for evolving design representations.
Software Process Maturity: The Software Engineering Institute's Capability Maturity Model (CMM) is a well-accepted benchmark for software process assessment. One of the key themes is that truly mature processes are enabled through an integrated environment that provides the appropriate level of automation to instrument the process for objective quality control.
Did you find this article valuable?
Support Vishesh Raghuvanshi by becoming a sponsor. Any amount is appreciated!