The following learning objectives
are covered in this lesson:
- Apply a generic problem-solving model to an acquisition
situation.
- Apply one or more selected qualitative tools (e.g.,
fishbone diagram) to resolve a problem.
- Identify developer practices essential for creation of
high quality software.
- Identify the requirements for interoperability testing.
1. One
problem-solving technique is the cause and effect diagram or "fishbone"
diagram. By analyzing all the possible causes of a problem, the
fishbone diagram focuses on determining the root cause of a problem, rather
than on symptoms or solutions. Typically, the fishbone diagram begins
with a statement of the problem in a box on the right side of the diagram--the
"head" of the fish. Then categories of major causes are
identified and drawn to the left--the "bones" of the fish.
These major causes are broken down into all the related causal factors that
might contribute to the major causes. Finally, the causal factors are
examined and narrowed down to the most significant elements of the problem to
determine the ultimate cause or causes.
2. The Software Program Managers
Network has identified several software best practices based on interviews with
software experts and industry leaders. Here is a synthesized list of some
of those characteristics, which are essential for the creation of high quality
software: Adopt Continuous Program Risk Management
Risk management is a continuous
process beginning with the definition of the concept and ending with system
retirement. Risks need to be identified and managed across the life of the
program.
Estimate Cost and Schedule
Empirically
Initial software estimates and
schedules should be looked on as high risk due to the lack of definitive
information available at the time they are defined.
Use Metrics to Manage
All programs should have in place a
continuous metrics program to monitor issues and determine the likelihood of
risks occurring. Metrics information should be used as one of the primary
inputs for program decisions.
Track Earned Value
Earned value requires each task to
have both entry and exit criteria and a step to validate that these criteria
have been met prior to the award of the credit. Earned value credit is binary
with zero percent being given before task completion and 100 percent when
completion is validated.
Track Defects against Quality
Targets
All programs need to have
pre-negotiated quality targets, which is an absolute requirement to be met
prior to acceptance by the customer. Programs should implement practices to
find defects early in the process and as close in time to creation of the
defect as possible and should manage this defect rate against the quality
targets. Meeting quality targets should be a subject at every major program
review.
Treat People as the Most Important
Resource
A primary program focus should be
staffing positions with qualified personnel and retaining this staff through
the life of the project. The program should not implement practices (e.g.,
excessive unpaid overtime) that will force voluntary staff turnover. The effectiveness
and morale of the staff should be a factor in rewarding management.
Adopt Life Cycle Configuration
Management
All programs, irrespective of size,
need to manage information through a preplanned configuration management (CM)
process. This discipline requires as a minimum:
- Control of shared information
- Control of changes
- Version control
- Identification of the status of controlled items(e.g.,
memos, schedules) and
- Reviews and audits of controlled items.
Manage and Trace Requirements
Before any design is initiated,
requirements for that segment of the software need to be agreed to.
Requirements need to be continuously traced from the user requirement to the
lowest level software component.
Use System-Based Software Design
All methods used to define system
architecture and software design should be documented in the system engineering
management plan and software development plan and be frequently and regularly
evaluated through audits conducted by an independent program organization.
Ensure Data and Database
Interoperability
All data and database implementation
decisions should consider interoperability issues and, as interoperability
factors change, these decisions should be revisited.
Define and Control Interfaces
Before completion of system-level
requirements, a complete inventory of all external interfaces needs to be
completed. Internal interfaces should be defined as part of the design process.
All interfaces should be agreed upon and individually tested.
Design Twice, Code Once
Traceability needs to be maintained
through the design and verified as part of the inspection process. Design can
be incrementally specified when an incremental release or evolution life cycle
model is used provided the CM process is adequate to support control of incremental
designs.
Assess Reuse Risks and Costs
The use of reuse components, COTS
(Commercial Off-The-Shelf), GOTS (Government Off-The-Shelf) or any other
non-developmental items (NDI) should be a primary goal, but treat any use as a
risk and manage it through risk management.
Inspect Requirements and Design
All products that are placed under
CM and are used as a basis for subsequent development need to be subjected to a
formal inspection defined in the software development plan. The program needs
to fund inspections and track rework savings.
Manage Testing as a Continuous
Process
All testing should follow a
preplanned process, which is agreed to and funded. Every test should be
described in traceable procedures and have pass-fail criteria.
Compile and Smoke Test Frequently
Smoke testing should qualify new
capability or component only after successful regression test completion. All
smoke tests should be based on a traceable procedure and run by an independent
organization (not the engineers who produced it). Smoke test results should be
visible and provided to all project personnel.
3. Interoperability problems can best be identified through the use of actual, live systems to mitigate risk. Joint interoperability is defined as the ability of systems to provide services to and accept services from other systems and to use the services exchanged to enable them to operate effectively together. The Joint Interoperability Test Command is responsible for verifying the interoperability of systems to the parameters outlined in the ICD, CDD, CPD and ISP.
No comments:
Post a Comment