It all started around September 2001 when it was decided that the organization would look at implementing the Capability Maturity Model (CMM). A formal Software Engineering Process Group (SEPG) was formed to achieve this. The SEPG conducted a Gap Analysis of the templates and guidelines in the Quality Management System that existed at that point of time, and updated them so that they complied with the CMM Level 2 Key Process Areas (KPA). There were then training sessions conducted for the members who were Project Leads and above on the QMS. The first audits were conducted on the projects to assess their compliance with the newly defined standards.
Once we were comfortable that the projects were following the standards defined the team set a goal of reaching Level 3 by December 2002 and also used the services of an external Consultant.
In the last week of June 2002, the external consultant conducted a Gap Analysis called CMM Based Abridged Assessment. This assessment is an abridged version of a regular assessment where the members of the organization were interviewed, and the gaps in the process that is followed by the organization and the CMM framework were identified. As part of this exercise, he interviewed 60 members in 9 group sessions. Altogether, about 40 weaknesses were identified. Subsequent to this, the SEPG, along with the external consultant, came up with an action plan for closing these weaknesses.
It was at this point of time that new members were added to the SEPG group. All the members of the SEPG underwent a full day training course on Process Writing, where the finer aspects of defining a process were explained at length. In the next couple of months, i.e. July and August, the entire QMS was revamped to be compliant with Level 3. At the end of August, the external consultant did a review of the QMS and identified the gaps that were still there with respect to the CMM Framework. In September, all these gaps were closed. This QMS was subsequently published in the QMS web site as QMS 1.0.
Simultaneously, the SEPG was also preparing the training material on the QMS. Every technical member of the organization had to be trained in the process. In order to accomplish this, SEPG delivered a total of 52 hours of training in 11 sessions and trained 215 members of the organization. Once the training was completed, all the projects were asked to start implementing the process and to come up with their respective Software Development Plans. It was during this time that the Software Quality Assurance (SQA) team was formed.
The objective of the SQA team was to help the projects in documenting their process and plans and to then audit the projects to ensure that they were following the process that was defined by the projects. The SQA team consisted of practicing Project Leads from the organization. The SQA team underwent training by the external consultant. Every project in the organization was allocated an SQA Representative who would perform the SQA activities of the project. The SQA Group worked according to a published calendar and performed the audits approximately once in 3 weeks. Since its inception in November 2002, the SQA group has conducted 163 audits in 8 rounds.
At the end of November, the SEPG, realizing that the quantum of work to be done was immense and the fact that we were already behind schedule, decided to go for the assessment in March 2003.
In January, the SEPG, along with the Training Group and other members of the organization, prepared a Peer Review Training Course and a Software Configuration Management Training course and trained all the technical members of the organization. In the Peer Review Training, a total of 181 members were trained in nine 4 hour long sessions. In the SCM training, 158 members were trained in eight 4 hour long sessions. With this, the mandatory training that every technical member of the organization had to receive was completed.
In February 2003, a team along with the external consultant conducted a pre assessment. A pre assessment is conducted just like an assessment but in an abridged manner in order to find out the readiness of the organization to go for the complete assessment. It also aims at finding out any gaps that exist so that they can be closed before the final assessment. The pre assessment consisted of the assessment team interviewing 55 members of the organization playing an assortment of technical and managerial roles from various levels in the organization. These members were interviewed in nine different sessions and they were asked questions on the processes that they were following on the various technical and managerial tasks that they were performing. At the end of the pre assessment, the assessment team came out with a Findings Report containing the strengths and the weaknesses of the organization, along with an action plan for the SEPG to strengthen the weaknesses that were identified. A total of 25 weaknesses were identified in this report.
Looking at the gaps that were identified in the pre assessment, it became apparent that we could not go for the final assessment in March 2003 and so the SEPG rescheduled the date for the final assessment to May 2003. This time it was final and the SEPG decided that it would go in for the assessment in May, irrespective of what the result was going to be.
Based on the action plan that was decided between the pre assessment team and the SEPG team, all the process gaps in the QMS were closed and the SEPG rolled out the last version of the QMS, Version 1.4 on the 28th of March 2003. The SEPG froze this version of the QMS as the final version that would be used in the assessment. The QMS went through four revisions in all, and the inputs for these changes came from sources other than the action plans of the CMM-based activities with the consultant. The SQA conducted the audits and, based on the feedback they were giving the SEPG, made changes to the QMS. There was also a QMS Request System created, and members of the organization were requested to post their queries and suggestions to it.
The following table illustrates the quantum of change that the QMS has undergone since being initially defined.
|Type of Document||Legacy QMS||QMS 1.4|
|Standards and Guidelines||10||17|
In the month of April 2003, the preparation for the final assessment began in earnest. The first task that was performed was to select 14 members of the organization from which a team of 6 would be chosen to be part of the final assessment team. Over a period of five and a half days, this team of 14 underwent a course in the CMM Framework and also the Assessment Team Member (ATM) training.. Once these training sessions were completed, the final assessment team was chosen with the external consultant being the Lead Assessor.
The assessment consisted of two phases: the pre-onsite activities and the onsite activities. The pre-onsite activities consisted of every project completing a questionnaire about the project. The SEPG also completed a questionnaire about the organization, detailing the number of people in the organization, the types of projects, the business goals the structure of the organization and other related information. Based on this information there were four projects that were selected which were a representative sample of the projects that were being done in the organization. The Project Managers of these projects were then requested to fill up a questionnaire called the Maturity Questionnaire that asked them questions on the process that was being followed in the projects. Using the results of the Maturity Questionnaire, and the review results of the QMS, the ATM team came up with the questions that they wished to ask in the interviews.
The onsite activities of the assessment consisted of the opening meeting where the Sponsor of the assessment the head of the organization and the Lead Assessor described the process and the activities of the assessment to the participants of the assessment. After this, there were a series of interviews conducted. First, the Senior Management was interviewed on the policies and the support and checks that they do for the smooth and efficient functioning of the projects. Next, the Project Manager and Project Lead of each of the short listed projects were interviewed in-depth on the process that was followed by them. Once these interviews were completed, there were about eight representatives from various functional areas that were selected and interviewed in groups. These functional areas were Project Leads, SEPG, SQA, Training and Support, SCM, Testers, Team Members.
On completion of all the interviews, the ATM team then consolidated all the data and presented the findings to the participants of the assessment and got their feedback. This feedback was then discussed and the necessary modifications and follow up verification took place. Once all the issues were discussed and resolved, the rating exercise commenced. Once the decision of the rating was taken, the ATM team prepared the Final Findings Presentation which was presented to the organization in May 2003 when the organization was assessed to be Level 3 compliant.