EuroSPI98 
How to Reap the Business Benefit from SPI
European Software Process Improvement
SPI Installation & Implementation
Category Index
Rated Newspaper Supported by EU Project 

SPI in Embedded Software Applications

Bjarne Månsson
Software Group Manager
BARCO Communication Systems Denmark
 
 

BARCO Communication Systems AS

BARCO Communication Systems (BCS) has 3 divisions, each dealing with a specific range of products on broadcasting video and audio:

BCS Denmark

BCS Denmark has a long tradition for handling audio and video signals. In 1980, the company RE Technology based its business objective on audio test and measurement equipment, which in 1989 changed into telecommunication PCM transmission equipment (34 Mbit/s, 140 Mbit/s). In 1992, the know-how in audio signals and in telecommunication combined into audio and video broadcast communication before, in 1997, the company became a profound member of the BARCO Communication Systems.

The main products in BCS Denmark are codecs: digital video and audio compression equipment for high-quality transmission via common telecommunication networks.

The high-quality transmission usually relates to primary contribution of video and audio


Fig. BjM.1 : BCS product application

Product techniques

A modern video codec features high-speed conversion of video signals at 300 Mbit/s into compressed video at telecommunication bandwidths of 45 Mbit/s, 34 Mbit/s, and down to 8 Mbit/s (MPEG-2).

A corresponding audio codec features conversion of audio signals at 384 kbit/s into compressed audio at telecommunication bandwidth of 64 kbit/s.

Because of the high requirements to conversion rates, the codec contains a lot of dedicated electronics in the form of ASICs and FPGAs.

Embedded software applies system control and monitoring but only signal processing at low bit rates (audio).

PC software is used to replace the equipment display and keyboard featuring equipment control and monitoring.



Fig. BjM.2 : BCS product software techniques

The initial problem

During 1994, the BCS released a major codec product, the RE 3400 ETSI video codec. The development of the codec had involved a project team greater than ever experienced during the past history of the company. In order to cope with a project of this size, a product life cycle model had been set up before the start of the project.


Fig. BjM.3 : BCS product life cycle model



The software crisis


Though the product development complied with the life cycle in all phases, the first releases experienced a number of drawbacks like:

The customer reactions were even worse:

The ISO 9001 issue


During 1994, the BCS thought it would like to be ISO 9001 certified. At the preliminary auditions by the certification institute, the software issue was brought up. The QA procedures were based on the product life cycle model and as the product was mostly hardware based, the software was only dealt with in 5 lines of text!
 

SPI initiated

The BCS management got very concerned at the alarming reports on the delayed codec releases and on the software problems obviously causing the delays and the bad quality. Furthermore, the road to the ISO 9001 certification was blocked by insufficient software procedures.

Another hint was given. The BCS had always promoted the use of the latest development techniques, "the-state-of-the-art". When somebody told the management that on a maturity scale from 1 to 5, BCS was on level 1 (not telling, however, that so was the situation for 90% of other companies too), an unsatisfied roar rolled through the company:

Do something about that software!

The SPI task force

Out of the roar came the establishment of a software group consisting of a newly employed group manager and the four present software engineers. The initial task was to evaluate answers to the outstanding questions: Answers: Answers: Answers: Answers: These answers were presented to the management who instructed the task force to set up an action plan of how to solve the problems. With the spirit of that time, the objective of the task force was expanded with:

Better software in a shorter time at a lower price!

Where to start


In my 20 years of software development, I have been looking for some way to get hold of this unpredictable, shapeless, intangible workmanship which somebody even has dared to call art. Though everybody knew of the problems, no firm philosophy, method, or tool had emerged though a few attempts had been performed, ref. [1]. This is very strange taking into consideration that hardware development is performed under well known methods as described in the product life cycle.
 
 

And then it starts as "best practice" out of the experience from many software developments’ "seek and try". The inspiration came from the Capability Maturity Model CMM presented in Denmark during the spring of 1995, ref. [2].
 
Level Management Organizational Engineering
1: Initial      
2: Re-peatable Project planning

Requirement Management

Quality Assurance

Configuration Management

Project Tracking

Subcontract Management

   
3: Defined Intergroup Coordination

Integrated Software Management

Process Focus

Process Definition

Training Program

Software Product Engineering

Peer Reviews

4: Mana-ged Quantitative Process Management   Software Quality Management
5: Opti-mizing   Process Change Management

Technology Change Management

Defect Prevention

Fig. BjM.4 : The CMM key process areas

Buzzwords arise all the time, and in 1995 everybody said "OOM", "CASE tool", and "Reuse". The CMM indicated with which key process areas to start. It is e.g. pointed out that no method or tool can solve the lack of requirement specifications, and that in order to gain benefit from reuse, configuration management must be introduced.

Later on, we did not strictly follow the order of areas in the CMM, but we went more for every topic, which is also the basic idea in the Bootstrap model.
 

Initial SPI

Being a development company, the BCS is very project oriented. In order to be understood by the management, the task force (software group) handled its task as a project. The software process action plan 1995/96 set the outlines of the project with the following headlines:

In order to make the ISO 9001 certification possible, the software was introduced into the QA procedures by software guidelines, which give recommendations to the essential documentation of:

For the project RE 3400, which initiated this SPI, we recommended two actions for every new release:

Short time results

It is important for every project to have some immediate results showing that the project is doing progress.

The first obvious result was the ISO 9001 certification in mid 1995 in which software was an integrated part.

Another result was recordings of an improved error rate measured on the codec RE 3400 releases.
 
 
Software errors
Ver-sion
found in b-test
known in release
found after release
removed
     
24
 
1.5  
24
17
 
1.6
42
37
9
4
1.61
20
37
   
1.62
10
36
25
10
2.0
34
21
5
40
3.0
39
13
2
13
3.1
13
15
6
0
3.11
1
15
0
6
4.0
28
9
1
6
3.12
3
10
1
0
 
190
 
90
80

Fig. BjM.5 : The RE 3400 error rates on releases
 

SPI continued

On obtaining the first results, we followed the same road when continuing the SPI: We were then leaving the strict division into CMM key process areas and focused on the two key subjects of:
  1. Software development methods and tools.
  2. Software project management.

Development methods and tools


Despite good experience in writing software requirement and design specifications, we were still having difficulties in revealing all the relevant requirements from the hardware to the software. Though everybody still said "OOM", we were advised in real time applications to go for the Structured Analysis and Design. This was introduced in late 1995 together with the tool Select Yourdon, ref. [3] and [4].

Fig. BjM.6 : Example on SA/SD-RT

The original object of the SA/SD-RT was for the hardware engineers and the software engineers to have a common base to discuss the full implementation. But it turned out to be a method for the software engineers to find all relevant questions in connection with the requirements.

Introducing the SA/SD-RT as early as in 1995 has enabled us to revise the method and the tool to our specific use. An example is that because we are not using entities (database data flows), we have removed this item from our recommended templates.

In the software implementation phase we had to set a number of rules:

These strange rules originate from the fact that previous software projects were isolated from each other. Having one software engineer on each of these projects, they had "succeeded" into choosing a different compiler to each project.

We could now see that the oncoming projects required a number of software engineers, which trigged off some more guidelines:

/***********************************************

* Project : Demoproject for PVCS

* Used in : PVCS demonstration

* Description : The module only contains a

* demo description.

* :

$Workfile: demo.c $

$Log: F:/sw_faggr/Projdemo/Source/vcs/demo.c_v $

*

* Rev 1.0 18 Feb 1997 13:34_08 BjM

* Description of the change in this revision

*

***********************************************/

Fig. BjM.7 : Example of programming guidelines (program header)



The software test and verification is a subject too often put aside. In our case, we had already some very bad experiences of not properly testing the software together with the hardware. Choosing the V-model was very natural taking into consideration that the hardware already followed this model, ref. [5].


Fig. BjM.8 : The test and verification V-model

But finding the proper method and tool to each of the V-model stages turned out to be a difficult task. Software engineers were used to test the software in the "monkey" way - testing what they thought should be tested which is not much more than usual debug testing.
 
 

After some research, we recommended the test following methods and tools:


Fig. BjM.9 : Software test tools

One hurdle to overcome was the strong belief of every software engineer that a test tool can do the methodical work too! "Is the tool not able to generate the test case for me?" No, there is still a lot of test work to do for the software engineer.
 

Project management

Software project management has the same contents as hardware project management. In the beginning, the project managers (being hardware based) did not believe this. But when we produced a project management guideline for using software in hardware, they started to be convinced, ref. [6].
 
 

The management guideline contains the following subjects of:

When we were filling in descriptions of how to perform these subjects, it became more or less a repetition of what we already had described for the "product" project management (the ISO 9001 QA procedures).

As to the software project creation we confirmed the following items:

No.
 
Poss.

0,2..0,8

Effect

1..10

Weight

P*E

Preventive

actions

Prepared 

actions

Signals
1,0
The product
1,1
Is the product technically wrong?

- Are the technical requirements difficult? Is the product the-state-of-the-art? 

0,4
10
4
Check with Barco
   
1,2
Is the product wrong for the market?

- Is the market moving? Do we know the market?

0,8
5
4
Check with marketing communications
 
Customers decline use of management network
1,3
Is the quality too bad?

- Do we usually see many errors after release?

0,2
8
1,6
     
2,0
The frames of the project            
2,1
Are the goals and subgoals unclear?

- Are the goals too ambitious?

0,2
10
2
     
2,2
Is the project description still unsettled by DR1?

- Is the requirement specification well prepared?

Is the project plan by DR1 realistic?

0,5
5
2,5
     
2,3
Inadequate resources?

- Do you expect additional resources during the project?

Do you expect overtime work?

0,8
5
4
Commitment from Product Council.

Agreement with Barco

Asking Barco for resources.

External resources

Increasing delays on START of items

Fig. BjM.10 : Example of risk analysis
 

As to the software project planning we confirmed the following items:

Fig. BjM.11 : QA checklist

As to the software project reviews we confirmed the following items:

* Structured code?

* Requirements mapped into the code?

* Is the SD-RT diagram implemented?

* Is the interface to the module OK?

* Is the code easy to maintain?

Fig. BjM.12 : Code review checklist



As to the software project follow up we confirmed the following items:

Fig. BjM.13 : Example on QA deviation reports

As the software project metrics we confirmed the following items:

Milestone Hit rate
(on time or better)
Hit rate
(delay £  20%)
Prototype milestone
45 % 
60%
Product matured milestone
35 %
55 %
Release milestone
30 %
45 %
Milestone differences    
From prototype to product matured
65 %
70 %
From product matured to release
75 %
80 %

Fig. BjM.14 : Release plan tracking

In BCS, the software metrics has been an overlooked subject, so even if we did introduce these items to the projects, we could not retrieve information from the "experience" database. And because we did not get any immediate results out of the metrics, the trend was that we did not even do any metrics on the new projects.

In summary we did not introduce any new software project management items, but we merely confirmed the existing hardware related items. In a few cases, e.g. risk analysis, we expanded the items of the subject.
 
 
 

SPI assessment

One of the things that started the SPI in BCS was the allegation of our maturity level being level 1! Naturally, a metric of the SPI project is the measure of maturity level.

As to the action plans, we set the goal of reaching a specific level:

But how do we measure our maturity level? In the CMM, you can only be on one level of 1, 2, 3, 4, or 5, and only if you are complying with all key process areas of that level. Doing any improvements of a higher level does not count on a lower level.

This is handled in the Bootstrap model, which compiles all process improvements into a single number with divisions of a quarter. In this way, even minor improvements can be measured, and you benefit from measuring even small improvements, which is good for the motivation, ref. [7].

The Bootstrap assessment can be performed by either self-assessment or by certified assessment.

The self-assessment involves a questionnaire which one or more people from the organisation may answer.

The certified assessment involves 3 days’ interview of management and of a number of project groups made by external auditors.

At the end of 1996, BCS decided to do both a self-assessment and a certified assessment. The self-assessment was based on three different questionnaires and it gave the following results:

Bootstrap assessment method Level
   
BOOTCHECK (46 questions) 2.5 - 3.5
ESSI committee (43 questions) 2.5 - 3.0
SYNQUEST (370 questions) 2.0 - 2.5
   
Certified ("3 days of questions") 2.3 - 2.5

Fig. BjM.15 : Assessment results

The questionnaire containing most questions (SYNQUEST) gets the nearest to the certified assessment (luckily enough!), ref. [8]. But anyhow, any self-assessment gives a clue of the present maturity level:

Better do some self-assessment than none at all!

SPI summary


You get most benefit from SPI if you apply the improvements

on concurrent projects.

We applied the SA/SD-RT and implementation methods on 3 major projects with good results. But the introduction of the test and verification V-model was delayed compared with the progress of the 3 projects, resulting in a very bad software quality on one of the major projects.

Also, software project management was not introduced early enough on the 3 projects. The software time planning (time estimates), software project plan, and the risk analysis (all due in the early phase of the project) were hardly used in the product project.

And now we have to wait for the next major project before we can benefit from these methods!

An advantage of introducing SPI as part of the QA procedures was that we were not delayed by any "pilot project evaluation". Though we did some trial investigations before introducing a major new method or tool, we gained a lot of knowledge and motivation from everyone being educated to the same level at the same time.
 
 

Fig. BjM.16 : SPI Summary with respect to Concurrent Projects



SPI lessons learned

Introducing SPI in a development-based company requires a strong health and lots of good spirits - like in any other project matter!

But keeping some rule-of-thumb in mind can help you through the strongest of bad luck.

Below I sum up some of the main good and bad experiences from the 3 years of SPI introduction.

The SPI iceberg

Beware of the SPI iceberg. It is easy to "buy" a method or a tool but it is much harder to get it working inside the company.



Fig. BjM.17 : The SPI iceberg



The SPI "silver bullet" life cycle

Beware of the SPI "silver bullet" life cycle, ref. [9]. It is easy to set up unrealistic expectations to the outcome of the SPI, but that only works until the SPI is being used in real life. The hard work is to get SPI down to earth and get it working in actual practice.



Fig. BjM.18 : The SPI "silver bullet" life cycle

The SPI good experience

The SPI bad experience

SPI in future

Still, many SPI subjects and items have to be fulfilled - and changed! This will be when new projects require revised methods and tools!

For the next action plan, the main topics will be:

BCS software began as a supplier to the hardware but this is not the case anymore. We must look upon each other as equals in the cause of product development.
 
 
 

Abbreviations

 
CASE: Computer Aided Software Engineering
CMM: Capability Maturity Model
OOM: Object Oriented Model
SA/SD-RT: Structured Analysis and Design in Real-Time
SPI: Software Process Improvement

References

[1] S. Biering-Sørensen, F. Overgaard Hansen, S. Klim, P. Thalund Madsen: Håndbog i Struktureret Programudvikling, Teknisk Forlag, 1988.
[2] Capers Jones: The Path to Software Excellence: Becoming "Best in Class", SPR, Inc., March 10th, 1995.
[3] P. Ward, S. Mellor: Structured Development for Real-Time Systems, Prentice-Hall, 1984.
[4] F. Overgaard Hansen, F. Hansen: SA/SD-RT Kompendie 940131, DTI, Århus, 1994.
[5] Boris Beizer: Software Testing Techniques, Van Nostrand Reinhold, New York, 1990.
[6] W.S.Humphrey: Managing the Software Process, Addison-Wesley, Reading, MA, 1989.
[7] P. Kuvaja, et al.: Software Process Assessment and Improvement – The Bootstrap Approach, Blackwell Business, Oxford, 1994.
[8] SynQuest, version 1.5: Selfassessment for Softworkers, SynSpace GmbH, 1996.
[9] F.P. Brooks: No Silver Bullet: Essence and Accidents of Software Engineering, IEEE Computer, Vol. 20, No. 4, April 1987.
 
 

Appendix: The author and the company

 

Bjarne Månsson

With more than 20 years of software background, Bjarne Månsson has experienced the need of and the requirements to the SPI movement. He graduated in 1974 with a M.Sc. degree from the Technical University of Denmark (DTU), which in 1979 was extended with a M.Phil. Degree from the University of Leeds, UK. Starting in 1976 in the telecommunication world, developing test equipment for telephone exchanges, he joined the first attempts to embed software in purely hardware-based products. His knowledge in this field was widened in much greater scale during the 1980s where he worked as project manager of data acquisition equipment including data collection electronics and data processing mainframe software. After a short visit to the CNC machine industry in the early 1990s, also developing data acquisition, Bjarne Månsson returned to the telecommunication business, being responsible for the introduction of SPI in embedded software in high quality broadcasting electronics.
 

BARCO Communication Systems Denmark

The BARCO Group’s main business area is projection systems, which covers one third of the group sales. The closely related display systems and graphics systems cover another third of the group sales.

A BARCO group member is the BARCO Communication Systems, which is a world leader in high quality solutions for the broadcast, cable TV and telecommunication markets with

The BARCO Communication Systems has three divisions (Belgium, North America and Denmark) doing development, production, marketing and sales of The BARCO Communication Systems is present world-wide with offices in Germany, France, UK, the Netherlands, Israel, USA, Mexico, Brazil, Hong Kong, China, Malaysia, Japan and Australia.


Partners in EuroSPI

Editors
ISCN LTD, ISCN GesmbH, Schieszstattgasse 4/24, 8010 Graz, and Coordination Office, Florence House, 1 Florence Villas, Bray, Ireland, office@iscn.at, office@iscn.com, office@iscn.ie, Editing Done: 19.7.2002