(c) G. R. Peters

We will be using the book by Schinzinger and Martin as a text for this portion of the course.  There are six chapters in the book and below is my rough plan for time we have left. I have a video or two to show, and some exercises on case studies. I will also spend some time on APEGN's Code of Ethics, in the first couple of lectures.  We have already dealt with some of the material on the professional aspects of engineering (covered in Chapter 1 of S&M), but what is there is good review.  The study of our own code of ethics fits in very well with this.

I also draw your attention to the Codes of Ethics collected in Appendix A3.  While these do not carry legal weight for Canadian engineers that the Codes of their own Associations do, they are nevertheless informative, and work from the same basic philosophy.

The fact that I do not mention everything in the book in class is not to be taken to mean that it is not relevant to the course.  I encourage students to read it all; it is not a big book.
Chapter 1: The Profession of Engineering [1 lecture] 
Chapter 2: Moral Reasoning and Ethical theories [2] 
Chapter 3: Engineering as Social Experimentation [3] 
Chapter 4: Commitment to Safety [3]
Chapter 5: Workplace Responsibilities and Rights [3]
Chapter 6: Global Issues  [?]

I want to remind you here of the quote from Paul Goodman used by Postman, which was in fact also used by Martin and Schinzinger in their book "Ethics in Engineering" which we used in the course up to this year.  It is this:

Technology is more than science...

"Whether or not it draws on new scientific research, technology is a branch of moral philosophy, not [only] of science."
-Paul Goodman, New Reformation

We have had a look at the big picture.  I have already remarked that the quotation was appropriate in relation to the choices open to the engineer in technological decision making. The types of choices which I had in mind at that time were between different technological paths, some of which might have more desirable short-term results than another, or less adverse societal impact than another. In later lectures we pointed out the difficulty of making predictions over even a moderate long term, and whether we as engineers should take it upon ourselves to make judgements for society.

Now the focus is on the individual.  We will now take a more narrow and more specific focus on the way in which the choices are made, and often this will be on a personal level. There may be  choices which affect your family, your employer, your job, or may put you on the receiving end of discipline from a professional engineering committee. A knowledge of engineering ethics will help us in making these decisions.

This idea contained in Goodman's quotation is behind the title of the first section of Schinzinger and Martin's book,  which they call "Engineering and Moral Complexity". They point out how "moral values are embedded in the engineering process itself", which they diagram in Figure 1.1.

The authors emphasize that the process is not a simple one-line progression.  Rather, there are many choices made along the way, and the process is iterative.  The complex process of iteration and change requires more than good technical knowledge.  It also requires care, attention to detail, and an attitude of professional responsibility.  They give the well known example of the Kansas City Hyatt Regency walkway collapse, which killed over 100 people.  (This is well documented, and the student should look it up.(p5) )

Some of the shortcomings of people and organizations which give rise to problems are the following:

The many linkages and intertwined responsibilities are illustrated in Figure 1.3

MOTIVATION (See also the "Illustrative Cases" in S&M p10)
We will shortly get to the detailed study of engineering ethics, and the practical professional "codes" which come under that name. But to illustrate the nature of the problems which can arise, and the need for rules such as those embodied in codes of ethics, we will undertake a little exercise. I will present  take three or four cases. I will ask you to decide - without rules other than your own morality, sense of fairness, etc. - how you would act. The first one is from our text, (p10) (but without the result) the next two from another reference, and I made up the last one, roughly based on an incident of my own experience.

1. Construction manager under pressure. [1] An inspector discovers faulty construction equipment and applies a "violation tag" preventing use of the equipment. The inspector's supervisor, the construction manager, views the case as a minor infraction of safety regulations, and orders the tag removed so that the job will not be delayed. The inspector objects and is threatened with disciplinary action. You are the inspector. What do you do now?

-is it really only minor? -should you give in to the threat? -write a memo to the bigger boss? - try to fix the machine? -go to the client? -the media? -quit?

2. Merry Christmas. [2] George is a city employee, and is the resident engineer for a large sewer contract. With his many years of field engineering experience, he is able, during the course of the job, to suggest techniques and procedures to the contractor which save time and money, although the work is done strictly in accordance with plans and specifications. At Christmas time, George receives a case of good Scotch from the contractor, with his greeting card attached. May George (you) accept the gift?

-what's wrong with it, you certainly saved him and the city money. -does it depend on city policy? -how much is the scotch worth? -does that matter? -what about if you donate it to the city (or the contractor's) Christmas party?

3. The case of Deadfish Creek. [3] As plant engineer for Lotsa Chrome, Inc., Lisa Smith knows that the manufacturing process results in periodic discharges of cadmium and chrome into Deadfish Creek in concentrations which may cause serious long-term health effects for downstream water users. Because Lotsa Chrome Inc. is marginally profitable, management has made a policy decision to close the plant if and when waste water controls are imposed by the Government. When Lisa's boss is questioned by the Department of the Environment, he understates the levels of chrome and cadmium discharged, and Lisa knows this. Must she report the correct numbers to DOE?

-what about loyalty to the other employees who will be out of work if the company closes?-make an anonymous phone call? -maybe the government won't do anything. -tell DOE that the boss's numbers are wrong, but not say what the correct ones are? -write the boss's boss? -the media? Company directors?

4. A Better Way. Joe is a civil engineer in a struggling firm specializing in municipal engineering. He is aware that a design contract has been awarded to a competitor for a sewage treatment system for his city.  After talking informally to friends in the local government planning department, he learns some details of the design under way. Joe thinks the design concept is a poor one, and will cost the city more in the long run. The city council then asks him to look at the system being proposed, and to prepare an alternative design for their engineering department to examine. Should Joe accept this contract?

EXERCISE: Students take a handout with all the above cases, and in about 5 minutes each, write their own answers in each case, with reasons. These are collected, summarized and then discussed following the presentation on the APEGN Code of Ethics, next lecture.

One of the marks of a profession is the existence of sets of rules or guidelines known as "codes of ethics".
These rules are intended to inform members of the profession of standards of behaviour expected from them in the practice of their profession, and also to demonstrate to the public that standards exist and are being enforced. Doctors, lawyers and engineers all have them, as well as committees of the professional organization which enforces the rules.

In Appendix A3, Schinzinger and Martin have printed several sets, including those for the National Society of Professional Engineers (NPSE), and the Institute of Electrical and Electronics Engineers (IEEE). In this province, as in all provinces of Canada, the practice of engineering is governed by an act of the legislature. Our association (APEGN), whose home page you can reach from our web site course materials, has our Code of Ethics, and we will use this in our dealing with ethical exercises and examples.

One purpose of our  study of this topic is to make you familiar with codes of ethics, ours and others. But these codes are the products of the minds of engineers, and they are not carved in stone. You may see ways they can be improved, and it is our responsibility to do this where we can. One of the features of codes of ethics is a fairly strong element of what might be called "guild rules". By this I mean that certain guidelines are clearly shaped with the benefit of the engineer or the engineering company in mind rather than the exclusive protection of society. This can sometimes lead to conflicting advice, as we shall see from time to time. So along with making ourselves familiar with them, we should question them with a view to improvement.

We will now briefly discuss APEGN's Codes of Professional Ethics, and follow this by applying them to the case studies we worked on in the last lecture.

See also the APEGN Home page link on the Web site, under "publications". My comments on various articles of the code below in italics.


Association of Professional Engineers and Geoscientists of Newfoundland

1. A professional engineer or geoscientist shall recognize that professional ethics are founded upon integrity, competence and devotion to service and to the advancement of human welfare. This concept shall guide the conduct of the professional engineer or geoscientist at all times.

This foundation is pretty straightforward except perhaps for the word "integrity."  The word implies wholeness and unity. Thus integrity is a bridge between responsibility and private life. We can't break it into parts, and say "don't blame me (for this immoral act) I was just doing my job".  Integrity makes possible the virtues of self respect and pride in one's work. It precludes the attitude that one is not personally responsible for one's work.

Duties of the Professional Engineer or Geoscientist to the Public

A professional engineer or geoscientist shall:

2. have proper regard in all his or her work for the safety, health and welfare of the public;
The safety of the public is first priority.

3. endeavour to extend public understanding of engineering and geoscience and their role in society;
This is interesting, considering our recent discussions on Postman, etc. The notion appears in other codes, e.g. NSPE.

4. where his or her professional knowledge may benefit the public, seek opportunities to serve in public affairs;

5. not be associated with enterprises contrary to the public interest;
Who decides what is in (or contrary to) the public interest? The engineer?

6. undertake only such work as he or she is competent to perform by virtue of his or her education, training and experience;
Note  that you are required to judge your own competence.

7. sign and seal only such plans, documents or work as he or she has personally prepared or which have been prepared or carried out under his or her direct professional supervision;
A professional engineer has an "seal", or official stamp, to show the public who is taking responsibility for the document. You should not stamp something that has not been your direct professional responsibility.

8. express opinions on engineering or geo-scientific matters only on the basis of adequate knowledge and honest conviction;

9. have proper regard in all his or her work for the well being and integrity of the Environment.
"Proper regard" can be open to interpretation. Is "integrity of the environment" a meaningful phrase?

Duties of the Professional Engineer or Geoscientist to Client or Employer

A professional engineer or geoscientist shall:

10. act for his or her client or employer as a faithful agent or trustee;
This can sometimes put you on the horns of a dilemma - but it must always be a serious consideration.

11. not accept remuneration for services rendered other than from his or her client or employer;
No case of Scotch.

12. not disclose confidential information without the consent of his or her client or employer;
This can make it hard to blow the whistle externally.

13. not undertake any assignment which may create a conflict of interest with his or her client or employer without a full knowledge of the client or employer;
There is a duty to keep your employer informed when you are serving some other master. Note that this does not say that you cannot have potentially conflicting jobs.

14. present clearly to his or her clients or employers the consequences to be expected if his or her professional judgement is overruled by other authorities in matters pertaining to work for which he or she is professionally responsible.

Duties of the Professional Engineer or Geoscientist to the Profession

We must watch out for potential self-serving items here. But is there anything wrong with being self-serving?

A professional engineer or geoscientist shall:

15. endeavour at all times to improve the competence, dignity and reputation of his or her profession;
A very broad requirement.

16. conduct himself or herself towards other professional engineers and geoscientists with fairness and good faith;

17. not advertise his or her professional services in self laudatory language or in any other manner derogatory to the dignity of the profession;
What moral basis is there for this, so long as the ads are factual?

18. not attempt to supplant another engineer, or geoscientist in an engagement after definite steps have been taken toward the other's employment.
Why not, if you have a better idea?

19. when in salaried position, engage in a private practice and offer or provide professional services to the public only with the consent of his or her employer and in compliance with all requirements of such practice;
Applies to consulting professors.

20. not exert undue influence or offer, solicit or accept compensation for the purpose of affecting negotiations for an engagement;
No bribes.

21. not invite or submit proposals under conditions that constitute only price competition for professional services;
Moral basis? This kind of clause has been stricken from codes in the US. Judged to be in restraint of trade, and therefore illegal.

22. advise the Council of any practice by another member of the profession which he or she believes to be contrary to the Code of Ethics.
What about our reluctance to snitch?

Review of our Cases.
1. Construction Manager under pressure
Student Responses (about 150 students)  %
Stand by decision 
Record, appeal, job continues 
Reconsider, remove 

Code of Ethics guidance: (Items 2,10,14,22) Do not remove the tag. Go to the next level if someone else does. Go to the Association for advice, especially if you are threatened.

Student Comments:
-safety more important than deadline. Safety paramount.
-inform employees in area.. They have right to refuse.
-if tag removed it is manager's responsibility. (Can you pass on this responsibility?)
-write it up in your professional diary
-leave it on and take a sick day (???)
-you must recognize the authority of a supervisor (What kind of authority does he have in this case?)
-just because someone is your boss, it doesn't mean that they know how to do your job better.
2. Merry Christmas
Student Responses (About 150 students) %
Accept the gift
Depends, consult

Code of Ethics guidance: George should not accept the gift (10, 11)

Student Comments:
-May depend on whether public or private sector (employer policy).
-Accept, but be sure you are not influenced. (Not a defence against conflict of interest or potential influence )
-refuse at first... then accept !  (Playing hard to get)
-share the wealth - include city managers and let contractor know. (Not a bad idea - but let the contractor know first)
-no one should ever turn down good scotch... (?)
-maybe technically ok, but perception important
-take it and donate to a shelter for homeless (?)
-George may overlook mistakes in future
-No. The key is that George does not work for the contractor.
-Take it. A wise man never turns down anything thing that is free! (Budding sage)
-Obviously, this is a trick question. Take it. (A student survival philosophy)
3. Deadfish Creek
Student Responses   (About 150 students) %
Report correct
Inform, other ways
Don't report 

CoE guidance: Report correct numbers. (2,5,9). Note in conflict with (10,12).

Student comments:
-public safety is more important than jobs. Safety paramount
-She should be trying to reduce the pollution.
-involve others
-let boss know, use chain of command. (This is basically the way to go.)
4. Better way
Student Responses (About 150 students) %
Accept the contract
Refuse it 


CoE guidance: Joe should take it only on condition that the competitor is made fully aware of his review. Note that the competitor does not have to give permission. (16,18 - the debate would be about intent). Some would argue that Joe should not take the contract unless the relationship between the city and his competitor was severed.

Students comments:
-This is not an ethical question. (Good observation. Fair play, or guild rules?)
-if inferior work is being done, he should be replaced. (How do we know the work is inferior?)
-how would Joe like it if this were done to him?
-take it - eat or be eaten ! (Real man of the world here.)
-Accept it, since it will save the taxpayers money. (How do we know?)
-Accept it. Competition in the marketplace is healthy.

Some of the comments note the need for a bigger context, and more information. This is a good observation of the weakness of the case study method. It lacks the complexity of real life.

A fundamental point from which we start  is that while APEGN's code of ethics is a set of rules which members of our association must observe, the foundation on which this is built is worthy of deeper study. In fact, as we have indicated above, such codes sometimes include rules for which one might question the ethical and moral basis. Take the rule in the APEGN code about not supplanting another engineer, for example, or the one about bidding for services:

18. not attempt to supplant another engineer, or geoscientist in an engagement after definite steps have been taken toward the other's employment.
21. not invite or submit proposals under conditions that constitute only price competition for professional services;

As we have noted, these articles could be considered to be a matter of "the rules of the business game", and not having a strong moral basis. That is not to say there is no such basis, since it might be thought rather lacking in respect for a colleague to behave in the way implied in (18), for example. Having "Rules of the Game" is part of being a profession, and one underlying rationale for the rules is that individuals are presumably better off in the long run if they obey the agreed rules. For a good paper on this complex issue, see reference 4. (Michael Davis. Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession. Available on the Illinois Institute of Technology Web site:

Sometimes items in codes of ethics  have run into conflict with laws. For example, the American Society of Civil Engineers had several clauses of their code overturned in the mid  70's by the Department of Justice in the US, on the basis that they were in restraint of  trade . Two of the offensive clauses were similar to 18 and 21 of our code.

In another vein, the National Society of Professional Engineers (NSPE) in the US has a clause preventing participation in strikes.   "Engineers shall not actively participate in strikes, picket lines, or other collective coercive action." [NSPE CoE,III(e)]. See Appendix A3, S&M.  Some might regard this as an inhibition of personal freedom. Yet there is a moral basis which one can use for such a clause, and it could go something like this: how can the engineer discharge his moral duty to protect society if he refuses to work? And what about loyalty to the employer?

Ethical questions are frequently complex, and even the question of what is an "ethical issue" is not always clear cut. A large part of the attention to these basics is in order to clarify the terminology and issues, and to make it easier for us to deal with the ethical questions and the moral judgements that go with them.

I also want to make it absolutely clear that I am not trying to encourage a cynical view of codes of engineering ethics. Among other things, they relieve the engineer of the difficult task of making her own moral judgement anew on every issue. The codes have been thought out, and part of being a professional engineer is to abide  by the  profession's rules. Scepticism is fine, so long as it is healthy, for there is probably nothing that cannot be improved.

Defining Engineering Ethics
It makes sense to start with a definition; not of the code, but what we mean when we think of it as a subject for study. According to Schinzinger and Martin,( p8), there are three senses in which the term "engineering ethics" is used in their book:

 (1) Engineering Ethics is the study of moral values, issues, and decisions involved in engineering practice.

In this sense ethics refers to an area of inquiry,  the activity of understanding moral values, resolving moral issues, and justifying moral judgements.  The moral values take many forms, including ideals, character, social policies, etc. There are at least two other ways in which the term can be used:

also 2) a specific set of beliefs or attitudes. This is the sense in which we identify our "codes of ethics", for example. In fact, it is in this context that we are mainly interested. The distinction between ethical standards and legal ones (laws)  is particularly significant

and 3) "ethical" can be used as the equivalent of "morally correct", according to some accepted moral  principles. We say sometimes that an action may or may not be "ethical". "In this sense, engineering ethics amounts to the set of justified moral principles of obligation, rights, and ideals that ought to be endorsed by those engaged in engineering." (S&M p9).  This usage is rather more general than the specific principles embodied in a code of ethics, say.

What is morality?
In  the above, and in many of our discussions, we have used the word "moral". What do the philosophers (and others) mean by this?  This is a big topic, and we will spend a bit of time barely scratching the surface, including a brief look at some of the underlying theory. It is more than simply what "ought" to be done. One ought to wear waterproof footwear to prevent wet feet, but  it is hardly an issue of moral conduct.

At an intuitive and common-sense level, we observe that moral reasons  require us to respect other people as well as ourselves. This includes keeping promises, respecting rights, avoiding cheating and dishonesty, for example. We can see at once that the characterization of what is "morally right" is by no means precise and clearly definable. Two individuals might well have differing judgements in the same situation.

At this point you might be wondering whether the engineering profession - or this class - intends to inculcate moral beliefs in you all, or to give you moral training. There is no such paternalistic objective. You are responsible for your own set of beliefs and standards. We do start from a supposed base of the worth of basic honesty, respect for others, and recognition of the "golden rule". (Not the version that says that the one who has the gold sets the rules).

Your attention is drawn to the illustrative cases in the text  (p10) showing how moral issues arise in real life situations.  We have already met the first one in our motivational examples (except for the last sentence).

Here is another:

Find the moral issues:

(3) A chemical plant dumped wastes in a landfill.  Hazardous substances found their way into the underground water table.  The plant's engineers were aware of the situation but did not change the disposal method because their competitors did it the same cheap way, and no law explicitly forbade the practice.  Plant supervisors told the engineers it was the responsibility of the local government to identify any problems.

Answer: Some possibilities are:

 environmental stewardship (this is explicit in the CoE)
 safety of the public
 concealing risk to the public
 what is right vs what is legal
 loyalty to employer

Sometimes the application of ethical principles leads to a conflict between two sets of moral obligations. Such a situation is called a moral dilemma. For example, the need to be loyal to an employer in the case where some issue (e.g. in the Lotsa Chrome case) indicates that a report contrary to his interest must be made.

We have already referred to the now obvious fact that sometimes a moral question, or question of ethics, is not easy to settle. It can be just a question of vagueness about where to draw the line, e.g. what constitutes a real bribe. Then there are questions where a disagreement would exist between reasonable people involved in a moral issue. For example this might come about in a company where a decision has to be reached collectively, and the issue is not clear-cut. Furthermore, even when one tries to apply a set of agreed rules such as a code of ethics, it may turn out that there are principles which conflict in that particular case.  Invariably, one has to apply  rules as the particular circumstances require, and not in a dogmatic way.

Resolving ethical dilemmas requires the careful weighing of conflicting moral obligations.  The authors break the process into three related tasks:

Conceptual inquiries are those which attempt to clarify concepts. For example, what do we mean by safety and risk? These concepts will get some elaboration later; we make do with an intuitive idea for now. What does it mean in our code of ethics when it says that you should act "in good faith"? This sort of question is often closely connected to finding out what is normal, or usually accepted.

Factual inquiries are investigations as to the facts of the matter. This is usually the most simple to do. How much, when, where, etc. This is important, and must be done in any inquiry.

Interpersonal disagreements.  Different people see things in different ways.  Even among rational, well-meaning people, there can be different interpretations as to  what is right and what is unreasonable.  That is why in a profession, there has to be due process, the hearing of all sides of an issue, careful attention to facts (and details) and fair deliberation by experienced people, fairly chosen.

The authors suggest six steps to deal with moral dilemmas:

1. Identify relevant moral factors, conflicts, etc.
2. Gather all the facts pertinent to the moral factors.
3. Rank (i.e. "prioritize") the moral considerations, if possible.  For example, a duty to serious public safety would have to rank higher than a duty to an employer.
4. Identify the alternatives in solving the moral dilemma problem.
5. Discuss with colleagues.
6. Come to a reasoned conclusion.

To put it succinctly, in any inquiry: (For example if you are being asked to decide on a particular case)

Example: Consider the plant engineer responsible for the the chemical plant described in example (3) above.  Is his or her behaviour ethical?  Use APEGN Code of Ethics.


-Due to the plant's disposal practice, hazardous substances are polluting the groundwater.
-The engineer and others involved are aware of this.
-The disposal method is cheap, and competitors are doing the same thing.
-There is no law which explicitly forbids the disposal practice.
-The engineer's supervisors said it was the government's responsibility to identify any problem.

Articles in CoE bearing on the case:

1. ..integrity, etc.  Permitting hazardous environmental practice and not reporting it violates integrity.

2.  proper regard for health and safety of the public...

5.  not be associated with enterprises contrary to the public interest..

9.  proper regard for environment.

10.  act for employer as faithful agent...

16.  fairness and good faith towards other engineers

Analysis.  Permitting hazardous environmental practice and not reporting it violates the first basic guideline of the code regarding integrity.  Specifically, the engineer is clearly in violation of articles 2, 5 and 9.  The engineer could defend his behaviour by noting that 10 and 16 support it. The lack of action is serious, and a danger to the environment and to the public.  The fact the practice is legal does not change this.  It could be that lawmakers are unaware that the practice is hazardous, but if the engineers are, that is not an excuse. Articles 1, 2, 5 and 9 have a much higher priority than the loyalty to the company and to other engineers.

Conclusion. The engineer is in violation of the Code of Ethics.

The  "Professional" Engineer.
This might be a good point to note the discussion which the authors present on what it means to be a "professional".  Davis (ref 4) also has some good points in this area. S&M set out useful criteria for professions such as engineering: (p17), and we have seen these earlier, at the start of the course:

Marks of a Profession:

1.  Advanced Expertise. The beginnings are founded in a good educational system, and built upon by experience and continual learning.

2.  Self Regulation. The group of people claiming to be a profession must be organized to set standards, generate and enforce codes of ethics, and represent members before the public. This authority for self regulation is embodied in an Act of the Provincial Legislature

3. Public Good. The profession must serve some public good. In engineering, for example, we claim to protect the public by insuring that public infrastructure is safely built, among other things. The question of just who constitutes the public can arise. Are fellow employee engineers the same kind of  public as consumers of a company's product, for example? Generally, no. The engineer insists on a lower level of risk for the "innocent" consumer.

Employee Engineers
The question sometimes arises as to whether an engineer serving as an employee, and not taking direct responsibility for work beyond his or her employer, should be considered a professional in the sense of the criteria above, and require registration with the profession. Some take the view that only engineers dealing personally with clients can claim to be acting as independent professionals, and employee status prevents this .

In the opinion of most people, including me, this is an extreme view. Anyone can take personal responsibility for his or her work. A professional is one who never needs a boss.

With all this emphasis on the engineer as a responsible agent, having moral autonomy, being  accountable, and so on, what role do codes of ethics play? We have already discussed these to some extent, but it is of interest to hear what S&M have to say. They take this topic up beginning on p18.

Shared Standards We have already seen how easy it is for reasonable people to have different views on moral issues.  It is very important that all members of the profession, and the public, know what the accepted standards are.

Support. One of the very real advantages of the code of ethics is to give the engineer support in making ethical decisions. The legitimate appeal to the professional group - e.g. "to do that would be professionally unethical, i.e. against the code of ethics of my profession" - is a persuasive argument.

Guidance  A code of ethics - certainly the ones we have seen -  are obviously written with the intention to inspire the reader and guide behaviour. The language is full of idealistic statements such as having proper regard for the health and welfare of the public, integrity of the environment, and so on. The down side is that the grand language of these noble statements can be a bar to clarity, allowing vagueness to creep in, and actually reducing its effectiveness.

Inspiration  Because of the recognition by all members, the code give a motivation for the type of behaviour described or at least implied in it.

Education and Mutual Understanding.  We are using codes for this very purpose at the present time. Their standing as a formal document approved by the licensing body makes it a valuable standard, with acceptance by members, governments and the public.

Deterrence and Discipline. From the professional association's point of view, having the code written down and distributed is a deterrent to unacceptable behaviour of members. Contravention of the code can lead to penalties ranging from reprimand (which might include public notification) to the cancellation of membership and loss of the right to practice as an engineer. By the way, the right to practice cannot be revoked in the US by the profession alone (p19, S&M). In Canada, this is under the control of the profession in all provinces. The application of discipline is handled by a peer group, with proper due process and usually, legal advice.

Contribution to Profession's Public Image.  The acceptance and standing mentioned above helps the reputation of the engineering profession in the public eye, and is essential in retaining its right to be "self regulating".  It is very important to note that this standing can only be earned; is not granted. It may be taken away by the public (and government) if the performance of the profession does not meet the standards being claimed. By the way, surveys consistently show that engineers have a very high public rating for trustworthiness and dependability.

Limitations and abuse of Codes.

Protecting the Status Quo?  Codes can also have a negative effect, and this is one of them. We can also run into the syndrome of "minimal compliance". (Remember the lifeboats on the Titanic?)

Once the rules have been laid down in a code, the effect can be to discourage dissent and improvement. The status quo is not always good enough. There have been cases where engineers have been disciplined by the profession, when in fact, they have been serving the public interest. (See the text, p19, referring to a 1932 case) It is not hard to see how this could happen. In the APEGN code, for example, the requirement to keep company data confidential could be a problem for someone who felt they should reveal it to a regulator, in response to what they judge to be a public health hazard. S&M have an interesting paragraph (p19) on this subject which goes as follows:

"Probably the worst abuse of engineering codes in the past has been to restrict honest moral effort on the part of individual engineers in the name of preserving the profession's public image and protecting the status quo."

S&M go on to say: "Preoccupation with keeping a shiny public image may silence the healthy dialogue and lively criticism needed to ensure the public's right to an open expression. And an excessive interest in protecting the status quo may lead to a distrust of the engineering profession on the part of both government and the public. The best way to increase trust is by encouraging and aiding engineers to speak freely and responsibly about the public safety and good as they see it. And this includes a tolerance of criticism of the codes themselves, rather than allowing the codes to become sacred documents that have to be accepted uncritically."

Promoting Business Interests?  This commercial  objective is clearly contained in codes. We have already mentioned the general dislike of competition on the basis of  price, although one can argue that its inclusion is for the protection of the public. In fact, clauses such as our 21 have not been permitted in the US since 1979, when a court decision struck them out on the basis that they are in restraint of trade, and not in the public interest. There is a tendency for the professional group to set up rules so that newcomers cannot overturn the established members. For what is perhaps a somewhat cynical view, listen to a quotation in an article in one of our references from a senior American engineer: " ethics are rules old men make to keep young men from getting any business." [2]
For an example of a very simple code including penalties for infractions, see...

Babylonian Building Code (1758 B.C.) (Hammurabi code)

If a builder has built a house for a man and has not made his work sound, and the house which he has built has fallen down and so caused the death of the householder, that builder shall be put to death. If it causes the death of the householder's son, they shall put that builder's son to death.

 If it causes the death of the householder's slave, he shall give slave for slave to the householder.

If it destroys property he shall replace anything it has destroyed; and because he has not made sound the house which he has built and it has fallen down, he shall rebuild the house which has fallen down from his own property.

If a builder has built a house for a man and does not make his work perfect and the wall bulges, that builder shall put that wall into sound condition at his own cost.

We note no provisions for any loss of life except householder, sons and slaves. We have made a little progress since then.


"The Social Responsibility of Business is to Increase its Profits" -Milton Friedman. (Nobel Prize winning economist)

The famous economist argues that managers must limit their goals to those authorized by stockholders, and that is to generate a maximum return on their investment, within the laws of the land. To adopt other objectives, such as protecting the environment, or providing for the employment of disadvantaged workers, or other social goals is to violate the shareholders' trust.   S&M think that Friedman's philosophy is ultimately self defeating; i.e. that approach will not maximize profits.

"The engineer's problem has centred on a conflict between professional independence and bureaucratic loyalty, rather than between workmanlike and predatory instincts." Edwin T Leyton Jr.. quoted in Ethics in Engineering by Martin and Schinzinger, p236. Ref 1.

The usual context of engineering practice is within corporations.  Leyton's comment is a recognition of this. In other words, the problems usually arise because the employer has a different sense of what and how something should be done, possibly involving some moral judgement, than the engineer does. It is not that the engineer is out to create shoddy work.

Ethical corporate climate.  This provides for a working environment where morally responsible conduct is fostered and encouraged. There should not be a conflict between business objectives and professional responsibilities. With engineers in management, as is often the case, the potential for this kind of tension should be recognized.

There are at least four characteristics of a good ethical climate identified by S&M:

1) Ethical values in their full complexity are widely acknowledged by management and employees.
2) The use of an ethical language is applied: e.g. in a corporate code of ethics.
3) Top management sets a moral tone in words, policies and  by personal example.
4) Procedures exist for conflict resolution. E.g. ombudsman or identified resource people.

It is important to avoid the trap of legalistic procedures and solutions.

Good engineering, good business, and good ethics work together in the long run... (S&M)


Ethical theory provides a basis for identifying moral issues and making decisions on moral responsibilities. The authors tell us that during the past three centuries, three ethical theories have been especially influential:

(1) utilitarianism, (which further divides into "act utilitarianism" and "rule utilitarianism")
(2) rights ethics,
(3) duty ethics

We will now take up the main features of each of these in turn, observing especially how they differ in providing a basis for making moral judgements.

Truthfulness.  Utilitarianism, duty ethics and human rights ethics all focus on principles about right and wrong conduct. These three types of "right action" ethics relate to what one considers to be the most fundamental moral concept.  S&M use "truthfulness" as an illustration of a  basic moral consideration which any ethics theory will have to take into account.  Moreover, as the authors illustrate with several examples, it often turns up in cases of discipline, and is relevant to anything we do in engineering.  They quote from the NSPE Code: "Engineers shall be objective and truthful in professional reports, statements or testimony.  They shall include all relevant and pertinent information in such reports, statements or testimony" (Our own code has to depend on Article 1 for this particular thought, in a less specific form.)

This  "truthfulness rule" is affirmed by all major ethical theories, but for different reasons can be linked to the "bottom line" of utilitarianism, rights, or duty ethics.

Utilitarianism.   This theory maintains that we should seek the greatest "utility" i.e. the overall balance of good over bad consequences. These are the only moral considerations. Hence everything reduces to the "greatest good for the greatest number". Truthfulness understood in terms of its contribution to good consequences. The philosopher most identified with this theory is John Stuart Mill (1806-1873).  He was an Englishman, and you will note that he lived at the time we identify with the Industrial Revolution.  You will also observe that he was a contemporary of Brunel.  Perhaps it is not unlikely that this era would produce a "cost/benefit" ethical  theory, and one that mostly underlies our codes of ethics.

Philosophers identify two types or variations of this theory: "rule utilitarianism" and "act utilitarianism".

Codes of Ethics are examples of the applications of "rule" utilitarianism, in that accepted behaviour is codified into a set of rules.  It focuses on overall consequences. Individual actions are right when they conform to the rules.

 Act utilitarianism focuses on individual cases, and is more flexible.  Rules are open to modification, depending on the circumstances.  The "truthfulness rule" is a  guideline, not absolute.  It is not hard to imagine circumstances where the rule could not be absolutely applied.  For example, suppose one is dealing with extreme situations such as a kidnapping.  One would hardly be expected to be bound by the truthfulness rule when dealing with the hostage takers.

Rights ethics.  This is an older theory than utilitarianism, and is identified with John Locke, (1632-1704) another Englishman.  Human rights, not good consequences, are fundamental.  Actions which respect human rights are obligatory, regardless of whether they maximize good. Truthfulness follows because trust is essential in order to be able to exercise liberty. These ideas had a strong influence on the two great revolutions that were to follow in the latter part of the 18th century in America and France, and the American Constitution is built on the foundation of individual rights.

Again we can identify two branches.  The first is the strongly individualistic version, called Liberty Rights.  These place a duty on others not to meddle in one's life.  There is a strong sense of independent action.  The viewpoint is sometimes identified as "libertarian", and takes a dim view of taxes and too much  government.

The second version of rights ethics is more oriented to the collective right  of the community (Welfare Rights).  The logic is that to have moral rights is based on a concern for others, and to be accountable to the community in which you live.

Duty Ethics.  This can be viewed as the "mirror image"  of rights ethics.  By this is meant that one has a duty because another has a right. For most individual rights, there are corresponding duties incumbent on others.  For example, the right not to be deceived puts a duty on another to be truthful.

Duty ethics emphasizes one's responsibility to respect persons, (including oneself, by the way). A principle of duty must be applicable to everyone, that is, everyone is expected (generally) to accept it.  One respects a person's individuality and rationality, as you expect them to accept yours. Also, there cannot be qualifications or conditions on a duty. For example, it does not make sense to have a duty to be honest, if it only applies when convenient. Again for example, the statement  "If you want to  improve your health, it is your duty to stop smoking" does not constitute a valid duty principle, because it depends on whether or not you want to improve your health.

The name most associated with duty ethics is Immanuel Kant (1724-1804).  Right actions are governed by duties to others and ourselves. According to Kant, each duty expresses respect for persons.

A list of duties from a modern philosopher: 1) Don't kill; 2) don't cause pain; 3) don't disable; 4) don't deprive of freedom, 5) don't deprive of pleasure; 6) don't deceive); 7) Keep your promise; 8) don't cheat; 9) obey the law; 10) do your duty (work, family, etc.)

One difficulty is confusion between 1) rules being universally applicable to all rational agents, and 2) rules being exceptionless. Some duties have to come before others, and exceptions have to be allowed for.

Example. Problem. 1(c), p47.

Robert is third year engineering student who has been placed on probation for a low average, even though he knows he is doing the best work he can. A friend offers to help by sitting next to him and "sharing" his answers during the next exam. Robert has never cheated on an exam before, but this time he is desperate. Should he accept his friend's offer?

Question: Apply utilitarianism, duty ethics, and rights ethics in resolving the moral problem. Do the theories lead to the same or different answers to the problem?

Applying utilitarianism. In this case we identify the good and bad consequences for everyone, and assess the balance.

Good consequences:
Might pass exam.
Might get off probation.

Bad consequences:
Might be discovered and disciplined, etc.
Friend would be implicated.
Loss of knowledge of what his independent performance would be.
Loss of self-confidence.

Applying duty ethics
There is a duty to not cause risk to others (his friend)
There is a duty not to deceive (those who might put trust in the results of the exam.)
There is a duty to not disappoint those who expect him to succeed (e.g. parents) (??)(Note that this "duty" is in favour of cheating)
There is a duty to realistic self evaluation (he is working as hard as he can).
There is a duty not to disturb an exam (which would happen if he were discovered)

Applying rights ethics  (to Robert and to others)
Robert has a right to choose his own success strategy (?)
His friend has a right to avoid the risk, and possibly guilt.
His friend also has a right to use his own mind - he made the offer.
Examiners have a right not to be undermined in their assessments.
Society has a right to be able to rely on the educational standards for engineers.
Fellow students have a right to an undisturbed exam.

Conclusion? The balance is in favour of saying no, thank you.

We will observe that there is no answer "in the back of the book" for these questions. You may have some different ideas. There is bound to be some subjectivity.

Sometimes the application of different theories leads to different conclusions. For example, suppose the question is whether seat belts should be worn. Utilitarianism would support it. Rights ethics probably would not.

Testing and refining ethical theories. Now you ask, which one of these theories is best.  The authors of the text go on to say that they do not intend to evaluate them, since they all have insights to offer, and complement each other. It would take a professional philosopher, which I am not, to explain all the nuances and details of these theories, together with the history of their development by the great philosophers of the world.

They do offer a set of conditions that any ethical theory must exhibit to be useful:

1. clear, coherent and applicable.
2. internally consistent.
In other words, none of its basic tenets can contradict each other.
3. not rely on false information.
4. comprehensive.
That is to say, it must be capable of providing guidance in specific situations of interest to us.
5. compatible with moral convictions.
This last one is most important, since any ethical theory which supports a morally repugnant act must be very questionable.

"Ethical theories are developed to illuminate, unify and correct common-sense judgements; and refined common-sense judgements about specific situations are used to test ethical theories." In other words, it is an interactive process.

 VIRTUE ETHICS.  The three major theories already examined were described as "right action" theories.  They focused on what kind of action was best.  Another ethical theory does not look at it that way at all, but emphasizes the ideals of good character.

It is the oldest type of ethical theory, going back to Aristotle, (384-322 BC) and occurring in many of the world religions. It seeks balance between extremes of conduct, emotion, desires, etc. This theory can also be used as a basis for professional ethics.

The most basic and comprehensive professional virtue is professional responsibility.  The authors call this an "umbrella virtue" which encompasses many others which may apply to various situations. Some of these others are what they call self direction virtues, such as self assessment, humility, courage, self-respect, and integrity. Secondly, there are public-spirited virtues.  These focus on the good of clients, and the general public. Striving for public safety and welfare come into this category. Third, there are team-work virtues. This includes loyalty to employers, respect for colleagues, and leadership. Finally, we can identify proficiency virtues, such as competence, diligence, and creativity.

OTHER VIEWS - Just a few words

Pragmatism (William James, John Dewey) This theory is somewhat like utilitarianism, but as the name suggests, allows more flexibility in applying moral rules, in line with the circumstances. S&M say that it "carries the danger of paying insufficient attention to moral principles through immersion in specific practical contexts."  They further suggest that nevertheless, in the right spirit,  it could be seen as an extension of utilitarianism and the others we have talked about.

Customs and ethical relativism. No doubt you have sometimes read or heard of companies operating in foreign countries "where standards are different from ours" and where it is argued, for example, that it is ok to provide bribes and kickbacks, or to exploit child labour, because that is the way business is done there. This would presumably be thought, by the same companies, to be unethical in this country. So what we have here is a case of what we can call ethical relativism. This theory equates moral correctness with acceptance by laws and customs of the society. S&M argue that this is an indefensible theory in general, and I agree.

It must, nevertheless, be recognized that customs and laws have to be taken into account when making moral judgements. Although the context of the time and place will have influence on how they are applied, the basic moral principles may well hold, if sometimes in a way that might not be apparent in our own society. We should not confuse this moral relationalism with ethical relativism.

Religion and divine command ethics.  Another view that the authors do not support is that moral values depend on a particular religion. This is referred to as divine command ethics. It is recognized that religion may be personally important, but morals may exist in its absence.

Self interest and ethical egoism  All the ethical theories recognize the importance of self-interest, but also require that it  be balanced by moral responsibility to other people. (Then we call it "enlightened" self interest). But there is a view, known as  ethical egoism, which reduces morality to the sole rational pursuit of self-interest, and maintains that in professional and business affairs, overall good will result. The authors use Adam Smith (remember him?) and the present-day Nobel winning economist Milton Friedman as examples of ethical egoists. The authors (and I) argue that surely morality requires a willingness of individuals and corporations to place some restraints on the pursuit of private interests.  One of the beneficial outcomes of studying engineering ethics is to help in uncovering  the moral limits on the pursuit of self - interest in the profession of engineering.

Meaningful Work and Professional Commitment.
S&M conclude their introduction with a short discussion on motivation for professional work such as engineering, and the commitments that go with it.  They identify three categories:

Craft motives.  Engineering is interesting and challenging work.

Moral Motives.  The idea that engineers can help society and do good for people is often a factor.  See Florman "The Existential Pleasures of Engineering"

Compensation and self interest.  There is nothing wrong in trying to make a good living.

GILBANE GOLD usually shown here.  Video, study guide, and overheads. (One or perhaps two lectures)


You will have noted the dominance of safety issues in many aspects of engineering ethical problems. Safety, particularly public safety, is paramount in much that engineers do. Chapters 3 and 4 of S&M have this as an underlying theme. As soon as we begin to consider safety in any quantitative way, we have to deal with the more formal notion of risk. Engineers deal with risk on a daily basis, and even when every effort is made to reduce it to a minimum in design and operation, we know that it is never zero. The problem is, we are never able to have the luxury of complete knowledge. Judgements have to be made, and sometimes mistakes occur. We also know that sometimes an innovation might lead in a totally unexpected direction, or have unanticipated consequences.

It is for these reasons that the authors compare engineering to an experiment; calling it "social experimentation." This ties in nicely with our earlier study on the interaction of technology and society, except that  Postman's interest was mainly in the undesirable outcomes.


Titanic.  Sometimes engineering is guilty of overconfidence, perhaps forgetting that things that are untried - which much of engineering must deal with - do constitute an experiment.  S&M illustrate  this overconfidence and lack of preparation for the unexpected with a very brief account, at the beginning of Chapter 3, of  Titanic disaster of 1912. Over 1500 people perished.

The builders were so confident of the Titanic's safety that she only carried lifeboats for about one quarter of the passengers and crew. This was all that was required by the regulations, which did not anticipate ships of her size. What was the responsibility of the engineers here? There were many other mistakes made which might have increased the likelihood of the tragedy which took place, but there is no escaping the fact that there were not enough lifeboats.

This situation is what we sometimes call "minimal compliance", and it constitutes one of the greatest moral problems in engineering, according to S&M (p88). Obviously the engineer must obey the law, but this does not replace moral responsibility if engineering judgement indicates that the law is insufficient.

The Titanic had many innovative features - the most significant in this case being its increased safety due to multiple watertight compartments. The authors use this case of overconfidence and minimal compliance to dramatize the proposal that had the engineers looked upon their innovation  as a "social experiment", they would have not been so likely to have made some of the tragic errors.

Engineering as experimentation.

Overall, any engineering project may itself be viewed as a kind of  experiment. The design process always involves proposing alternatives, trying out ideas, and iteration.

Innovation as a social experiment.  The authors recognize that the engineer working in a firm is not the sole experimenter.  But their expertise places them in a unique position to know what is going on and what is likely to happen. Pursuing the model of social experimentation, S&M suggest that this implies four elements if engineers are to be "Responsible Experimenters":

1. A conscientious commitment to live by moral values.

 This is very high sounding, and very broad in its application, but what exactly is meant? It must go beyond a preoccupation with narrow self-interest. It implies a consciousness, or awareness, in considering and acting upon the full range of moral values which bear on a given situation. This does not fall naturally into place in the working environment of many engineers. Ninety percent are employees, and while corporations may be run by totally moral executives (or sometimes not) there is a great pressure for an engineer to be a team player.

In fact, as we have seen, it is in keeping with our ethics codes to be a "team player", quite apart from the normal pressures which come from trying to ensure personal job security, advancement in the company, and professional development. Looking at it in terms of "duty ethics", the minimum "negative duties": for example, the duty not to falsify data, not to breach confidentiality, etc.,  can become the full extent of moral commitment. In other words, we then have minimal compliance.

On the other hand, a conscientious commitment to moral values does not mean that you must force your own views of social good on society. Moral values, by definition, have to be those which are widely shared.

I should not leave this discussion without pointing out that the idea of an engineer as a social guardian is not without controversy. I have distributing an article written by Samuel Florman in 1978 (Moral Blueprints) which argues that this social responsibility is over stressed . He puts more faith in a distributed moral responsibility, in companies and other institutions, and reasonably suggests that it is expecting too much of the employee at lower levels to look out for the ultimate use (or misuse) of everything he or she  might work on.

All we can do in this course of this kind is to give you an introduction to the issues and the intellectual tools which can be used to deal with them, together with a few case studies. In the end you have the right to develop (and  no doubt, will develop) your own set of rules and your own personal slant on moral philosophy.

The next element implied in being what S&M calls a "responsible experimenter" is:

2. A comprehensive perspective, or continual awareness of the possibility of unforeseen events, etc.

This is a principle which argues against the "compartmentalization" of knowledge and responsibility. It is fundamentally integrity, or "wholeness". One should look at the broad picture, and take all the facts into account. Is there an undesirable side effect of the change you are proposing in the design or system? Perhaps no one but you has thought of it. As S&M say, you should not leave it to the sales department to let the customers know - if they even ask, which they probably won't.

3. Moral autonomy.

This means taking personal responsibility. One cannot say "I'm only doing my job, my employer is responsible". That, in my view, is to abandon the idea of being a professional. It is certainly true that  the attitude of the management in a company will be a large factor in determining how much "moral autonomy" an employee engineer can exercise. Keep this in mind when you become managers. Other practical aspects of operating a business, such as competition and deadlines, have to be considered, and one has to be realistic, and simultaneously, not to compromise the principles.

4. There must be accountability.  This means that in the end, if it is your mistake, the buck stops at your desk. (Remember Isambard Kingdom Brunel and his atmospheric railway?). But it is not only in this narrow sense of accepting blame when it is properly placed at your door. There is also the acceptance of review, of just criticism from peers, and the willingness to morally defend your actions when appropriate. Working within a firm, under an employer's authority, can be interpreted as a reason for a narrowed accountability.

 It is of some interest to note that here we find another mention of the work of Stanley Milgram, whom we became acquainted with in Postman's discussion of social science.  You will recall that Milgram concluded from his experiments that, if instructed by an authority figure, people tend do do things that they otherwise would not .  (Postman thought this was an obvious commonness result being cloaked in scientism).  S&M observe that this tendency to abandon accountability if one is under someone else's authority is common in the professions.

Challenger case study usually done at this point. Video by Roger Boisjoly, and other material based on material developed at MIT and available on the internet. (Two lectures).


Concepts of Safety and Risk  We have seen that safety, particularly public safety, is a key responsibility of engineers. In terms of "rights ethics", it derives from the right of the public not to be endangered without a clear prior warning. This last part about the "clear warning" relates to  the subtle point that if risk is acceptable, the product is deemed safe. Safety does not imply zero risk, it implies acceptable risk.

In defining a safe product as one where there is acceptable risk, we are not really getting anywhere, for now we have to define what we mean by "risk". Furthermore, how low must the "risk" be for the item or process or whatever it is to be considered safe? Not only that, but we can see at once that  it depends on the context in which the risk occurs. Using a sharp knife to slice bread is more risky if it is being done by a child than an adult. So we cannot attach some unconditional "risk number" to the knife, although the design can influence the degree of risk to which any user is exposed.

The notion of risk is fundamental to an engineer, and not simple.  Let me quote from the introduction to a marvellous book by Peter L. Bernstein: [9]

"What is it that distinguishes the thousands of years of history from what we think of as modern times? The answer goes way beyond the progress of science, technology, capitalism, and democracy... The revolutionary idea that defines the boundary between modern times and the past is the mastery of risk: the notion that the future is more than a whim of the gods and that men and women are not passive before nature" .

Later on in the same chapter, Bernstein tells us something of the etymology of the word "risk". It derives from early Italian risicare, which means "to dare". In this sense, risk is a choice, rather than a fate . This is interesting, since well back in this course, I pointed out that engineers make choices. Of course all people make choices, but it is an ever-present part of the work of an engineer.

The recognition that risk is inherent in engineering is, no doubt, behind the ideas of Schinzinger and Martin in the previous chapter, where they suggest that engineers should consider themselves to be "social experimenters", and take the necessary precautions for the unpredictable.

Engineering decisions may affect many people. For example, a technological direction might be set for society - perhaps unconsciously - or choices of design methodology and materials may establish how safe a structure will be, or how environmentally friendly a manufacturing plant will be. Generally speaking, the public does not know what choices, or implied risks, are involved. The engineer takes professional responsibility, so it is very important for him or her to understand what risk means. And not only must we know, but we have a responsibility to explain it to the public. Otherwise, we cannot properly discharge our duty of safety to society.

Our text is a little less sophisticated and perhaps a bit more pragmatic than Bernstein's book. S&M defines risk as "a potential that something unwanted and harmful may occur." (p110). Their definition of safety is a bit more wordy: "A thing is safe if , were its risks fully known, those risks would be judged acceptable by a reasonable person in the light of settled value principles."

The words "settled values" refer to the need to have an agreed level of acceptable risk according to the moral values of the person or group. There needs to be an "external" point of reference, i.e. external to the individual. Safety is therefore a matter, in the authors' view, of how people would find risks acceptable or unacceptable if they knew the risks and were basing their judgements on their "most settled value" perspectives. These ideas are only partly objective; the subjective element is clearly there. Safety is therefore also a hypothetical matter, since it depends on a conditional knowledge of risks, which generally will not be complete.

Not only that, but when we have to judge risk, it depends on how the question is put. Human perception is not exactly as rational as we might expect, and depends somewhat on how the question is asked. (Recall Postman (Technopoly) on this point.)  A fascinating example is given in the text (p112).

The imaginary scenario is that a country is expecting a serious outbreak of disease, which, untreated, is expected to kill 600 people. A group of 150 people is given the choice of two strategies, and it is to be assumed that the results will be as follows:

Choices as put to Group 1:

Program "A" will result in 200 people surviving.
Under Program "B", there is a 1/3 probability that 600 people will be saved, and a 2/3 probability that there will be no survivors.

Results for Group 1:
72 % chose "A". The reason was apparently that the certainty of saving 200 under "A" was perceived to be more acceptable to most people than the gamble of saving everyone under "B".

A second group was given the information in the following way:
Under Program "C", 400 people will die.
Under Program "D" there is a 1/3 probability that no one will die, and 2/3 probability that all 600 will die.

Results for Group 2:
Now 22 % chose "C", and 78 % chose "D". Note that "A" and "C" are identical, as are "B" and "D".


One way of looking at the result is that options perceived as yielding firm gains tend to be preferred over options in which even better gains are only possible. (This is the first case).

The second conclusion (from the second case) is that options which emphasize firm losses tend to be avoided in favour of possible lower losses.

In other words, people tend to be more willing to take risks to avoid perceived firm losses in favour of options than they are to win only possible gains. They avoided the 1/3 probability of no one dying in the first instance, but chose it in the second, even though the other option was exactly the same; 400 certain deaths.

I should make clear that this is an experimental result, as reported in a prestigious scientific journal, not the authors' speculation. You can find the reference in S&M. So we can take it that it is truly what we can expect from the population generally.

People thus treat the perception of benefit differently from the perception of risk. We will expend more effort to avoid perceived risk (per unit loss) than to gain perceived benefit (per unit benefit). This idea is captured in the expression we sometimes hear (but probably never question or justify) that most people are "risk averse". This obviously is not the same for everyone, and there are such things as risk takers, hard-core gamblers, and people who play video gambling machines with their grocery money.

Thus to establish what the people regard as "safe" is quite a complex undertaking. It depends on these rather irrational human characteristics. For example, people underrate the risk of familiar things - e.g. driving - and overrate that of others less familiar, e.g. flying. Perception of the risk associated with an activity can be greatly altered if something happens to someone we know, e.g. if a friend goes through the ice on a snowmobile you are likely to be much more conscious of the risk of crossing ice on your own machine.


Safety is no accident: good engineering design . We have already noted that absolute safety, in the sense of zero risk to everyone under all possible conditions, is not attainable, let alone affordable. It will likely (but not always) cost more to manufacture safer products. I say not always, because safety is often determined by design, and sometimes innovative design can improve safety and not increase manufacturing cost. In fact, probably the most under used tool is good engineering design sense. Consider the two designs in the example on p127 (Fig 4-5) of S&M. Simple rewiring makes a safer design.

Although increased attention to safety is often accompanied by increased manufacturing cost,  if products are not safe there can be economic consequences as well as ethical considerations. Unsafe products lead to high warranty costs, loss of customer goodwill, litigation, and so on. There is frequently a trade-off to be made between primary manufacturing cost and secondary costs which might be a function of product safety. This is illustrated by Fig.  4-1, P117. The curve of primary design cost might decrease as risk is allowed to rise and product safety worsens, as shown in "P". On the other hand, secondary costs, such as litigation, warranty cost, as we discussed above, might well rise as shown in "S". The curve of total design cost, "T", has a minimum at "M". If one were to try to reduce the primary costs, one might design to the right of M. But a better plan, for the same overall design cost, would be for the manufacturer to be to the left of "M", at a point such as "H". It would obviously be a better choice than being at the same total cost to the right of M.

Uncertainties in Design  It is easy to accept in principle that safety must always have a high priority in design. But many uncertainties about eventual use of a product, operating conditions, materials, or even design data reliability can undermine the objective. One of the traditional ways to cope with these unknowns is to make reasonable assumptions, and then build in a "factor of safety" - or "ignorance factor" (for the cynical). But a factor of safety is effectively over design, and that can be costly. Competition and over-confidence force them downward, until eventually there is a failure.

"Factor of safety"  can also be quite ineffective if the assumptions regarding the nature of use of the product turn out to be invalid, for example if something is designed for an expected load environment, but it is fact exposed to something which affects the structure (say) in an unexpected way. A good example is the effect of wind inducing unexpected torsional loading on the deck of the Tacoma Narrows bridge, which I expect you have all see demonstrated . (1940).

Risk Benefit Analysis  This tool is often used for large public projects.  For example, although the mathematical analysis may not be undertaken, in a project like the development of an offshore oil field the impact assessment required by government is of this sort, at least intuitively.  The public is consulted about what they see as risk, and what benefit.

The process can be formalized, using probabilities estimated for future anticipated events and economic returns.  Some of these are easier to calculate than others.  How much value is placed on the 0.01 % chance of a large oil spill, in order to compute the financial risk?  And what do we use for an interest rate to calculate present values of risks or benefits far into the future?

The authors say there is a need in today's society for some open process for judging the acceptability of potentially risky projects.  (Perhaps we already have that in the case where the environment is at risk)

The ethical question is: "Under what conditions is someone in society entitled to impose a risk on someone else on behalf of a supposed benefit to yet others."  (p122)

Public Risk and Public Acceptance:  How much value on a life?  The problem of answering this question is obviously full of difficulty.  On a population, one can use statistics such as insurance, extra paid for hazardous work, or whatever quantifiable data there is.  Examples:

National Highway Traffic Safety Administration (NHTSA): Maybe $200,725?

Research on what would be acceptable; ~$8 million

Court awards from a plane crash :

Auto crash: (79 Malibu gas tank explosion, sued GM)
$107 million compensation, $4.8 Billion punitive damages.

Obviously, court awards vary widely.

Case Studies  The  student is encouraged to read the descriptions of engineering cases described at the end of Chapter 4, and in many other places.

Three Mile Island, USA.  (1979) A nuclear power plant that had a technical failure compounded by incompetence, and disaster was narrowly avoided. No deaths or injuries, but over a billion dollars in decommissioning costs, and the effect on the perceived risk of nuclear plants led to a sharp downturn in the nuclear industry.

Chernobyl, Ukraine, (1986) Poorly planned and executed technical testing led to a violent explosion.  31 died at once, but by 1992 thousands of deaths have been attributed

The Citicorp Tower. (1978) This was an innovative 59 storey building in New York, supported on pillars over a church.  A question from an engineering student started the structural engineer thinking, and upon checking, he found that not only had he underestimated the wind loading, a factor of safety which he might have counted on had been eliminated by a design change during construction.  He took the ethical route, alerted those involved and made provision to protect people in the event of a big windstorm while the repairs were made.  The repair cost was over $12 million, and the engineer's insurance picked up $2 million.  There was no litigation.

Designing for failure

It is unrealistic to expect that engineering projects will never fail. The best that can be done is to insure that when they do, they should fail safely, without injury to people. To this we can add the requirement for safe abandonment; i.e. the failed product or process can be discarded or stopped without hazard to others, property, or the environment; or safe escape from the danger e.g. a sinking ship. Together, the authors denote this as the necessity to build in safe exit, (unlike that in the Ocean Ranger) and they see it as an integral part of sound engineering. This amounts to proper management of risk, which is a key part of the responsible engineer's job.

The messages of Chapters 3 and 4.


Both employer and employee have rights and responsibilities.
Employers have a right to expect:

The employee engineer also has certain rights:

This chapter deals with some of these issues, starting with issues of responsibility, confidentiality, and conflicts of interest.

Confidentiality.  This is one of the central duties of a professional, and widely acknowledged.  Some would define it to mean anything that comes from the employer during employment, but the authors think this is a little broad. But the employer has the institutional right to identify confidential material on a reasonable basis, and S&M suggest that most would agree to the following:

...any information that the employer would like to have kept secret in order to compete effectively (S&M)

There are a couple of related terms. Privileged information refers to material that is available only to those with a particular authority, or privilege. For example, it could mean only the members of a project group.  Proprietary information is what is owned by the company (it is the proprietor).  It is a "trade secret", and the law makes it illegal for an employee to divulge it, even if they have gone on to a new job.

 Another way a company might choose to protect technical matters is through patents.  This legally forbids others to copy it, but a patent is public, and sometimes the idea can be used and not contravene the patent.

Confidentiality  can emerge as a problem when changing employment.  For example, how can an employee who moves to a new job use his/her experience, if nothing can be based on the old job? In general this restriction is not so constraining that one cannot use the results of professional experience, but if material known to be a trade secret is revealed, there could be a lawsuit. The moral right of the employee to career advancement has to be balanced against the right of the company to protect its competitive position. Sometimes the court has to decide.  See a couple of good examples in the text.  In one case mentioned a  GM executive went to Volkswagen, who eventually settled out of court for $100 million cash, plus other considerations for the company. In an older but well documented case, a man by the name of Donald Wohlgemuth left B F Goodrich to manufacture space suits with a competitor. Goodrich sued, and sought an injunction to prevent him from working with any other company which manufactured space suits. But the courts more or less said he was ok, in that Wohlgemuth was not conveying specific trade secrets, but rather using the general benefits of his professional experience, even though it was acquired at B.F. Goodrich.

Justification and management policies.  The need for confidentiality is not unreasonable for companies, and they have a right to protect their interests.  The rights of both companies and employees can be protected by sensible and open employment agreements.  For example, an employee can agree to not go to work for a competitor in the same business for a specified period of time after leaving the first company.  Some benefit may be offered for this undertaking, e.g. portability of a pension plan.

Conflict of Interest.   When an employee has an interest which might prevent them from meeting their obligations to the employer, there is a conflict of interest.  This could be personal, e.g. owning substantial stock in a competitor's company, or professional, e.g. taking on an assignment for a competitor, or serving on a regulatory committee which governs one's own business activity.

Our own code of ethics mentions this directly, and points out that such conflicts must be made known to the employer, and only pursued with permission.  The problem arises because having another interest different from your employer's can cloud professional judgement, which involves more than just following rules.  If there is a grey area, and you have a conflict of interest, you might be inclined to make a different judgement from what you would in the absence of that bias. It is not a defence to say that in the actual case there is no chance that judgement could be affected.  If a "typical professional" could be influenced, the conflict exists.


Gifts, bribes and kickbacks.  For example, what constitutes a bribe?  We have had some discussion on this topic. But anything that constitutes a reward from someone other than the employer can influence how the employer's interest is discharged.

Interests in other companies.   Owning stock in a competitor's company could obviously be a problem.

Insider information.  Knowledge that is not available to those outside the firm in general could be used to favour friends or relatives - e.g. in investments.  When it comes to outside activities, no special advantage should accrue to the employee because of access to confidential matters.  Sometimes experts such as professors let their reluctance to possibly lose industry research support interfere with public service roles such as giving expert testimony.

Exercise 4, p155:  Scott Bennett and the Condo.


By this we mean the ethical questions which relate to the rights of engineers as employees. The rights we identify are in the category of "ethical rights" rather than rights under the law, although sometimes there is legal basis.  They identify the "moral high ground", and could be the basis for taking an action requiring some justification to someone in authority.

Professional employee rights.  If we look simply at an individual employee professional engineer, it is not difficult to identify rights that he/she should have by looking at the various identifications.

As a human being, one has the fundamental right to life and pursuit of legitimate interests, such as work and career. There is also the right to non-discrimination on the basis of sex, race and age, and in many countries, such as Canada, these rights are also set in our laws.

The fact that an engineer is an employee also confers certain institutional rights. Examples are those which arise from employment contracts, e.g. the right to a salary, vacation, etc., in return for the work that you do. Then there are rights of the employee not to have to give up human and political rights, e.g. religious rights, political rights, etc. just because they are employed.

Finally, there are special rights which are a result of the professional role one has as an engineer, which we identify in more detail below.

Every issue is not a "rights" issue.  Before we get too far into this emphasis on rights, it is prudent to consider the fact that, as we have often observed, a professional life -  or any life - is hardly ever black and white. We have seen how obligations and therefore rights can be in conflict, thereby creating moral dilemmas.  Circumstances can affect how much weight can be given to a particular right. Just because engineers have a duty to public welfare, and therefore a right to blow the whistle in some cases, does not mean that a certain business decision by management which does not materially affect public safety has to be taken to the local TV station because you may judge your management as not working in the overall good of society. We will spend some time on the matter of whistle blowing shortly.

Professional Rights

We start with the last of the above categories, which is central to one's role as a professional engineer.

Right of professional conscience. This right is basic and generic. That is to say, it is fundamental, and forms the basis for several individual rights of the professional engineer. It is the moral right to exercise professional judgement based on knowledge, experience, moral values, and moral autonomy. It is what we have sometimes called a "negative right", or a "liberty right" in that it is a right not to be overruled or interfered with by others. It induces the duty in others not to interfere with its proper exercise.

Of course you should not get the idea that the right must be exercised in isolation. Most engineering decisions are complex, and especially those which have ethical components. Consultation and discussion with colleagues is a part of the responsible exercise of engineering judgement. Further, your right to this consultation, and perhaps access to employer resources essential to making such a conscientious decision, places an obligation on the employer to do more than simply not interfere.

Right of conscientious refusal.  But there does come a point where you consider drawing the line. Usually when this is considered, it will then have to be viewed in the light of one of  two possibilities regarding prevailing standards:

(1) where there is a widely shared agreement in the profession that a proposed act is unethical. This is the more easy of the two. Examples are forging documents, altering test results, lying, taking a bribe (but the definition of bribe can be a bit troublesome), padding a payroll, etc.

(2) where there is some doubt about the agreement among members of the profession on the ethical issue. This is obviously more difficult, and unfortunately, it is also more likely to occur than black and white cases.

Take a case where an engineer is designing a mine ventilation system in a developing country where there are not equivalent regulations to that in Canada, say. The mine management might be prepared to accept a lower standard, and to insist on Canadian standards might put the project out of reach and much needed employment might be lost.

An engineer's right to refuse assignments does not put a duty on the employer to keep him employed,  if there is nothing else for him to do in the company. For example if company is building military aircraft and the engineer decides that he cannot ethically work on machines of war, because it is contrary to his sense of duty to the public, there may be no alternative but to look for a new job.

So the right to refuse assignment is there, but it is conditional. There is some legal recourse against wrongful dismissal, but it will not likely cover all cases based on ethics.

Right to recognition  S&M state without reservation  that "engineers have a right to professional recognition for their work and accomplishments." this includes reasonable payment of salary of fees, and other more subtle forms of recognition. For example, someone who works hard only to have someone else take the credit is being abused and demeaned. There is a right to fair treatment.
Employee rights.   A professional employee has certain rights that result simply from the employment contract, formal and implied.  Large corporations ought to recognize a basic set.
a) contractual, e.g.
  to a salary
  to a vacation
(b) non contractual, e.g.
  to choose legitimate outside activities.
  to privacy and employer confidentiality
  to due process (in the event of disputes)
  to non-discrimination and absence of sexual harassment at the workplace.

Collective bargaining.  (see discussion topic  5 p166.)

This issue is not specifically mentioned in our code of ethics, although as we have mentioned, it is in others. Two arguments have been used for inclusion of terms against "coercive collective action" by engineers. One is the "faithful agent" argument, and the other is based, as is so much, on serving the public good. But there are "public good" arguments on the side of collective bargaining, and there is no open and shut case that shows that unionism and professionalism are incompatible. S&M presents both sides, and admits to complexity. This is an issue that depends on the particular situation. It might be more effective to argue on the basis of whether the bargaining strategy is effective, rather than to make it a moral issue.  Modern practice has led to the conclusion that there is now no bar which can be enforced against collective bargaining for engineers.

Examples: (1) The Case of the Backward Math.

Jay's boss is an acknowledged expert in the field of catalysis.  Jay is the leader of a group that has been developing a new catalyst system, and the search has narrowed to two possibilities, Catalyst A, and Catalyst B.

The boss is certain that the best choice is A, but he directs that tests be run on "just for the record".  Owing to inexperienced help, the tests take longer than expected, and the results show that B is the preferred material.  The engineers question the validity of the tests, but because of the project timetable, there is no time to repeat the series.  So the boss directs Jay to work backwards and come up with phoney data to substantiate the choice of Catalyst A, a choice that all the engineers in the group, including Jay, fully agree with.

What should Jay do, and does he have a moral right to not do as he is directed?



(1) Jay's boss is an expert in the field.
(2) On whatever basis, presumably experience, he believes the right choice is "A".
(3) Jay and other engineers agree with the boss, and they also have expertise in the area.
(4) The validity of the tests is questionable.

Jay has been directed to falsify data. This is clearly contrary to the moral obligation of integrity in all he undertakes. He also has the moral responsibility to obey direction from his employer. But this falls into the area of clause 14 of the APEGN code, which deals with professional judgement being overruled.

On no account can he simply falsify data to support the choice of catalyst "A" in the face of testing that indicated "B", even if the validity of the testing is questionable, and even when they all agree that the result was unexpected.

He should then make the best possible case for re-testing. If that is unsuccessful, in order to satisfy the boss's direction he can do the required calculations of "backward math", but his report must clearly document that although these would be the required data if they were to support the choice of catalyst "A", but that is not what the testing revealed.

He should also report the actual data, and state clearly that he has no test data to support the choice of "A", and any recommendation to the client in favour of "A" would be contrary to such data as they have. That is not to say that such a recommendation cannot be made to the client, depending the risk exposure if a mistake is made, but the client would have to be made fully aware of the facts, and the basis for the recommendation.

(2) How far does this paramount obligation to protect the public go?

A design group develops a new electronic circuit to be used in clock radios that would extend their average life from five to seven years, at an increased manufacturing cost of only one percent. When this proposal is presented to top management, it is rejected on the grounds that it is not cost-effective, and they direct that further work on the design be terminated. Does the design group's obligation to the public outweigh the employer's directive?

Analysis  In this case it is reasonable to conclude that the employer's action is within his legitimate authority, and the moral obligation of loyalty (faithful agent) to the employer is more important than the possible benefit to the public. After all it is a business decision. It might even be that it is a bad business decision,  (if that is the designers' professional opinion, they should document it in a memo to the boss (Article 14 of CoE)) but it is the employer's right to make it.  For example, a competitor might well make such an improvement and take away market share.  The question of public safety has not arisen, and the effect of innovations such as these are very hard to predict. The engineer is taking on a lot to  base  a moral argument on presumptions of the good to the public in this situation.


When it comes to a course in engineering ethics, whistle blowing is one of the first to come to everyone's mind. This predominance is understandable, in that it usually is the most dramatic situation in which an engineer takes a stand against an employer, internally or externally.  It fundamentally conflicts with the duty of loyalty to the employer, and Article 10 of our Code of Ethics applies.  In terms of the application of ethics, the attention it gets is largely undeserved, because it really is a last resort.  Some of the questions are:

In its broadest sense, the notion of a "whistle blower" is not limited to an employee and his employer. Anyone who is aware of something they think is illegal or unethical can draw it to the attention of authorities or to other people whom they think can respond. This might only be the public. Journalists, politicians, and consumer groups do it all the time. This is of course why the idea is so well known.

Definition:  In the sense that we wish to use it in engineering ethics, it occurs when an employee or former employee conveys information about a significant moral problem outside approved organizational channels or counter to authority, to someone in a position to take action on the problem. (S&M p167)

There are four main features in this definition:

(1) Act of disclosure: Information is intentionally conveyed outside normal channels or against the wishes of  supervisors.

(2) Topic: The person believes it to be a significant moral problem for him, with consequences for the organization and the public. Examples are illegal acts, unethical practices, serious threats to public safety.

(3) Agent: The person disclosing the information is an employee, former employee, or otherwise closely associated.

(4) Recipient: This would have to be a person in a position to act on the problem. For example, telling it to a relative over dinner or a friend in the pub does not count.

We can divide the process as to whether it is kept inside the organization (internal whistle blowing) or taken outside (external whistle blowing). Even when it is internal, it might amount to going against authority. Whistle blowing can be open or anonymous.

Note that the definition does not make assumptions about whether or not the whistle blowing  is justified, other than the fact the person believes the moral problem to be significant. (Some other definitions differ slightly on this point.) Each case can then be argued on its merits.

Two examples p168. Fitzgerald  and the C5-A, (left for student to read)  and Applegate and the crash of the DC10.  In one case the whistle was blown, the other not.

DC 10 crash and Applegate's dilemma

346 people were aboard a Turkish Airlines DC 10 jumbojet which left Paris on the evening of 3 March, 1974. Only nine minutes into the flight a cargo bay door flew open, and the main passenger cabin floor collapsed, taking out control cables and causing total loss of control to the tail surfaces. There were no survivors. Many engineers knew the design was unsafe, and the danger was well documented. So why did this happen?

Details of the DC10 affair

S&M give a brief summary of this well documented case, but more details are available in reference 7. (Sawyier, in "Engineering Professionalism and Ethics")

Design for the DC10 aircraft began in 1968, in response to a call from American Airlines for a versatile wide-bodied jet. McDonnell Douglas was in fierce competition with Boeing, who had produced the 747, and Lockheed (L1011) for a piece of the "Jumbo jet" market.

As early as 1968, Dutch engineers had already warned of the danger of the collapse of passenger cabin floors in these big planes if the cargo hold should suddenly become depressurized. In moving up from smaller planes to the big wide-bodied aircraft, these engineers argued that there was  insufficient additional structural strength to withstand the enormous forces on the floor if there were a one atmosphere pressure differential across it, which is what would happen if the cargo bay suddenly depressurized for any reason.

In this case, the client, American Airlines, also insisted on a change in the cargo door closing mechanism. They wanted electrical door actuators rather than conventional hydraulic. They would be lighter, and cheaper. This requirement was passed to Convair, the design contractor, and was implemented.

Dan Applegate was the Director of Product Engineering for the Convair Division of General Dynamics Corporation, which was under contract to McDonnell Douglas. The potential weakness in the design was recognized by Applegate and Convair in 1968, and by 1969 they had identified nine possible critical failure sequences involving cargo doors.

But McDonnell Douglas took full responsibility for the design, and did not inform the regulators about these results, which indicated that the system could be exposed to a catastrophic failure. Applegate's superiors in Convair apparently would not allow the information to go beyond the company, knowing that there would be a dispute as to who would bear the cost of redesign. In fact, the contract between McDonnell Douglas and Convair forbade Convair from going to the regulator (the Federal Aviation Authority) directly.

Between 1970 and 1972 more evidence of problems with the cargo door was documented  (including at least two cases of floor collapse) occurred, but the plane was still allowed to fly.

Applegate became so alarmed by June, 1972 that he wrote an internal memo documenting his concerns. Although these renewed concerns still did not go to the regulator, in fact the FAA  were aware that there was a problem, because tests had revealed them. One of the regulator's own engineers was so frustrated that he too, wrote and filed a memo. It all came apart the night of 3 March, 1974, when the ground crew at Paris failed to properly secure a cargo bay door, and the crash occurred a few minutes later.

What do we conclude?

No doubt it is an extreme case, but this illustrates the classic dilemma of the employee engineer. If Applegate had gone over his bosses' heads directly to the regulatory authority, or the press, he would probably have been fired, but it might have saved 346 lives. On the other hand, who knows what pressures he might have been under, and for that matter, who knows how the system might have responded.

We should not jump too quickly to the conclusion that it was obvious at the time that he should have done something other than what he did, nor can we even conclude that the accident would certainly have been prevented. But we are reminded that acting ethically can sometimes require a lot of courage.. Applegate struggled internally to do what he thought he should, and was badly let down by his management.

 As S&M say, quoting another source, "ethics is not for wimps"

The DC10 disaster is an example of a complicated problem, with several people and organizations sharing blame. But complexity is more often the rule than the exception, and no code of ethics can be applied like a formula in every case.

Required conditions before whistle blowing:  Under what conditions may (or should) an engineer "blow the whistle"?  The potential harm must be substantial, and may affect the public, employees, or even shareholders. It may take various forms, safety, faulty products or fraud, for example. For the whistle blower, it is a very serious undertaking, especially if it is done externally.  S&M draw on guidelines proposed by Richard DeGeorge (see the reference in the text) where public safety is concerned. They advise that the following conditions must be met before getting involved in whistle blowing:

(1) the actual or potential harm is serious and has been documented
(2) the concerns have been made known to superiors, and
(3) after getting no satisfaction from immediate superiors, other internal channels are exhausted, up to the highest levels of management.

Sometimes these conditions can be hard to meet, and there may be exceptions.  For example, confidentiality may interfere with the first one. The next two can be very problematic if the issue involves managers above you, or there is extreme urgency. Finally, the price of whistle blowing can be high, yet engineers and others have done it. Read the examples in the text - e.g. Fitzgerald.  What about Applegate, and Boisjoly? Both did all they might have been expected to do, but they did not go external.  Should they have?

Loyalty, collegiality and authority  Every company wants its employees to be loyal team players, and teamwork is a virtue with which engineering students are very well acquainted. The ability to work in teams, even in the face of competing ideas and uneven contributions of members is a great strength, and the mark of a good engineer. Although we would identify loyalty as a virtue in an employee engineer, the word  does not appear literally in the APEGN Code of Ethics, nor in any of the others in S&M. The notion of loyalty is captured in the "faithful agent and trustee" phrase, which appears in them all, in one form or another.

Another similar characteristic is collegiality; the ability to get along with people, to make everyone feel they belong, and to resolve conflict when it occurs.

Two types of loyalty:  There are two levels of loyalty to an employer. The first can be called agency loyalty, and refers to the contractual duties that the employee has, as well as accepting the legitimate authority of the company.

The other type of loyalty is called attitude loyalty, and has more to do with emotions and a sense of personal identity with the group to which it is loyal. It is of course possible to meet the formal contractual obligations implied by agency loyalty without reaching the level of commitment implied by attitude loyalty.

The extent to which "faithful agent and trustee"  goes beyond "agency loyalty" is open to interpretation. Some would undoubtedly say that the intention in the code does indeed go further than this bare minimum. It can be reasonably argued that if the employee is achieving goals of his/her own through professional employment, more is owed to the company than the lowest level of loyalty. For example, at the very least, one is gaining professional experience. Similarly, if there is a sharing of burdens and benefits among the group, a sense of identification loyalty reasonably develops.

In the final analysis, the weight given to employer loyalty has to be judged in the specific situation. There are times when it must yield to higher values. Thus loyalty is a dependent virtue, and depends on the value of contribution being made by the organization. Covering up wrongdoing to the public could not rate as a valuable contribution, and therefore does not command loyalty.

Respect for authority.  Employers must have legitimate authority over employees. Without lines of authority in an organization, everyone is free to do their own thing, and chaos would soon ensue. Such authority is called executive  authority, and is given to individuals to carry out their tasks within the institution.

This sort of authority is different from expert authority which derives from a special competence, knowledge or skill. Of course expert authority can exist in individuals who have no institutional authority.

Another important distinction is between authority and power. Real power amounts to the ability to inspire, persuade or direct others to accomplish objectives. An individual might have a good deal of institutional authority and resources, but if he/she lacks leadership skills, can be quite ineffective in getting employees to produce for the company or institution. On the other hand, someone having expert authority and respect of colleagues, or even simply charisma,  might be able to motivate or otherwise persuade people to excel.

Protecting whistle blowers

"Whistle blowing is  lonely, unrewarded, and fraught with peril..."   (S&M, p175, quoting a lawyer).  Nevertheless, engineers feel they must do it in some circumstances.  It is not surprising that employers who have been exposed in some illegal or unethical conduct can be quite upset, employees can suffer greatly, and of course court cases can result.

Common-sense procedures.  S&M suggest several pieces of advice in considering whether or not to take this serious action:

1. Except for extremely rare emergencies, always try first working through normal organizational channels. Get to know both formal and informal (unwritten) rules for making appeals within the organization.

2. Be prompt in making objections. Waiting too long may create the appearance of plotting for your advantage and seeking to embarrass a supervisor.

3. Proceed in a tactful, low-key manner. Be considerate of the feelings of others involved. Always keep focused on the issues themselves, avoiding any personal criticisms that might create antagonism and deflect attention from solving those issues.

4. As much as possible, keep supervisors informed of your actions, both through informal discussions and formal memorandums.

5. Be accurate in your observations and claims, and keep formal records documenting relevant events.

6. Consult colleagues for advice - avoid isolation.

7. Before going outside the organization, consult the ethics committee of your professional society.

8. Consult a lawyer concerning potential legal liabilities.

To these I would add the very real caution that one should be clear about  one's motives. It is very easy to fall victim to the temptation to turn some personal grievance into a great case of moral principle. The moral weight of self-interest should not be zero, but it is not one of the heaviest.

There is an interesting web site mentioned by S&M, created by an American group called GAP: "Government Accountability Project", with a web site

The one compelling observation which must be made is that there surely is little protection and reward for the whistle blowers. This led us to consider conditions under which engineers are justified in whistle blowing, especially externally. This action requires a serious moral inquiry, since it will often be done at the sacrifice of the rights of colleagues and family.

We note that in the Challenger case, Boisjoly did not go totally public with his concerns; there did not seem to be any mechanism which would stop the launch and time was extremely short.. But in meetings with the client he did his best to make his best case against launching that cold day.

Recap of issues concerning employee engineers' rights:

Before concluding this section (and the course) let me return to the set of "issues" with which we started the material concerning the rights of engineers.

-Do engineers have a moral right to refuse to carry out what they consider to be unethical activity?

Yes, we certainly do. But the exercise of this right does not necessarily impose a duty on the employer to keep us employed.

-what duty falls to employers to respect this right? That is, if an employee takes a course of action contrary to the employer's wish, based on moral grounds, must the employer refrain from action such as discipline or dismissal?

It depends on the situation. In some cases, there is a legal constraint on the employer, and court action can be brought by the engineer. In many cases, the employer also has the right to moral autonomy, and might not respect your right. In other cases the employer may have no control over the situation, for example where a law is broken.

-do the rights or responsibilities of either the employee or the employer depend on whether it is a business or an  institution supported from public funds?

We have not talked much about this. Fundamentally, the engineer's rights should not be different, nor should the responsibilities of the employer. But the duty to the public good and fair treatment of employees is especially accented when the employment is in the public service. It has the effect of giving even more weight to that moral obligation.



1. Mike W. Martin, Roland Schinzinger. Ethics in Engineering, 3rd ed. McGraw Hill, 1996.

2. Jack McMinn. Ethics Spun From Fairy Tales in Engineering Professionalism and Ethics, James H Schaub, Karl Pavlovic, editors. Wiley, 1983. P466.

3. Ibid, p467.

4. Michael Davis. Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession. Available on the Illinois Institute of Technology Web site:

5. Engineering Professionalism and Ethics, James H Schaub, Karl Pavlovic, editors. Wiley, 1983., p458.

6. Robert L. Whitlaw. The Professional Status of the American Engineer: A Bill of Rights. In Schaub and Pavlovic p295.

7. Fay Sawyier. The Case of the DC10 and Discussion. In Schaub and Pavlovic, above, pp368-401.

8. Samuel C. Florman. Moral Blueprints. In Schaub and Pavlovic, pp 76-81.

9. Peter L. Bernstein. Against the Gods, the Remarkable Story of Risk. John Wiley and Sons, 1996. P1.

10. Ibid, p8

11. Henry Petroski. Design Paradigms: Case Histories of Error and Judgement in Engineering. Cambridge University press, 1994. p158.

Last Modified 17 March 2003 G. R. Peters